In this Stratfor article George Friedman tries to dress up Obama's increasing isolationist tilt as the U.S. disengages from international flash points a wise move. Me, I'm not so sure -- I think building roads, spreading cells towers and in general hailing American notions of democracy will prove to be much more fertile seeds than a lot of people think at present.
Besides, dirty little wars are the lot of major powers. As much as they may be unpopular, they're kind of inevitable and avoiding them usually costs down the road. We shall see.
As for the Hot Stratfor Babe, since the article dealt with endless war the endless day of the movie Groundhog Day came to mind, and so Andie MacDowell became the easy winner of this article's title of Hot Stratfor Babe.
Ms MacDowell started her career as a print model, turned to television commercials and from there moved onto a very successful stint as a leading lady in movies. She continues to be busy, doing television, film and continues to model.
Avoiding the Wars That Never End
By George Friedman, Founder and Chief Executive Officer, January 15, 2013
Last week, U.S. President Barack Obama announced that the United States would transfer the primary responsibility for combat operations in Afghanistan to the Afghan military in the coming months, a major step toward the withdrawal of U.S. forces. Also last week, France began an intervention in Mali designed to block jihadists from taking control of the country and creating a base of operations in France's former African colonies.
The two events are linked in a way that transcends the issue of Islamist insurgency and points to a larger geopolitical shift. The United States is not just drawing down its combat commitments; it is moving away from the view that it has the primary responsibility for trying to manage the world on behalf of itself, the Europeans and its other allies. Instead, that burden is shifting to those who have immediate interests involved.
Insecurity in 9/11's Wake
It is interesting to recall how the United States involved itself in Afghanistan. After 9/11, the United States was in shock and lacked clear intelligence on al Qaeda. It did not know what additional capabilities al Qaeda had or what the group's intentions were. Lacking intelligence, a political leader has the obligation to act on worst-case scenarios after the enemy has demonstrated hostile intentions and capabilities. The possible scenarios ranged from additional sleeper cells operating and awaiting orders in the United States to al Qaeda having obtained nuclear weapons to destroy cities. When you don't know, it is both prudent and psychologically inevitable to plan for the worst.
The United States had sufficient information to act in Afghanistan. It knew that al Qaeda was operating in Afghanistan and that disrupting the main cell was a useful step in taking some action against the threat. However, the United States did not immediately invade Afghanistan. It bombed the country extensively and inserted limited forces on the ground, but the primary burden of fighting the Taliban government was in the hands of anti-Taliban forces in Afghanistan that had been resisting the Taliban and in the hands of other forces that could be induced to act against the Taliban. The Taliban gave up the cities and prepared for a long war. Al Qaeda's command cell left Afghanistan and shifted to Pakistan.
The United States achieved its primary goal early on. That goal was not to deny al Qaeda the ability to operate in Afghanistan, an objective that would achieve nothing. Rather, the goal was to engage al Qaeda and disrupt its command-and-control structure as a way to degrade the group's ability to plan and execute additional attacks. The move to Pakistan at the very least bought time, and given continued pressure on the main cell, allowed the United States to gather more intelligence about al Qaeda assets around the world.
This second mission -- to identify al Qaeda assets around the world -- required a second effort. The primary means of identifying them was through their electronic communications, and the United States proceeded to create a vast technological mechanism designed to detect communications and use that detection to identify and capture or kill al Qaeda operatives. The problem with this technique -- really the only one available -- was that it was impossible to monitor al Qaeda's communications without monitoring everyone's. If there was a needle in the haystack, the entire haystack had to be examined. This was a radical shift in the government's relationship to the private communications of citizens. The justification was that at a time of war, in which the threat to the United States was uncertain and possibly massive, these measures were necessary.
This action was not unique in American history. Abraham Lincoln violated the Constitution in several ways during the Civil War, from suspending the right to habeas corpus to blocking the Maryland Legislature from voting on a secession measure. Franklin Roosevelt allowed the FBI to open citizens' mail and put Japanese-Americans into internment camps. The idea that civil liberties must be protected in time of war is not historically how the United States, or most countries, operate. In that sense there was nothing unique in the decision to monitor communications in order to find al Qaeda and stop attacks. How else could the needle be found in the haystack? Likewise, detention without trial was not unique. Lincoln and Roosevelt both resorted to it.
The Civil War and World War II were different from the current conflict, however, because their conclusions were clear and decisive. The wars would end, one way or another, and so would the suspension of rights. Unlike those wars, the war in Afghanistan was extended indefinitely by the shift in strategy from disrupting al Qaeda's command cell to fighting the Taliban to building a democratic society in Afghanistan. With the second step, the U.S. military mission changed its focus and increased its presence massively, and with the third, the terminal date of the war became very far away.
But there was a broader issue. The war in Afghanistan was not the main war. Afghanistan happened to be the place where al Qaeda was headquartered on Sept. 11, 2001. The country was not essential to al Qaeda, and creating a democratic society there -- if it were even possible -- would not necessarily weaken al Qaeda. Even destroying al Qaeda would not prevent new Islamist organizations or individuals from rising up.
A New Kind of War
The main war was not against one specific terrorist group, but rather against an idea: the radical tendency in Islamism. Most Muslims are not radicals, but any religion with 1 billion adherents will have its share of extremists. The tendency is there, and it is deeply rooted. If the goal of the war were the destruction of this radical tendency, then it was not going to happen. While the risk of attacks could be reduced -- and indeed there were no further 9/11s despite repeated attempts in the United States -- there was no way to eliminate the threat. No matter how many divisions were deployed, no matter how many systems for electronic detection were created, they could only mitigate the threat, not eliminate it. Therefore, what some called the Long War really became permanent war.
The means by which the war was pursued could not result in victory. They could, however, completely unbalance U.S. strategy by committing massive resources to missions not clearly connected with preventing Islamist terrorism. It also created a situation where emergency intrusions on critical portions of the Bill of Rights -- such as the need to obtain a warrant for certain actions -- became a permanent feature. Permanent war makes for permanent temporary measures.
The break point came, in my opinion, in about 2004. Around that time, al Qaeda was unable to mount attacks on the United States despite multiple efforts. The war in Afghanistan had dislodged al Qaeda and created the Karzai government. The invasion of Iraq -- whatever the rationale might have been -- clearly produced a level of resistance that the United States could not contain or could contain only by making agreements with its enemies in Iraq. At that point, a radical rethinking of the war had to take place. It did not.
The radical rethinking had to do not with Iraq or Afghanistan, but rather with what to do about a permanent threat to the United States, and indeed to many other countries, posed by the global networks of radical Islamists prepared to carry out terrorist attacks. The threat would not go away, and it could not be eliminated. At the same time, it did not threaten the existence of the republic. The 9/11 attacks were atrocious, but they did not threaten the survival of the United States in spite of the human cost. Combating the threat required a degree of proportionality so the fight could be maintained on an ongoing basis, without becoming the only goal of U.S. foreign policy or domestic life. Mitigation was the only possibility; the threat would have to be endured.
Washington found a way to achieve this balance in the past, albeit against very different sorts of threats. The United States emerged as a great power in the early 20th century. During that time, it fought three wars: World War I, World War II and the Cold War, which included Korea, Vietnam and other, smaller engagements. In World War I and World War II, the United States waited for events to unfold, and in Europe in particular it waited until the European powers reached a point where they could not deal with the threat of German hegemony without American intervention. In both instances, it intervened heavily only late in the war, at the point where the Germans had been exhausted by other European powers. It should be remembered that the main American push in World War II did not take place until the summer of 1944. The American strategy was to wait and see whether the Europeans could stabilize the situation themselves, using distance to mobilize as late as possible and intervene decisively only at the critical moment.
The critics of this approach, particularly prior to World War II, called it isolationism. But the United States was not isolationist; it was involved in Asia throughout this period. Rather, it saw itself as being the actor of last resort, capable of acting at the decisive moment with overwhelming force because geography had given the United States the option of time and resources.
During the Cold War, the United States modified this strategy. It still depended on allies, but it now saw itself as the first responder. Partly this could be seen in U.S. nuclear strategy. This could also be seen in Korea and Vietnam, where allies played subsidiary roles, but the primary effort was American. The Cold War was fought on a different set of principles than the two world wars.
The Cold War strategy was applied to the war against radical Islamism, in which the United States -- because of 9/11 but also because of a mindset that could be seen in other interventions -- was the first responder. Other allies followed the United States' lead and provided support to the degree to which they felt comfortable. The allies could withdraw without fundamentally undermining the war effort. The United States could not.
The approach in the U.S.-jihadist war was a complete reversal from the approach taken in the two world wars. This was understandable given that it was triggered by an unexpected and catastrophic event, the reponse to which flowed from a lack of intelligence. When Japan struck Pearl Harbor, emotions were at least as intense, but U.S. strategy in the Pacific was measured and cautious. And the enemy's capabilities were much better understood.
Stepping Back as Global Policeman
The United States cannot fight a war against radical Islamism and win, and it certainly cannot be the sole actor in a war waged primarily in the Eastern Hemisphere. This is why the French intervention in Mali is particularly interesting. France retains interests in its former colonial empire in Africa, and Mali is at the geographic center of these interests. To the north of Mali is Algeria, where France has significant energy investments; to the east of Mali is Niger, where France has a significant stake in the mining of mineral resources, particularly uranium; and to the south of Mali is Ivory Coast, where France plays a major role in cocoa production. The future of Mali matters to France far more than it matters to the United States.
What is most interesting is the absence of the United States in the fight, even if it is providing intelligence and other support, such as mobilizing ground forces from other African countries. The United States is not acting as if this is its fight; it is acting as if this is the fight of an ally, whom it might help in extremis, but not in a time when U.S. assistance is unnecessary. And if the French can't mount an effective operation in Mali, then little help can be given.
This changing approach is also evident in Syria, where the United States has systematically avoided anything beyond limited and covert assistance, and Libya, where the United States intervened after the French and British launched an attack they could not sustain. That was, I believe, a turning point, given the unsatisfactory outcome there. Rather than accepting a broad commitment against radical Islamism everywhere, the United States is allowing the burden to shift to powers that have direct interests in these areas.
Reversing a strategy is difficult. It is uncomfortable for any power to acknowledge that it has overreached, which the United States did both in Iraq and Afghanistan. It is even more difficult to acknowledge that the goals set by President George W. Bush in Iraq and Obama in Afghanistan lacked coherence. But clearly the war has run its course, and what is difficult is also obvious. We are not going to eliminate the threat of radical Islamism. The commitment of force to an unattainable goal twists national strategy out of shape and changes the fabric of domestic life. Obviously, overwatch must be in place against the emergence of an organization like al Qaeda, with global reach, sophisticated operatives and operational discipline. But this is very different from responding to jihadists in Mali, where the United States has limited interests and fewer resources.
Accepting an ongoing threat is also difficult. Mitigating the threat of an enemy rather than defeating the enemy outright goes against an impulse. But it is not something alien to American strategy. The United States is involved in the world, and it can't follow the founders' dictum of staying out of European struggles. But the United States has the option of following U.S. strategy in the two world wars. The United States was patient, accepted risks and shifted the burden to others, and when it acted, it acted out of necessity, with clearly defined goals matched by capabilities. Waiting until there is no choice but to go to war is not isolationism. Allowing others to carry the primary risk is not disengagement. Waging wars that are finite is not irresponsible.
The greatest danger of war is what it can do to one's own society, changing the obligations of citizens and reshaping their rights. The United States has always done this during wars, but those wars would always end. Fighting a war that cannot end reshapes domestic life permanently. A strategy that compels engagement everywhere will exhaust a country. No empire can survive the imperative of permanent, unwinnable warfare. It is fascinating to watch the French deal with Mali. It is even more fascinating to watch the United States wishing them well and mostly staying out of it. It has taken about 10 years, but here we can see the American system stabilize itself by mitigating the threats that can't be eliminated and refusing to be drawn into fights it can let others handle.
Avoiding the Wars That Never End is republished with permission of Stratfor.
No comments:
Post a Comment