Monday, September 2, 2013
The Other 99 Percent of Syrian Casualties. By Barry Rubin.
The other 99 percent of Syrian casualties. By Barry Rubin. Jerusalem Post, September 1, 2013.
Striving to Survive Beyond Urban Elites’ Horizons. By Salena Zito.
Striving to Survive Beyond Urban Elites’ Horizons. By Salena Zito. Real Clear Politics, September 2, 2013.
Indian Country. By Robert D. Kaplan.
Indian Country. By Robert D. Kaplan. Wall Street Journal, September 21, 2004. Also here.
Kaplan:
An overlooked truth about the war on terrorism, and the war in Iraq in particular, is that they both arrived too soon for the American military: before it had adequately transformed itself from a dinosauric, Industrial Age beast to a light and lethal instrument skilled in guerrilla warfare, attuned to the local environment in the way of the 19th-century Apaches. My mention of the Apaches is deliberate. For in a world where mass infantry invasions are becoming politically and diplomatically prohibitive – even as dirty little struggles proliferate, featuring small clusters of combatants hiding out in Third World slums, deserts and jungles – the American military is back to the days of fighting the Indians.
The red
Indian metaphor is one with which a liberal policy nomenklatura may be
uncomfortable, but Army and Marine field officers have embraced it because it
captures perfectly the combat challenge of the early 21st century. But they don’t
mean it as a slight against the Native North Americans. The fact that radio
call signs so often employ Indian names is an indication of the troops’
reverence for them. The range of Indian groups, numbering in their hundreds,
that the U.S. Cavalry and Dragoons had to confront was no less varied than that
of the warring ethnic and religious militias spread throughout Eurasia, Africa
and South America in the early 21st century. When the Cavalry invested Indian
encampments, they periodically encountered warrior braves beside women and
children, much like Fallujah. Though most Cavalry officers tried to spare the
lives of noncombatants, inevitable civilian casualties raised howls of protest
among humanitarians back East, who, because of the dissolution of the conscript
army at the end of the Civil War, no longer empathized with a volunteer force
beyond the Mississippi that was drawn from the working classes.
* * *
Indian
Country has been expanding in recent years because of the security vacuum
created by the collapse of traditional dictatorships and the emergence of new
democracies – whose short-term institutional weaknesses provide whole new
oxygen systems for terrorists. Iraq is but a microcosm of the earth in this
regard. To wit, the upsurge of terrorism in the vast archipelago of Indonesia,
the southern Philippines and parts of Malaysia is a direct result of the
anarchy unleashed by the passing of military regimes. Likewise, though many do
not realize it, a more liberalized Middle East will initially see greater
rather than lesser opportunities for terrorists. As the British diplomatist
Harold Nicolson understood, public opinion is not necessarily enlightened
merely because it has been suppressed.
I am
not suggesting that we should not work for free societies. I am suggesting that
our military-security establishment be under no illusions regarding the
immediate consequences.
In
Indian Country, it is not only the outbreak of a full-scale insurgency that
must be avoided, but the arrival in significant numbers of the global media. It
would be difficult to fight more cleanly than the Marines did in Fallujah. Yet
that still wasn’t a high enough standard for independent foreign television
voices such as al-Jazeera, whose very existence owes itself to the creeping
liberalization in the Arab world for which the U.S. is largely responsible. For
the more we succeed in democratizing the world, not only the more security
vacuums that will be created, but the more constrained by newly independent
local medias our military will be in responding to those vacuums. From a field
officer’s point of view, an age of democracy means an age of restrictive ROEs
(rules of engagement).
The
American military now has the most thankless task of any military in the
history of warfare: to provide the security armature for an emerging global
civilization that, the more it matures – with its own mass media and governing
structures – the less credit and sympathy it will grant to the very troops who
have risked and, indeed, given their lives for it. And as the thunderous roar
of a global cosmopolitan press corps gets louder – demanding the application of
abstract principles of universal justice that, sadly, are often neither
practical nor necessarily synonymous with American national interest – the
smaller and more low-key our deployments will become. In the future, military
glory will come down to shadowy, page-three skirmishes around the globe, that
the armed services will quietly celebrate among their own subculture.
The
goal will be suppression of terrorist networks through the training of – and
combined operations with – indigenous troops. That is why the Pan-Sahel
Initiative in Africa, in which Marines and Army Special Forces have been
training local militaries in Mauritania, Mali, Niger and Chad, in order to
counter al-Qaeda infiltration of sub-Saharan Africa, is a surer paradigm for
the American imperial future than anything occurring in Iraq or Afghanistan.
In
months of travels with the American military, I have learned that the smaller
the American footprint and the less notice it draws from the international
media, the more effective is the operation. One good soldier-diplomat in a
place like Mongolia can accomplish miracles. A few hundred Green Berets in Colombia
and the Philippines can be adequate force multipliers. Ten thousand troops, as
in Afghanistan, can tread water. And 130,000, as in Iraq, constitutes a mess
that nobody wants to repeat – regardless of one’s position on the war.
In
Indian Country, the smaller the tactical unit, the more forward deployed it is,
and the more autonomy it enjoys from the chain of command, the more that can be
accomplished. It simply isn’t enough for units to be out all day in Iraqi towns
and villages engaged in presence patrols and civil-affairs projects: A
successful FOB (forward operating base) is a nearly empty one, in which most
units are living beyond the base perimeters among the indigenous population for
days or weeks at a time.
Much
can be learned from our ongoing Horn of Africa experience. From a base in
Djibouti, small U.S. military teams have been quietly scouring an anarchic
region that because of an Islamic setting offers al Qaeda cultural access. “Who
needs meetings in Washington,” one Army major told me. “Guys in the field will
figure out what to do. I took 10 guys to explore eastern Ethiopia. In every
town people wanted a bigger American presence. They know we’re here, they want
to see what we can do for them.” The new economy-of-force paradigm being
pioneered in the Horn borrows more from the Lewis and Clark expedition than
from the major conflicts of the 20th century.
In
Indian Country, as one general officer told me, “you want to whack bad guys
quietly and cover your tracks with humanitarian-aid projects.” Because of the
need for simultaneous military, relief and diplomatic operations, our greatest
enemy is the size, rigidity and artificial boundaries of the Washington
bureaucracy. Thus, the next administration, be it Republican or Democrat, will
have to advance the merging of the departments of State and Defense as never
before; or risk failure. A strong secretary of state who rides roughshod over a
less dynamic defense secretary – as a Democratic administration appears to
promise – will only compound the problems created by the Bush administration,
in which the opposite has occurred. The two secretaries must work in unison,
planting significant numbers of State Department personnel inside the
military's war fighting commands, and defense personnel inside a modernized
Agency for International Development.
The
Plains Indians were ultimately vanquished not because the U.S. Army adapted to
the challenge of an unconventional enemy. It never did. In fact, the Army never
learned the lesson that small units of foot soldiers were more effective
against the Indians than large mounted regiments burdened by the need to carry
forage for horses: whose contemporary equivalent are convoys of humvees
bristling with weaponry that are easily immobilized by an improvised bicycle
bomb planted by a lone insurgent. Had it not been for a deluge of settlers
aided by the railroad, security never would have been brought to the Old West.
Now
there are no new settlers to help us, nor their equivalent in any form. To help
secure a more liberal global environment, American ground troops are going to
have to learn to be more like Apaches.
Kaplan:
An overlooked truth about the war on terrorism, and the war in Iraq in particular, is that they both arrived too soon for the American military: before it had adequately transformed itself from a dinosauric, Industrial Age beast to a light and lethal instrument skilled in guerrilla warfare, attuned to the local environment in the way of the 19th-century Apaches. My mention of the Apaches is deliberate. For in a world where mass infantry invasions are becoming politically and diplomatically prohibitive – even as dirty little struggles proliferate, featuring small clusters of combatants hiding out in Third World slums, deserts and jungles – the American military is back to the days of fighting the Indians.
U.S. Intervention in Syria: War for Virtue. By Henry Allen.
U.S. intervention in Syria: War for virtue. By Henry Allen. Washington Post, September 1, 2013. Also here.
Why do we ignore the civilians killed in American wars? By John Tirman. Washington Post, January 6, 2012.
Allen:
Where were the smiles, the flowers? We’d expected, in a modest way, to be greeted as liberators.
This
was many years ago, Chu Lai, South Vietnam, 1966, in one of the early disasters
of the United States’ post-World War II attempts to fight wars for virtue.
People in the villages refused to meet our eyes, and they only smiled if they
were selling us something.
How
disappointing. The war was young then, and so were we, but not so young that we
hadn’t seen newsreel footage of the cheers from the giddy urchins of Naples,
the French doing their tiptoe waves.
But not
the Vietnamese. It seemed that in Chu Lai, at least, the beneficiaries of our
liberation and largesse hated us, or were too scared to show they liked us.
But
why? Weren’t we fighting a war of liberation, another good war in the American
tradition of good wars? Wasn’t my Marine civic action team giving candy to
children, the same SweeTarts you could buy in American movie theaters?
The
giveaway lasted two days.
“SweeTart
numbah ten!” shouted the kids who swarmed our truck on the second day. “Numbah
ten” meant the worst. They flung the SweeTarts back at us. We flung them back
at them, no doubt losing a heart here, a mind there. The Battle of the
SweeTarts. At the end of the day you’d have to say we lost it, another case of
American virtue unrewarded.
The
good war, the virtuous war. We believe in it. We have to believe in it or we
wouldn’t be Americans.
As John
Updike wrote: “America is beyond power, it acts as in a dream, as a face of
God. Wherever America is, there is freedom, and wherever America is not,
madness rules with chains, darkness strangles millions. Beneath her patient
bombers, paradise is possible.”
The
United States doesn’t fight for land, resources, hatred, revenge, tribute,
religious conversion — the usual stuff. Along with the occasional barrel of
oil, we fight for virtue.
Never
mind that it doesn’t work out — the Gulf of Tonkin lies, Agent Orange,
waterboarding, nonexistent weapons of mass destruction, the pointless horrors
of Abu Ghraib, a fighter plane wiping out an Afghan wedding party, our
explanation of civilian deaths as an abstraction: “collateral damage.”
Just
so. We talk about our warmaking as if it were a therapeutic science — surgical
strikes, precision bombing, graduated responses, a homeopathic treatment that
uses war to cure us of war. “Like cures like,” as the homeopathic slogan has
it; “the war to end all wars” as Woodrow Wilson is believed to have said of
World War I. We send out our patient bombers in the manner of piling on
blankets to break a child’s fever. We launch our missiles and say: “We’re doing
it for your own good.”
After
World War II, I was taught in school that humankind, especially Americans, hate
war and love peace. The United Nations rose on New York’s East River, a foundry
beating swords into plowshares. We renamed the Department of War as the
Department of Defense. We had Atoms for Peace, CARE packages, UNICEF boxes at
Halloween and the Berlin Airlift instead of a war against the Soviet Union.
The
problem here is that humankind doesn’t hate war, it loves war. That’s why it
fights so many of them. The New England Indians were so devoted to fighting
each other that they couldn’t unite to drive the European settlers into the sea
in King Philip’s War.
What
better explains all of recorded history with its atrocity, conquest, pillage
and extermination? Our love of war is the problem. War is an addiction, maybe a
disease, the chronic autoimmune disease of humanity. It erupts, it subsides,
but it’s always there, waiting to cripple and kill us. The best we can do is
hope to keep it in remission.
And yet
Americans still believe in the idea of the good and virtuous war. It scratches
our Calvinist itch; it proves our election to blessedness.
Thus
God is on our side. Strangely enough, though, we keep losing. Since World War
II, we have failed to win any land war that lasted more than a week: Korea (a
stalemate), Vietnam, little ones like Lebanon and Somalia, bigger ones like
Iraq and Afghanistan. Ah, but these were all intended to be good wars, saving
people from themselves.
The
latest target of opportunity for our patient bombers is Syria. The purity of
our motives is unassailable. We would fire our missiles only to punish sin,
this time in the form of poison gas. No land grab, no oil, not even an attempt
to install democracy.
Oscar
Wilde said: “As long as war is regarded as wicked, it will always have its
fascination. When it is looked upon as vulgar, it will cease to be popular.” He
didn’t foresee a United States that would regard war as virtuous.
What a
dangerous idea it is.
Tirman:
Why the American silence on our wars’ main victims? Our self-image, based on what cultural historian Richard Slotkin calls “the frontier myth” — in which righteous violence is used to subdue or annihilate the savages of whatever land we’re trying to conquer — plays a large role. For hundreds of years, the frontier myth has been one of America’s sturdiest national narratives.
When
the challenges from communism in Korea and Vietnam appeared, we called on these
cultural tropes to understand the U.S. mission overseas. The same was true for
Iraq and Afghanistan, with the news media and politicians frequently portraying
Islamic terrorists as frontier savages. By framing each of these wars as a
battle to civilize a lawless culture, we essentially typecast the local
populations as theIndians of our North American conquest. As the foreign policy
maven Robert D. Kaplan wrote on the Wall
Street Journal op-ed page in 2004, “The red Indian metaphor is one with
which a liberal policy nomenklatura may be uncomfortable, but Army and Marine
field officers have embraced it because it captures perfectly the combat
challenge of the early 21st century.”
Politicians
tend to speak in broader terms, such as defending Western values, or simply
refer to resistance fighters as terrorists, the 21st-century word for savages.
Remember the military’s code name for the raid of Osama bin Laden’s compound?
It was Geronimo.
The
frontier myth is also steeped in racism, which is deeply embedded in American
culture’s derogatory depictions of the enemy. Such belittling makes it all the
easier to put these foreigners at risk of violence. President George W. Bush,
to his credit, disavowed these wars as being against Islam, as has President
Obama.
Perhaps
the most compelling explanation for indifference, though, taps into our beliefs
about right and wrong. More than 30 years ago, social psychologists developed
the “just world” theory, which argues that humans naturally assume that the
world should be orderly and rational. When that “just world” is disrupted, we
tend to explain away the event as an aberration. For example, when encountering
a beggar on the street, a common reaction is indifference or even anger, in the
belief that no one should go hungry in America.
This
explains much of our response to the violence in Korea, Vietnam, Iraq and
Afghanistan. When the wars went badly and violence escalated, Americans tended
to ignore or even blame the victims. The public dismissed the civilians because
their high mortality rates, displacement and demolished cities were discordant
with our understandings of the missions and the U.S. role in the world.
Why do we ignore the civilians killed in American wars? By John Tirman. Washington Post, January 6, 2012.
Allen:
Where were the smiles, the flowers? We’d expected, in a modest way, to be greeted as liberators.
Tirman:
Why the American silence on our wars’ main victims? Our self-image, based on what cultural historian Richard Slotkin calls “the frontier myth” — in which righteous violence is used to subdue or annihilate the savages of whatever land we’re trying to conquer — plays a large role. For hundreds of years, the frontier myth has been one of America’s sturdiest national narratives.
The Scared Worker. By Robert J. Samuelson.
The Scared Worker. By Robert J. Samuelson. Real Clear Politics, September 2, 2013. Also at the Washington Post.
Tours of Duty: The New Employer-Employee Contract. By Reid Hoffman, Ben Casnocha, and Chris Yeh. Harvard Business Review, June 2013.
Samuelson:
On this Labor Day, American workers face a buyers’ market. Employers have the upper hand and, given today’s languid pace of hiring, the advantage shows few signs of ending. What looms, at best, is a sluggish descent from high unemployment (7.4 percent in July) and a prolonged period of stagnant or slow-growing wages. Since 2007, there has been no gain in average inflation-adjusted wages and total compensation, including fringes, notes the Economic Policy Institute, a liberal think tank.
The
weak job market has a semi-permanence unlike anything seen since World War II,
and the effects on public opinion extend beyond the unemployed. “People’s
expectations have been really ratcheted down for what they can expect for
themselves and their children,” says EPI economist Lawrence Mishel. There’s a
sense “that the economy just doesn’t produce good jobs anymore.” Possible job
loss becomes more threatening because finding a new job is harder. Says Paul
Taylor of the Pew Research Center: “Security is valued more than money because
it’s so fragile.”
What’s
occurring is the final breakdown of the post-World War II job compact, with its
promises of career jobs and something close to “full employment.” The
dissolution of these expectations compounds stress and uncertainty.
Over
the past century, we’ve had three broad labor regimes. The first, in the early
1900s, featured “unfettered labor markets,” as economic historian Price Fishback of the University of Arizona puts it. Competition set wages and
working conditions. There was no federal unemployment insurance or union
protection. Workers were fired if they offended bosses or the economy slumped;
they quit if they thought they could do better. Turnover was high: Fewer than a
third of manufacturing workers in 1913 had been at their current jobs for more than five years.
After
World War II, labor relations became more regulated and administered — the
second regime. The Wagner Act of 1935 gave workers the right to organize;
decisions of the National War Labor Board also favored unions. By 1945, unions
represented about a third of private workers, up from 10 percent in 1929.
Health insurance, pensions and job protections proliferated. Factory workers
laid off during recessions could expect to be recalled when the economy
recovered. Job security improved. By 1973, half of manufacturing workers had been at the same job for more than five years.
To
avoid unionization and retain skilled workers, large nonunion companies
emulated these practices. Career jobs were often the norm. If you went to work
for IBM at 25, you could expect to retire from IBM at 65. Fringe benefits expanded. Corporate America, unionized or not, created a private welfare state
to protect millions from job and income loss.
But in
some ways, the guarantees were too rigid and costly. They started to unravel
with the harsh 1981-82 recession (peak monthly unemployment: 10.8 percent). As
time passed, companies faced increasing competition from imports and new
technologies. Pressure mounted from Wall Street for higher profits. In some
industries, labor became uncompetitive. Career jobs slowly vanished as a norm;
managers fired workers to cut costs. Unions provided diminishing protection. In
2012, they represented only 6.6 percent of private workers. Old organized
sectors (steel, autos) have shrunk. New sectors, from high tech to fast food,
have proved hard to organize. Companies have ferociously resisted. (Public unionization is 36 percent, but that’s another story.)
Now
comes the third labor regime: a confusing mix of old and new. The private safety net is shredding, though the public safety net (unemployment insurance,
Social Security, anti-poverty programs, anti-discrimination laws) remains.
Economist Fishback suggests we may be drifting back toward “unfettered labor
markets” with greater personal instability, insecurity — and responsibility.
Workers are often referred to as “free agents.” An article in the Harvard Business Review argues that lifetime employment at one company is dead and
proposes the following compact: Companies invest in workers’ skills to make
them more employable when they inevitably leave; workers reciprocate by
devoting those skills to improving corporate profitability.
“The
new compact isn’t about being nice,” the article says. “It’s based on an
understanding that a company is its talent, that low performers will be cut,
and that the way to attract talent is to offer appealing opportunities.”
Workers
can’t be too picky, because their power has eroded. Another indicator: After
years of stability, labor’s share — in wages and fringes — of non-farm business
income slipped from 63 percent in 2000 to 57 percent in 2013, reports the White House Council of Economic Advisers. But an even greater decline in 22 other
advanced countries, albeit over a longer period, suggests worldwide pressures
on workers. Take your pick: globalization; new labor-saving technologies;
sluggish economies. Workers do best when strong growth and tight markets raise
real wages. On Labor Day 2013, this prospect is nowhere in sight.
Tours of Duty: The New Employer-Employee Contract. By Reid Hoffman, Ben Casnocha, and Chris Yeh. Harvard Business Review, June 2013.
Samuelson:
On this Labor Day, American workers face a buyers’ market. Employers have the upper hand and, given today’s languid pace of hiring, the advantage shows few signs of ending. What looms, at best, is a sluggish descent from high unemployment (7.4 percent in July) and a prolonged period of stagnant or slow-growing wages. Since 2007, there has been no gain in average inflation-adjusted wages and total compensation, including fringes, notes the Economic Policy Institute, a liberal think tank.
Subscribe to:
Posts (Atom)