Sykes Picot Agreement Map. Signed May 8, 1916. National Archives UK, Document Record MPK 1/426. Wikipedia.
Thursday, May 19, 2016
Monday, May 16, 2016
America in 2016 Resembles 1910 More Than the Postwar Era. By Michael Barone.
America Today Resembles 1910 More Than the Postwar Era. By Michael Barone. Real Clear Politics, May 17, 2016.
Barone:
What’s your benchmark? What is the historical era with which you compare life in contemporary America?
Barone:
What’s your benchmark? What is the historical era with which you compare life in contemporary America?
For
many astute commentators on various points of the political spectrum, it is
postwar America, the two decades after the United States and its allies won
World War II and before Lyndon Johnson sent half a million U.S. troops to
Vietnam.
Conservatives
look back fondly on postwar America’s high marriage rates and stable families,
few divorces and out-of-wedlock births, low crime rates and widely shared
cultural values celebrated in classic movies and television sitcoms that almost
everyone watched. Liberals look back fondly on postwar America’s high income
equality and labor union membership, its low rates of unemployment and rising
education levels, its high marginal tax rates and its high rates of social
mobility.
Neither
side embraces the whole package. No one today wants to go back to legally
mandated and violently enforced racial segregation. Very few Americans today
want to return to stigmatizing homosexuality.
But
some things have been lost. Books like libertarian Charles Murray’s Coming Apart or liberal Robert Putnam’s Our Kids, which lament the family
instability and economic stagnation of today’s downscale America, inspire a
nostalgia for a time widely seen as the American norm.
But was
it really the norm? Postwar America was the result of unique circumstances –
economic dominance when competitor nations were devastated, cultural uniformity
that followed from a universal popular culture and the common experience of
military service (16 million Americans served in the wartime military; the
proportional equivalent today would be 38 million).
So let
me offer a different benchmark: the America of 1910 or some other year before
the outbreak of World War I in 1914.
I
started thinking about that on a recent weekend sightseeing tour of lower
Manhattan. It’s become a kind of outdoor museum, with few cars on the street
and with dozens of tourists eyeing the massive buildings -- the columned stock
exchange, JP Morgan’s austere headquarters, the massive Equitable Building and
the 60-story Woolworth Building looming over lower Broadway – with their marble
gleaming as it must have when they were newly built 100 or so years ago.
The
America of 1910 was a lot more like today’s America than you might think. The
economy was growing, but fitfully. Disruptive technology was threatening old
industries, creating new jobs but eliminating many others.
Income
inequality was much greater than today, and living conditions more disparate.
Electricity was common in cities but unavailable on the farms where half of
Americans lived. John D. Rockefeller and Henry Ford were billionaires at a time
when average annual incomes were below $1,000.
It was
an America even more culturally divided than we are today. Within a mile or so
of Wall Street lived hundreds of thousands of Jewish and Italian immigrants in
the world’s most crowded neighborhoods. Immigration as a percentage of
pre-existing population between the opening of Ellis Island in 1892 and the
outbreak of World War I in 1914 was three times the level of 1982-2007.
The
South was in many ways a separate and underdeveloped country, still estranged
half a century after the Civil War, with income levels one-quarter those of New
York. Even as 30 million Europeans crossed the ocean to America, only 1 million
Southern whites and 1 million blacks moved North despite the promise of much
higher wages.
Marriage
rates were lower than in postwar America, and many young people dropped by the
wayside. Alcohol consumption was much higher than today; prostitution, female
and male, was common. People didn’t like to talk about these things, but you
get hints about them in the novels of Frank Norris and Theodore Dreiser.
The
Americans of 1910 faced terrorism and globalization, too. Anarchists murdered
President William McKinley in 1901 and set off a bomb that killed dozens next
to J.P. Morgan’s 23 Wall Street in 1920. This America was interlaced with the
global economy and, with its growing economic and demographic might, risked
being drawn into any world war.
So,
America in 1910, with nearly 100 million people, was in important ways less
like the postwar America of 150 million than like today’s America of 300
million. Studying how Americans handled – or mishandled – similar challenges
may prove more fruitful than yearning to restore the unique and non-replicable
America of Charles Murray’s, Robert Putnam’s and my youth.
Les Enfants Terribles of Barack Obama. By Hisham Melhem.
Les enfants terribles of Barack Obama. By Hisham Melhem. Al Arabiya English, May 7, 2016.
Melhem:
The world according to President Barack Obama described recently in the Atlantic Magazine and the portrait of Ben Rhodes, deputy national security advisor for strategic communications, in the current issue of the New York Times Magazine reveal an insular White House suspicious of the foreign policy establishment entrenched in Washington and New York, including senior members of Obama’s cabinet, contemptuous of traditional allies and friends in Europe and the Middle East, and disdainful of what they see as the very gullible American Media they became very adept at skillfully manipulating and ventriloquizing its narratives.
Melhem:
The world according to President Barack Obama described recently in the Atlantic Magazine and the portrait of Ben Rhodes, deputy national security advisor for strategic communications, in the current issue of the New York Times Magazine reveal an insular White House suspicious of the foreign policy establishment entrenched in Washington and New York, including senior members of Obama’s cabinet, contemptuous of traditional allies and friends in Europe and the Middle East, and disdainful of what they see as the very gullible American Media they became very adept at skillfully manipulating and ventriloquizing its narratives.
Some of
the president’s relatively young men, particularly Rhodes and Jon Favreau a
former speechwriter, are like him, gifted wordsmiths who see the skillful use
of “messaging” and the way the “narrative” is advanced, as important as the
content of the policy, and at times the “narrative” supersedes everything else.
In Obama’s universe words sometimes are synonymous with policy and action. In
these two lengthy articles, Obama’s universe is cold, unsentimental,
calculating, deceitful, and its inhabitants are willing to live comfortably
with horrendous tragedies like Syria’s “where more than 450,000 people have
been slaughtered.” What is so egregious in these two lengthy articles is that
the President and his men did not even come close to questioning a single
decision or position they have taken in the Middle East in more than seven
years. There was no hint of an attempt at introspection or honest self-criticism;
only naked, unbridled arrogance and self-righteousness.
A portrait of the advisor as a young man
The
portrait of Ben Rhodes as “the single most influential voice shaping American
foreign policy aside from Potus (Obama) himself” is stunning. Rhodes channels
and mirrors the President. The two are inseparable. The braggart Rhodes boasts
“I don’t know anymore where I begin and Obama ends.” Rhodes and Denis
McDonough, White House Chief of Staff, and others who constitute Obama’s inner
circle of advisors are more powerful and influential than Secretaries of State
and Defense. Obama insists on controlling national security issues and foreign
policy from the White House. Former Secretary of Defense Chuck Hagel found that
out in a humiliating way when he was asked to step down because he was out of
step with the White House on Syria and ISIS, and because the inner circle never
warmed up to him. One of Obama’s most consequential and most controversial
decision was taken, after he took a walk and consulted Dennis McDonough at the
height of the Syrian crisis in the summer of 2013, when Obama decided to
retreat from his announced decision to punish the Syrian regime after its use
of chemical weapons and killing 1,400 civilians.
It was
after the walk, that Obama called his Secretaries of State and Defense, to
inform them of his decision. Obama did not even bother to consult them first.
When Obama began his secret contact with Cuba, via the Vatican, he assigned
that mission to Ben Rhodes, who began his contacts without the knowledge of
Secretary of State John Kerry. Rhodes was tasked with selling the Iran deal to
congress and the American people. When Rhodes joined the Obama campaign in 2007
he was 30 years old, and he brought with him “a healthy contempt for the American
foreign-policy establishment, including editors and reporters at The New York
Times, The Washington Post, The New Yorker and elsewhere, who at first
applauded the Iraq war and then sought to pin all the blame on Bush and his
merry band of neocons when it quickly turned sour.” Rhodes derisively refers
“to the American foreign-policy establishment as the Blob. According to Rhodes,
the Blob includes Hillary Clinton, Robert Gates and other Iraq-war promoters
from both parties who now whine incessantly about the collapse of the American
security order in Europe and the Middle East”. One would suspect that Obama
shares this sentiment with his young guru.
Rhodes,
the morally dubious and masterful manipulator concocted a deceptive “narrative”
about the evolution of the negotiations with Iran, and successfully sold it to
the American Media. This tale of the deal alleges that negotiations became
possible in the wake of a new political reality in Iran following the elections
that brought the moderates, including President Hassan Rouhani to power. But
that narrative “was largely manufactured”. When Obama claimed in 2015, that the
deal was struck “after two years of negotiations” he was technically correct,
but “actively misleading because the most meaningful part of the negotiations
with Ira had begun in mid-2012, many months before Rouhani and the “moderate”
camp were chosen in an election among candidates handpicked by Iran’s supreme
leader, the Ayatollah Ali Khamenei.” Obama’s advisors always understood that he
was eager for a deal with Iran since the beginning of his first term; “It’s the
center of the arc,” Rhodes explained to the New York Times Magazine.
In
describing how he manipulates the media, Rhodes and one of his aides drip with
derision towards the reporters they spoon feed the narratives and the messaging
they want. Rhodes’ in your-face cynicism screams in the following passage: “Rhodes
singled out a key example to me one day, laced with the brutal contempt that is
a hallmark of his private utterances. ‘All these newspapers used to have
foreign bureaus,’ he said. ‘Now they don’t. They call us to explain to them
what’s happening in Moscow and Cairo. Most of the outlets are reporting on
world events from Washington. The average reporter we talk to is 27 years old,
and their only reporting experience consists of being around political
campaigns. That’s a sea change. They literally know nothing.’” This passage is
full of ironies. Rhodes was in his twenties when he was a congressional aide
writing reports, and he expressed his brutal contempt of reporters to a
journalist.
Obama’s bargain
President
Obama and his men come across in the two articles as insurgents trying to
disrupt the “Washington playbook” written by the despised “foreign-policy
establishment” with its dangerous “credibility” fetish, which according to
Obama tends, as a default position to prescribe militarized options to settle
international crisis. After all this is the President who was elected to end
the “dumb” war in Iraq, and terminate the longest war in America’s history in
Afghanistan; and who extended a hand to Iran’s clenched fist in his first
inaugural speech. For all of Obama’s declarations and speeches about a “new
Beginning” with the Muslim world, his intentions to settle the Arab-Israeli
conflict in his first term, his supposed sympathy with Arab and Iranian
reformers, his central interest –bordering on obsession- was to strike a
strategic bargain with Iran leading to a historic opening, hence his dogged
determination to reach a nuclear deal with the Islamic Republic. To that end
Obama and his men used subterfuge and misled the American people and their
representatives about the negotiations, and betrayed his promises to the Syrian
people when he refrained from seriously challenging Iran’s predations in Syria
fearing that such posture could undermine the prized nuclear deal.
Now in
the twilight of his presidency, with the nuclear deal with Iran behind him,
Obama and his men feel liberated enough, to voice their criticism of and to
express their disdain for their traditional friends and partners in the Middle
East who are seen as “free riders” or entitled to unqualified American support.
With the exception of Iran, and the imperatives of fighting al-Qaeda and the
“Islamic State” (ISIS) Obama did not exhibit serious and sustained intellectual
curiosity in the societies of the Middle East, or the kind of genuine sympathy
with the plight of the numerous victims there that would require effective
support. Reading Obama and his unscrupulous foreign policy guru Ben Rhodes one
could easily sense their disdain for things Middle Eastern, and their eagerness
to abandon the region and never look back. As related by Ben Rhodes to the New York Times Magazine, the deal with
Iran “would create the space for America to disentangle itself from its
established system of alliances with countries like Saudi Arabia, Egypt, Israel
and Turkey. With one bold move, the administration would effectively begin the
process of a large-scale disengagement from the Middle East.”
Betraying Syria
Syria
hovered over the negotiations with Iran. Leon Panetta, who served as Obama’s
head of the C.I.A. and later Secretary of Defense said that Obama was obsessed
with avoiding a conflict with Iran, even if it was at the expense of ignoring
Syria’s tragedy, “If you ratchet up sanctions, it could cause a war. If you
start opposing their interest in Syria, well, that could start a war, too.”
When the author of the article asks Rhodes about the ability of White House
officials “to get comfortable with tragedy” in reference to Syria, Rhodes’
answer is startling; “Yeah, I admit very much to that reality,” he says.
“There’s a numbing element to Syria in particular. But I will tell you this,”
he continues. “I profoundly do not believe that the United States could make
things better in Syria by being there. And we have an evidentiary record of
what happens when we’re there — nearly a decade in Iraq.”
When
the author asks Rhodes why the Obama administration is “spending so much time
and energy trying to strong-arm Syrian rebels into surrendering to the dictator
who murdered their families, or why it is so important for Iran to maintain its
supply lines to Hezbollah.” Rhodes mumbles something about John Kerry, and then
says something to the effect, “that the world of the Sunni Arabs that the
American establishment built has collapsed. The buck stops with the
establishment, not with Obama, who was left to clean up their mess.” This
cowardly denial and the claim that Obama is absolutely blameless in the slow
death of Syria, is the most jarring in the article.
The
Obama administration’s claims that Syrian tyrant Assad and his cohorts should
have no place in the new Syria that emerges after the negotiations rings
hollow. When the author describes Rob Malley, Obama’s senior advisor on ISIS
and Syria as the official “currently running negotiations that could keep the
Syrian dictator Bashar al-Assad in power” the circle of deceit and the betrayal
of Syria is complete. Les enfants
terribles of Obama are like him; conceited, arrogant, contemptuous and
proud of it.
Sunday, May 15, 2016
American Capitalism’s Great Crisis. By Rana Foroohar.
Foroohar:
How Wall Street is choking our economy and how to fix it
How Wall Street is choking our economy and how to fix it
A
couple of weeks ago, a poll conducted by the Harvard Institute of Politics
found something startling: only 19% of Americans ages 18 to 29 identified themselves
as “capitalists.” In the richest and most market-oriented country in the world,
only 42% of that group said they “supported capitalism.” The numbers were
higher among older people; still, only 26% considered themselves capitalists. A
little over half supported the system as a whole.
This
represents more than just millennials not minding the label “socialist” or
disaffected middle-aged Americans tiring of an anemic recovery. This is a
majority of citizens being uncomfortable with the country’s economic
foundation—a system that over hundreds of years turned a fledgling society of
farmers and prospectors into the most prosperous nation in human history. To be
sure, polls measure feelings, not hard market data. But public sentiment
reflects day-to-day economic reality. And the data (more on that later) shows
Americans have plenty of concrete reasons to question their system.
This
crisis of faith has had no more severe expression than the 2016 presidential
campaign, which has turned on the questions of who, exactly, the system is
working for and against, as well as why eight years and several trillions of
dollars of stimulus on from the financial crisis, the economy is still growing
so slowly. All the candidates have prescriptions: Sanders talks of breaking up
big banks; Trump says hedge funders should pay higher taxes; Clinton wants to
strengthen existing financial regulation. In Congress, Republican House Speaker
Paul Ryan remains committed to less regulation.
All of
them are missing the point. America’s economic problems go far beyond rich
bankers, too-big-to-fail financial institutions, hedge-fund billionaires,
offshore tax avoidance or any particular outrage of the moment. In fact, each
of these is symptomatic of a more nefarious condition that threatens, in equal
measure, the very well-off and the very poor, the red and the blue. The U.S.
system of market capitalism itself is broken. That problem, and what to do
about it, is at the center of my book Makers and Takers: The Rise of Finance and the Fall of American Business, a
three-year research and reporting effort from which this piece is adapted.
To
understand how we got here, you have to understand the relationship between
capital markets—meaning the financial system—and businesses. From the creation
of a unified national bond and banking system in the U.S. in the late 1790s to
the early 1970s, finance took individual and corporate savings and funneled
them into productive enterprises, creating new jobs, new wealth and,
ultimately, economic growth. Of course, there were plenty of blips along the
way (most memorably the speculation leading up to the Great Depression, which
was later curbed by regulation). But for the most part, finance—which today
includes everything from banks and hedge funds to mutual funds, insurance
firms, trading houses and such—essentially served business. It was a vital
organ but not, for the most part, the central one.
Over
the past few decades, finance has turned away from this traditional role.
Academic research shows that only a fraction of all the money washing around
the financial markets these days actually makes it to Main Street businesses.
“The intermediation of household savings for productive investment in the
business sector—the textbook description of the financial sector—constitutes
only a minor share of the business of banking today,” according to academics
Oscar Jorda, Alan Taylor and Moritz Schularick, who’ve studied the issue in
detail. By their estimates and others, around 15% of capital coming from
financial institutions today is used to fund business investments, whereas it
would have been the majority of what banks did earlier in the 20th century.
“The
trend varies slightly country by country, but the broad direction is clear,”
says Adair Turner, a former British banking regulator and now chairman of the
Institute for New Economic Thinking, a think tank backed by George Soros, among
others. “Across all advanced economies, and the United States and the U.K. in
particular, the role of the capital markets and the banking sector in funding
new investment is decreasing.” Most of the money in the system is being used
for lending against existing assets such as housing, stocks and bonds.
To get
a sense of the size of this shift, consider that the financial sector now represents
around 7% of the U.S. economy, up from about 4% in 1980. Despite currently
taking around 25% of all corporate profits, it creates a mere 4% of all jobs.
Trouble is, research by numerous academics as well as institutions like the
Bank for International Settlements and the International Monetary Fund shows
that when finance gets that big, it starts to suck the economic air out of the
room. In fact, finance starts having this adverse effect when it’s only half
the size that it currently is in the U.S. Thanks to these changes, our economy
is gradually becoming “a zero-sum game between financial wealth holders and the
rest of America,” says former Goldman Sachs banker Wallace Turbeville, who runs
a multiyear project on the rise of finance at the New York City—based nonprofit
Demos.
It’s
not just an American problem, either. Most of the world’s leading market
economies are grappling with aspects of the same disease. Globally, free-market
capitalism is coming under fire, as countries across Europe question its merits
and emerging markets like Brazil, China and Singapore run their own forms of
state-directed capitalism. An ideologically broad range of financiers and elite
business managers—Warren Buffett, BlackRock’s Larry Fink, Vanguard’s John
Bogle, McKinsey’s Dominic Barton, Allianz’s Mohamed El-Erian and others—have
started to speak out publicly about the need for a new and more inclusive type
of capitalism, one that also helps businesses make better long-term decisions
rather than focusing only on the next quarter. The Pope has become a vocal
critic of modern market capitalism, lambasting the “idolatry of money and the
dictatorship of an impersonal economy” in which “man is reduced to one of his
needs alone: consumption.”
During
my 23 years in business and economic journalism, I’ve long wondered why our
market system doesn’t serve companies, workers and consumers better than it
does. For some time now, finance has been thought by most to be at the very top
of the economic hierarchy, the most aspirational part of an advanced service
economy that graduated from agriculture and manufacturing. But research shows
just how the unintended consequences of this misguided belief have endangered
the very system America has prided itself on exporting around the world.
America’s
economic illness has a name: financialization. It’s an academic term for the
trend by which Wall Street and its methods have come to reign supreme in
America, permeating not just the financial industry but also much of American
business. It includes everything from the growth in size and scope of finance
and financial activity in the economy; to the rise of debt-fueled speculation
over productive lending; to the ascendancy of shareholder value as the sole
model for corporate governance; to the proliferation of risky, selfish thinking
in both the private and public sectors; to the increasing political power of
financiers and the CEOs they enrich; to the way in which a “markets know best”
ideology remains the status quo. Financialization is a big, unfriendly word
with broad, disconcerting implications.
University
of Michigan professor Gerald Davis, one of the pre-eminent scholars of the
trend, likens financialization to a “Copernican revolution” in which business
has reoriented its orbit around the financial sector. This revolution is often
blamed on bankers. But it was facilitated by shifts in public policy, from both
sides of the aisle, and crafted by the government leaders, policymakers and
regulators entrusted with keeping markets operating smoothly. Greta Krippner,
another University of Michigan scholar, who has written one of the most
comprehensive books on financialization, believes this was the case when
financialization began its fastest growth, in the decades from the late 1970s
onward. According to Krippner, that shift encompasses Reagan-era deregulation,
the unleashing of Wall Street and the rise of the so-called ownership society
that promoted owning property and further tied individual health care and
retirement to the stock market.
The
changes were driven by the fact that in the 1970s, the growth that America had
enjoyed following World War II began to slow. Rather than make tough decisions
about how to bolster it (which would inevitably mean choosing among various
interest groups), politicians decided to pass that responsibility to the
financial markets. Little by little, the Depression-era regulation that had
served America so well was rolled back, and finance grew to become the dominant
force that it is today. The shifts were bipartisan, and to be fair they often
seemed like good ideas at the time; but they also came with unintended
consequences. The Carter-era deregulation of interest rates—something that was,
in an echo of today’s overlapping left-and right-wing populism, supported by an
assortment of odd political bedfellows from Ralph Nader to Walter Wriston, then
head of Citibank—opened the door to a spate of financial “innovations” and a
shift in bank function from lending to trading. Reaganomics famously led to a
number of other economic policies that favored Wall Street. Clinton-era
deregulation, which seemed a path out of the economic doldrums of the late
1980s, continued the trend. Loose monetary policy from the Alan Greenspan era
onward created an environment in which easy money papered over underlying
problems in the economy, so much so that it is now chronically dependent on
near-zero interest rates to keep from falling back into recession.
This
sickness, not so much the product of venal interests as of a complex and
long-term web of changes in government and private industry, now manifests
itself in myriad ways: a housing market that is bifurcated and dependent on
government life support, a retirement system that has left millions insecure in
their old age, a tax code that favors debt over equity. Debt is the lifeblood
of finance; with the rise of the securities-and-trading portion of the industry
came a rise in debt of all kinds, public and private. That’s bad news, since a
wide range of academic research shows that rising debt and credit levels stoke
financial instability. And yet, as finance has captured a greater and greater
piece of the national pie, it has, perversely, all but ensured that debt is
indispensable to maintaining any growth at all in an advanced economy like the
U.S., where 70% of output is consumer spending. Debt-fueled finance has become
a saccharine substitute for the real thing, an addiction that just gets worse.
(The amount of credit offered to American consumers has doubled in real dollars
since the 1980s, as have the fees they pay to their banks.)
As the
economist Raghuram Rajan, one of the most prescient seers of the 2008 financial
crisis, argues, credit has become a palliative to address the deeper anxieties
of downward mobility in the middle class. In his words, “let them eat credit”
could well summarize the mantra of the go-go years before the economic
meltdown. And things have only deteriorated since, with global debt levels $57
trillion higher than they were in 2007.
The
rise of finance has also distorted local economies. It’s the reason rents are
rising in some communities where unemployment is still high. America’s housing
market now favors cash buyers, since banks are still more interested in making
profits by trading than by the traditional role of lending out our savings to
people and businesses looking to make longterm investments (like buying a
house), ensuring that younger people can’t get on the housing ladder. One
perverse result: Blackstone, a private-equity firm, is currently the largest
single-family-home landlord in America, since it had the money to buy
properties up cheap in bulk following the financial crisis. It’s at the heart
of retirement insecurity, since fees from actively managed mutual funds “are
likely to confiscate as much as 65% or more of the wealth that … investors
could otherwise easily earn,” as Vanguard founder Bogle testified to Congress
in 2014.
It’s
even the reason companies in industries from autos to airlines are trying to
move into the business of finance themselves. American companies across every
sector today earn five times the revenue from financial activities—investing,
hedging, tax optimizing and offering financial services, for example—that they
did before 1980. Traditional hedging by energy and transport firms, for
example, has been overtaken by profit-boosting speculation in oil futures, a
shift that actually undermines their core business by creating more price
volatility. Big tech companies have begun underwriting corporate bonds the way
Goldman Sachs does. And top M.B.A. programs would likely encourage them to do
just that; finance has become the center of all business education.
Washington,
too, is so deeply tied to the ambassadors of the capital markets—six of the 10
biggest individual political donors this year are hedge-fund barons—that even
well-meaning politicians and regulators don’t see how deep the problems are.
When I asked one former high-level Obama Administration Treasury official back
in 2013 why more stakeholders aside from bankers hadn’t been consulted about
crafting the particulars of Dodd-Frank financial reform (93% of consultation on
the Volcker Rule, for example, was taken with the financial industry itself),
he said, “Who else should we have talked to?” The answer—to anybody not
profoundly influenced by the way finance thinks—might have been the people
banks are supposed to lend to, or the scholars who study the capital markets,
or the civic leaders in communities decimated by the financial crisis.
Of
course, there are other elements to the story of America’s slow-growth economy,
including familiar trends from globalization to technology-related job
destruction. These are clearly massive challenges in their own right. But the
single biggest unexplored reason for long-term slower growth is that the
financial system has stopped serving the real economy and now serves mainly
itself. A lack of real fiscal action on the part of politicians forced the Fed
to pump $4.5 trillion in monetary stimulus into the economy after 2008. This
shows just how broken the model is, since the central bank’s best efforts have
resulted in record stock prices (which enrich mainly the wealthiest 10% of the
population that owns more than 80% of all stocks) but also a lackluster 2%
economy with almost no income growth.
Now, as
many top economists and investors predict an era of much lower asset-price
returns over the next 30 years, America’s ability to offer up even the
appearance of growth—via financially oriented strategies like low interest
rates, more and more consumer credit, tax-deferred debt financing for
businesses, and asset bubbles that make people feel richer than we really are,
until they burst—is at an end.
This
pinch is particularly evident in the tumult many American businesses face.
Lending to small business has fallen particularly sharply, as has the number of
startup firms. In the early 1980s, new companies made up half of all U.S.
businesses. For all the talk of Silicon Valley startups, the number of new
firms as a share of all businesses has actually shrunk. From 1978 to 2012 it
declined by 44%, a trend that numerous researchers and even many investors and
businesspeople link to the financial industry’s change in focus from lending to
speculation. The wane in entrepreneurship means less economic vibrancy, given
that new businesses are the nation’s foremost source of job creation and GDP
growth. Buffett summed it up in his folksy way: “You’ve now got a body of
people who’ve decided they’d rather go to the casino than the restaurant” of
capitalism.
In
lobbying for short-term share-boosting management, finance is also largely
responsible for the drastic cutback in research-and-development outlays in
corporate America, investments that are seed corn for future prosperity. Take
share buybacks, in which a company—usually with some fanfare—goes to the stock
market to purchase its own shares, usually at the top of the market, and often
as a way of artificially bolstering share prices in order to enrich investors
and executives paid largely in stock options. Indeed, if you were to chart the
rise in money spent on share buybacks and the fall in corporate spending on
productive investments like R&D, the two lines make a perfect X. The former
has been going up since the 1980s, with S&P 500 firms now spending $1
trillion a year on buybacks and dividends—equal to about 95% of their net
earnings—rather than investing that money back into research, product
development or anything that could contribute to long-term company growth. No
sector has been immune, not even the ones we think of as the most innovative.
Many tech firms, for example, spend far more on share-price boosting than on
R&D as a whole. The markets penalize them when they don’t. One case in
point: back in March 2006, Microsoft announced major new technology
investments, and its stock fell for two months. But in July of that same year,
it embarked on $20 billion worth of stock buying, and the share price promptly
rose by 7%. This kind of twisted incentive for CEOs and corporate officers has
only grown since.
As a
result, business dynamism, which is at the root of economic growth, has
suffered. The number of new initial public offerings (IPOs) is about a third of
what it was 20 years ago. True, the dollar value of IPOs in 2014 was $74.4
billion, up from $47.1 billion in 1996. (The median IPO rose to $96 million
from $30 million during the same period.) This may show investors want to make
only the surest of bets, which is not necessarily the sign of a vibrant market.
But there’s another, more disturbing reason: firms simply don’t want to go
public, lest their work become dominated by playing by Wall Street’s rules
rather than creating real value.
An
IPO—a mechanism that once meant raising capital to fund new investment—is
likely today to mark not the beginning of a new company’s greatness, but the
end of it. According to a Stanford University study, innovation tails off by
40% at tech companies after they go public, often because of Wall Street
pressure to keep jacking up the stock price, even if it means curbing the
entrepreneurial verve that made the company hot in the first place.
A flat
stock price can spell doom. It can get CEOs canned and turn companies into
acquisition fodder, which often saps once innovative firms. Little wonder,
then, that business optimism, as well as business creation, is lower than it
was 30 years ago, or that wages are flat and inequality growing. Executives who
receive as much as 82% of their compensation in stock naturally make
shorter-term business decisions that might undermine growth in their companies
even as they raise the value of their own options.
It’s no
accident that corporate stock buybacks, corporate pay and the wealth gap have
risen concurrently over the past four decades. There are any number of studies
that illustrate this type of intersection between financialization and
inequality. One of the most striking was by economists James Galbraith and
Travis Hale, who showed how during the late 1990s, changing income inequality
tracked the go-go Nasdaq stock index to a remarkable degree.
Recently,
this pattern has become evident at a number of well-known U.S. companies. Take
Apple, one of the most successful over the past 50 years. Apple has around $200
billion sitting in the bank, yet it has borrowed billions of dollars cheaply
over the past several years, thanks to superlow interest rates (themselves a
response to the financial crisis) to pay back investors in order to bolster its
share price. Why borrow? In part because it’s cheaper than repatriating cash
and paying U.S. taxes. All the financial engineering helped boost the
California firm’s share price for a while. But it didn’t stop activist investor
Carl Icahn, who had manically advocated for borrowing and buybacks, from
dumping the stock the minute revenue growth took a turn for the worse in late
April.
It is
perhaps the ultimate irony that large, rich companies like Apple are most
involved with financial markets at times when they don’t need any financing.
Top-tier U.S. businesses have never enjoyed greater financial resources. They
have a record $2 trillion in cash on their balance sheets—enough money combined
to make them the 10th largest economy in the world. Yet in the bizarre order
that finance has created, they are also taking on record amounts of debt to buy
back their own stock, creating what may be the next debt bubble to burst.
You and
I, whether we recognize it or not, are also part of a dysfunctional ecosystem
that fuels short-term thinking in business. The people who manage our
retirement money—fund managers working for asset-management firms—are typically
compensated for delivering returns over a year or less. That means they use their
financial clout (which is really our financial clout in aggregate) to push
companies to produce quick-hit results rather than execute long-term
strategies. Sometimes pension funds even invest with the activists who are
buying up the companies we might work for—and those same activists look for
quick cost cuts and potentially demand layoffs.
It’s a
depressing state of affairs, no doubt. Yet America faces an opportunity right
now: a rare second chance to do the work of refocusing and right-sizing the
financial sector that should have been done in the years immediately following
the 2008 crisis. And there are bright spots on the horizon.
Despite
the lobbying power of the financial industry and the vested interests both in
Washington and on Wall Street, there’s a growing push to put the financial
system back in its rightful place, as a servant of business rather than its
master. Surveys show that the majority of Americans would like to see the tax
system reformed and the government take more direct action on job creation and
poverty reduction, and address inequality in a meaningful way. Each candidate
is crafting a message around this, which will keep the issue front and center
through November.
The
American public understands just how deeply and profoundly the economic order
isn’t working for the majority of people. The key to reforming the U.S. system
is comprehending why it isn’t working.
Remooring
finance in the real economy isn’t as simple as splitting up the biggest banks
(although that would be a good start). It’s about dismantling the hold of
financial-oriented thinking in every corner of corporate America. It’s about
reforming business education, which is still permeated with academics who
resist challenges to the gospel of efficient markets in the same way that
medieval clergy dismissed scientific evidence that might challenge the
existence of God. It’s about changing a tax system that treats one-year
investment gains the same as longer-term ones, and induces financial
institutions to push overconsumption and speculation rather than healthy
lending to small businesses and job creators. It’s about rethinking retirement,
crafting smarter housing policy and restraining a money culture filled with
lobbyists who violate America’s essential economic principles.
It’s
also about starting a bigger conversation about all this, with a broader group
of stakeholders. The structure of American capital markets and whether or not
they are serving business is a topic that has traditionally been the sole
domain of “experts”—the financiers and policymakers who often have a
self-interested perspective to push, and who do so in complicated language that
keeps outsiders out of the debate. When it comes to finance, as with so many
issues in a democratic society, complexity breeds exclusion.
Finding
solutions won’t be easy. There are no silver bullets, and nobody really knows
the perfect model for a high-functioning, advanced market system in the 21st
century. But capitalism’s legacy is too long, and the well-being of too many
people is at stake, to do nothing in the face of our broken status quo. Neatly
packaged technocratic tweaks cannot fix it. What is required now is lifesaving
intervention.
Crises
of faith like the one American capitalism is currently suffering can be a good
thing if they lead to re-examination and reaffirmation of first principles. The
right question here is in fact the simplest one: Are financial institutions
doing things that provide a clear, measurable benefit to the real economy?
Sadly, the answer at the moment is mostly no. But we can change things. Our
system of market capitalism wasn’t handed down, in perfect form, on stone tablets.
We wrote the rules. We broke them. And we can fix them.
Rana Foroohar is an assistant managing
editor at TIME and the magazine’s economics columnist. She’s the author of Makers and Takers: The Rise of Finance and the Fall of American Business.
This appears in the May 23, 2016 issue of
TIME.
American “Success” In the Middle East Has Only Created More Problems. By Fareed Zakaria.
The U.S.’s “success” in the Middle East has only created more problems. By Fareed Zakaria. Washington Post, May 12, 2016.
Zakaria:
Iraq is collapsing as a country. This week’s bombings in Baghdad, which killed more than 90 people, are just further reminders that the place remains deeply unstable and violent. There is a lesson to be drawn from this, one that many powerful people in Washington are still resisting.
Iraq is collapsing as a country. This week’s bombings in Baghdad, which killed more than 90 people, are just further reminders that the place remains deeply unstable and violent. There is a lesson to be drawn from this, one that many powerful people in Washington are still resisting.
As Iraq
has spiraled downward, policymakers have been quick to provide advice.
Perennial hawks such as Sen. John McCain (R-Ariz.) have argued that if only the
Obama administration would send more troops to the region, it would be more stable. Others say we need more
diplomats and political advisers who can buttress military efforts. Still
others tell us to focus on Iraqi leaders and get them to be more inclusive.
Perhaps
it is worth stepping back from Iraq and looking at another country where the
United States has been involved. The United States has been engaged in
Afghanistan militarily, politically and economically for 15 years. It has had
many “surges” of troops. It has spent more than $1 trillion on the war, by some estimates, and still pays a large portion of Afghanistan’s defense budget.
Afghanistan has an elected government of national unity.
And
yet, in October, the United
Nations concluded that the insurgency had spread to more places in the
country than at any point since 2001. Danielle Moylan reported in the New York Times
that the Taliban now controls or contests all but three districts in Helmand
province. She said that 36,000 police officers — almost a quarter of the force
— are believed to have deserted the ranks last year. And last month, the
Taliban penetrated Kabul itself, attacking a building run by the National
Directorate of Security, which is responsible for much of the security in the
capital, as the New Yorker’s Dexter Filkins has reported.
Some
argue that 15 years is not enough. They point to South Korea and Germany and
say that the United States should simply stay unendingly. I am not opposed to a
longer-term U.S. presence in Afghanistan, especially because the country’s
elected government seems to want it. But the analogy is misplaced. In Germany
and South Korea, U.S. forces remained to deter a foreign threat. They were not
engaged in a never-ending battle within the country to help the government gain
control over its own people. The more appropriate analogue is Vietnam.
Much
has been made recently of a pair of interviews on U.S. foreign policy, one with
President Obama, the other with one of his closest aides, Ben Rhodes. Both men have been described as arrogant, self-serving and brimming
with contempt for the foreign policy establishment. Certainly, as most
administrations would, Obama and Rhodes sought to present their actions in a
positive light. So Obama congratulates himself for stepping back from the edge
of military intervention in Syria. He never grapples with the fact that his own
careless rhetoric — about Bashar al-Assad’s fate and “red lines” — pushed Washington
to the edge in the first place.
But on
the most important issue of substance, Obama is right and his critics are
wrong. The chief lesson for U.S. foreign policy from the past 15 years is that
it is much easier to defeat a military opponent in the greater Middle East than
to establish political order in these troubled lands.
The
mantra persists in Washington that Obama has “overlearned” the lessons of Iraq.
But the lessons come not just from Iraq. In Iraq, Afghanistan and Libya, it
took weeks to defeat the old regime. Years later, despite different approaches,
all of these countries remain in chaos. Can anyone seriously argue that a few
more troops, or a slightly different strategy, would have created stability and
peace?
The
Obama administration’s policy is trying to battle the Islamic State and yet
steer clear of anything that would lead it to occupy and control lands in the
region. I worry that the United States is veering toward too much involvement,
which will leave Washington holding the bag, but I understand the balance the
administration is trying to strike.
In
Syria, Washington’s real dilemma would be if the effort worked and the Islamic
State were defeated. This would result in a collapse of authority in large
swaths of Iraq and Syria that are teeming with radicalized Sunnis who refuse to
accept the authority of Baghdad or Damascus. Having led the fight, Washington
would be forced to assert control over the territory, set up prisons to house
thousands of Islamic State fighters, and provide security and economic
assistance for the population while fighting the inevitable insurgency.
You
know you’re in trouble when success produces more problems than failure.
Wednesday, May 11, 2016
Ice Age Europeans Had Some Serious Drama Going On, According to Their Genomes. By Sarah Kaplan.
Ice Age Europeans had some serious drama going on, according to their genomes. By Sarah Kaplan. Washington Post, May 5, 2016.
Game of bones: first Europeans’ shifting fortunes found in DNA. By Colin Barras. New Scientist, May 2, 2016.
Genetic analysis of Ice Age Europeans. Phys.org, May 2, 2016.
The genetic history of Ice Age Europe. By Qiaomei Fu et al. Nature, published online, May 2, 2016.
Nature abstract:
Modern humans arrived in Europe ~45,000 years ago, but little is known about their genetic composition before the start of farming ~8,500 years ago. Here we analyse genome-wide data from 51 Eurasians from ~45,000–7,000 years ago. Over this time, the proportion of Neanderthal DNA decreased from 3–6% to around 2%, consistent with natural selection against Neanderthal variants in modern humans. Whereas there is no evidence of the earliest modern humans in Europe contributing to the genetic composition of present-day Europeans, all individuals between ~37,000 and ~14,000 years ago descended from a single founder population which forms part of the ancestry of present-day Europeans. An ~35,000-year-old individual from northwest Europe represents an early branch of this founder population which was then displaced across a broad region, before reappearing in southwest Europe at the height of the last Ice Age ~19,000 years ago. During the major warming period after ~14,000 years ago, a genetic component related to present-day Near Easterners became widespread in Europe. These results document how population turnover and migration have been recurring themes of European prehistory.
Kaplan:
The entire drama of human history is encoded in our DNA.
Where
we went. Who we slept with. How we died — or almost did. It's basically a
scientific soap opera, complete with occasional discoveries of long-lost
cousins we never knew we had.
Take
Ice Age Europe, for example. A new study of genetic material from the period reveals a continent roiling with
change.
First,
an upstart band of modern humans arrived, slowly pushing their ancient
predecessors out of existence. But soon that new lineage was swept aside by a
group of big game hunters. For the next 15,000 years, the older community lay
in wait in a remote corner of the continent before bursting back onto the
scene. The usurpers were overturned, and history barreled forward. And all of
this happened against a backdrop of dramatic environmental change — waves of
cold and heat that sent glaciers surging back and forth across the continent.
“The
demographic history of early European populations was much more dynamic than
previously thought,” Cosimo Posth, a PhD student in archaeogenetics at the
University of Tübingen in Germany and a co-author of the study, told the New Scientist.
Posth
was just one of some six dozen researchers on four different continents who
teamed up for the survey, which was published this week in Nature. The result
of their efforts is the most comprehensive account of Europe's Ice Age
population changes yet, and it's told entirely through ancient DNA.
But
before researchers could start analyzing that genetic material, they had to get
it. DNA degrades over time, so extracting it from ancient human remains is
difficult and costly.
Much of
that delicate work was done by Qiaomei Fu, the lead
author of the paper and a genetics researcher at Harvard and the Chinese Academy
of Sciences in Beijing. She had to make sure that each genome was
uncontaminated by material picked up from microbes or present-day humans.
Over
and over again, she screened the samples, which came from long-buried remains
spanning nearly 40,000 years of history.
“It’s a
great privilege to be able to work on these samples,” David Reich, the head of
the Harvard Genetics Lab where Fu did some of her work, said in a news release. “It’s like being an art historian given full
access to the treasures of the Louvre.”
In the
end, they had data from 51 individuals — a tenfold increase over the measly
four that once gave researchers their only glimpses into this period.
“Trying
to represent this vast period of European history with just four samples is
like trying to summarize a movie with four still images,” Reich said. “With
51 samples, everything changes; we can follow the narrative arc; we get a vivid
sense of the dynamic changes over time.”
One of
the oldest genomes studied came from a thigh bone discovered in Goyet Cave in
Belgium and given the unwieldy name GoyetQ116-1. Radiocarbon dating pegs the
Goyet individual at some 35,000 years old, making him a likely member of the Aurignacian culture.
These stone toolmakers produced the oldest known example of human figurative
art — a 40,000-year-old figurine called the “Venus of Hohle Fels”
— as well as countless cave paintings.
Goyet
guy’s DNA is also strikingly similar to many modern Europeans’. Does this mean
that his family were the final colonizers of the continent?
Not
quite. Around 1,000 years after the Goyet individual was found, a new culture
swept through Europe: the Gravettians. Analysis of genetic material from the
time shows that art and artifacts weren’t the only things changing. The
Gravettians’ DNA was significantly different from their Aurignacian
predecessors, suggesting that they were a completely separate lineage.
Goyet
guy’s descendants retreated to the Iberian Peninsula (modern day Spain and
Portugal) and waited for their time to come again.
It did,
some 15,000 years later. Probably spurred by climate changes as glaciers began
to recede, this dormant lineage expanded back into the rest of Europe, bearing
a new culture known as Magdalenian. Not long after that, their genomes started
to look like those of people from the Middle East and the Caucasus, suggesting
that new arrivals from the southeast were mingling with — and in some cases
supplanting — the existing population.
![]() |
Impression
of one of the Ice Age modern humans analyzed in this study,
drawn by Stefano
Ricci who is both a professional graphic artist
and an author. Credit: Stefano
Ricci.
|
This
was a surprise, because researchers used to think that transition happened much
later, when Turkish farmers introduced agriculture to Europe some 8,500 years ago.
“It is
amazing how ancient DNA now starts to provide us with a detailed account of the
earliest history of present-day Europeans,” Max Planck Institute anthropologist
Svante Pääbo,
another author of the study, said in a news release.
But
like any good soap opera, this one is about disaster as much as it’s about
success. The genetic analysis allowed researchers to trace the inexorable
decline of Neanderthal DNA, which was two to three times more prominent in
early human genomes than it is in modern-day ones. This supports theories that
early humans interbred with Neanderthals, but that their DNA was toxic to us
and gradually weeded out by natural selection over the course of millennia.
For
those among us who still carry fractions of Neanderthal DNA, that process is probably still happening, Pääbo said. The drama isn't over yet.
Subscribe to:
Comments (Atom)




