Sunday, May 15, 2016

American Capitalism’s Great Crisis. By Rana Foroohar.





American Capitalism’s Great Crisis. By Rana Foroohar. Time, May 12, 2016.

Foroohar:

How Wall Street is choking our economy and how to fix it

A couple of weeks ago, a poll conducted by the Harvard Institute of Politics found something startling: only 19% of Americans ages 18 to 29 identified themselves as “capitalists.” In the richest and most market-oriented country in the world, only 42% of that group said they “supported capitalism.” The numbers were higher among older people; still, only 26% considered themselves capitalists. A little over half supported the system as a whole.

This represents more than just millennials not minding the label “socialist” or disaffected middle-aged Americans tiring of an anemic recovery. This is a majority of citizens being uncomfortable with the country’s economic foundation—a system that over hundreds of years turned a fledgling society of farmers and prospectors into the most prosperous nation in human history. To be sure, polls measure feelings, not hard market data. But public sentiment reflects day-to-day economic reality. And the data (more on that later) shows Americans have plenty of concrete reasons to question their system.

This crisis of faith has had no more severe expression than the 2016 presidential campaign, which has turned on the questions of who, exactly, the system is working for and against, as well as why eight years and several trillions of dollars of stimulus on from the financial crisis, the economy is still growing so slowly. All the candidates have prescriptions: Sanders talks of breaking up big banks; Trump says hedge funders should pay higher taxes; Clinton wants to strengthen existing financial regulation. In Congress, Republican House Speaker Paul Ryan remains committed to less regulation.

All of them are missing the point. America’s economic problems go far beyond rich bankers, too-big-to-fail financial institutions, hedge-fund billionaires, offshore tax avoidance or any particular outrage of the moment. In fact, each of these is symptomatic of a more nefarious condition that threatens, in equal measure, the very well-off and the very poor, the red and the blue. The U.S. system of market capitalism itself is broken. That problem, and what to do about it, is at the center of my book Makers and Takers: The Rise of Finance and the Fall of American Business, a three-year research and reporting effort from which this piece is adapted.

To understand how we got here, you have to understand the relationship between capital markets—meaning the financial system—and businesses. From the creation of a unified national bond and banking system in the U.S. in the late 1790s to the early 1970s, finance took individual and corporate savings and funneled them into productive enterprises, creating new jobs, new wealth and, ultimately, economic growth. Of course, there were plenty of blips along the way (most memorably the speculation leading up to the Great Depression, which was later curbed by regulation). But for the most part, finance—which today includes everything from banks and hedge funds to mutual funds, insurance firms, trading houses and such—essentially served business. It was a vital organ but not, for the most part, the central one.




Over the past few decades, finance has turned away from this traditional role. Academic research shows that only a fraction of all the money washing around the financial markets these days actually makes it to Main Street businesses. “The intermediation of household savings for productive investment in the business sector—the textbook description of the financial sector—constitutes only a minor share of the business of banking today,” according to academics Oscar Jorda, Alan Taylor and Moritz Schularick, who’ve studied the issue in detail. By their estimates and others, around 15% of capital coming from financial institutions today is used to fund business investments, whereas it would have been the majority of what banks did earlier in the 20th century.

“The trend varies slightly country by country, but the broad direction is clear,” says Adair Turner, a former British banking regulator and now chairman of the Institute for New Economic Thinking, a think tank backed by George Soros, among others. “Across all advanced economies, and the United States and the U.K. in particular, the role of the capital markets and the banking sector in funding new investment is decreasing.” Most of the money in the system is being used for lending against existing assets such as housing, stocks and bonds.

To get a sense of the size of this shift, consider that the financial sector now represents around 7% of the U.S. economy, up from about 4% in 1980. Despite currently taking around 25% of all corporate profits, it creates a mere 4% of all jobs. Trouble is, research by numerous academics as well as institutions like the Bank for International Settlements and the International Monetary Fund shows that when finance gets that big, it starts to suck the economic air out of the room. In fact, finance starts having this adverse effect when it’s only half the size that it currently is in the U.S. Thanks to these changes, our economy is gradually becoming “a zero-sum game between financial wealth holders and the rest of America,” says former Goldman Sachs banker Wallace Turbeville, who runs a multiyear project on the rise of finance at the New York City—based nonprofit Demos.

It’s not just an American problem, either. Most of the world’s leading market economies are grappling with aspects of the same disease. Globally, free-market capitalism is coming under fire, as countries across Europe question its merits and emerging markets like Brazil, China and Singapore run their own forms of state-directed capitalism. An ideologically broad range of financiers and elite business managers—Warren Buffett, BlackRock’s Larry Fink, Vanguard’s John Bogle, McKinsey’s Dominic Barton, Allianz’s Mohamed El-Erian and others—have started to speak out publicly about the need for a new and more inclusive type of capitalism, one that also helps businesses make better long-term decisions rather than focusing only on the next quarter. The Pope has become a vocal critic of modern market capitalism, lambasting the “idolatry of money and the dictatorship of an impersonal economy” in which “man is reduced to one of his needs alone: consumption.”

During my 23 years in business and economic journalism, I’ve long wondered why our market system doesn’t serve companies, workers and consumers better than it does. For some time now, finance has been thought by most to be at the very top of the economic hierarchy, the most aspirational part of an advanced service economy that graduated from agriculture and manufacturing. But research shows just how the unintended consequences of this misguided belief have endangered the very system America has prided itself on exporting around the world.

America’s economic illness has a name: financialization. It’s an academic term for the trend by which Wall Street and its methods have come to reign supreme in America, permeating not just the financial industry but also much of American business. It includes everything from the growth in size and scope of finance and financial activity in the economy; to the rise of debt-fueled speculation over productive lending; to the ascendancy of shareholder value as the sole model for corporate governance; to the proliferation of risky, selfish thinking in both the private and public sectors; to the increasing political power of financiers and the CEOs they enrich; to the way in which a “markets know best” ideology remains the status quo. Financialization is a big, unfriendly word with broad, disconcerting implications.

University of Michigan professor Gerald Davis, one of the pre-eminent scholars of the trend, likens financialization to a “Copernican revolution” in which business has reoriented its orbit around the financial sector. This revolution is often blamed on bankers. But it was facilitated by shifts in public policy, from both sides of the aisle, and crafted by the government leaders, policymakers and regulators entrusted with keeping markets operating smoothly. Greta Krippner, another University of Michigan scholar, who has written one of the most comprehensive books on financialization, believes this was the case when financialization began its fastest growth, in the decades from the late 1970s onward. According to Krippner, that shift encompasses Reagan-era deregulation, the unleashing of Wall Street and the rise of the so-called ownership society that promoted owning property and further tied individual health care and retirement to the stock market.

The changes were driven by the fact that in the 1970s, the growth that America had enjoyed following World War II began to slow. Rather than make tough decisions about how to bolster it (which would inevitably mean choosing among various interest groups), politicians decided to pass that responsibility to the financial markets. Little by little, the Depression-era regulation that had served America so well was rolled back, and finance grew to become the dominant force that it is today. The shifts were bipartisan, and to be fair they often seemed like good ideas at the time; but they also came with unintended consequences. The Carter-era deregulation of interest rates—something that was, in an echo of today’s overlapping left-and right-wing populism, supported by an assortment of odd political bedfellows from Ralph Nader to Walter Wriston, then head of Citibank—opened the door to a spate of financial “innovations” and a shift in bank function from lending to trading. Reaganomics famously led to a number of other economic policies that favored Wall Street. Clinton-era deregulation, which seemed a path out of the economic doldrums of the late 1980s, continued the trend. Loose monetary policy from the Alan Greenspan era onward created an environment in which easy money papered over underlying problems in the economy, so much so that it is now chronically dependent on near-zero interest rates to keep from falling back into recession.

This sickness, not so much the product of venal interests as of a complex and long-term web of changes in government and private industry, now manifests itself in myriad ways: a housing market that is bifurcated and dependent on government life support, a retirement system that has left millions insecure in their old age, a tax code that favors debt over equity. Debt is the lifeblood of finance; with the rise of the securities-and-trading portion of the industry came a rise in debt of all kinds, public and private. That’s bad news, since a wide range of academic research shows that rising debt and credit levels stoke financial instability. And yet, as finance has captured a greater and greater piece of the national pie, it has, perversely, all but ensured that debt is indispensable to maintaining any growth at all in an advanced economy like the U.S., where 70% of output is consumer spending. Debt-fueled finance has become a saccharine substitute for the real thing, an addiction that just gets worse. (The amount of credit offered to American consumers has doubled in real dollars since the 1980s, as have the fees they pay to their banks.)

As the economist Raghuram Rajan, one of the most prescient seers of the 2008 financial crisis, argues, credit has become a palliative to address the deeper anxieties of downward mobility in the middle class. In his words, “let them eat credit” could well summarize the mantra of the go-go years before the economic meltdown. And things have only deteriorated since, with global debt levels $57 trillion higher than they were in 2007.

The rise of finance has also distorted local economies. It’s the reason rents are rising in some communities where unemployment is still high. America’s housing market now favors cash buyers, since banks are still more interested in making profits by trading than by the traditional role of lending out our savings to people and businesses looking to make longterm investments (like buying a house), ensuring that younger people can’t get on the housing ladder. One perverse result: Blackstone, a private-equity firm, is currently the largest single-family-home landlord in America, since it had the money to buy properties up cheap in bulk following the financial crisis. It’s at the heart of retirement insecurity, since fees from actively managed mutual funds “are likely to confiscate as much as 65% or more of the wealth that … investors could otherwise easily earn,” as Vanguard founder Bogle testified to Congress in 2014.

It’s even the reason companies in industries from autos to airlines are trying to move into the business of finance themselves. American companies across every sector today earn five times the revenue from financial activities—investing, hedging, tax optimizing and offering financial services, for example—that they did before 1980. Traditional hedging by energy and transport firms, for example, has been overtaken by profit-boosting speculation in oil futures, a shift that actually undermines their core business by creating more price volatility. Big tech companies have begun underwriting corporate bonds the way Goldman Sachs does. And top M.B.A. programs would likely encourage them to do just that; finance has become the center of all business education.

Washington, too, is so deeply tied to the ambassadors of the capital markets—six of the 10 biggest individual political donors this year are hedge-fund barons—that even well-meaning politicians and regulators don’t see how deep the problems are. When I asked one former high-level Obama Administration Treasury official back in 2013 why more stakeholders aside from bankers hadn’t been consulted about crafting the particulars of Dodd-Frank financial reform (93% of consultation on the Volcker Rule, for example, was taken with the financial industry itself), he said, “Who else should we have talked to?” The answer—to anybody not profoundly influenced by the way finance thinks—might have been the people banks are supposed to lend to, or the scholars who study the capital markets, or the civic leaders in communities decimated by the financial crisis.

Of course, there are other elements to the story of America’s slow-growth economy, including familiar trends from globalization to technology-related job destruction. These are clearly massive challenges in their own right. But the single biggest unexplored reason for long-term slower growth is that the financial system has stopped serving the real economy and now serves mainly itself. A lack of real fiscal action on the part of politicians forced the Fed to pump $4.5 trillion in monetary stimulus into the economy after 2008. This shows just how broken the model is, since the central bank’s best efforts have resulted in record stock prices (which enrich mainly the wealthiest 10% of the population that owns more than 80% of all stocks) but also a lackluster 2% economy with almost no income growth.

Now, as many top economists and investors predict an era of much lower asset-price returns over the next 30 years, America’s ability to offer up even the appearance of growth—via financially oriented strategies like low interest rates, more and more consumer credit, tax-deferred debt financing for businesses, and asset bubbles that make people feel richer than we really are, until they burst—is at an end.

This pinch is particularly evident in the tumult many American businesses face. Lending to small business has fallen particularly sharply, as has the number of startup firms. In the early 1980s, new companies made up half of all U.S. businesses. For all the talk of Silicon Valley startups, the number of new firms as a share of all businesses has actually shrunk. From 1978 to 2012 it declined by 44%, a trend that numerous researchers and even many investors and businesspeople link to the financial industry’s change in focus from lending to speculation. The wane in entrepreneurship means less economic vibrancy, given that new businesses are the nation’s foremost source of job creation and GDP growth. Buffett summed it up in his folksy way: “You’ve now got a body of people who’ve decided they’d rather go to the casino than the restaurant” of capitalism.

In lobbying for short-term share-boosting management, finance is also largely responsible for the drastic cutback in research-and-development outlays in corporate America, investments that are seed corn for future prosperity. Take share buybacks, in which a company—usually with some fanfare—goes to the stock market to purchase its own shares, usually at the top of the market, and often as a way of artificially bolstering share prices in order to enrich investors and executives paid largely in stock options. Indeed, if you were to chart the rise in money spent on share buybacks and the fall in corporate spending on productive investments like R&D, the two lines make a perfect X. The former has been going up since the 1980s, with S&P 500 firms now spending $1 trillion a year on buybacks and dividends—equal to about 95% of their net earnings—rather than investing that money back into research, product development or anything that could contribute to long-term company growth. No sector has been immune, not even the ones we think of as the most innovative. Many tech firms, for example, spend far more on share-price boosting than on R&D as a whole. The markets penalize them when they don’t. One case in point: back in March 2006, Microsoft announced major new technology investments, and its stock fell for two months. But in July of that same year, it embarked on $20 billion worth of stock buying, and the share price promptly rose by 7%. This kind of twisted incentive for CEOs and corporate officers has only grown since.

As a result, business dynamism, which is at the root of economic growth, has suffered. The number of new initial public offerings (IPOs) is about a third of what it was 20 years ago. True, the dollar value of IPOs in 2014 was $74.4 billion, up from $47.1 billion in 1996. (The median IPO rose to $96 million from $30 million during the same period.) This may show investors want to make only the surest of bets, which is not necessarily the sign of a vibrant market. But there’s another, more disturbing reason: firms simply don’t want to go public, lest their work become dominated by playing by Wall Street’s rules rather than creating real value.

An IPO—a mechanism that once meant raising capital to fund new investment—is likely today to mark not the beginning of a new company’s greatness, but the end of it. According to a Stanford University study, innovation tails off by 40% at tech companies after they go public, often because of Wall Street pressure to keep jacking up the stock price, even if it means curbing the entrepreneurial verve that made the company hot in the first place.

A flat stock price can spell doom. It can get CEOs canned and turn companies into acquisition fodder, which often saps once innovative firms. Little wonder, then, that business optimism, as well as business creation, is lower than it was 30 years ago, or that wages are flat and inequality growing. Executives who receive as much as 82% of their compensation in stock naturally make shorter-term business decisions that might undermine growth in their companies even as they raise the value of their own options.

It’s no accident that corporate stock buybacks, corporate pay and the wealth gap have risen concurrently over the past four decades. There are any number of studies that illustrate this type of intersection between financialization and inequality. One of the most striking was by economists James Galbraith and Travis Hale, who showed how during the late 1990s, changing income inequality tracked the go-go Nasdaq stock index to a remarkable degree.

Recently, this pattern has become evident at a number of well-known U.S. companies. Take Apple, one of the most successful over the past 50 years. Apple has around $200 billion sitting in the bank, yet it has borrowed billions of dollars cheaply over the past several years, thanks to superlow interest rates (themselves a response to the financial crisis) to pay back investors in order to bolster its share price. Why borrow? In part because it’s cheaper than repatriating cash and paying U.S. taxes. All the financial engineering helped boost the California firm’s share price for a while. But it didn’t stop activist investor Carl Icahn, who had manically advocated for borrowing and buybacks, from dumping the stock the minute revenue growth took a turn for the worse in late April.

It is perhaps the ultimate irony that large, rich companies like Apple are most involved with financial markets at times when they don’t need any financing. Top-tier U.S. businesses have never enjoyed greater financial resources. They have a record $2 trillion in cash on their balance sheets—enough money combined to make them the 10th largest economy in the world. Yet in the bizarre order that finance has created, they are also taking on record amounts of debt to buy back their own stock, creating what may be the next debt bubble to burst.

You and I, whether we recognize it or not, are also part of a dysfunctional ecosystem that fuels short-term thinking in business. The people who manage our retirement money—fund managers working for asset-management firms—are typically compensated for delivering returns over a year or less. That means they use their financial clout (which is really our financial clout in aggregate) to push companies to produce quick-hit results rather than execute long-term strategies. Sometimes pension funds even invest with the activists who are buying up the companies we might work for—and those same activists look for quick cost cuts and potentially demand layoffs.

It’s a depressing state of affairs, no doubt. Yet America faces an opportunity right now: a rare second chance to do the work of refocusing and right-sizing the financial sector that should have been done in the years immediately following the 2008 crisis. And there are bright spots on the horizon.

Despite the lobbying power of the financial industry and the vested interests both in Washington and on Wall Street, there’s a growing push to put the financial system back in its rightful place, as a servant of business rather than its master. Surveys show that the majority of Americans would like to see the tax system reformed and the government take more direct action on job creation and poverty reduction, and address inequality in a meaningful way. Each candidate is crafting a message around this, which will keep the issue front and center through November.

The American public understands just how deeply and profoundly the economic order isn’t working for the majority of people. The key to reforming the U.S. system is comprehending why it isn’t working.

Remooring finance in the real economy isn’t as simple as splitting up the biggest banks (although that would be a good start). It’s about dismantling the hold of financial-oriented thinking in every corner of corporate America. It’s about reforming business education, which is still permeated with academics who resist challenges to the gospel of efficient markets in the same way that medieval clergy dismissed scientific evidence that might challenge the existence of God. It’s about changing a tax system that treats one-year investment gains the same as longer-term ones, and induces financial institutions to push overconsumption and speculation rather than healthy lending to small businesses and job creators. It’s about rethinking retirement, crafting smarter housing policy and restraining a money culture filled with lobbyists who violate America’s essential economic principles.

It’s also about starting a bigger conversation about all this, with a broader group of stakeholders. The structure of American capital markets and whether or not they are serving business is a topic that has traditionally been the sole domain of “experts”—the financiers and policymakers who often have a self-interested perspective to push, and who do so in complicated language that keeps outsiders out of the debate. When it comes to finance, as with so many issues in a democratic society, complexity breeds exclusion.

Finding solutions won’t be easy. There are no silver bullets, and nobody really knows the perfect model for a high-functioning, advanced market system in the 21st century. But capitalism’s legacy is too long, and the well-being of too many people is at stake, to do nothing in the face of our broken status quo. Neatly packaged technocratic tweaks cannot fix it. What is required now is lifesaving intervention.

Crises of faith like the one American capitalism is currently suffering can be a good thing if they lead to re-examination and reaffirmation of first principles. The right question here is in fact the simplest one: Are financial institutions doing things that provide a clear, measurable benefit to the real economy? Sadly, the answer at the moment is mostly no. But we can change things. Our system of market capitalism wasn’t handed down, in perfect form, on stone tablets. We wrote the rules. We broke them. And we can fix them.

Rana Foroohar is an assistant managing editor at TIME and the magazine’s economics columnist. She’s the author of Makers and Takers: The Rise of Finance and the Fall of American Business.


This appears in the May 23, 2016 issue of TIME.


American “Success” In the Middle East Has Only Created More Problems. By Fareed Zakaria.

The U.S.’s “success” in the Middle East has only created more problems. By Fareed Zakaria. Washington Post, May 12, 2016.

Zakaria:

Iraq is collapsing as a country. This week’s bombings in Baghdad, which killed more than 90 people, are just further reminders that the place remains deeply unstable and violent. There is a lesson to be drawn from this, one that many powerful people in Washington are still resisting.

As Iraq has spiraled downward, policymakers have been quick to provide advice. Perennial hawks such as Sen. John McCain (R-Ariz.) have argued that if only the Obama administration would send more troops to the region, it would be more stable. Others say we need more diplomats and political advisers who can buttress military efforts. Still others tell us to focus on Iraqi leaders and get them to be more inclusive.

Perhaps it is worth stepping back from Iraq and looking at another country where the United States has been involved. The United States has been engaged in Afghanistan militarily, politically and economically for 15 years. It has had many “surges” of troops. It has spent more than $1 trillion on the war, by some estimates, and still pays a large portion of Afghanistan’s defense budget. Afghanistan has an elected government of national unity.

And yet, in October, the United Nations concluded that the insurgency had spread to more places in the country than at any point since 2001. Danielle Moylan reported in the New York Times that the Taliban now controls or contests all but three districts in Helmand province. She said that 36,000 police officers — almost a quarter of the force — are believed to have deserted the ranks last year. And last month, the Taliban penetrated Kabul itself, attacking a building run by the National Directorate of Security, which is responsible for much of the security in the capital, as the New Yorker’s Dexter Filkins has reported.

Some argue that 15 years is not enough. They point to South Korea and Germany and say that the United States should simply stay unendingly. I am not opposed to a longer-term U.S. presence in Afghanistan, especially because the country’s elected government seems to want it. But the analogy is misplaced. In Germany and South Korea, U.S. forces remained to deter a foreign threat. They were not engaged in a never-ending battle within the country to help the government gain control over its own people. The more appropriate analogue is Vietnam.

Much has been made recently of a pair of interviews on U.S. foreign policy, one with President Obama, the other with one of his closest aides, Ben Rhodes. Both men have been described as arrogant, self-serving and brimming with contempt for the foreign policy establishment. Certainly, as most administrations would, Obama and Rhodes sought to present their actions in a positive light. So Obama congratulates himself for stepping back from the edge of military intervention in Syria. He never grapples with the fact that his own careless rhetoric — about Bashar al-Assad’s fate and “red lines” — pushed Washington to the edge in the first place.

But on the most important issue of substance, Obama is right and his critics are wrong. The chief lesson for U.S. foreign policy from the past 15 years is that it is much easier to defeat a military opponent in the greater Middle East than to establish political order in these troubled lands.

The mantra persists in Washington that Obama has “overlearned” the lessons of Iraq. But the lessons come not just from Iraq. In Iraq, Afghanistan and Libya, it took weeks to defeat the old regime. Years later, despite different approaches, all of these countries remain in chaos. Can anyone seriously argue that a few more troops, or a slightly different strategy, would have created stability and peace?

The Obama administration’s policy is trying to battle the Islamic State and yet steer clear of anything that would lead it to occupy and control lands in the region. I worry that the United States is veering toward too much involvement, which will leave Washington holding the bag, but I understand the balance the administration is trying to strike.

In Syria, Washington’s real dilemma would be if the effort worked and the Islamic State were defeated. This would result in a collapse of authority in large swaths of Iraq and Syria that are teeming with radicalized Sunnis who refuse to accept the authority of Baghdad or Damascus. Having led the fight, Washington would be forced to assert control over the territory, set up prisons to house thousands of Islamic State fighters, and provide security and economic assistance for the population while fighting the inevitable insurgency.

You know you’re in trouble when success produces more problems than failure.


Wednesday, May 11, 2016

Ice Age Europeans Had Some Serious Drama Going On, According to Their Genomes. By Sarah Kaplan.



Three ~31,000 years old skulls from Dolni VÄ›stonice in the Czech Republic. For the next five thousand years, all samples analyzed in this study—whether from Belgium, the Czech Republic, Austria, or Italy—are closely related, reflecting a population expansion associated with the Gravettian archaeological culture. Credit: Martin Frouz and Jiří Svoboda.


Ice Age Europeans had some serious drama going on, according to their genomes. By Sarah Kaplan. Washington Post, May 5, 2016.

Game of bones: first Europeans’ shifting fortunes found in DNA. By Colin Barras. New Scientist, May 2, 2016.

Genetic analysis of Ice Age Europeans. Phys.org, May 2, 2016.

The genetic history of Ice Age Europe. By Qiaomei Fu et al. Nature, published online, May 2, 2016.

Nature abstract:

Modern humans arrived in Europe ~45,000 years ago, but little is known about their genetic composition before the start of farming ~8,500 years ago. Here we analyse genome-wide data from 51 Eurasians from ~45,000–7,000 years ago. Over this time, the proportion of Neanderthal DNA decreased from 3–6% to around 2%, consistent with natural selection against Neanderthal variants in modern humans. Whereas there is no evidence of the earliest modern humans in Europe contributing to the genetic composition of present-day Europeans, all individuals between ~37,000 and ~14,000 years ago descended from a single founder population which forms part of the ancestry of present-day Europeans. An ~35,000-year-old individual from northwest Europe represents an early branch of this founder population which was then displaced across a broad region, before reappearing in southwest Europe at the height of the last Ice Age ~19,000 years ago. During the major warming period after ~14,000 years ago, a genetic component related to present-day Near Easterners became widespread in Europe. These results document how population turnover and migration have been recurring themes of European prehistory.


Kaplan:

The entire drama of human history is encoded in our DNA.

Where we went. Who we slept with. How we died — or almost did. It's basically a scientific soap opera, complete with occasional discoveries of long-lost cousins we never knew we had.

Take Ice Age Europe, for example. A new study of genetic material from the period reveals a continent roiling with change.

First, an upstart band of modern humans arrived, slowly pushing their ancient predecessors out of existence. But soon that new lineage was swept aside by a group of big game hunters. For the next 15,000 years, the older community lay in wait in a remote corner of the continent before bursting back onto the scene. The usurpers were overturned, and history barreled forward. And all of this happened against a backdrop of dramatic environmental change — waves of cold and heat that sent glaciers surging back and forth across the continent.

“The demographic history of early European populations was much more dynamic than previously thought,” Cosimo Posth, a PhD student in archaeogenetics at the University of Tübingen in Germany and a co-author of the study, told the New Scientist.

Posth was just one of some six dozen researchers on four different continents who teamed up for the survey, which was published this week in Nature. The result of their efforts is the most comprehensive account of Europe's Ice Age population changes yet, and it's told entirely through ancient DNA.


But before researchers could start analyzing that genetic material, they had to get it. DNA degrades over time, so extracting it from ancient human remains is difficult and costly.

Much of that delicate work was done by Qiaomei Fu, the lead author of the paper and a genetics researcher at Harvard and the Chinese Academy of Sciences in Beijing. She had to make sure that each genome was uncontaminated by material picked up from microbes or present-day humans.

Over and over again, she screened the samples, which came from long-buried remains spanning nearly 40,000 years of history.

“It’s a great privilege to be able to work on these samples,” David Reich, the head of the Harvard Genetics Lab where Fu did some of her work, said in a news release.  “It’s like being an art historian given full access to the treasures of the Louvre.”

In the end, they had data from 51 individuals — a tenfold increase over the measly four that once gave researchers their only glimpses into this period.

“Trying to represent this vast period of European history with just four samples is like trying to summarize a movie with four still images,” Reich said. “With 51 samples, everything changes; we can follow the narrative arc; we get a vivid sense of the dynamic changes over time.”

One of the oldest genomes studied came from a thigh bone discovered in Goyet Cave in Belgium and given the unwieldy name GoyetQ116-1. Radiocarbon dating pegs the Goyet individual at some 35,000 years old, making him a likely member of the Aurignacian culture. These stone toolmakers produced the oldest known example of human figurative art — a 40,000-year-old figurine called the “Venus of Hohle Fels” — as well as countless cave paintings.

Goyet guy’s DNA is also strikingly similar to many modern Europeans’. Does this mean that his family were the final colonizers of the continent?

Not quite. Around 1,000 years after the Goyet individual was found, a new culture swept through Europe: the Gravettians. Analysis of genetic material from the time shows that art and artifacts weren’t the only things changing. The Gravettians’ DNA was significantly different from their Aurignacian predecessors, suggesting that they were a completely separate lineage.


Goyet guy’s descendants retreated to the Iberian Peninsula (modern day Spain and Portugal) and waited for their time to come again.

It did, some 15,000 years later. Probably spurred by climate changes as glaciers began to recede, this dormant lineage expanded back into the rest of Europe, bearing a new culture known as Magdalenian. Not long after that, their genomes started to look like those of people from the Middle East and the Caucasus, suggesting that new arrivals from the southeast were mingling with — and in some cases supplanting — the existing population.


Impression of one of the Ice Age modern humans analyzed in this study,
 drawn by Stefano Ricci who is both a professional graphic artist
 and an author. Credit: Stefano Ricci.


This was a surprise, because researchers used to think that transition happened much later, when Turkish farmers introduced agriculture to Europe some 8,500 years ago.

“It is amazing how ancient DNA now starts to provide us with a detailed account of the earliest history of present-day Europeans,” Max Planck Institute anthropologist Svante Pääbo, another author of the study, said in a news release.

But like any good soap opera, this one is about disaster as much as it’s about success. The genetic analysis allowed researchers to trace the inexorable decline of Neanderthal DNA, which was two to three times more prominent in early human genomes than it is in modern-day ones. This supports theories that early humans interbred with Neanderthals, but that their DNA was toxic to us and gradually weeded out by natural selection over the course of millennia.

For those among us who still carry fractions of Neanderthal DNA, that process is probably still happening, Pääbo said. The drama isn't over yet.


Tuesday, May 10, 2016

The Scariest Reason Trump Won. By Dennis Prager.

The Scariest Reason Trump Won. By Dennis Prager. National Review Online, May 10, 2016.

Prager:

There are many reasons Donald Trump is the presumptive Republican presidential nominee. The four most often cited reasons are the frustrations of white working-class Americans, a widespread revulsion against political correctness, disenchantment with the Republican “establishment,” and the unprecedented and unrivaled amount of time the media afforded Trump.

They are all valid.

But the biggest reason is this: The majority of Republicans are not conservative.

Conservatives who opposed Trump kept arguing — indeed provided unassailable proof – that Donald Trump is not a conservative and has never been one. But the argument meant little or nothing to two types of Republicans: the majority of Trump voters who don’t care whether he is a conservative, and the smaller number of Trump voters who are conservative but care about illegal immigration more than all other issues, including Trump’s many and obvious failings.

So, then, what happened to the majority of Republicans? Why aren’t they conservative?

The answer lies in America’s biggest – and scariest – problem: Most Americans no longer know what America stands for. For them, America has become just another country, a place located between Canada and Mexico.

But America was founded to be an idea, not another country. As Margaret Thatcher put it: “Europe was created by history. America was created by philosophy.”

Why haven’t Americans over the past three generations known what America stands for?

Probably the biggest reason is the influence of left-wing ideas.

Since its inception, the Left has opposed the American idea, and for good reason. Everything the American idea represents undermines leftist ideas. And the Left, unlike most Americans, has always understood that either the Left is right or America is right.

America stands for small government, a free economy (and therefore capitalism), liberty (and it therefore allows for liberty’s inevitable consequence, inequality), the “melting pot” ideal, and a God-centered population rooted in Judeo-Christian values (so that a moral society is created by citizens exercising self-control rather than relying on the state to impose controls).

Only America was founded on the idea of small government. But the Left is based on big government.

America was founded on the principle that human rights come from the Creator. For the Left, rights come from the state.

America was founded on the belief that in order to maintain a small government, a God-fearing people is necessary. The Left opposes God-based religions, particularly Judeo-Christian religions. Secularism is at the core of Leftism every bit as much as egalitarianism is.

The American Revolution, unlike the French Revolution, placed liberty above equality. For the Left, equality is more important than all else. That’s why so many American and European leftists have celebrated left-wing regimes, no matter how much they squelched individual liberty, from Stalin to Mao to Che and Castro to Hugo Chávez. They all preached equality.

It took generations, but the Left has succeeded (primarily through the schools, but also through the media) in substituting its values for America’s.

While the Left has been the primary cause, there have been others.

The most significant is success.

American values were so successful that Americans came to take America’s success for granted. They forgot what made America uniquely free and affluent. And now, it’s not even accurate to say “forgot,” because, in the case of the current generation, they never knew. While the schools, starting with the universities, were being transformed into institutions for left-wing indoctrination, American parents, too, ceased teaching their children American values (beginning with not reading to their children the most popular book in American history, the Bible).

Schools even stopped teaching American history. When American history is taught today, it is taught as a history of oppression, imperialism, and racism. Likewise, there is essentially no civics education, once a staple of the public-school system. Young Americans are not taught either the Constitution or how American government works. I doubt many college students even know what “separation of powers” means, let alone why it is so significant.

So, then, thanks to leftism and America’s taken-for-granted success, most Americans no longer understand what it means to be an American. Those who do are called “conservatives” because they wish to conserve the unique American idea. But conservatives now constitute not only a minority of Americans, but a minority of Republicans. That is the primary reason Donald Trump — a nationalist but not a conservative — is the presumptive Republican nominee.

As I noted from the outset, I will vote for him if he wins the nomination — because there is no choice. But the biggest reason he won is also the scariest.


Hillary: The Conservative Hope. By Bret Stephens.

Hillary: The Conservative Hope. By Bret Stephens. Wall Street Journal, May 9, 2016.

The Republican Party ruined conservatism long before Trump. By Sean Illing. Salon, May 10, 2016.


Stephens:

The best hope for what’s left of a serious conservative movement in America is the election in November of a Democratic president, held in check by a Republican Congress. Conservatives can survive liberal administrations, especially those whose predictable failures lead to healthy restorations—think Carter, then Reagan. What isn’t survivable is a Republican president who is part Know Nothing, part Smoot-Hawley and part John Birch. The stain of a Trump administration would cripple the conservative cause for a generation.

This is the reality that wavering Republicans need to understand before casting their lot with a presumptive nominee they abhor only slightly less than his likely opponent. If the next presidency is going to be a disaster, why should the GOP want to own it?

In the 1990s, when another Clinton was president, conservatives became fond of the phrase “character counts.” This was a way of scoring points against Bill Clinton for his sexual predations and rhetorical misdirections, as well as a statement that Americans expected honor and dignity in the Oval Office. I’ll never forget the family friend, circa 1998, who wondered how she was supposed to explain the meaning of a euphemism for oral sex to her then 10-year-old daughter.

Conservatives still play the character card against Hillary Clinton, citing her disdain for other people’s rules, her Marie Antoinette airs and her potential law breaking. It’s a fair card to play, if only the presumptive Republican nominee weren’t himself a serial fabulist, an incorrigible self-mythologizer, a brash vulgarian, and, when it comes to his tax returns, a determined obfuscator. Endorsing Mr. Trump means permanently laying to rest any claim conservatives might ever again make on the character issue.

Conservatives are also supposed to believe that it’s folly to put hope before experience; that leopards never change their spots. So what’s with the magical thinking that, nomination in hand, Mr. Trump will suddenly pivot to magnanimity and statesmanship? Where’s the evidence that, as president, Mr. Trump will endorse conservative ideas on tax, trade, regulation, welfare, social, judicial or foreign policy, much less personal comportment?

On Monday, former Louisiana Gov. Bobby Jindal, who savaged Mr. Trump during the campaign, published an op-ed in these pages on why he plans to cast his vote for the real-estate developer as “the second-worst thing we could do this November.” Too much is at stake, Mr. Jindal said, on everything from curbing the regulatory excesses of the Obama administration to appointing a conservative judge to the Supreme Court, to risk another Democratic administration.

Mr. Jindal holds out the hope that Mr. Trump, who admires the Supreme Court’s 2005 Kelo decision on eminent domain (the one in which Susette Kelo’s little pink house was seized by the city of New London for the intended benefit of private developers), might yet appoint strict constructionists to the bench. Mr. Jindal also seems to think that a man whose preferred style of argument is the threatened lawsuit and the Twittertantrum, can be trusted with the vast investigative apparatus of the federal government.

The deeper mistake that Mr. Jindal and other lukewarm Trump supporters make is to assume that policy counts for more than ideas—that is, that the policy disasters he anticipates from a Clinton administration will be indelible, while Trumpism poses no real threat to the conservative ideas he has spent a political career championing. This belief stems from a failure to take Trumpism seriously, or to realize just how fragile modern conservatism is as a vital political movement.

But Trumpism isn’t just a triumph of marketing or the excrescence of a personality cult. It is a regression to the conservatism of blood and soil, of ethnic polarization and bullying nationalism. Modern conservatives sought to bury this rubbish with a politics that strikes a balance between respect for tradition and faith in the dynamic and culture-shifting possibilities of open markets. When that balance collapses—under a Republican president, no less—it may never again be restored, at least in our lifetimes.

For liberals, all this may seem like so much manna from heaven. Mr. Trump’s nomination not only gives his Democratic opponent the best possible shot at winning the election (with big down-ballot gains, too), but of permanently discrediting the conservative movement as a serious ideological challenger. They should be careful what they wish for. Mr. Trump could yet win, or one of his epigones might in four or eight years. This will lead to its own left-wing counter-reactions, putting America on the road to Weimar.

For conservatives, a Democratic victory in November means the loss of another election, with all the policy reversals that entails. That may be dispiriting, but elections will come again. A Trump presidency means losing the Republican Party. Conservatives need to accept that most conservative of wisdoms—sometimes, losing is winning, especially when it offers an education in the importance of political hygiene.


Sunday, May 8, 2016

The Conservative Case Against Trump. By Ross Douthat.

The Conservative Case Against Trump. By Ross Douthat. New York Times, May 7, 2016.

Douthat:

THERE are many lessons that conservatives need to learn from the rise of Donald Trump. There are elements of his message that the party should embrace. There are grievances among his voters that the Republican Party must address.

But for conservatives to support Trump himself, to assist in his election as president of the United States, would be a terrible mistake.

It would be a particularly stark mistake for conservatives who feel that the basic Reaganite vision that’s dominated their party for decades — a fusion of social conservatism, free-market economics, and a hawkish internationalism — still gets things mostly right.

In large ways and small, Trump has consistently arrayed himself against this vision. True, he paid lip service to certain Reaganite ideas during the primaries — claiming to be pro-life, promising a supply-side tax cut, pledging to appoint conservative judges. But the core of his message was protectionist and nativist, comfortable with an expansive welfare state, bored with religious conservatism, and dismissive of the commitments that constitute the post-Cold War Pax Americana. And Trump’s policy forays since clinching the nomination have only confirmed his post-Reagan orientation.

Reaganite conservatives who help elevate Trump to the presidency, then, would be sleepwalking toward a kind of ideological suicide. Successful party leaders often transform parties in their image. William Jennings Bryan and Woodrow Wilson between them turned a conservative Democratic Party progressive. Dwight Eisenhower all but extinguished G.O.P. isolationism. Reagan himself set liberal Republicanism on the path to extinction.

A successful President Trump (and to support him is to hope for such a thing) could easily do the same to Reaganism. In a fully-Trumpized G.O.P., Reagan’s ideological coalition would crack up, with hawks drifting toward the Democrats, supply-siders fading into crankery, religious conservatives entering semi-permanent exile. And in its place a Trumpized Republican intelligentsia would arise, with as little interest in Reaganism as today’s conservatives have in the ideas of Nelson Rockefeller or Jacob Javits.

The things conservatives are telling themselves to justify supporting him — at least he might appoint good judges — miss this long-term point. The Reagan coalition might — might! — get an acceptable Supreme Court appointment out of the Trump presidency. But that could easily be the last thing it ever got.

But what if you’re a conservative who isn’t a Reaganite, or you believe that Reaganite ideas have long passed their sell-by dates? What if you agree with Trump about the folly of the Iraq War, the perils of open immigration policies, or the need for a different right-wing economic agenda? What if you think his populism might bring about some necessary creative destruction to a backward-looking G.O.P.?

Then supporting Trump for president could make ideological sense, and the crackup I’ve just described might seem like an advertisement for doing so.

But there still remains the problem of Trump himself. Even if you find things to appreciate in Trumpism — as I have, and still do — the man who has raised those issues is still unfit for an office as awesomely powerful as the presidency of the United States.

His unfitness starts with basic issues of temperament. It encompasses the race-baiting, the conspiracy theorizing, the flirtations with violence, and the pathological lying that have been his campaign-trail stock in trade.

But above all it is Trump’s authoritarianism that makes him unfit for the presidency — his stated admiration for Putin and the Chinese Politburo, his promise to use the power of the presidency against private enterprises, the casual threats he and his surrogates toss off against party donors, military officers, the press, the speaker of the House, and more.

All presidents are tempted by the powers of the office, and congressional abdication has only increased that temptation’s pull. President Obama’s power grabs are part of a bipartisan pattern of Caesarism, one that will likely continue apace under Hillary Clinton.

But far more than Obama or Hillary or George W. Bush, Trump is actively campaigning as a Caesarist, making his contempt for constitutional norms and political niceties a selling point. And given his mix of proud ignorance and immense self-regard, there is no reason to believe that any of this is just an act.

Trump would not be an American Mussolini; even our sclerotic institutions would resist him more effectively than that. But he could test them as no modern president has tested them before — and with them, the health of our economy, the civil peace of our society and the stability of an increasingly perilous world.

In sum: It would be possible to justify support for Trump if he merely promised a period of chaos for conservatism. But to support Trump for the presidency is to invite chaos upon the republic and the world. No policy goal, no court appointment, can justify such recklessness.

To Trumpism’s appeal, to Trump’s constituents, conservatives should listen and answer “yes,” or “maybe,” or “not that, but how about…”

But to Trump himself, there is no patriotic answer except “no.”


The Defeat of True Conservatism. By Ross Douthat.

The Defeat of True Conservatism. By Ross Douthat. New York Times, May 3, 2016.

True True Conservatism. By Andrew C. McCarthy. National Review Online, May 4, 2016.


Douthat:

When Donald Trump knocked first Jeb Bush and then Marco Rubio out of the Republican primary campaign, he defeated not only the candidates themselves but their common theory of what the G.O.P. should be — the idea that the party could essentially recreate George W. Bush’s political program with slightly different domestic policy ideas and recreate Bush’s political majority as well.

Now, after knocking Ted Cruz out of the race with a sweeping win in Indiana, Trump has beaten a second theory of where the G.O.P. needs to go from here: a theory you might call True Conservatism.

True Conservatism likes to portray itself as part of an unbroken tradition running back through Ronald Reagan to Barry Goldwater and the Founding Fathers. It has roots in that past, but it’s also a much more recent phenomenon, conceived in the same spirit as Bushism 2.0 but with the opposite intent.

If Bushism 2.0 looked at George W. Bush’s peaks — his post-Sept. 11 popularity, his 2004 majority — and saw a model worth recovering, True Conservatism looked at his administration’s collapse and argued that it proved that he had been far too liberal, and that all his “compassionate conservative” heresies had led the Republican Party into a ditch.

Thus True Conservatism’s determination to avoid both anything that savored of big government and anything that smacked of compromise. Where Bush had been softhearted, True Conservatism would be sternly Ayn Randian; where Bush had been free-spending, True Conservatism would be austere; where Bush had taken working-class Americans off the tax rolls, True Conservatism would put them back on — for their own good. And above all, where Bush had sometimes reached for the center, True Conservatism would stand on principle, fight hard, and win.

This philosophy found champions on talk radio, it shaped the Tea Party’s zeal, it influenced Paul Ryan’s budgets, it infused Mitt Romney’s “You built that” rhetoric. But it was only in the government shutdown of 2013 that it found its real standard-bearer: Ted Cruz.

And Cruz ended up running with it further than most people thought possible. His 2016 campaign strategy was simple: Wherever the party’s most ideological voters were, there he would be. If Obama was for it, he would be against it. Where conservatives were angry, he would channel their anger. Where they wanted a fighter; he would be a fighter. Wherever the party’s activists were gathered, on whatever issue — social or economic, immigration or the flat tax — he would be standing by their side. He would win Iowa, the South, his native Texas, the Mountain West. They wanted Reagan, or at least a fantasy version of Reagan? He would give it to them.

It didn’t work — but the truth is it almost did. In the days before and after the Wisconsin primary, with delegate accumulation going his way and the polling looking plausible once the Northeastern primaries were over, it seemed like Cruz could reasonably hope for a nomination on the second or third ballot.

So give the Texas senator some credit. He took evangelical votes from Mike Huckabee, Ben Carson and Rick Santorum; he took libertarian votes from Rand Paul; he outlasted and outplayed Marco Rubio; he earned support from Mitt Romney, Jeb Bush and Lindsey Graham, who once joked about his murder. Nobody worked harder; no campaign ran a tighter ship; no candidate was more disciplined.

But it turned out that Republican voters didn’t want True Conservatism any more than they wanted Bushism 2.0. Maybe they would have wanted it from a candidate with more charisma and charm and less dogged unlikability. But the entire Trump phenomenon suggests otherwise, and Trump as the presumptive nominee is basically a long proof against the True Conservative theory of the Republican Party.

Trump proved that movement conservative ideas and litmus tests don’t really have any purchase on millions of Republican voters. Again and again, Cruz and the other G.O.P. candidates stressed that Trump wasn’t really a conservative; they listed his heresies, cataloged his deviations, dug up his barely buried liberal past. No doubt this case resonated with many Republicans. But not with nearly enough of them to make Cruz the nominee.

Trump proved that many evangelical voters, supposedly the heart of a True Conservative coalition, are actually not really values voters or religious conservatives after all, and that the less frequently evangelicals go to church, the more likely they are to vote for a philandering sybarite instead of a pastor’s son. Cruz would probably be on his way to the Republican nomination if he had simply carried the Deep South. But unless voters were in church every Sunday, Trump’s identity politics had more appeal than Cruz’s theological-political correctness.

Trump proved that many of the party’s moderates and establishmentarians hate the thought of a True Conservative nominee even more than they fear handing the nomination to a proto-fascist grotesque with zero political experience and poor impulse control. That goes for the prominent politicians who refused to endorse Cruz, the prominent donors who sat on their hands once the field narrowed and all the moderate-Republican voters in blue states who turned out to be #NeverCruz first and #NeverTrump less so or even not at all.

Finally, Trump proved that many professional True Conservatives, many of the same people who flayed RINOs and demanded purity throughout the Obama era, were actually just playing a convenient part. From Fox News’ 10 p.m. hour to talk radio to the ranks of lesser pundits, a long list of people who should have been all-in for Cruz on ideological grounds either flirted with Trump, affected neutrality or threw down their cloaks for the Donald to stomp over to the nomination. Cruz thought he would have a movement behind him, but part of that movement was actually a racket, and Trumpistas were simply better marks.

Cruz will be back, no doubt. He’s young, he’s indefatigable, and he can claim — and will claim, on the 2020 hustings — that True Conservatism has as yet been left untried. But that will be a half-truth; it isn’t being tried this year because the Republican Party’s voters have rejected him and it, as they rejected another tour for Bushism when they declined to back Rubio and Jeb.

What remains, then, is Trumpism. Which is also, in its lurching, sometimes insightful, often wicked way, a theory of what kind of party the Republicans should become, and one that a plurality of Republicans have now actually voted to embrace.

Whatever reckoning awaits the G.O.P. and conservatism after 2016 will have to begin with that brute fact. Where the reckoning goes from there — well, now is a time for pundit humility, so your guess is probably as good as mine.


The Problem with Trump as CEO of America: Government Is Not a Business. By Fareed Zakaria.



A display of Donald Trump-branded products at a press conference after his March 8 Florida Primary victory. Reuters/Joe Skipper.


The problem with Trump as CEO of America: Government is not a business. By Fareed Zakaria. Washington Post, May 5, 2016.

Zakaria:

At the heart of Donald Trump’s appeal is his fame as a successful businessman. It’s why most of his supporters don’t worry about his political views or his crude rhetoric and behavior. He’s a great chief executive and will get things done. No one believes this more than Trump himself, who argues that his prowess in the commercial world amply prepares him for the presidency. “In fact I think in many ways building a great business is actually harder,” he told GQ last year.

There is some debate about Trump’s record as a businessman. He inherited a considerable fortune from his father and, by someaccounts, would be wealthier today if he had simply invested in a stock index fund. His greatest skill has been to play a successful businessman on his television show “The Apprentice.”

Regardless, it is fair to say that Trump has formidable skills in marketing. He has been able to create a brand around his name like few others. The real problem is that these talents might prove largely irrelevant because commerce is quite different from government. The modern presidents who achieved the most — Franklin Roosevelt, Lyndon Johnson and Ronald Reagan — had virtually no commercial background. Some who did, George W. Bush and Herbert Hoover, fared worse in the White House. There is no clear pattern. One of the few successful CEOs who did well in Washington is Robert Rubin. A former head of Goldman Sachs, he served as the chief White House aide on economics and then treasury secretary in Bill Clinton’s administration. When he left Washington, he reflected inhis memoirs that he had developed “a deep respect for the differences between the public and private sectors.”

“In business, the single, overriding purpose is to make a profit,” he wrote. “Government, on the other hand, deals with a vast number of legitimate and often potentially competing objectives — for example, energy production versus environmental protection, or safety regulations versus productivity. This complexity of goals brings a corresponding complexity of process.”

He then noted that a big difference between the two realms is that no political leader, not even the president, has the kind of authority every corporate chief does. CEOs can hire and fire based on performance, pay bonuses to incentivize their subordinates, and promote capable people aggressively. By contrast, Rubin pointed out that he had the authority to hire and fire fewer than 100 of the 160,000 people who worked under him at the Treasury Department. Even the president has limited authority and mostly has to persuade rather than command.

This is a feature, not a flaw, of American democracy. Power is checked, balanced and counterbalanced to ensure that no one branch is too powerful and that individual liberty can flourish. It is no accident that Trump admires Vladimir Putin, who doesn’t have to deal with the complications of modern democratic government and can simply get things done.

In interviews with the New York Times, Trump imagined his first 100 days in office: He would summon congressional leaders to lobster dinners at Mar-a-Lago, threaten CEOs in negotiations at the White House (“The Oval Office would be an amazing place [from which] to negotiate”) and make great deals. When talking about the positions he would fill, Trump explained, “I want people in those jobs who care about winning. The U.N. isn’t doing anything to end the big conflicts in the world, so you need an ambassador who would win by really shaking up the U.N.”

This displays an astonishing lack of understanding about the world. The United Nations can’t end conflicts because it has no power. That rests with sovereign governments (unless Trump wants to cede U.S. authority to U.N. Secretary General Ban Ki-moon). The notion that all it would take is a strong U.S. ambassador to shake up the U.N., end conflicts and “win” is utterly removed from reality. Yet it is a perfect example of business thinking applied in a completely alien context.

Success in business is important, honorable and deeply admirable. But it requires a particular set of skills that are often very different from those that produce success in government. As Walter Lippmann wrote in 1930 about Herbert Hoover, possibly the most admired business leader of his age, “It is true, of course, that a politician who is ignorant of business, law, and engineering will move in a closed circle of jobs and unrealities. ... [But the] popular notion that administering a government is like administering a private corporation, that it is just business, or housekeeping, or engineering, is a misunderstanding. The political art deals with matters peculiar to politics, with a complex of material circumstances, of historic deposit, of human passion, for which the problems of business or engineering as such do not provide an analogy.”


General James Mattis: The Middle East at an Inflection Point.

The Middle East at an Inflection Point with Gen. Mattis. Video. CSIS, April 22, 2016. YouTube.