Don’t Mention The War

angelamerkel

It is wrong to suggest that people should be held accountable for the actions of their ancestors. Blaming each other for the deeds of our ancestors is the cause of vast tracts of human suffering and conflict. Tribes and nations have fought each others for centuries — and, in some places, continue to do so — based on the actions of that tribe or nation’s forefathers. This is irrational. We cannot change the actions of our ancestors. That is perhaps one reason why John Cleese’s portrayal of an idiot hotelier beating down his German guests with a spiel of cringe-inducing World War 2 and Nazi clichés is so absurdly funny.

That being said, we do have a responsibility to learn from and not repeat the mistakes of our ancestors. Failure to learn from the mistakes of one’s ancestors is the point at which the actions of past generations become relevant in a discussion of the present.

There is a line of reasoning that suggests that the first person to compare their opponent to Adolf Hitler or Nazism in an argument on the internet just lost the argument. I tend to see this view as generally correct. The acts and beliefs of the Nazis were unusually horrific, and comparing your opponent or the person or group you are criticizing to the Nazis is often an act of rhetorical desperation, and often a symptom of a lack of imagination. However, what is generally correct is often locally wrong. Sometimes, a Nazi or World War 2 analogy really cuts to the core of a problem.

This, I believe, is one of those times. Having the German government and its allies trying to dictate to the Greek people the terms of Greece’s euro membership, the standards by which they should run their government, and economy, and civil service, and welfare state must feel painfully close to a new German occupation. Greece is a country, we should not forget, that suffered greatly under a German military occupation less than a lifetime ago. It is now experiencing a brutal and prolonged economic depression at the hands of a new generation of austerity obsessed Germans.

Greece has been a willing victim for German austerity. The Greeks have taken Merkel’s medicine. Greece has done a huge amount of spending cuts, so many in fact that by 2012 they had a primary surplus.

Unfortunately, Merkel’s and the Troika’s medicine was a load of horse shit. Instead of recovering, the Greek economy just got even more depressed. Unemployment has been at Great Depression levels ever since Merkel and the Troika began dictating how the Greeks ran their economy. Greek real GDP continues to trend downward. Indeed, Europe itself remains in an epic depression. The austerians keep making it worse.

Now, nobody is saying that the Greeks are blameless. Obviously, they took on a load of relatively unproductive debt they couldn’t afford, and they colluded with financiers to falsify economic data to get into the eurozone. But the country has already suffered massively as a result of those decisions (which of course were not Greece’s alone — the creditors clearly did not do their homework).

The goal now should be getting Greece — and the wider continent and world, which would also suffer greatly from a default cascade or economic slump as a result of the Greek crisis — out of the mess they are in. What Greece really needs is debt forgiveness. Even the IMF recognizes that Greece’s debts are unrepayable. But that is not Merkel and Schaüble’s goal. Instead of recognizing that their policies have failed, and that a change in course is necessary, their goal for Greece is complete capitulation to the stormtroopers. Their goal for Greece is punishment, in order to set an example to other euro members who might get into fiscal trouble.

The great irony — and the thing that makes the Nazi references really begin to stick — is that earlier German governments received massive debt relief. Indeed, after Germany started the Second World War — which killed 50 million people, including 6 million who died in the holocaust — it had its war debt written off, allowing the West German economy to begin to recover and rebuild. Indeed, Germany was the biggest defaulter of the 20th century. Yet now the very descendants of those Germans refuse the same treatment for today’s Greeks, whose troubles pale against the crimes of Germany’s Nazi past.

This is sickening. Not only are they shredding to pieces the European unity and the European Union that has kept war-torn Europe at piece with itself for the past seventy years, they are doing it in the name of an ignorant program of austerity that does nothing other than punish and degrade. And they are doing it in complete ignorance of how their own ancestors benefited from others’ forgiveness. Do they not understand the value of European unity? Of economic growth? Of peace or prosperity?

In choosing the path of sadomasochism, punishment and German supremacism, Schaüble and Merkel and their allies are risking turning what is already a terrible depression for the continent — and a ravaging for Greece — into something deeper, gloomier and more painful.

Deflation is Here — And The Government is Poised to Make it Worse

Consumer prices may not be deflating as quickly as Labour’s electoral chances did earlier this month, but — even after £300 billion of quantitative easing — price deflation for the first time in more than half a century is finally here. The Bank of England continues to throw everything at keeping prices rising at close to their 2 percent target. Yet it’s not working. And this is not just about cheaper oil. Core inflation has also been dropping like a rock.

I argued that “deflation was looming” for Britain last year, and feel a little vindicated that it has come to pass. But I don’t feel at all gratified about the thing itself.

In a highly indebted economy such as Britain’s — where private debt dwarfs government debt — deflation is a dangerous thing. Past debts — and the interest rates paid on those debts — are nominally rigid. Unless specifically stipulated as being inflation-adjusted (like TIPS) they don’t scale to price changes in the broader economy.

Under positive rates of inflation, inflation assists in keeping debt under control, by shrinking the present amount of goods and services and labour that equate to a nominal amount of currency. Under deflation, the opposite process occurs, and the nominal value of currency — as well as that of historical debt — rises, making the debt harder to service and pay down, especially with the ongoing accumulation of interest.

On the face of it, that is good news for net savers and bad news for net debtors. But raising the difficulty of deleveraging and debt service can often be bad for both, because debtors who cannot pay default, bankrupting themselves and injuring their creditors. It can also depress the economy, as individuals and firms are forced to stop spending and investing and start devoting more and more of their income to the rising real cost of deleveraging.

With growth last quarter dropping to 0.3 percent from 0.6 percent, this process might very well already be under way. This raises the prospect of the nightmarish debt-deflationary spiral above.

The last thing that the economy needs under that circumstance is more money being sucked out of it through slashing public spending. Sucking money out of the economy will make deleveraging even more difficult for debtors, and slow growth further as individuals and firms adjust their spending plans to lower levels of national and individual income. Yet that is the manifesto that the country elected to power in the election earlier this month. And although Osborne and Cameron can get out of it — via offsetting cuts in spending with tax cuts — if they go through with their election promises, the prospect of recession, continued deflation and rising levels of unemployment loom clearly.

What the economy really needed in 2010 was a deep and long commitment to public stimulus to provide the economic growth needed to let the private sector deleverage. Unlike the public sector, which is a sovereign creditor borrowing in its own currency — the private sector is far from a secure debtor. Private borrowers can — unlike the central government — “become the next Greece” and run out of money.

With interest rates in the last parliament having sunk down to new historic lows, such a thing was affordable and achievable. Instead, by trying to do public deleveraging at the same time as the private sector was deleveraging Osborne, Cameron and Clegg chose a much rockier path, one in which private deleveraging and public deleveraging are slow and grinding. With private debt levels still very high, the country remains vulnerable to another deleveraging-driven recession.

On Trade Unions & Inequality

This chart is pretty wow:

CC5Nkv4VAAAyaCX

Florence Jaumotte and Carolina Osorio Buitron of the International Monetary Fund have some ideas about how the correlation may have been caused:

The main channels through which labor market institutions affect income inequality are the following:

Wage dispersion: Unionization and minimum wages are usually thought to reduce inequality by helping equalize the distribution of wages, and economic research confirms this.

Unemployment: Some economists argue that while stronger unions and a higher minimum wage reduce wage inequality, they may also increase unemployment by maintaining wages above “market-clearing” levels, leading to higher gross income inequality. But the empirical support for this hypothesis is not very strong, at least within the range of institutional arrangements observed in advanced economies (see Betcherman, 2012; Baker and others, 2004; Freeman, 2000; Howell and others, 2007; OECD, 2006). For instance, in an Organisation for Economic Co-operation and Development review of 17 studies, only 3 found a robust association between union density (or bargaining coverage) and higher overall unemployment.

Redistribution: Strong unions can induce policymakers to engage in more redistribution by mobilizing workers to vote for parties that promise to redistribute income or by leading all political parties to do so. Historically, unions have played an important role in the introduction of fundamental social and labor rights. Conversely, the weakening of unions can lead to less redistribution and higher net income inequality (that is, inequality of income after taxes and transfers).

I have spent a lot of time thinking about what has caused the major upswing in inequality since the 1980s.

Back in 2011 and 2012 my analysis tended to emphasize financialization and specifically the massive growth in credit creation that took place since the 1980s. I think this was a rather naive view to take.

I don’t think I was wrong to look at financialization. Obviously, unchecked credit creation is a plausible pathway for the rich to make themselves and their friends richer. I just think it was naive to not see financialization — like deunionization, like globalization, and like trends in housing wealth — as part of a broader pie.

My hypothesis is that what changed is that politicians decided that greed was good and that “industrial policy” was a dirty phrase. The political structures that emerged in the wake of the Great Depression and World War 2 which together greatly limited inequality — welfare states, nationalized industries, unionized workforces, constrictive financial regulations like Glass Steagall — were severely rolled back. This created an opening for the rich to get much richer very fast, which they did.

If I’m right, it would take a major political shift in the other direction to start reducing inequality.

How To Euthanize Rentiers (Wonkish)

In my last post, I established that the “rentier’s share” of interest — resulting from as Keynes put it the “power of the capitalist to exploit the scarcity-value of capital” — can be calculated as the real-interest rate on lending to the monetary sovereign, typically known as the real risk free interest rate. That is because it is the rate that is left over after deducting for credit risk and inflation risk.

However, I have been convinced that my conclusion — that euthanizing rentiers should be an objective of monetary policy — is either wrong or impractical.

It would at very least require a dramatic shift in monetary policy orthodoxy. My initial thought was thus: the real risk-free interest rate (r) can be expressed as the nominal risk free interest rate minus the rate of inflation (r=n-i). To eliminate the rentier’s share, simply substitute 0 for r so that 0=n-i and n=i. In other words, have the central bank target a rate of inflation that offsets the expected future nominal risk free interest rate, resulting in a future real risk free interest rate as close to zero as possible.

There are some major problems with this. Presently, most major central banks target inflation. But they target a fixed rate of inflation of around 2 percent. The Fed explains its rationale:

Over time, a higher inflation rate would reduce the public’s ability to make accurate longer-term economic and financial decisions. On the other hand, a lower inflation rate would be associated with an elevated probability of falling into deflation, which means prices and perhaps wages, on average, are falling — a phenomenon associated with very weak economic conditions. Having at least a small level of inflation makes it less likely that the economy will experience harmful deflation if economic conditions weaken. The FOMC implements monetary policy to help maintain an inflation rate of 2 percent over the medium term.

Now, it is possible to argue that inflation targets should vary with macroeconomic conditions. For example, if you’re having a problem with deflation and getting stuck in a liquidity trap, a higher inflation target might be appropriate, as Jared Bernstein and Larry Ball argue. And on the other side of the coin, if you’re having a problem with excessive inflation — as occurred in the 1970s — it is arguable a lower inflation target than 2 percent may be appropriate.

But shifting to a variable rate targeting regime would be a very major policy shift, likely to be heavily resisted simply because the evidence shows that a fixed rate target results in more predictability, and therefore enhances “the public’s ability to make accurate longer-term economic and financial decisions”.

A second sticking point is the argument that such a regime would be trying to target a real-variable (the real risk free interest rate), which central banks have at best a very limited ability to do.

A third sticking point is Goodhart’s Law: “when a measure becomes a target, it ceases to be a good measure.” By making the future spread between the nominal risk free interest rate and inflation a target, the spread would lose any meaning as a measure.

A fourth sticking point is the possibility that such a severe regime change might create a regime susceptible to severe accelerative macroeconomic problems like inflationary and deflationary spirals.

And in this age of soaring inequality, the euthanasia of the rentier is simply too important an issue to hinge on being able to formulate a new workable policy regime and convince the central banking establishment to adopt it. Even if variable-rate inflation targeting or some alternative was actually viable, I don’t have the time, or the energy, or the inclination, or the expertise to try to do what Scott Sumner has spent over half a decade trying to do — change the way central banks work.

Plus, there is a much better option: make the euthanasia of the rentier a matter for fiscal policy and specifically taxation and redistribution. So here’s a different proposal: a new capital gains tax at a variable rate equal to the real risk-free interest rate, with the proceeds going toward business grants for poor people to start new businesses.

What The UK’s Low Productivity Is Really Telling Us

This, I would argue, is one of the scariest charts in the world today. The green line is output per hour worked, and the dotted green line is the pre-crisis trend:

growth_fig1

It’s what the Bank of England calls the “UK productivity puzzle.” As the BBC’s Linda Yueh notes: “output per hour is around 16 percentage points lower than it should be if productivity had grown at its pre-crisis pace.”

I don’t think it should be called a “productivity puzzle”. That would imply that we don’t really understand the phenomenon. That the phenomenon is a puzzle. But it’s really a simple phenomenon. The phenomenon is that people are producing less output per hour than they were before the financial crisis. Work is getting done. But the quality of the work is not improving.

The Bank of England points to “reduced investment in both physical and intangible capital, such as innovation, and impaired resource allocation from low to high productive uses” as a cause. In other words, the work is crap because firms aren’t deploying the resources to do good work. And this is a trend that predates the election of the Coalition government in 2010. As the Bank of England notes, the UK has lagged in investment as a percentage of GDP behind its fellow G8 economies since even the 1990s.

But things got really bad under the Coalition. And that shouldn’t really be news. There was a recession resulting from the financial crisis. The recession — as recessions tend to do — resulted in a severe drop in business investment. In the wake of the recession, what did the newly elected government decide to do? It decided to enact sweeping austerity programs — to slash investment even more.

So the story is that the government decided to compound the after-effects of the financial crisis with an austerity program. That means depriving the economy of even more resources needed for productivity, growth and prosperity. And — in truly, truly shocking news — UK investment as a percentage of GDP is currently lagging at a pathetic 15 percent of GDP behind Belgium, Gambia, Jordan, Equatorial Guinea and Costa Rica, and barely ahead of Greece!

The austerian view, of course, is that the austerity was necessary because otherwise the bond vigilantes would have sold UK public debt, and we would have turned into Greece, or something.

The so-called “productivity puzzle” and the related low-investment puzzle categorically proves this claim wrong. If the austerity was imbuing the market with confidence necessary for growth, we would expect to see productivity and investment rising.

That has not been the case. What has occurred is a zombie recovery caused by zombified economic policies. Yes, there has been substantial job growth, and GDP is now above its pre-crisis peak — albeit in the slowest recovery since the South Sea bubble 300 years ago. But the weakness in productivity continues to illustrate the rottenness.

You can’t starve yourself to strength. You can’t beat yourself to growth.

Is China’s economy headed for a crash?

In his assessment of the global economy’s performance 2013, legendary financier George Soros warned of dangers in the Chinese economy:

The major uncertainty facing the world today is not the euro but the future direction of China. The growth model responsible for its rapid rise has run out of steam.

That model depended on financial repression of the household sector, in order to drive the growth of exports and investments. As a result, the household sector has now shrunk to 35 percent of GDP, and its forced savings are no longer sufficient to finance the current growth model. This has led to an exponential rise in the use of various forms of debt financing.

There are some eerie resemblances with the financial conditions that prevailed in the U.S. in the years preceding the crash of 2008. [Project Syndicate]

That, as William Pesek notes, is a rather ominous conclusion. So is China due a crash?

Read More At TheWeek.com

Explaining The WTI-Brent Spread Divergence

Something totally bizarre has happened in the last three years. Oil in America has become much, much cheaper than oil in Europe. Oil in America is now almost $30 cheaper than oil in Europe.

This graph is the elephant in the room:

3 year brent spread

And this graph shows how truly historic a move this has been:

brent-WTI-spread

Why?

The ostensible reason for this is oversupply in America. That’s right — American oil companies have supposedly been producing much, much more than they can sell:

This is hilarious if prices weren`t so damn high, but despite a robust export market for finished products, crude oil is backing up all the way to Cushing, Oklahoma, and is only going to get worse in 2013.

Now that Enterprise Products Partners LLP has let the cat out of the bag that less than a month after expanding the Seaway pipeline capacity to 400,000 barrels per day, The Jones Creek terminal has storage capacity of 2.6 million barrels, and it is basically maxed out in available storage.

But there’s something fishy about this explanation. I don’t know for sure about the underlying causality — and it is not impossible that the oil companies are acting incompetently — but are we really supposed to believe that today’s oil conglomerates in America are so bad at managing their supply chain that they will oversupply the market to such an extent that oil sells at a 25% discount on the price in Europe? Even at an expanded capacity, is it really so hard for oil producers to shut down the pipeline, and clear inventories until the price rises so that they are at least not haemorrhaging such a huge chunk of potential profit on every barrel of oil they are selling? I mean, that’s what corporations do (or at least, what they’re supposed to do) — they manage the supply chain to maximise profit.

To me, this huge disparity seems like funny business. What could possibly be making US oil producers behave so ridiculously, massively non-competitively?

The answer could be government intervention. Let’s not forget that the National Resource Defence Preparedness Order gives the President and the Department of Homeland Security the authority to:

(c)  be prepared, in the event of a potential threat to the security of the United States, to take actions necessary to ensure the availability of adequate resources and production capability, including services and critical technology, for national defense requirements;

(d)  improve the efficiency and responsiveness of the domestic industrial base to support national defense requirements; and

(e)  foster cooperation between the defense and commercial sectors for research and development and for acquisition of materials, services, components, and equipment to enhance industrial base efficiency and responsiveness.

And the ability to:

(e)  The Secretary of each resource department, when necessary, shall make the finding required under section 101(b) of the Act, 50 U.S.C. App. 2071(b).  This finding shall be submitted for the President’s approval through the Assistant to the President and National Security Advisor and the Assistant to the President for Homeland Security and Counterterrorism.  Upon such approval, the Secretary of the resource department that made the finding may use the authority of section 101(a) of the Act, 50 U.S.C. App. 2071(a), to control the general distribution of any material (including applicable services) in the civilian market.

My intuition is that it is possible that oil companies may have been advised (or ordered) under the NDRP (or under the 1950 Defense Production Act) to keep some slack in the supply chain in case of a war, or other national or global emergency. This would provide a capacity buffer in addition to the Strategic Petroleum Reserve.

If that’s the case, the question we need to ask is what does the US government know that other governments don’t? Is this just a prudent measure to reduce the danger of a resource or energy shock, or does the US government have some specific information of a specific threat?

The other possible explanation, of course, is ridiculous incompetence on the part of US oil producers. Which, I suppose, is almost believable in the wake of Deepwater Horizon…

America Loves Drone Strikes

This graph shows everything we need to know about the geopolitical reality of Predator Drones (coming soon to the skies of America to hunt down fugitives?).

The American public loves drone strikes:

BC2a006CAAEBSj2

The American public does not approve of the extrajudicial killing of American citizens. But for everyone else, it’s open season.

But everyone else — most particularly and significantly, the countries in the Muslim world — largely hates and resents drone strikes.

And it is the Muslim world that produces the radicalised extremists who commit acts like 9/11, 7/7, the Madrid bombings, and the Bali bombings.  With this outpouring of contempt for America’s drone strikes, many analysts are coming to believe that Obama’s drone policy is now effectively a recruitment tool for al-Qaeda, the Taliban and similar groups:

2_Hands

Indeed, evidence is beginning to coalesce to suggest exactly this. PressTV recently noted:

The expanding drone war in Yemen, which often kills civilians, does in fact cause blowback and help al-Qaeda recruitment – as attested to by numerous Yemen experts, investigative reporting on the ground, polling, testimony from Yemen activists, and the actual fact that recent bungled terrorist attacks aimed at the U.S. have cited such drone attacks as motivating factors.

After another September drone strike that killed 13 civilians, a local Yemeni activist told CNN, “I would not be surprised if a hundred tribesmen joined the lines of al-Qaeda as a result of the latest drone mistake. This part of Yemen takes revenge very seriously.”

“Our entire village is angry at the government and the Americans,” a Yemeni villager named Mohammed told the Post. “If the Americans are responsible, I would have no choice but to sympathize with al-Qaeda because al-Qaeda is fighting America.”

Many in the U.S. intelligence community also believe the drone war is contributing to the al-Qaeda presence in Yemen. Robert Grenier, who headed the CIA’s counter-terrorism center and was previously a CIA station chief in Pakistan, told The Guardian in June that he is “very concerned about the creation of a larger terrorist safe haven in Yemen.”

“We have gone a long way down the road of creating a situation where we are creating more enemies than we are removing from the battlefield,” he said regarding drones in Yemen.

Iona Craig reports that civilian casualties from drone strikes “have emboldened al-Qaeda” and cites the reaction to the 2009 U.S. cruise missile attack on the village of al-Majala in Yemen that killed more than 40 civilians (including 21 children):

That one bombing radicalized the entire area,” Abdul Gh ani al-Iryani, a Yemeni political analyst, said. “All the men and boys from those families and tribes will have joined [al-Qaeda] to fight.

And al-Qaeda’s presence and support in Yemen has grown, not shrunk since the start of the targeted killing program:

Meanwhile Yemen Central Security Force commander Brig. Gen. Yahya Saleh, nephew of ousted president Ali Abdullah Saleh, told Abdul-Ahad that al-Qaeda has more followers, money, guns and territory then they did a year and a half ago.

All at a time when Yemen is facing a “catastrophic” food crisis, with at least 267,000 children facing life-threatening levels of malnutrition. Hunger has doubled since 2009, and the number of displaced civilians is about 500,000 and rising.

As U.S. drones drop bombs on south Yemen villages and AQAP provides displaced civilians with “free electricity, food and water,” tribes in the area are becoming increasingly sympathetic to AQAP.

Let’s be intellectually honest. If a country engages in a military program that carries out strikes that kill hundreds of civilians — many of whom having no connection whatever with terrorism or radicalism — that country is going to become increasingly hated. People in the countries targeted — those who may have lost friends, or family members — are going to plot revenge, and take revenge. That’s just how war works. It infuriates. It radicalises. It instils hatred.

The reality of Obama’s drone program is to create new generations of America-hating radicalised individuals, who may well go on to be the next Osama bin Laden, the next Ayman al-Zawahiri, the next Abu Musab al-Zarqawi. The reality for Obama’s drone program is that it is sowing the seeds for the next 9/11 — just as American intervention in the middle east sowed the seeds for the last, as Osama bin Laden readily admitted.

The Decline and Fall of the American Empire

Does the hypochondriac who is ultimately diagnosed with a real, physiological illness have the right to say “I told you so”?

Well, maybe. Sometimes a “hypochondriac” might be ill all along, but those diagnosing him just did not conduct the right test, or look at the right data. Medical science and diagnostics are nothing like as advanced as we like to hope. There are still thousands of diseases and ailments which are totally unexplained. Sometimes this means a “hypochondriac” might be dead or comatose before he ever gets the chance to say “I told you so.”

Similarly, there are are many who suggest that their own nations or civilisations are in ailing decline. Some of them might be crankish hypochondriacs. But some of them might be shockingly prescient:

Is Marc Faber being a hypochondriac in saying that the entire derivatives market is headed to zero? Maybe. It depends whether his analysis is proven correct by events. I personally believe that he is more right than he is wrong: the derivatives market is deeply interconnected, and counter-party risk really does threaten to destroy a huge percentage of it.

More dangerous to health than hypochondria is what I might call hyperchondria.


This is the condition under which people are unshakeably sure that they are fine. They might sustain a severe physical injury and refuse medical treatment. They brush off any and all sensations of physical illness. They suffer from an interminable and unshakeable optimism. Government — or, at least, the public face of government — is littered with them. John McCain blustered that the economy was strong and robust — until he had to suspend his Presidential campaign to return to Washington to vote for TARP. Tim Geithner stressed there was “no chance of a downgrade” — until S&P downgraded U.S. debt. Such is politics — politicians like to exude the illusion of control. So too do economists, if they become too politically active. Ben Bernanke boasted he could stanch inflation in “15 minutes“.

So, between outsiders like Ron Paul who have consistently warned of the possibility of economic disaster, and insiders like Ben Bernanke who refuse to conceive of such a thing, where can we get an accurate portrait of the shape of Western civilisation and the state of the American empire?

Professor Alfred McCoy — writing for CBS News — paints a fascinating picture:

A soft landing for America 40 years from now?  Don’t bet on it.  The demise of the United States as the global superpower could come far more quickly than anyone imagines.  If Washington is dreaming of 2040 or 2050 as the end of the American Century, a more realistic assessment of domestic and global trends suggests that in 2025, just 15 years from now, it could all be over except for the shouting.

Despite the aura of omnipotence most empires project, a look at their history should remind us that they are fragile organisms. So delicate is their ecology of power that, when things start to go truly bad, empires regularly unravel with unholy speed: just a year for Portugal, two years for the Soviet Union, eight years for France, 11 years for the Ottomans, 17 years for Great Britain, and, in all likelihood, 22 years for the United States, counting from the crucial year 2003.

Future historians are likely to identify the Bush administration’s rash invasion of Iraq in that year as the start of America’s downfall. However, instead of the bloodshed that marked the end of so many past empires, with cities burning and civilians slaughtered, this twenty-first century imperial collapse could come relatively quietly through the invisible tendrils of economic collapse or cyberwarfare.

But have no doubt: when Washington’s global dominion finally ends, there will be painful daily reminders of what such a loss of power means for Americans in every walk of life. As a half-dozen European nations have discovered, imperial decline tends to have a remarkably demoralizing impact on a society, regularly bringing at least a generation of economic privation. As the economy cools, political temperatures rise, often sparking serious domestic unrest.

Available economic, educational, and military data indicate that, when it comes to U.S. global power, negative trends will aggregate rapidly by 2020 and are likely to reach a critical mass no later than 2030. The American Century, proclaimed so triumphantly at the start of World War II, will be tattered and fading by 2025, its eighth decade, and could be history by 2030.

Significantly, in 2008, the U.S. National Intelligence Council admitted for the first time that America’s global power was indeed on a declining trajectory. In one of its periodic futuristic reportsGlobal Trends 2025, the Council cited “the transfer of global wealth and economic powernow under way, roughly from West to East” and “without precedent in modern history,” as the primary factor in the decline of the “United States’ relative strength — even in the military realm.” Like many in Washington, however, the Council’s analysts anticipated a very long, very soft landing for American global preeminence, and harbored the hope that somehow the U.S. would long “retain unique military capabilities… to project military power globally” for decades to come.

No such luck.  Under current projections, the United States will find itself in second place behind China (already the world’s second largest economy) in economic output around 2026, and behind India by 2050. Similarly, Chinese innovation is on a trajectory toward world leadership in applied science and military technology sometime between 2020 and 2030, just as America’s current supply of brilliant scientists and engineers retires, without adequate replacement by an ill-educated younger generation.

Wrapped in imperial hubris, like Whitehall or Quai d’Orsay before it, the White House still seems to imagine that American decline will be gradual, gentle, and partial. In his State of the Union address last January, President Obama offered the reassurance that “I do not accept second place for the United States of America.” A few days later, Vice President Biden ridiculed the very idea that “we are destined to fulfill [historian Paul] Kennedy’s prophecy that we are going to be a great nation that has failed because we lost control of our economy and overextended.” Similarly, writing in the November issue of the establishment journal Foreign Affairs, neo-liberal foreign policy guru Joseph Nye waved away talk of China’s economic and military rise, dismissing “misleading metaphors of organic decline” and denying that any deterioration in U.S. global power was underway.

Frankly — given how deeply America is indebted, given that crucial American military and consumer supply chains are controlled by China, given how dependent America is on foreign oil for transport and agribusiness — I believe that the end of American primacy by 2025 is an extraordinarily optimistic estimate. The real end of American primacy may have been as early as 9/11/2001.