Ben Bernanke, in his final press conference as chairman of the Federal Reserve, announced today that the central bank would be tapering asset purchases to $75 billion a month, down from $85 billion, which has been widely seen as a modest first step toward reducing the Fed’s outsized role in financial markets and the economy.
The move caught many economists by surprise — a USA Today survey found that most economists polled said the Fed would maintain its current levels of quantitative easing, as the policy is known, before trimming down in January.
After the financial crisis in 2008, spooked investors started piling into low-risk assets like Treasuries, driving prices dramatically higher. The Fed’s aim in buying these assets was to take safe investments like Treasuries off the market, in order to encourage investors to take more risk and invest in higher-yielding and more productive ventures like stocks, equipment, and new employees.
The ultimate objective was more jobs, and more economic activity.
J.P. Morgan Chase is nearing a settlement with federal regulators over the bank’s ties to convicted fraudster Bernie Madoff, reports The New York Times. The deal would involve penalties of up to $2 billion dollars and a rare criminal action. The government intends to use the money to compensate Madoff’s victims.
For two decades before his arrest, Madoff had banked with J.P. Morgan — and apparently laundered up to $76 billion through the bank. Employees at the bank had raised concerns about Madoff’s business. In 2006, a J.P. Morgan employee wrote after studying some of Mr. Madoff’s trading records that “I do have a few concerns and questions,” and expressed worry that Madoff would not disclose exactly which trades he had made. Madoff’s company turned out to be an elaborate ponzi scheme that stole an estimated $18 billion from clients; it collapsed in 2008.
Is it fair to blame J.P. Morgan for the activities of Madoff? Do banks have a responsibility to know if their clients are involved in criminal activities? I think so — banks should have strong checks and balances to prevent fraud and money laundering, because if they don’t then criminals like Madoff can get away with it for years and years. According to Robert Lenzner of Forbes, “J.P. Morgan never reported to the Treasury or the Federal Reserve a huge cache of checks going back and forth for seven years between Madoff’s Investment Account 703 and Bank Customer Number One, belonging to real estate developer Norman Levy, who died in 2005.”
By agreeing to pay the fine and the government’s rebuke, J.P. Morgan is admitting a failure of oversight. But it’s not as if J.P. Morgan is the only one to blame. Others on Wall Street had expressed concern about Madoff’s business much earlier.
But isn’t there a better way to increase a borrowing limit — and one that doesn’t freak out markets, investors, and, well, just about everyone every few months?
Brower claims his actions are a product of his frustration over the existence of homelessness in his district, telling Hawaii News Now, “I got tired of telling people I’m trying to pass laws. I want to do something practical that will really clean up the streets.”
Brower also wakes those he finds sleeping and tell them to sleep somewhere else. “If someone is sleeping at night on the bus stop, I don’t do anything, but if they are sleeping during the day, I’ll walk up and say, ‘Get your ass moving,’” he said.
An impressive video featuring former Treasury Secretary Larry Summers has been making the rounds.
Summers makes the case that the United States and other Western nations may have reached a state of permanent stagnation in growth and employment. In Japan, per capita incomes grew strongly until the 1990s, and since then they have been growing very weakly and intermittently. Summers cites Japan as an early example of what might occur elsewhere.
Japan’s stagnation is shocking — today, the Japanese economy is only half the size economists in the 1990s predicted it would be if it had continued on its pre-1990s growth trend. As Summers notes, in the U.S., growth is also well below its pre-crisis trend, and unemployment remains persistently high. More than 12 million people who want work and are actively looking cannot find it. That’s a very ugly situation.
Under normal conditions, central banks can lower interest rates on lending to banks as a way to encourage activity and fight unemployment. Lower rates make business projects easier to afford, and more business projects should mean more jobs. If an economic shock pushes the unemployment rate up, central banks can lower lending rates to ease conditions. And conversely, if economic conditions are overheating and inflation is pushing up above the Federal Reserve’s target of 2 percent, interest rates can be hiked to encourage saving and discourage spending.
Yet in the current slump, unemployment has remained elevated even while interest rates have been at close to zero for four years while inflation has remained contained. This suggests that the interest rate level required to bring employment down significantly is actually below zero. Summers agrees:
Suppose that the short-term real interest rate that was consistent with full employment had fallen to negative 2 percent or negative 3 percent sometime in the middle of the last decade.
But central banks can’t lower interest rates below zero percent because people can just hold cash instead. Why invest if you’re going to lose money doing so?
When I see discussion of Obamacare in the media and especially on blogs, I often see the impression that Obamcare is a communist scheme to impose socialised medicine in the United States:
Actually, Obamacare was first dreamt up by the conservative Heritage Foundation, and first implemented at the state level by the Republican former Massachusetts governor Mitt Romney. (And for what it’s worth, I wrongly judged Republican opposition to Obamacare as an immovable obstacle in Romney’s quest to become the Republican Presidential nominee, but I guess Republicans were far more fickle than I thought). So as its origin implies, Obamacare is not exactly a communist, or social democratic idea. A charge of socialism or communism might be more fairly levelled against Obamacare if Obamacare were a law to confiscate all hospitals, drug companies, biotechnology companies and insurance companies from private hands. But it does no such thing. The opposite, in fact. More principled critics of Obamcare might more accurately describe it as corporatist — guaranteeing revenue streams for the insurance industry through the individual mandate — but that has not exactly been the Republican Party’s line of attack.
Given that opposition by the Republican-controled House to Obamacare is the most significant cause of the current government shutdown, it is worthwhile looking over how Americans actually feel about the law, not least to gauge the extent to which Americans may or may not support the Republicans now that their opposition to Obamacare is having real consequences.
It has long been said that Obamacare is unpopular, and the polls bear this out. A September CNN/ORC poll showed that Obamacare was supported by 43% of respondents, and opposed by 51% of respondents. But here’s the catch: 16% of respondents opposed Obamacare for not being liberal enough. Presumably, they would prefer a single payer system, as is the reality throughout most of Europe an Canada. (Of course, a move to such a system might be more fairly described as socialist, but that is another argument for another day). A sizeable number want something more liberal than Obamacare, and so would presumably prefer Obamacare to the status quo, even if they still claim to oppose it. So the consensus is actually against the Republican position by 59% to 35%. And that is why opposing Obamacare in this fight-to-the-death manner will be received negatively by a majority of Americans. Only 35% of Americans are against Obamacare because it is too liberal, and even then a substantial number of those — such as seniors who receive government benefits, or poor rural Republicans receiving food stamps — may be against shutting down the government to fight Obamacare. The Republicans are fighting a losing fight, and as the shutdown grinds on may be doing irreparable damage to their 2014 election prospects.
More generally, I find it rather puzzling that Republicans — convinced Obamacare will fail disastrously — are going to such lengths to oppose it. Like Prohibition once was, it is now law, and if it is destined to precipitate disaster — by increasing unemployment, by increasing healthcare costs, by increasing strain on the healthcare system, or by any other means — then it will be quickly rejected and repealed in the future.
George Osborne just came out in favour of counter-cyclical policy — saving more in the boom, and spending more during a “rainy day”. This is consistent with John Maynard Keynes’ notion that “the time for austerity at the Treasury is the boom, not the slump”.
The thing is, George Osborne seems to believe that right now we are moving toward a boom and need to adopt the policies of the boom:
Chancellor George Osborne has said he wants the government to be running a surplus in the next Parliament and can get there without raising taxes.
He told the Conservative conference the public finances should be in the black when the economy was strong as insurance against a “rainy day”.
His comments were taken as suggesting more years of spending restraint.
Business welcomed the goal but Labour said Mr Osborne had missed targets before and could not be trusted.
The BBC News Channel’s chief political correspondent Norman Smith said Mr Osborne’s underlying message was that austerity would continue after the next election despite the return to growth.
If 7.8% unemployment and a smaller real economy than 5 years ago doesn’t constitute a rainy day, I’d like to know what does. To me, and to many economists this kind of thing doesn’t just constitute a rainy day, it constitutes a full blown great depression. Eventually, sooner or later, someday the economy will return to growth and full employment. With the right luck — technology breakthroughs and other exogenous shocks etc — that could be two or five years from now. The experience of Japan, however, who have endured a 20 year depression suggests that it could be much later rather than sooner.
The safer alternative is to use fiscal policy — as Osborne himself implies — during the rainy day to directly bring back full employment sooner, rather than later by engaging in infrastructure projects and the like. Even if a government hasn’t saved money during the boom, interest rates are so low during the slump that it is cheap to do so, even in the context of soaring national debt levels as is the case in Japan today.
Getting the economy to a point where the government can run a budget surplus, of course, is still a noble ambition. But Osborne has shown no awareness whatever of the steps that need to be taken to get to that position. Infrastructure and housing investment and a jobs program would be a start. So too would liberalising planning laws and lending to and deregulating business startups so that more houses can get built and more businesses can get started. For now, Osborne is preaching responsibility while doing something deeply irresponsible — prolonging a depression with unnecessary demand-sucking job-killing austerity. The boom, not the slump, is the time for austerity at the Treasury and this (for the love of God) is not the boom.
Paul Krugman says that the notion that the weak economy is due to policy uncertainty has been thoroughly debunked. The Stanford/Chicago uncertainty index has considerably fallen:
Without any considerable boost to job growth:
While policy uncertainty is concerned with policy in general, and not executive policy in particular, Krugman’s analysis is that “policy uncertainty” is a thinly-veiled attempt to blame Obama for the sluggishness of the recovery:
One of the remarkable things about the ongoing economic crisis is the endless search for explanations of something that’s actually quite simple — the sluggish pace of recovery. You have a large overhang of private debt; you have a still-depressed housing sector; and you have contractionary fiscal policy. Add to this the well-established fact that recovery tends to be slow after recessions caused not by tight money but by private-sector overreach, and there’s just no mystery that needs explaining.
Yet we’ve seen an endless series of analyses declaring that there is indeed a deep mystery, and it must be Obama’s Fault. Probably the most influential of these analyses was the claim that Obama was creating “uncertainty”, and this was holding everything back.
This crude notion of policy uncertainty is often attached to the notion of the Confidence Fairy; the idea that by running large deficits, government is crowding out private investment due to fears of future tax increases. The corollary of the Confidence Fairy view is that the only way to bring back private investment is to have large-scale austerity, to solidify expectations of lower future taxes. This view has been the basis for David Cameron’s economic policy in the UK, which can only be soberly judged as a large-scale failure.
Krugman is right to trash the Confidence Fairy — austerity at this point in the business cycle is a catastrophic error, because it sucks money out of the real economy. And he’s also right to trash those who view the sluggishness of the recovery as solely Obama’s fault. But he’s wrong, I think, to throw policy uncertainty out of the window entirely as a proximate cause of some of the problem’s we’re now facing.
Broadly, policy uncertainty goes both ways. That is simply because not all entrepreneurs in the private sector are looking for or worrying about tax cuts. People are heterogeneous. While there are some entrepreneurs worried about the future trajectory of taxes, many other entrepreneurs may be hoping for fiscal stimulus either because they would expect to receive orders from the government (for example, construction firms, defence contractors, universities, energy companies) or because they would be hoping that with stimulus, more people would have money in their pockets and they would be spending it.
While this, of course, cannot explain the crisis itself, nor the long and slow deleveraging since, having a deadlocked Congress erring on the side of austerity could be a major headache for many private enterprises. The fact that the more severe austerity experienced in Europe and Britain has actually led to bigger budget deficits there could result in even deeper and greater uncertainty for businesses. Put more simply, many businessmen could be reading Paul Krugman and others like him, agreeing with their interpretations, and worrying about the confused and deadlocked approach that the Federal government has taken to the post-2007 economy, and the dangers of austerity. This could contribute to the uptick in policy uncertainty measured by the Stanford/Chicago Index experienced since 2007 just as much as Wall Street Journal-reading Republicans worrying about the Confidence Fairy and taxes.
While gold is widely held as a store of purchasing power, and while it is possible to use gold as a unit of account (by converting its floating value to denominate anything in gold terms), gold is no longer widely used as a medium of exchange.
In the days when people carried around gold doubloons and whatnot as money, you had a global political system characterized by pockets of stability (the Spanish Empire, or the Chinese Empire, or whatever) scattered among large areas of anarchy. Those stable centers minted and gave out the gold coins. But in the event of a massive modern global catastrophe that brought widespread anarchy, the gold bars buried in your backyard would not be swappable for eggs or butter at the corner store. You’d need some big organization to turn the gold bars into coins of standard weights and purity. And that big organization is not going to do that for you as a free service. More likely, that big organization will simply kill you and take your gold bars, Dungeons and Dragons style.
In other words, I think gold is never coming back as a medium of exchange, under any circumstances. It is no more likely than a return of the Holy Roman Empire. Say goodbye forever to gold money.
Well, forever is a very long time. Human history stretches back just six million years. Recorded history suggests that gold has only been used as a medium of exchange for five or six thousand years. But for that tiny sliver of human history, gold became for many cultures entirely synonymous with money, and largely synonymous with wealth. So I think Noah is over egging his case by using the word forever. Societies have drastically changed in the last six thousand years, let alone the last one hundred. We don’t know how human culture and technology and societies will progress in the future. As humans colonise space, we may see a great deal of cultural and social fragmentation; deeper into the future, believers in gold as money may set up their own planetary colonies or space stations.
But what about the near future? Well, central banks are still using gold as a reserve. In the medium term, it is a hedge against the counter-party risks of a global fiat reserve system in flux. But central banks buying and acquiring gold is not the same thing as gold being used as a medium of exchange. Gold as a reserve never went away, and even in the most Keynesian of futures may not fully die for a long time yet.
And what about this great hypothetical scenario that many are obsessed with where the fragile interconnective structure of modern society — including electronics — briefly or not-so-briefly collapses? Such an event could result from a natural disaster like a megatsunami, or extreme climate change, or a solar flare, or from a global war. Well, again, we can’t really say what will or won’t be useful as a medium of exchange under such circumstances. My intuition is that we would experience massive decentralisation, and trade would be conducted predominantly either in terms of barter and theft. If you have gold coins or bars, and want to engage in trade using them — and have a means to protect yourself from theft, like guns and ammunition — then it is foreseeable that these could be bartered. But so too could whiskey, cigarettes, beer, canned food, fuel, water, IOUs and indeed state fiat currencies. If any dominant media of exchange emerges, it is likely to be localised and ad hoc. In the longer run, if modern civilisation does not return swiftly but instead has to be rebuilt from the ground up over generations then it is foreseeable that physical gold (and other precious metals, including silver) could emerge as the de facto medium of exchange, simply because such things are nonperishable, fungible, and relatively difficult to fake. On the other hand, if modern civilisation is swiftly rebuilt, then it is much more foreseeable that precious metal-based media of exchange will not have the time to get off the ground on anything more than the most localised and ad hoc of bases.
So when does gold actually pay off? Well, remember that stories do not have to be true for people to believe them. Lots and lots of people believe that gold or gold-backed money in the event of a global social disruption. And so when this story becomes more popular (possibly with the launching of websites like Zero Hedge?), or when large-scale social disruption seems more likely while holding the popularity of the story constant, gold pays off. Gold is like a credit default swap backed by an insolvent counterparty – it has no hope of actually being redeemed, but you can keep it around forever, and it goes up in price whenever people get scared.
In other words, gold pays off when there is an outbreak of goldbug-ism. Gold is a bet that there will be more goldbugs in the future than there are now. And since the “gold will be money again” story is very deep and powerful, based as it is on thousands of years of (no longer applicable) historical experience, it is highly likely that goldbug-ism will break out again someday. So if you’re the gambling type, or if you plan to start the next Zero Hedge, or if your income for some reason goes down when goldbug-ism breaks out, well, go ahead and place a one-way bet on gold.
Noah, of course, is right that gold is valuable when other people are willing to pay for it. The reason why gold became money in the first place was because people chose to use it as a medium of exchange. They liked it, and they used it, and that created demand for it. If that happens again, then gold will be an in-demand medium of exchange again. But for many reasons — including that governments want monetary flexibility — most of the world today has rejected gold as a medium of exchange.
But there is another pathway for gold to pay off. Noah is overlooking the small possibility that gold may at some point become more than a speculative investment based on the future possibility that gold may at some point return as a monetary media. In 2010, scientists from the Brookhaven National Laboratory on Long Island, using their Relativistic Heavy Ion Collider (RHIC) collided some gold nuclei, traveling at 99.999% of the speed of light. The plasma that resulted was so energetic that a tiny cube of it with sides measuring about a quarter of the width of a human hair would contain enough energy to power the entire United States for a year. So there exists a possibility that gold could be used at some date in the future as an energy source — completely obliterating any possibility of gold becoming a medium of exchange again. Of course, capturing and storing that energy is another matter entirely, and may prove impossible. In that case — if gold does not become a valuable energy source — it is almost inevitable that some society somewhere at some stage will experiment again with gold as a medium of exchange.