Sunday, December 30, 2012

The New Depression?

In my final post of 2012, will draw on insights from three heterodox economists to critically examine the state of the global economy. First, Richard Duncan, author of The New Depression, was recently interviewed by the New Left Review (h/t, Fred Lecoq):
You were one of the very few analysts to predict the full enormity of the financial crisis, writing as early as 2003 of a coming credit crunch that would have ramifications throughout the asset-backed securities sector, necessitating giant bail-outs for Fannie Mae, Freddie Mac and financial-insurance companies, and a possible meltdown in the multi-trillion-dollar derivatives market. This prescience was in stark contrast to the complacency of most mainstream economists.
Could you describe how you came to write The Dollar Crisis—what was the course of your intellectual development and what did you learn from your experience as a Far East securities analyst?

I grew up in Kentucky and went to Vanderbilt University. My plan was to go to law school, but I didn’t get in. Plan B was to go to France for a year, picking grapes. I got a job as a chauffeur in Paris, driving rich Americans, and made enough money to backpack around the world for a year, in 1983 and 84. So I was lucky enough to see the world when I was very young. I spent a couple of months in Thailand, Malaysia and Singapore—and even a couple of months there was long enough to realize: go east, young man.

Go east, because?

Economic opportunity. It was obviously booming—there were big skyscrapers going up, and people couldn’t read maps of their own street. So I went back to business school in Boston, at a time when there was of course very little economic growth in the United States. When I finished business school, going to Asia seemed the obvious thing to do. I found a job in Hong Kong, as a securities analyst with a local, Hong Kong–Chinese stock-broking company. This was 1986. In the first twelve months I was there, the Hong Kong stock market doubled—then I woke up one morning and learned that Wall Street had fallen 23 per cent overnight, and Hong Kong immediately fell back to where it had started. By 1990 I had joined James Capel, the oldest and largest UK stock-broking company at that time, and they sent me to Thailand to manage their research department there. We had ten analysts watching all the companies on the Bangkok stock market. At first, there really was something of a Thai miracle—the growth was solid and fundamental. But very quickly, by 1994, it was obviously a bubble and I started being bearish on the market. I wasn’t saying it was going to collapse, but the growth was going to slow down. But it just kept accelerating, and the bubble turned into a balloon. When it did finally pop, in 1997, Thailand’s GDP contracted by 10 per cent and the stock market fell 95 per cent in dollar terms, top to bottom.

So I witnessed at close quarters a very big boom-and-bust cycle, over a very short period of time. And while I was wrong for several years, I had plenty of time to think about why I was wrong. I started reading a lot of macro-economics: Keynes, Schumpeter, Milton Friedman’s monetary history of the US, the classic works. There was also a sort of lightning-flash moment, around 1994. Five years earlier I had taken a group of fund managers on a trip around the Pearl River Delta, from Hong Kong up to Canton, and back down the other side to Macao. What we saw, all along this vast delta, were miles and miles of factories, as far as the eye could see, full of nineteen-year-old girls earning $3 a day. It was in 1994 that the meaning of this really became clear to me: globalization was not going to work. The US would have a bigger and bigger trade deficit, and the American economy would continue to be hollowed out. It was unsustainable—the demographics made it impossible for this system to work. The Dollar Crisis, which came out in 2003, examined the way those global imbalances were blowing bubbles in the trade-surplus economies, and how the money boomeranged back into the US. I came to see that the unlimited credit expansion enabled by the post-gold, post-Bretton Woods international monetary system was where it all began.

Yet you’re not advocating a return to gold?

No. That is, I think that if the US had remained on the gold standard, it wouldn’t now be teetering on the edge of collapse. The global economy would be much smaller than it is; China would look nothing like it does. There would have been much less growth, but it would have been more stable. But now that we’re here, there’s no going back. If the US was to go back, the sort of deflation that would be required to take us there would be absolutely unbearable—like 1926 in Britain. But it’s important to understand what the effects have been of abandoning the automatic adjustment mechanisms inherent in the gold-linked Bretton Woods system and the classical, pre-1914 gold standard—they automatically served to correct large-scale trade imbalances and government deficits. Officially, the international monetary system that emerged after 1973 and the breakdown of Bretton Woods still doesn’t have a name. In the book I called it the ‘dollar standard’, because the US dollar became the medium for the world’s reserve assets, in place of gold. The Dollar Crisis focused on how this system had enabled worldwide credit bubbles to be created. Total international reserves, the best measure of global money supply, soared by almost 2,000 per cent between 1969 and 2000 (Figure 1), with the central banks creating paper money on an unprecedented scale.

The quantity of US dollars in circulation soared (Figure 2). One of the main features of the dollar standard is that it allows the US to incur a huge current-account deficit, as it pays for its imports in dollars—of which the Federal Reserve can print as many as it needs, without having to back them with gold—and then gets these dollars back from its trading partners when they invest them in dollar-denominated assets—Treasury bonds, corporate bonds, equity, mortgage instruments—as they must do, if they are to earn any interest on them. The French economist Jacques Rueff once compared this process to a game of marbles in which, after each round, the winners give their marbles to the losers. The larger the US current-account deficit has become, the larger the amount of dollars that wash back into the US through its equally vast financial-account surplus (Figure 3). The other option for America’s trading partners—the one US pundits are always calling for—would be to exchange the dollars they’re earning for their own currency, which would drive up its value and thereby make their exports too expensive for the US market, knocking them out of the game.
The post-Bretton Woods era had been plagued by financial crises long before 2008—Latin America in the 1980s, Japan in 1990, Scandinavia in 1992, the Asian Crisis of 1997, Russia, Argentina, Brazil, the dot.com bust. What is your explanation for this?

The Austrian economists were basically right in their understanding of the role credit plays. As long as it is expanding, credit will create an artificial boom, driving an upward spiral of economic growth and inflating asset prices, which create further collateral for yet more credit expansion. But the day always comes when ever-faster economic overheating and rising asset prices outstrip the growth of wages and incomes, to such an extent that these can no longer service the interest on the credit. Bubbles always pop and when that happens, it all begins to spiral into reverse: falling consumption, falling asset prices, bankruptcies, business failures, rising unemployment and a financial sector left in tatters. The depression begins—which, according to the Austrians, is the period in which the economy returns to some sort of pre-credit equilibrium. Nothing drops forever; at some point the asset price comes more closely in line with the income of the public, and the economy stabilizes. What changed under the ‘dollar standard’ was the advent of vastly greater quantities of credit, creating harder and faster boom-and-bust cycles. In fact the first boom-and-bust crisis of the post-Bretton Woods era was sparked off in the 1970s, when the New York banks recycled petro-dollars from the OPEC states as loans to South American and African countries, flooding their economies with credit. When the ‘miracle’ booms deflated into busts, this created the Third World debt crisis of the 1980s.

But destabilizing credit creation really took off once the US started to run current-account deficits of over $100 billion, from the early 1980s; a few years later it began running large government budget deficits, too, which it could fund through the resulting financial-account inflows. It could run the deficits because it could print all the dollars it needed. As these dollars entered the banking systems of countries with a current-account surplus against the US, they acted as ‘high-powered money’—that is, the original amount could be lent and re-lent by the banks, many times over—setting off an explosion of credit creation that would generate economic overheating and soaring asset prices, first in Japan in the 1980s, then in the ‘Asian Tiger’ economies in the 90s. In countries like Thailand, in particular, inflows of ‘hot’ capital attracted by the initial growth served to blow the credit bubble even bigger. Eventually, over-investment produced over-capacity and over-supply, followed by a downward spiral of falling profits, bankruptcies and stock-market crashes, leaving their banks laden with non-performing loans and their governments deep in debt. After the 1997 Asian Crisis, a surge of capital inflows washed back into the US, creating the ‘new economy’ stock-market bubble and credit boom there.

Now, there’s no doubt that Japan, for instance, derived tangible economic benefits from its export-led growth. Without the purchasing power that came from its trade surpluses with the US, its economy would have grown at a much slower rate through the 60s and 70s. But what’s less appreciated is the expansionary impact those surpluses had on domestic credit, once they entered Japan’s banking system. It was this that helped inflate the great Japanese bubble economy—the ratio of domestic credit to GDP rose from 135 per cent in 1970 to a massive 265 per cent in 1989. Japan actually tried to export large amounts of capital in the mid-80s, to avoid its economy overheating: after 1985, faced with the sharp appreciation of the yen, there was a big relocation of Japanese manufacturing capacity to other East Asian economies, setting off the growth of the ‘Asian tigers’, Thailand, Indonesia, South Korea, Malaysia (Figure 4). But after so many years of trade surpluses, rising international reserves and swelling money supply, it was impossible to stop a further surge causing drastic overheating in the late 80s. After the Japanese bubble popped in 1990, property prices fell by more than 50 per cent and the stock market by 75 per cent; twenty-two years later, its banks are still laden with bad loans and government debt is the highest in the world—230 per cent of GDP.

What was your assessment of the IMF ’s handling of the Asian Crisis?

I’d left Thailand before the bubble popped in 1997, but after six years studying the market there I felt I had a good understanding of what was happening, so I started calling up the IMF, the World Bank and the US Treasury Department, and harassed them until the IMF hired me as a consultant in May 1998. I flew over to Bangkok with them—there was a group of about thirty people from the IMF and the World Bank, and we all stayed at the very nice Oriental Hotel on the river. For three weeks I got to spend a little time with them, and got a glimpse into how they worked and what their thinking processes were. I have to say I was shocked at how little they seemed to know about Thailand’s economy and the nature of the crisis there. Maybe I’m being a bit unfair, because I’d had so much intensive experience there, but I had assumed that the IMF would be at least as knowledgeable as I was. They were a lot of very intelligent people, who had a great deal of experience in many economies around the world, but they didn’t seem to know much about what was happening in Thailand. At one meeting they decided—without any particularly good reasons that I could determine—to project a 3 per cent contraction for the Thai economy that year. I spent the next week writing reports explaining why I thought the economy would shrink by 9 per cent in 1998 and 9 per cent the following year, if it continued with the same IMF-imposed policies that were being pursued at that time. In the end, the economy did contract by about 10 per cent in 98, but then it rebounded the following year. By that time, they’d reversed many of the initial policies the IMF had demanded in the early days of the crisis. What really made the difference was a massive devaluation of the currency—from 25 baht to the dollar to 50 baht, at one point—which was very helpful in allowing Thailand to grow its way out of the crisis, by exporting into the still relatively booming global economy.

After that, I got a full-time job with the World Bank in Washington for two years, starting in October 98, and that was also very interesting. Both these Bretton Woods institutions had been created to replicate the automatic stabilizers of the gold standard, to help countries to re-establish an overall balance-of-payments equilibrium when they ran out of cash. The end of Bretton Woods and the expansion of global trade imbalances transformed the situation; but at the time of the East Asian crisis, I don’t think the IMF and World Bank quite understood how destabilizing the larger and larger cross-border capital flows had become. They didn’t understand how the capital inflows that had washed into Thailand during the 80s and 90s had completely changed and distorted the economy and blown it into a bubble—and when all the money washed back out, the economy really deflated. It would have destroyed all the banks and all the Thais’ savings, had they carried out the harsh policies that might have been appropriate in the 1950s or 60s.

To go back for a moment to the growing us trade deficit, which you see as at the root of the explosion of credit creation: the deficit had begun to widen in the early 80s, but by 1985 in fact the US was pushing for a sharply lower dollar, as agreed with Germany and Japan in the ‘Plaza Accord’, and this did succeed in boosting American manufacturing and narrowing the deficit. By 1995, though, that policy had gone into reverse. Why do you think the US did not continue to push for a lower dollar over the longer run, and what would the effects have been had it done so?

I’m not sure I know a complete answer to that question. By 1985, the US trade deficit was something like 3.5 per cent of GDP and this was very alarming, not only to US policy-makers but around the world, because the economies of the surplus countries, primarily Japan and Germany, were getting over-heated. The agreement at the Plaza Hotel in 1985 was that the dollar would be devalued against the yen and the mark, and over the next two years the dollar fell by roughly 50 per cent. That was enough to bring the US trade deficit more or less back into balance by around 1990. But by that point, Japan and Germany were no longer the problem. It was the Asian Tigers that were increasingly becoming large exporters to the US, with growing trade surpluses, followed by China. Once China really got going, its trade surplus became larger and larger. But the US didn’t have the same sort of control over China’s currency as it did over Japanese and German policies. In fact in 1994, China had a massive devaluation of its currency, which made the situation much worse in terms of the US trade deficit.

So the rise of China came athwart the us low-dollar export policy?

It’s a very complicated subject, but I think that, as time went by, American industry gave up on American manufacturing, and realized that they could make a profit by manufacturing outside the US in ultra-low-wage countries. And so it began. Eventually, more and more corporations realized that they could do very well by outsourcing. A tipping point came in the early 90s, when it was actually in the interest of major sectors of American society to have a strong dollar and a weak Chinese currency, or weak currencies in all the other countries from which American firms were exporting goods back to the US. The issue with Germany and Japan in the 80s had been different, because the workforces of those countries already had relatively high wages compared to the US. It was really only after the rise of the Asian Tigers, and above all when they were joined by China in the 90s, that American industry realized that it could make a lot more money just by making everything offshore. From 1997, the US deficit widened dramatically (Figure 3, above).

More generally, how has the ‘dollar standard’ affected the US economy itself?

Once the constraint was removed of the US needing to have 25 per cent gold backing for every dollar that it issued, it also lifted any constraint on how much credit could be created. It had been easy for the US to maintain gold backing in the first post-war decades, because it owned most of the world’s gold. But with multinationals relocating industry abroad and growing government spending, it finally came up against that binding constraint in 1968. So Congress simply changed the law, at Johnson’s request, removing any requirement for a gold link. But with no restraint on credit, either, credit growth exploded. Of course, credit and debt are simply two sides of the same coin. In the US, total debt—government, household, corporate and financial-sector debt, combined—expanded from $1 trillion in 1964 to over $50 trillion by 2007 (Figure 5). Credit growth on this scale has been taken for granted as natural; but in fact it is something entirely new under the sun—only made possible because the US broke the link between dollars and gold. This explosion of credit created today’s world. It made Americans much more materially prosperous than we would have been otherwise. It financed Asia’s strategy of export-led growth and it ushered in the age of globalization. Not only did it make the global economy much bigger than it would have been otherwise, it changed the nature of the economic system itself. I would argue that American capitalism has evolved into something different—in my latest book, The New Depression, I call it ‘creditism’.

How would you define the chief features of ‘creditism’?

First, an expanded role for the state. The US government now spends 24 per cent of GDP—one out of every four dollars. All the major industries are state subsidized, one way or another, and half the US population gets some sort of government support. Now, one can argue that capitalism was a 19th-century phenomenon that’s been dead since World War One; but clearly, this is not how capitalism’s supposed to function. Secondly, the central bank now creates the money and manipulates its value. Thirdly, and more interestingly, perhaps, the growth dynamic is entirely different now. Under capitalism, businessmen would invest, some would make a profit, which they’d save, in other words accumulate capital, and repeat: investment, saving, investment, saving. It was slow and difficult, but that was how economic growth worked. But for decades, the growth dynamic of the American economy, and hence increasingly the world economy as a whole, has been driven by credit creation and consumption. Total reserve assets had already swelled by almost 2,000 per cent between the end of Bretton Woods and the late 1990s (see Figure 1, above). Since then, they’ve quintupled (Figure 6).

The problem is that ‘creditism’ can no longer create more growth because the US private sector can’t sustain any more debt. The ratio of household debt to disposable personal income was around 70 per cent, from the mid-60s to the mid-80s; since then, it soared to reach nearly 140 per cent in 2007, on the eve of the crisis (Figure 7). At the same time, median US income is declining and the level of owners’ equity as a percentage of household real estate has plunged to a record low (Figure 8). In 2010, American households owed $13.4 trillion—92 per cent of USGDP (Table 1).

May we press you a bit on this concept of creditism, as a successor to capitalism. Firstly, of course, agencies of credit—banks, factors, money lenders—existed in the 19th century, on quite a large scale. Secondly, capitalism itself has developed through a series of historical phases, but arguably it has never been entirely ‘pure’ and free from state support; it has always been ‘mixed’ to some degree and there have been times when capital was a good deal more constrained than it is today. Nineteenth-century American capitalism was protected by high tariff walls and aided by US military expansionism, conquering territory and resources—iconically, the US Cavalry massacring the indigenous Americans, to clear the way for the railroads. Unprofitable sectors of American industry may be heavily subsidized today, but isn’t it precisely capitalism in general—however wrecked in parts—that Federal funds are supporting? There seems to be an argument for retaining the classical concept, which has been a trusty tool of analysis for both left and right, as long as the broad relations of private capitalist ownership and wage labour still persist. ‘Creditism’ may be a corruption of capitalism, but isn’t capitalism still there, underneath?

Yes and no. In the US, at the biggest level, it’s not, because every major industry is subsidized one way or another, by the government—all the manufacturing that’s still there, much of it related to military spending. All the hospitals and pharmaceutical companies benefit from Medicare and Medicaid. The universities also get subsidies from the government in the medical and military industry. Farmers get subsidies from the government. Price levels are still generally determined by market forces, but government spending directs those market forces—at the bottom, they allow the price system to work, but at the top level it’s all directed and supported by government spending. I think that the biggest impediment to fixing this crisis is the misconception that we have a capitalist economy. Fox News watchers in America all think, red, white and blue, we’re a capitalist economy, the government is evil and there’s nothing it can do that would help the situation.

They don’t understand what a large role the government plays—and that if government spending is reduced, the economy immediately collapses. I think it would help if they understood that we don’t have capitalism to begin with, we have a different kind of economy now. This is not a crisis of capitalism, it’s a crisis of creditism, and we have to work with the system that we have. And while it would be nice to rein in the bankers, if you rein them in too hard it’s going to blow up the whole system—the banks are so worthless that the losses would be enormous, if they were actually exposed; all the savings in the world would be destroyed as the banking sector failed. Creditism as a system requires credit growth to survive, and only the government can provide the credit growth now—the private sector can’t bear any more debt.

So there’s a polemical character to the concept of creditism, in the sense that it’s targeted at a policy level?

Right. And I would like to persuade not only policy-makers, but the general public as well. It’s not impossible to swing public opinion away from where it is now, which is stuck in a very boring debate between austerity and Keynesianism, neither of which, as it’s presented, makes any sense whatsoever.

Another term that’s been applied to this latest stage is ‘financialization’, or financialized capitalism, and it would be interesting to know how you’d compare that to creditism. It’s been suggested that, as the momentum of the American economy began to falter, the government stepped in in the 1990s with a form of privatized Keynesianism, or asset-price Keynesianism: that credit was used, in other words, to maintain the level of demand when it threatened to flag, rather than the big public programmes of classical Keynesianism.

I think that’s probably true, if you look at the way Alan Greenspan encouraged the expansion of credit and the way they all denied there was any kind of bubble: that benefited the bankers and the policy-makers, but it also benefited the people, as long as everything was expanding, because this was against the background of increasing globalization, which put strong downward pressure on US wages. The way to buy off the voting public, who were losing their jobs and not seeing any wage increases, was to make their asset prices go up—their houses increased in value, so they could spend more even if their wages didn’t go up. This worked very nicely for ten or fifteen years, and the authorities seem to have wanted to keep it going even longer—but bubbles always have to pop, in the end. So yes, I think that’s probably right, though it’s hard to know whether this was actually what was planned or whether it just evolved that way, as it could have done, because that was the easiest way to go.

But it’s worth emphasizing that the credit expansion in the US from the 1990s on couldn’t have taken place without the disinflationary impact of manufactured imports from extremely low-wage economies: low inflation permitted low interest rates. The scale of the income gap is enormous: Mexican GDP per capita is around 20 per cent of the US rate; Chinese GDP per capita is only 11 per cent. But another effect of globalization was that the expansion of credit was beginning to produce diminishing returns in economic growth in the US, well before the 2008 crisis. In The New Depression I show how total credit growth has correlated with economic growth in the US since the 1950s (Figure 9). Whenever total credit expanded by less than 2 per cent, the US economy fell into recession—or nearly did, in 1970. But from the early 1980s, the difference between the two growth rates became much more pronounced: total credit soared, but economic growth continued to weaken, cycle by cycle, apart from a slight increase during the late 90s ‘new economy’ boom. Part of the explanation for this must be that while credit growth did stimulate demand, that demand was largely met by imports, so there was little of the multiplier effect that US production would have achieved.

On top of this, the excess productive capacity created by years of credit expansion and capital misallocation has been a further disinflationary factor. It’s easy to increase aggregate supply in an economy: simply increase the flow of credit to the manufacturing sector—this is what happened with the ‘new economy’ boom in the United States (Figure 10). But once industrial capacity is put in place, it doesn’t go away again just because demand for its products doesn’t keep up; instead, excess capacity puts a downward pressure on the price of goods, even as capacity utilization slackens. It’s much more difficult to increase aggregate demand, which is ultimately linked to the public’s purchasing power. Over the past thirty years, the expansion of credit has produced a vast expansion in global industrial productive capacity—witness the Pearl River Delta—but the purchasing power of the world’s population has not risen at anything like the same pace. So we’re facing a glut of industrial capacity on a world scale.

In The Dollar Crisis you suggested a radical solution to the problem of aggregate global demand…

One of the cures I suggested was a global minimum wage, starting with raising the wages of Chinese workers in foreign-owned factories by a dollar a day, every year—it wouldn’t break Apple or Foxconn. To be diplomatic, I suggested that the poor developing countries could form a labour cartel, the way that OPEC has formed an oil cartel; but in reality that wouldn’t work—everyone would cheat. The most effective way to make it happen would be for the US Treasury Secretary to go on TV and announce to the world: if you cannot prove to us that you pay your workers six dollars a day, instead of five, then we’re going to put a 20 per cent tariff on your imports. And we’re going to ask the workers to report on whether it’s really being paid. That was written ten years ago, and if it had been implemented, by now the minimum wage would have tripled, from five dollars to fifteen, and that would have created much more aggregate demand to absorb all of this excess capacity.

So yes, it’s crucial to find a way to increase purchasing power at the bottom of the pyramid—otherwise the world economy will be heading back to what it was like at the beginning of the industrial revolution, when workers only earned subsistence wages and couldn’t afford to buy what they were making. In a sense, that’s the world economy in the age of globalization. As new manufacturing countries enter the world market, especially China, the ability to produce has skyrocketed; but wages don’t go up anymore. They’re going down in the West, and demographic trends, the sheer numbers of young people looking for jobs, don’t let them go up quickly enough in developing countries. That’s at the core of the global crisis. For a good fifteen or twenty years, that gap was filled by inflating US asset prices, which allowed the Americans to withdraw equity and spend it, consume with it, to import and to fill the gap that couldn’t be filled with normal wage income. But now that game seems to be over. Americans can’t sustain any more debt; home prices have dropped 34 per cent, on average, across the US. The only thing that’s filling the gap is government spending—that’s all that’s preventing the US from spiralling into depression.

What have been the US government’s aims in handling the crisis? How would you assess its policies to date?

The aim of US government policy has been to perpetuate the credit expansion, to prevent a collapse. So far it’s more or less been able to sustain the level of total credit market debt (Figure 11). It’s done so by racking up around $5 trillion in budget deficits, which it probably wouldn’t have been able to finance if the Federal Reserve had not printed $2 trillion dollars and injected that into the economy. Initially, in 2007 and 2008, the financial sector bail-out and the $787 billion stimulus for the economy were funded by selling government bonds. But that initial round of support for the financial sector already cost around $1 trillion—some $544 billion in loans to US banks, $118 billion to Bear Stearns and AIG, $333 billion to the Commercial Paper Funding Facility and more. So the Fed began its policy of quantitative easing in November 2008. Of course, QE is a euphemism for fiat money creation: the ‘quantity’ refers to the amount of money in existence, and ‘easing’ means creating more—‘easing’ liquidity conditions. The first round, QE1, was mostly used to relieve the banks and other institutions of mortgage-backed securities. It was expanded in March 2009, from a $600 billion to a $1.75 trillion money-printing programme, through to March 2010. As soon as it stopped, the US economy entered its ‘soft patch’, in summer 2010. By August 2010 Bernanke was hinting at another round, and QE2 was formally announced that November, to run till June 2011. This time the Fed printed $600 billion, which it mostly used to buy government bonds, to fund the budget deficit. With some differences, the same course has pretty much been followed by the ECB and the Bank of England, on a smaller scale.

Given the nature of the debate around the budget deficit in the US, it’s important to stress what the alternative would have been if governments had not jumped in. Total credit would have begun contracting in 2008, when the private sector could no longer cover the interest payments on its debt, and the sort of debt–deflation spiral that Irving Fisher described would have taken hold. The US economy would have already collapsed into a new Great Depression, and with it, the rest of the world. The size of the American economy in GDP is about $16 trillion, and the US budget deficit is $1.3 trillion. So if the government had balanced the budget in 2009—if there had been a balanced-budget constitutional amendment, for example—it would have shrunk to being a $14.7 trillion economy. There would have been an immediate contraction of 13.5 per cent, but with a large multiplier effect, because unemployment would skyrocket, consumption would drop, business profits would plummet and the economy would go into a sharp downward spiral. Now, the argument against huge budget deficits under Bretton Woods or on the gold standard was that government borrowing on such a scale meant pushing up interest rates and crowding out the private sector. But that’s not the case any more. In today’s world, there’s no limit to the amount of money that governments can create—or so it seems. Even though the US has trillion-dollar budget deficits, interest rates are at a historic low; the ten-year bond yield in the US is 1.5 per cent. Never lower. Today, if the US government cuts its spending, there’s no offsetting benefit of lower interest rates—however much government spending is cut by, the economy simply contracts by that amount.

What’s been the impact of quantitative easing on the economy as a whole?

The most important short-term effect has been to allow government spending to support the economy while keeping interest rates low. Another aspect, with QE1 in particular, was that the government bought up toxic assets, like the debt issued by Fannie Mae and Freddie Mac. That allowed the financial sector to deleverage by $1.75 trillion, as it swapped mortgage-backed securities for cash. It didn’t work that way in Britain, because the Bank of England didn’t buy assets like that from the banking system, it only bought government bonds. So the British financial sector is still very highly leveraged, whereas in the US it is much less leveraged than it was. Thirdly, every round of quantitative easing drives up the stock market and commodity prices (Figure 12). To some extent higher stock prices create a positive wealth effect, which supports the economy; some sectors will benefit from higher food prices—Mid-West agribusiness, for example—but it’s bad for American consumers; the same goes for the rising price of oil.

Since around 2011, I’d say the costs of QE have been starting to overtake its benefits, which are subject to diminishing returns. Quantitative easing has created food-price inflation that is very harmful for the two billion people who live on less than $2 a day. I’ve read that global food prices went up 60 per cent during QE2, and this was one of the factors that sparked off the Arab Spring. The oil-price spike has been very negative for the US economy; the 2011 slowdown in US consumption was due to higher food and oil prices. It comes back to the old quantity theory of money: if you increase the quantity of money, prices go up. So far, this has barely affected manufactured goods because of the huge deflationary impact of globalization and the 95 per cent drop in the marginal cost of labour that it’s brought. So we don’t see any CPI inflation, because of this offsetting deflationary force. But food prices have gone up everywhere. If the dollar price of food goes up—if rice prices go up in dollars—then rice prices go up everywhere in the world, because otherwise they’d just sell into the dollar market. So if US rice prices go up, Thai rice prices go up. And when the Fed prints dollars, food prices go up. That’s the main drawback, the one real big problem of QE—otherwise it’d be a great thing: print money, make the stock market go up, everybody’s rich and happy. But it has this impact of creating food-price inflation.

What effect has it had on profits and investment? US business profits have been hitting 15 per cent this year, according to the Economist, but corporations seem to be sitting on cash mountains that aren’t being used.

Yes, profits are very high, first of all because labour is getting a lower and lower share. Also, as a percentage of GDP, US corporate tax last year was the lowest it has been since the 1950s. In total, the tax revenue for the country as a whole was under 15 per cent of GDP, which is, again, the lowest since the 1950s. So, yes, corporate profits have been exceptionally good, although this quarter, suddenly everyone’s concerned that they may be dropping. But there’s a fundamental problem: there are no viable investment opportunities. So much credit has been expended and so much capacity built that we already have too much of everything relative to the amount of income, as it’s currently distributed, to absorb it. If you invest more, you’re going to lose your money; if you take your corporate cash-flow every year and buy government bonds, you can preserve your money for a better day—but that helps push down bond yields to these historic low levels. That’s why, even in Japan, after two decades of massive fiscal deficits, the ten-year government bond yield is only 0.8 per cent; in Germany, it’s 1.2 per cent; US, 1.5 per cent; UK, around 1.6 per cent. They’ve never been lower, and this is part of the reason. When bubbles pop, there’s no place to invest the money profitably, so it’s better to put it in government bonds.

What are the options, over the longer term?

I think there are three ways forward for the US economy—three paths policy makers could take. Option one is what the libertarians and Tea Party people want: balance the budget. That would result in immediate depression and collapse, the worst possible scenario. The second option is what I call the Japan model. When Japan’s great economic bubble popped twenty-two years ago, the Japanese government started running very large budget deficits, and have done that now for twenty-two years. The total amount of government debt to GDP has increased from 60 per cent to 240 per cent of GDP. That’s effectively what the US and British governments are doing now: running massive budget deficits to keep the economy from collapsing. They can carry on doing this for another five years with very little difficulty, and maybe even for ten years. The US government debt is only 100 per cent of GDP, so they could carry on for another five years and still not hit 150 per cent. But though it’s not clear how high it can go, it can’t go on forever. Sooner or later—say, ten or fifteen years from now—the US government will be just as bankrupt as Greece, and the American economy will collapse into a new Great Depression. So, that’s option two. It’s better than option one, because it’s better to die ten years from now than to die now; but it’s not ideal.

Option number three is for the US government to keep borrowing and spending aggressively, as they’re doing now, but to change the way they spend. Rather than spending it on too much consumption, and on war, for instance—the US government has so far spent $1.4 trillion invading Iraq and Afghanistan—they should invest it; not just in patching up the roads and the bridges, but invest it very aggressively in transformative 21st-century technologies like renewable energy, genetic engineering, biotechnology and nanotechnology, on a huge scale. The US government could put a trillion dollars into each of these industries over the next ten years—have a plan to develop these new sectors. A trillion dollars, let’s say, in solar energy over the next ten years: I’m not talking about building solar panels for sale in the market; I’m talking about carpeting the Nevada desert with solar panels, building a grid coast-to-coast to transmit it; converting the automobile industry to electricity, replacing all the gas stations with electric charging stations, and developing new technology to make electric cars run at 70 miles an hour. Then, ten years from now, the US will have free, limitless energy. Trade will come back into balance, because we won’t have to import any foreign oil, and the US will be able to spend $100 billion less a year on the military, because it won’t have to defend Gulf oil. The US government could tax the domestically generated electricity, and help bring down the budget deficit; and the cost of energy to the private sector would probably fall by 75 per cent—that in itself could set off a wave of private-sector innovation that would generate new prosperity.

If the US government invested a trillion dollars in genetic engineering, it’s probable they could create medical miracles: a cancer cure, or ways to slow the metabolic processes of ageing. We need to think in terms of peace-time Manhattan Projects: bring together all the best brains, the best technology, set them targets; use ‘creditism’ to produce results. We can all now see the flaws in creditism—they’re obvious. But as a society, I think the US is overlooking the opportunities that exist within this new economic system—the opportunity for the government to borrow massive amounts of money at 1.5 per cent interest and invest it aggressively in transformative technologies that restructure the US economy, so that it can get off its debilitating dependence on the financial sector, which has developed into a giant Ponzi scheme, before it all collapses. If not, then the US economy is likely to go down sooner or later into a lethal debt–deflation spiral.

Presumably this ‘creditist’ strategy could only apply to the US economy, though?

Not necessarily. For example, the Bank of England has printed so much money to buy up government bonds that it now owns more than a third of Britain’s entire debt. Now, it didn’t cost the Bank a single penny to buy all those bonds—it didn’t even have to buy any paper or ink to print the money; it’s all electronic now. So why not just cancel them? It wouldn’t cost anybody a thing; even if somehow it bankrupted the Bank of England, it could just print more money to recapitalize itself. Overnight, Britain would have a third less outstanding government debt and its credit rating would improve enormously. The government would announce that it was going to take advantage of this historic opportunity to increase government spending and invest it in new industries, so that Britain can finally wean itself off its debilitating dependence on Ponzi finance and develop manufacturing industries again. For example: throw $100 billion at Cambridge to invest in genetic engineering over the next three years, to become the dominant genetic-technology force on earth. Meanwhile create jobs and fix the infrastructure, at the same time.

But wouldn’t these new industries be subject to the same relative lack of aggregate demand?

Well, there would be no lack of demand for a molecular therapy that slows down ageing or cures a killer disease. The point would be to aim for technological breakthroughs that are completely transformative, like the agricultural technological revolution in the 1960s that changed the nature of global food production. In some respects this is an unprecedented opportunity because of the amount of money that governments could invest now, when interest rates are at such low levels. If they directed them into transformative technologies, they could create markets for products that just don’t exist at all now, where there would be demand. If we could actually shift the US economy from oil to solar, that would free up a lot of money that could be spent on other things. Polemically, if you like, the point is to stress that we can’t just wait for an old-fashioned cyclical recovery—it’s not going to come. We have a new kind of economic system, and either we master this system and take ultimate advantage of its opportunities to borrow and invest, or else it collapses into a severe depression, unwinding a $50 trillion expansion of credit. It’s going to be at least as bad as the 1930s.

In 2003 you called the Chinese economy a bubble waiting to pop. How do you see it today?

An even bigger bubble waiting to pop. When I wrote The Dollar Crisis, China’s trade surplus with the US was $80 billion a year; now it’s $300 billion a year, but the US can’t keep expanding its trade deficits, and that means China’s trade surplus is going to flatten out, creating a much more difficult environment there. In 2009, when the trade surplus corrected quite significantly, the headline was: 20 million factory workers lose their jobs and head back home to the countryside to grow rice. That almost popped the whole bubble then and there. The Chinese government’s policy response was to let Chinese banks increase total system bank loans by 60 per cent over the next two years. As a result of this massive stimulus, everybody borrowed money and property prices soared. But now, three or four years on, no one can repay the money, the banking system must be on the verge of collapse—although officially, non-performing loans are reported to be extremely low—and will have to be bailed out by the government. The whole China model is in serious trouble: they’ve been expanding industrial production by 20 per cent a year for decades, and now there’s massive excess capacity in every sector. The Americans can’t buy any more of it, and 80 per cent of the Chinese earn less than $10 a day, so they can’t buy what they’re making in their own factories. If they continue expanding their industrial production, the problem is only going to get worse. I think they’re going to have to follow the Japan model as well, and have very big government budget deficits to keep the economy from collapsing into a depression; if they do that aggressively, in a best-case scenario China can perhaps achieve 3 per cent growth a year on average for the next ten years.

Nevertheless, there is a potential market for first-generation purchases of cars and washing machines that’s still to be realized, on a massive scale—hundreds of millions of people. Isn’t that still ahead?

Not necessarily, unless Chinese wages go up—because people who earn $10 a day can’t afford a washing machine; even if they could, their flat wouldn’t be big enough to fit a washing machine. And the challenge is, if Chinese wages ever went to the astronomical level of $15 a day, then there are 500 million people in India who will work for $5 a day, and the jobs will move there. So there’s a real danger of a race to the bottom, unless we can agree on a global minimum wage.

How do you see the current state of the US banking sector? In August this year the New York Times was sounding the alarm about the fact that the cartel of the big banks was the sole regulator for the $700 trillion derivatives market, although it seems to have fallen silent again now.

One way of approaching this is, whoever creates the wealth has the political power. Under feudalism, power lay with the landed aristocracy. Under industrial capitalism, the captains of industry controlled political power. But in the last few decades, wealth in the US has come from credit creation. As bankers created more and more wealth, they became increasingly influential, politically; by the late 1990s they were unstoppable. First they repealed Glass–Steagall and then, the following year, they passed something called the Commodity Futures Modernization Act, which removed the regulations from the derivatives market and allowed them to trade over the counter with almost no regulation whatsoever. Since 1990, the total amount of derivatives contracts has increased from $10 trillion, which was already a very big number, to $700 trillion—the equivalent of $100,000 per person on earth, or global GDP for the last twenty years combined. There is nothing in the world you can hedge with that many derivatives contracts; the system has become increasingly surreal. You can imagine how much profit the banks make from $700 trillion—first from creating the derivatives, then from trading them and using them for structured finance.

Derivatives are basically used as gambling vehicles: you can gamble on the direction of interest rates or commodities, or anything else; if you actually want to hedge something, you can take out insurance by hedging it that way. But most of the trading is not between the real sectors of the economy; around two-thirds of it is done between the banks themselves. Ninety per cent of derivatives contracts trade over the counter, which means no regulator can see what’s going on; but 10 per cent of them do trade through exchanges, so we know something about them. The last time I looked, the average daily turnover for that 10 per cent—the amount they changed hands for, every day—was $4 trillion. Now, if the other 90 per cent traded as much—and it could be more, it could be less, I don’t know—that would be something like $40 trillion of turnover a day. If there were even a very tiny tax on each of these derivatives transactions, the government would have an enormous source of revenue, a tax that other people wouldn’t have to pay. Most of the trading is done in London and New York, so there’s no problem about relocation—the threat that all this business will move to China; the Chinese don’t let their banks do crazy things like this. Every major accounting scandal for the past twenty years—Fannie Mae, Freddie Mac, General Electric—has involved structured finance, with the culprits using derivatives to manipulate their accounts to avoid paying taxes; the bankers make a big fee on that. Given what we know about unregulated markets and the incentive structure of the banking industry, it seems unlikely that in this $700 trillion unregulated market there wouldn’t be every kind of fraud and shenanigan taking place. If you were a major oil-producing Gulf state, for example—not to name names—why would you not manipulate the price of oil, with the help of one of the large US investment banks and/or one of the major oil multinationals, when no one can see what you’re doing? You write contracts that push up the oil price, and the futures price pulls up the spot price. Most of the commodities are probably being manipulated this way, oil being the most obvious one.

Didn’t the Dodd–Frank law aim to put an end to over-the-counter derivatives trading?

Dodd–Frank required the banks to put all the derivatives through exchanges by the middle of 2011—more than a year ago. But it keeps getting pushed back to some unspecified date in the future. Somewhere along the way, the regulators may have realized that, if you actually put them all through exchanges, it would reveal such a degree of fraud and corruption that the whole system would implode. The actual net worth of the banks could turn out to be something like minus $30 trillion—that’s why they don’t break them apart; they’re too big to fail, because they’re too bankrupt for the government to take them on. They should be made to trade through exchanges and also to have proper margins on both sides, just like when people have an account with a stock broker; it’s okay to borrow money, but you have to have a certain degree of margin; and then, if anyone gets in trouble, they have enough margin to cover their losses or cut their positions. As it is now, there’s no exchange, so there’s no transparency—no one can see who’s doing what or why—and there are no margins. The industry complains that having to put up margins will be so expensive, it will damage their business. It’s like saying, I have to pay health insurance and I have to insure my house, that damages my business—but that’s the price of insurance. You don’t have insurance for free, you have to pay for it. But, of course, the industry is fighting this tooth and nail, because if they can no longer create more credit, because the private sector can’t take on more debt; and if they’re actually forced to stop proprietary trading on their own accounts, as the Volcker Rule requires; and if they are forced to put their derivatives through exchanges—then suddenly they will not be the major source of wealth-creation anymore, and their hold on political power will be greatly weakened. They are desperately trying to maintain their wealth-creating abilities in a very difficult environment. Creditism is much less stable, or sustainable, than industrial capitalism—and it seems to be teetering on the edge of collapse.

So there’s no hope that banking legislation will reform the sector? You’d argue that the banking system has to be propped up, because it would be such a global disaster if it was restructured?

I wouldn’t say I’m entirely hopeless about it, but it’s very difficult, because they’d have to find a way to restructure the banking system that doesn’t cause it to collapse completely, and I’m not sure there is such a formula. I don’t know what’s going to happen to the banking system. It’s not clear how they’re going to make any profits if they can’t continue to increase credit and can’t expand the unregulated portion of the derivatives market at an exponential rate anymore. The problem is that if the banking system went down, it would destroy so much credit that everything would collapse, just as it collapsed when the money supply was destroyed in 1930 and 31. Now it’s the credit supply that the policy-makers are determined not to allow to contract, for the same reason. So I don’t think any of the European banks are going to be allowed to fail. In November 2011 there was a lot of talk about the French banks going under, but it was clear that either the ECB or the IMF would bail them out. Or else, if no one else, the Fed would bail out Société Générale (for instance)—because if Soc Gen falls, Deutsche Bank’s going to fall, and then J. P. Morgan. They’re all going to fall together. So you might as well just bail out Soc Gen—it’ll be a whole lot cheaper than the Fed trying to bail out everybody. They have no choice. Sure enough, the ECB did a back-flip, printed a trillion new euros, and bailed everybody out. That’s what they’re going to continue to do as long as they can do it, because otherwise they know we’re going to collapse into the 1930s.

Working out positive and negative forms of creditism—this seems to be the crux of what you’re saying. This is the system we’ve got, but what we have to do is take control of it, and submit it to debt-forgiveness programmes and rational investment strategies that have a promise of being productive.

Exactly right. I think we can do better this time.
Second, Michael Hudson, author of the Bubble and Beyond, wrote another superb piece, America’s Deceptive 2012 Fiscal Cliff:
When World War I broke out in August 1914, economists on both sides forecast that hostilities could not last more than about six months. Wars had grown so expensive that governments quickly would run out of money. It seemed that if Germany could not defeat France by springtime, the Allied and Central Powers would run out of savings and reach what today is called a fiscal cliff and be forced to negotiate a peace agreement.

But the Great War dragged on for four destructive years. European governments did what the United States had done after the Civil War broke out in 1861 when the Treasury printed greenbacks. They paid for more fighting simply by printing their own money. Their economies did not buckle and there was no major inflation. That would happen only after the war ended, as a result of Germany trying to pay reparations in foreign currency. This is what caused its exchange rate to plunge, raising import prices and hence domestic prices. The culprit was not government spending on the war itself (much less on social programs).

But history is written by the victors, and the past generation has seen the banks and financial sector emerge victorious. Holding the bottom 99% in debt, the top 1% are now in the process of subsidizing a deceptive economic theory to persuade voters to pursue policies that benefit the financial sector at the expense of labor, industry, and democratic government as we know it.

Wall Street lobbyists blame unemployment and the loss of industrial competitiveness on
government spending and budget deficits – especially on social programs – and labor’s demand to share in the economy’s rising productivity. The myth (perhaps we should call it junk economics) is that (1) governments should not run deficits (at least, not by printing their own money), because (2) public money creation and high taxes (at lest on the wealthy) cause prices to rise. The cure for economic malaise (which they themselves have caused), is said to be less public spending, along with more tax cuts for the wealthy, who euphemize themselves as “job creators.” Demanding budget surpluses, bank lobbyists promise that banks can provide the economy with enough purchasing power to grow. Then, when this ends in crisis, they insist that austerity can squeeze out enough income to enable private-sector debts to be paid.

The reality is that when banks load the economy down with debt, this leaves less to spend on domestic goods and services while driving up housing prices (and hence the cost of living) with reckless credit creation on looser lending terms. Yet on top of this debt deflation, bank lobbyists urge fiscal deflation: budget surpluses rather than pump-priming deficits. The effect is to further reduce private-sector market demand, shrinking markets and employment. Governments fall deeper into distress, and are told to sell off land and natural resources, public enterprises, and other assets. This creates a lucrative market for bank loans to finance privatization on credit. This explains why financial lobbyists back the new buyers’ right to raise the prices they charge for basic needs, creating a united front to endorse rent extraction. The effect is to enrich the financial sector owned by the 1% in ways that indebt and privatize the economy at large – individuals, business and the government itself.

This policy was exposed as destructive in the late 1920s and early 1930s when John Maynard Keynes, Harold Moulton and a few others countered the claims of Jacques Rueff and Bertil Ohlin that debts of any magnitude could be paid if governments would impose deep enough austerity and suffering. This is the doctrine adopted by the International Monetary Fund to impose on Third World debtors since the 1960s, and by European neoliberals defending creditors imposing austerity on Ireland, Greece, Spain and Portugal.

This pro-austerity mythology aims to distract the public from asking why peacetime governments can’t simply print the money they need. Given the option of printing money instead of levying taxes, why do politicians only create new spending power for the purpose of waging war and destroying property, not to build or repair bridges, roads and other public infrastructure? Why should the government tax employees for future retirement payouts, but not Wall Street for similar user fees and financial insurance to build up a fund to pay for future bank over-lending crises? For that matter, why doesn’t the U.S. Government print the money to pay for Social Security and medical care, just as it created new debt for the $13 trillion post-2008 bank bailout? (I will return to this question below.)

The answer to these questions has little to do with markets, or with monetary and tax theory. Bankers claim that if they have to pay more user fees to pre-fund future bad-loan claims and deposit insurance to save the Treasury or taxpayers from being stuck with the bill, they will have to charge customers more – despite their current record profits, which seem to grab everything they can get. But they support a double standard when it comes to taxing labor.

Shifting the tax burden onto labor and industry is achieved most easily by cutting back public spending on the 99%. That is the root of the December 2012 showdown over whether to impose the anti-deficit policies proposed by the Bowles-Simpson commission of budget cutters whom President Obama appointed in 2010. Shedding crocodile tears over the government’s failure to balance the budget, banks insist that today’s 15.3% FICA wage withholding be raised – as if this will not raise the break-even cost of living and drain the consumer economy of purchasing power. Employers and their work force are told to save in advance for Social Security or other public programs. This is a disguised income tax on the bottom 99%, whose proceeds are used to reduce the budget deficit so that taxes can be cut on finance and the 1%. To paraphrase Leona Helmsley’s quip that “Only the little people pay taxes,” the post-2008 motto is that only the 99% have to suffer losses, not the 1% as debt deflation plunges real estate and stock market prices to inaugurate a Negative Equity economy while unemployment rates soar.

There is no more need to save in advance for Social Security than there is to save in advance to pay for war. Selling Treasury bonds to pay for retirees has the identical monetary and fiscal effect of selling newly printed securities. It is a charade – to shift the tax burden onto labor and industry. Governments need to provide the economy with money and credit to expand markets and employment. They do this by running budget deficits, and this can be done by creating their own money. That is what banks oppose, accusing it of leading to hyperinflation rather than help economies grow.

Their motivation for this wrong accusation is self-serving and their logic is deceptive. Bankers always have fought to block government from creating its own money – at least under normal peacetime conditions. For many centuries, government bonds were the largest and most secure investment for the financial elites that hold most savings. Investment bankers and brokers monopolized public finance, at substantial underwriting commissions. The market for stocks and corporate bonds was rife with fraud, dominated by insiders for the railroads and great trusts being organized by Wall Street, and the canal ventures organized by French and British stockbrokers.

However, there was little alternative to governments creating their own money when the costs of waging an international war far exceeded the volume of national savings or tax revenue available. This obvious need quieted the usual opposition mounted by bankers to limit the public monetary option. It shows that governments can do more under force majeur emergencies than under normal conditions. And the September 2008 financial crisis provided an opportunity for the U.S. and European governments to create new debt for bank bailouts. This turned out to be as expensive as waging a war. It was indeed a financial war. Banks already had captured the regulatory agencies to engage in reckless lending and a wave of fraud and corruption not seen since the 1920s. And now they were holding economies hostage to a break in the chain of payments if they were not bailed out for their speculative gambles, junk mortgages and fraudulent loan packaging.

Their first victory was to disable the ability – or at least the willingness – of the Treasury, Federal Reserve and Comptroller of the Currency to regulate the financial sector. Goldman Sachs, Citicorp and their fellow Wall Street giants hold veto power the appointment of key administrators at these agencies. They used this beachhead to weed out nominees who might not favor their interests, preferring ideological deregulators in the stripe of Alan Greenspan and Tim Geithner. As John Kenneth Galbraith quipped, a precondition for obtaining a central bank post is tunnel vision when it comes to understanding that governments can create their credit as readily as banks can. What is necessary is for one’s political loyalties to lie with the banks.

In the post-2008 financial wreckage it took only a series of computer keystrokes for the U.S. Government to create $13 trillion in debt to save banks from suffering losses on their reckless real estate loans (which computer models pretended would make banks so rich that they could pay their managers enormous salaries, bonuses and stock options), insurance bets gone bad (underpricing risk to win business to pay their managers enormous salaries and bonuses), arbitrage gambles and outright fraud (to give the illusion of earnings justifying enormous salaries, bonuses and stock options). The $800 billion Troubled Asset Relief Program (TARP) and $2 trillion of Federal Reserve “cash for trash” swaps enabled the banks to continue their remuneration of executives and bondholders with hardly a hiccup – while incomes and wealth plunged for the remaining 99% of Americans.

A new term, Casino Capitalism, was coined to describe the transformation that finance capitalism was undergoing in the post-1980 era of deregulation that opened the gates for banks to do what governments hitherto did in time of war: create money and new public debt simply by “printing it” – in this case, electronically on their computer keyboards.

Taking the insolvent Fannie Mae and Freddie Mac mortgage financing agencies onto the public balance sheet for $5.2 trillion accounted for over a third of the $13 trillion bailout. This saved their bondholders from having to suffer losses from the fraudulent appraisals on the junk mortgages with which Countrywide, Bank of America, Citibank and other “too big to fail” banks had stuck them. This enormous debt increase was done without raising taxes. In fact, the Bush administration cut taxes, giving the largest cuts to the highest income and wealth brackets who were its major campaign contributors. Special tax privileges were given to banks so that they could “earn their way out of debt” (and indeed, out of negative equity).[1] The Federal Reserve gave a free line of credit (Quantitative Easing) to the banking system at only 0.25% annual interest by 2011 – that is, one quarter of a percentage point, with no questions asked about the quality of the junk mortgages and other securities pledged as collateral at their full face value, which was far above market price.

This $13 trillion debt creation to save banks from having to suffer a loss was not accused of threatening economic stability. It enabled them to resume paying exorbitant salaries and bonuses, dividends to bondholders and also to pay counterparties on casino-capitalist arbitrage bets. These payments have helped the 1% receive a reported 93% of the gains in income since 2008. The bailout thus polarized the economy, giving the financial sector more power over labor and consumers, industry and the government than has been the case since the late 19th-century Gilded Age.

All this makes today’s financial war much like the aftermath of World War I and countless earlier wars. The effect is to impoverish the losers, appropriate hitherto public assets for the victors, and impose debt service and taxes much like levying tribute. “The financial crisis has been as economically devastating as a world war and may still be a burden on ‘our grandchildren,’” Bank of England official Andrew Haldane recently observed. “‘In terms of the loss of incomes and outputs, this is as bad as a world war.’ he said. The rise in government debt has prompted calls for austerity – on the part of those who did not receive the giveaway. ‘It would be astonishing if people weren’t asking big questions about where finance has gone wrong.’”[2]

But as long as the financial sector is winning its war against the economy at large, it prefers that people believe that There Is No Alternative. Having captured mainstream economics as well as government policy, finance seeks to deter students, voters and the media from questioning whether the financial system really needs to be organized in the way it is. Once such a line of questioning is pursued, people may realize that banking, pension and Social Security systems and public deficit financing do not have to be organized in the way they are. There are better alternatives to today’s road to austerity and debt peonage.

Today’s financial war against the economy at large

Today’s economic warfare is not the kind waged a century ago between labor and its industrial employers. Finance has moved to capture the economy at large, industry and mining, public infrastructure (via privatization) and now even the educational system. (At over $1 trillion, U.S. student loan debt came to exceed credit-card debt in 2012.) The weapon in this financial warfare is no larger military force. The tactic is to load economies (governments, companies and families) with debt, siphon off their income as debt service and then foreclose when debtors lack the means to pay. Indebting government gives creditors a lever to pry away land, public infrastructure and other property in the public domain. Indebting companies enables creditors to seize employee pension savings. And Indebting labor means that it no longer is necessary to hire strikebreakers to attack union organizers and strikers.

Workers have become so deeply indebted on their home mortgages, credit cards and other bank debt that they fear to strike or even to complain about working conditions. Losing work means missing payments on their monthly bills, enabling banks to jack up interest rates to levels that used to be deemed usurious. So debt peonage and unemployment loom on top of the wage slavery that was the main focus of class warfare a century ago. And to cap matters, credit-card bank lobbyists have rewritten the bankruptcy laws to curtail debtor rights, and the referees appointed to adjudicate disputes brought by debtors and consumers are subject to veto from the banks and businesses that are mainly responsible for inflicting injury.

The aim of financial warfare is not merely to acquire land, natural resources and key infrastructure rents as in military warfare; it is to centralize creditor control over society. In contrast to the promise of democratic reform nurturing a middle class a century ago, we are witnessing a regression to a world of special privilege in which one must inherit wealth in order to avoid debt and job dependency.

The emerging financial oligarchy seeks to shift taxes off banks and their major customers (real estate, natural resources and monopolies) onto labor. Given the need to win voter acquiescence, this aim is best achieved by rolling back everyone’s taxes. The easiest way to do this is to shrink government spending, headed by Social Security, Medicare and Medicaid. Yet these are the programs that enjoy the strongest voter support. This fact has inspired what may be called the Big Lie of our epoch: the pretense that governments can only create money to pay the financial sector, and that the beneficiaries of social programs should be entirely responsible for paying for Social Security, Medicare and Medicaid, not the wealthy. This Big Lie is used to reverse the concept of progressive taxation, turning the tax system into a ploy of the financial sector to levy tribute on the economy at large.

Financial lobbyists quickly discovered that the easiest ploy to shift the cost of social programs onto labor is to conceal new taxes as user fees, using the proceeds to cut taxes for the elite 1%. This fiscal sleight-of-hand was the aim of the 1983 Greenspan Commission. It confused people into thinking that government budgets are like family budgets, concealing the fact that governments can finance their spending by creating their own money. They do not have to borrow, or even to tax (at least, not tax mainly the 99%).

The Greenspan tax shift played on the fact that most people see the need to save for their own retirement. The carefully crafted and well-subsidized deception at work is that Social Security requires a similar pre-funding – by raising wage withholding. The trick is to convince wage earners it is fair to tax them more to pay for government social spending, yet not also to ask the banking sector to pay similar a user fee to pre-save for the next time it itself will need bailouts to cover its losses. Also asymmetrical is the fact that nobody suggests that the government set up a fund to pay for future wars, so that future adventures such as Iraq or Afghanistan will not “run a deficit” to burden the budget. So the first deception is to treat only Social Security and medical care as user fees. The second is to aggravate matters by insisting that such fees be paid long in advance, by pre-saving.

There is no inherent need to single out any particular area of public spending as causing a budget deficit if it is not pre-funded. It is a travesty of progressive tax policy to only oblige workers whose wages are less than (at present) $105,000 to pay this FICA wage withholding, exempting higher earnings, capital gains, rental income and profits. The raison d’être for taxing the 99% for Social Security and Medicare is simply to avoid taxing wealth, by falling on low wage income at a much higher rate than that of the wealthy. This is not how the original U.S. income tax was created at its inception in 1913. During its early years only the wealthiest 1% of the population had to file a return. There were few loopholes, and capital gains were taxed at the same rate as earned income.

The government’s seashore insurance program, for instance, recently incurred a $1 trillion liability to rebuild the private beaches and homes that Hurricane Sandy washed out. Why should this insurance subsidy at below-commercial rates for the wealthy minority who live in this scenic high-risk property be treated as normal spending, but not Social Security? Why save in advance by a special wage tax to pay for these programs that benefit the general population, but not levy a similar “user fee” tax to pay for flood insurance for beachfront homes or war? And while we are at it, why not save another $13 trillion in advance to pay for the next bailout of Wall Street when debt deflation causes another crisis to drain the budget?

But on whom should we levy these taxes? To impose user fees for the beachfront reconstruction would require a tax falling mainly on the wealthy owners of such properties. Their dominant role in funding the election campaigns of the Congressmen and Senators who draw up the tax code suggests why they are able to avoid prepaying for the cost of rebuilding their seashore property. Such taxation is only for wage earners on their retirement income, not the 1% on their own vacation and retirement homes.

By not raising taxes on the wealthy or using the central bank to monetize spending on anything except bailing out the banks and subsidizing the financial sector, the government follows a pro-creditor policy. Tax favoritism for the wealthy deepens the budget deficit, forcing governments to borrow more. Paying interest on this debt diverts revenue from being spent on goods and services. This fiscal austerity shrinks markets, reducing tax revenue to the brink of default. This enables bondholders to treat the government in the same way that banks treat a bankrupt family, forcing the debtor to sell off assets – in this case the public domain as if it were the family silver, as Britain’s Prime Minister Harold MacMillan characterized Margaret Thatcher’s privatization sell-offs.

In an Orwellian doublethink twist this privatization is done in the name of free markets, despite being imposed by global financial institutions whose administrators are not democratically elected. The International Monetary Fund (IMF), European Central Bank (ECB) and EU bureaucracy treat governments like banks treat homeowners unable to pay their mortgage: by foreclosing. Greece, for example, has been told to start selling off prime tourist sites, ports, islands, offshore gas rights, water and sewer systems, roads and other property.

Sovereign governments are, in principle, free of such pressure. That is what makes them sovereign. They are not obliged to settle public debts and budget deficits by asset selloffs. They do not need to borrow more domestic currency; they can create it. This self-financing keeps the national patrimony in public hands rather than turning assets over to private buyers, or having to borrow from banks and bondholders.

Why today’s fiscal squeeze adds to the economy’s costs and imposes needless austerity

The financial sector promises that privatizing roads and ports, water and sewer systems, bus and railroad lines (on credit, of course) is more efficient and will lower the prices charged for their services. The reality is that the new buyers put up rent-extracting tollbooths on the infrastructure being sold. Their break-even costs include the high salaries and bonuses they pay themselves, as well as interest and dividends to their creditors and backers, spending on stock buy-backs and political lobbying.

Public borrowing creates a dependency that shifts economic planning to Wall Street and other financial centers. When voters resist, it is time to replace democracy with oligarchy. “Technocratic” rule replaces that of elected officials. In Europe the IMF, ECB and EU troika insists that all debts must be paid, even at the cost of austerity, depression, unemployment, emigration and bankruptcy. This is to be done without violence where possible, but with police-state practices when grabbers find it necessary to quell popular opposition.

Financializing the economy is depicted as a natural way to gain wealth – by taking on more debt. Yet it is hard to think of a more highly politicized policy, shaped as it is by tax rules that favor bankers. It also is self-terminating, because when public debt grows to the point where investors (“the market”) no longer believe that it can be repaid, creditors mount a raid (the military analogy is appropriate) by “going on strike” and not rolling over existing bonds as they fall due. Bond prices fall, yielding higher interest rates, until governments agree to balance the budget by voluntary pre-bankruptcy privatizations.

Selling saved-up Treasury bonds to fund public programs is like new deficit borrowing

If the aim of America’s military spending around the world is to prepare for future warfare, why not aim at saving up a fund of $10 trillion or even $30 trillion in advance, as with Social Security, so that we will have the money to pay for it?

The answer is that selling saved-up Treasury bills to finance Social Security, military spending or any other program has the same monetary and price effect as issuing new Treasury bills. The impact on financial markets – and on the private sector’s holding of government debt – by paying Social Security out of past savings – that is, by selling the Treasury securities in which Social Security funds are invested – is much like borrowing by selling new securities. It makes little difference whether the Treasury sells newly printed IOUs, or sells bonds that it has been accumulating in a special fund. The effect is to increase public debt owed to the financial sector.

If the savings are to be invested in Treasury bonds (as is the case with Social Security), will this pay for tax cuts elsewhere in the budget? If so, will these cuts be for the wealthy 1% or the 99%? Or, will the savings be invested in infrastructure, or turned over to states and cities to help balance their budget shortfalls and underfunded pension plans?

Another problem concerns who should pay for this pre-saving. The taxes needed to pre-fund a savings build-up siphon off income from somewhere in the economy. How much will the economy shrink by diverting income from being spent on goods and services? And whose income will taxed? These questions illustrate how politically self-interested it is to single out taxing wages to save for Social Security in contrast to war-making and beach-house rebuilding.

Government budgets usually are designed to be in balance under normal peacetime conditions, so most public debt has been brought into being by war (prior to today’s financial war of slashing taxes on the wealthy). Adam Smith’s Wealth of Nations (Book V) traced how each new British bond issue to raise funds for a military action had a dedicated tax to pay its interest charges. The accumulation of such war debts thus raised the cost of living and hence the break-even price of labor. To prevent this from undercutting of British competitiveness, Smith urged that wars be waged on a pay-as-you-go basis – by full taxation rather than by borrowing and entailing interest payments and taxes (as the debt itself rarely was amortized). Smith thought that populations should feel the cost of war directly and immediately, presumably leading them to be vigilant in checking grandiose projects of empire.

The United States issued fiat greenback currency to pay for much of its Civil War, but also issued bonds. In analyzing this war finance the Canadian-American astronomer and monetary theorist Simon Newcomb pointed out that all wars must be paid for in the form of tangible material and lives by the generation that fights them. Paying for the war by borrowing from bondholders, he explained, involved levying taxes to pay the interest. The effect was to transfer income from the Western states (taxpayers) to bondholders in the East.

In the case of Social Security today the beneficiary of government debt is still the financial sector. The economy must provide the housing, food, health care, transportation and clothing to enable retirees to live normal lives. This economic surplus can be paid for either out of taxation, new money creation or borrowing. But instead of “the West,” the major payers of the Social Security tax are wage earners across the nation. Taxing labor shrinks markets and forces the economy into austerity.

Quantitative easing as free money creation – to subsidize the big banks

The Federal Reserve’s three waves of Quantitative Easing since 2008 show how easy it is to create free money. Yet this has been provided only to the largest banks, not to strapped homeowners or industry. An immediate $2 trillion in “cash for trash” took the form of the Fed creating new bank-reserve credit in exchange for mortgage-backed securities valued far above market prices. QE2 provided another $800 billion in 2011-12. The banks used this injection of credit for interest rate arbitrage and exchange rate speculation on the currencies of Brazil, Australia and other high-interest-rate economies. So nearly all the Fed’s new money went abroad rather than being lent out for investment or employment at home.

U.S. Government debt was run up mainly to re-inflate prices for packaged bank mortgages, and hence real estate prices. Instead of alleviating private-sector debt by writing down mortgages in line with the homeowners’ ability to pay, the Federal Reserve and Treasury created money to support property prices – to push the banking system’s balance sheets back above negative net worth. The Fed’s QE3 program in 2012-13 created money to buy mortgage-backed securities each month, to provide banks with money to lend to new property buyers.

For the economy at large, the debts were left in place. Yet commentators focused only on government debt. In a double standard, they accused budget deficits of inflating wages and consumer prices, yet the explicit aim of quantitative easing was to support asset prices. Inflating asset prices on credit is deemed to be good for the economy, despite loading it down with debt. But public spending into the “real” economy, raising employment levels and sustaining consumer spending, is deemed bad – except when this is financed by personal borrowing from the banks. So in each case, increasing bank profits is the standard by which fiscal policy is to be judged!

The result is a policy asymmetry that is opposite from what most epochs have deemed fair or helpful to economic growth. Bankers and bondholders insist that the public sector borrow from them, blocking the government’s power to self-finance its operations – with one glaring exception. That exception occurs when the banks themselves need free money creation. The Fed provided nearly free credit to the banks under QE2, and Chairman Ben Bernanke promised to continue this policy until such time as the unemployment rate drops to 6.5%. The pretense is that low interest rates spur employment, but the most pressing aim is to provide easy credit to revive borrowing and bid asset prices back up.

Fiscal deflation on top of debt deflation

The main financial problem with funding war occurs after the return to normalcy, when creditors press for budget surpluses to roll back the public debt that has been run up. This imposes fiscal austerity, reducing wages and commodity prices relative to the debts that are owed. Consumer spending shrinks and prices decline as governments spend less, while higher taxes withdraw revenue. This is what is occurring in today’s financial war, much as it has in past military postwar returns to peace.

Governments have the power to resist this deflationary policy. Like commercial banks, they can create money on their computer keyboards. Indeed, since 2008 the government has created debt to support the Finance, Insurance and Real Estate (FIRE) sector more than the “real” production and consumption economy.

In contrast to public spending for goods and services (or social programs that increase market demand), most of the bank credit that led to the 2008 financial collapse was created to finance the purchase property already in place, stocks and bonds already issued, or companies already in existence. The effect has been to load down the economy with mortgages, bonds and bank debt whose carrying charges eat into spending on current output. The $13 trillion bank subsidy since 2008 (to enable banks to earn their way out of negative equity) brings us back to the question of why taxes should be levied on the 99% to pre-save for Social Security and Medicare, but not for the bank bailout.

Current tax policy encourages financial and rent extraction that has become the major economic problem of our epoch. Industrial productivity continues to rise, but debt is growing even more inexorably. Instead of fueling economic growth, this of credit/debt threatens to absorb the economic surplus, plunging the economy into austerity, debt deflation and negative equity.

So despite the fact that the financial system is broken, it has gained control over public policy to sustain and even obtain tax favoritism for a dysfunctional overgrowth of bank credit. Unlike the progress of science and technology, this debt is not part of nature. It is a social construct. The financial sector has politicized it by pressing to privatize economic rent rather than collect it as the tax base. This financialization of rent-extracting opportunities does not reflect a natural or inevitable evolution of “the market.” It is a capture of market structures and fiscal policy. Bank lobbyists have campaigned to shift the economic arena to the political sphere of lawmaking and tax policy, with side battlegrounds in the mass media and universities to capture the hearts and minds of voters to believe that the quickest and most efficient way to build up wealth is by bank credit and debt leverage.

Budget deficits as an antidote to austerity

Public debts everywhere are growing, as taxes only cover part of public spending. The least costly way to finance this expenditure is to issue money – the paper currency and coins we carry in our pockets. Holders of this currency technically are creditors to the government – and to society, which accepts this money in payment. Yet despite being nominally a form of public debt, this money serves as public capital inasmuch as it is not normally expected to be repaid. This government money does not bear interest, and may be thought of as “equity capital” or “equity money,” and hence part of the economy’s net worth.

If taxes did fully cover government spending, there would be no budget deficit – or new public money creation. Government budget deficits pump money into the economy. Conversely, running a budget surplus retires the public debt or currency outstanding. This deflationary effect occurred in the late 19th-century, causing monetary deflation that plunged the U.S. economy into depression. Likewise when President Bill Clinton ran a budget surplus late in his administration, the economy relied on commercial banks to supply credit to use as the means of payment, charging interest for this service. As Stephanie Kelton summarizes this historical experience:
The federal government has achieved fiscal balance (even surpluses) in just seven periods since 1776, bringing in enough revenue to cover all of its spending during 1817-21, 1823-36, 1852-57, 1867-73, 1880-93, 1920-30 and 1998-2001. We have also experienced six depressions. They began in 1819, 1837, 1857, 1873, 1893 and 1929.

Do you see the correlation? The one exception to this pattern occurred in the late 1990s and early 2000s, when the dot-com and housing bubbles fueled a consumption binge that delayed the harmful effects of the Clinton surpluses until the Great Recession of 2007-09.[3]

When taxpayers pay more to the government than the economy receives in public spending, the effect is like paying banks more than they provide in new credit. The debt volume is reduced (increasing the reported savings rate). The resulting austerity is favorable to the financial sector but harmful to the rest of the economy.

Most people think of money as a pure asset (like a coin or a $10 dollar bill), not as being simultaneously a public debt. But to an accountant, a balance sheet always balances: Assets = Liabilities + Net Worth. This liability-side ambivalence is confusing to most people. It takes some time to think in terms of offsetting assets and liabilities as mirror images of each other. Much as cosmologists assume that the universe is symmetrical – with positively charged matter having an anti-matter counterpart somewhere at the other end – so accountants view the money in our pocket as being created by the government’s deficit spending. Holders of the Federal Reserve’s paper currency technically can redeem it, but they will simply get paid in other denominations of the same currency.

The word “redeem” comes from settling debts. This was the purpose for which money first came into being. Governments redeem money by accepting it for tax payment. In addition to issuing paper currency, the Federal Reserve injects money into the economy by writing checks electronically. The recipients (usually banks selling Treasury bonds or, more recently, packages of mortgage loans) gain a deposit at the central bank. This is the kind of deposit that was created by the above-mentioned $13 trillion in new debt that the government turned over to Wall Street after the September 2008 crisis. The price impact was felt in financial asset markets, not in prices for goods and services or labor’s wages.

This Federal Reserve and Treasury credit was not counted as part of the government’s operating deficit. Yet it increased public debt, without being spent on “real” GDP. The banks used this money mainly to gamble on foreign exchange and interest-rate arbitrage as noted above, to buy smaller banks (helping make themselves Too Big To Fail), and to keep paying their managers high salaries and bonuses.

This monetization of debt shows how different government budgets are from family budgets. Individuals must save to pay for retirement or other spending. They cannot print their own money, or tax others. But governments do not need to “save” (or tax) to pay for their spending. Their ability to create money means that they do not need to save in advance to pay for wars, Social Security or other needs.

Keynesian deficit spending vs. bailing out Wall Street to keep the debt overhead in place

There are two kinds of markets: hiring labor to produce goods and services in the “real” economy, and transactions in financial assets and property claims in the FIRE sector. Governments can run budget deficits by financing either of these two spheres. Since President Franklin Roosevelt’s WPA programs in the 1930s, along with his public infrastructure investment in roads, dams and other construction – and military arms spending after World War II broke out – “Keynesian” spending on goods and services has been used to hire labor or pay for social programs. This pumps money into the economy via the GDP-type transactions that appear in the National Income and Product Accounts. It is not inflationary when unemployment exists.

However, the debt that characterized the Paulson-Geithner bailout of Wall Street was created not to spend on goods and services, but to buy (or take liability for) mortgages and bank loans, insurance default bets and arbitrage gambles. The aim was to subsidize financial losses while keeping the debt overhead in place, so that banks and other financial institutions could “earn their way” out of negative net worth, at the economy’s expense. The idea was that they could start lending again to prevent real estate prices from falling further, saving them from having to write down their debt claims to bring levels back down within the ability to be paid.

Why tax the economy at all? And why financial and tax reform should go together.

Taxes pay for the cost of government by withdrawing income from the parties being taxed. From Adam Smith through John Stuart Mill to the Progressive Era, general agreement emerged that the most appropriate taxes should not fall on labor, capital or on sales of basic consumer needs. Such taxes raise the break-even cost of employing labor. In today’s world, FICA wage withholding for Social Security raises the price that employers must pay their work force to maintain living standards and buy the products they produce.

However, these economists singled out one kind of tax that does not increase prices: taxes on the land’s rental value, natural resource rents and monopoly rents. These payments for rent-extraction rights are not a return to “factors of production,” but are privatized levy reflecting privileges that have no ongoing cost of production. They are rentier rake-offs.

Land is the economy’s largest asset. A site’s rental value is set by market conditions – what people pay for being able to live in a good location. People pay more to live in prestigious and convenient neighborhoods. They pay more if there is local investment in roads and public transportation, and if there are parks, museums and cultural centers nearby, or nice shopping districts. People also pay more as the economy grows more prosperous, because one of the first things they desire is status, and in today’s world this is defined largely by where one lives.

Landlords do not create this site value. But speculators may seek to ride the wave by buying property on credit, where the rate of land-price gain exceeds the interest rate. This “capital” gain is the proverbial free lunch. It is created by public investment, by the general level of prosperity, and by the terms on which banks extend credit. In a nutshell, a property is worth whatever a bank will lend, because that is the price that new buyers will be able to pay for it.

This logic was more familiar to the public a century ago than it is today. A property tax to collect this “free lunch” rent is paid out of the rent. This leaves less to be capitalized into new interest-bearing loans – while freeing the government from having to tax labor and industrial capital. So this tax not only is “less bad” than others; it is actively desirable to reduce the debt overhead. Rent levels are not affected, but the government collects the rent instead of the property owner or, at one remove, the mortgage banker who turns this rent into a flow of interest by advancing the purchase price of rent-yielding properties to new buyers.

Real estate was the major source of rising net worth and wealth for America’s middle class for over sixty years, from the return to peace in 1945 until the 2008 financial collapse. Rising property prices were fueled largely by banks providing mortgage credit on easier terms. But by 2008 these terms had reached their limit. Interest rates were seemingly as low as they could go. So were down payments (zero down payment) and amortization rates (zero, with interest-only loans) and property values were becoming fictitious as a result of a tidal wave of fraud by the banking system’s property appraisers, while the income statements of borrowers also was becoming fictitious (“liars’ loans,” with the main liars being the mortgage writers).

If the rise in real estate prices (mainly site values) had been taxed, there would have been no financial overgrowth, because this price-gain would have been collected as the tax base. The government would not have needed to tax labor either via income tax, FICA wage withholding or consumer sales. And taken in conjunction with the government’s money-creating power, there would have been little need for public debt to grow. Taxing rent extraction privileges thus would minimize debt levels and taxes on the 99%.

The next leading form of economic rent is taken by oil, gas and mining companies from the mineral deposits created by nature, as well as by owners or leasers of forests and other natural resources. Classical economics from David Ricardo onward defined such income received by landlords, mining companies, forestry and fisheries as “economic rent.” It is not profit on capital investment, because nature has provided the resource, not human labor or expenditure on capital – except for tangible capital investment in the buildings erected on the land, saws to cut down trees, earth-moving equipment to do the mining, and so forth.

The basic contrast is between a productive industrial economy and a rent-extracting one in which special privileges, monopoly pricing and economic rents divert spending away from tangible capital investment and real output. Classical economists defined economic rent generically as “empty” pricing in excess of technologically necessary costs of production. This would include payments to pharmaceutical companies, health management organizations (HMOs) and monopolies above their necessary cost of doing business. Much like paying debt service, such economic rent siphons market revenue away from tangible production and consumption.

It was to demonstrate this that Francois Quesnay developed the first national income statistics, the Tableau Économique. His aim was to show that the landed aristocracy’s rental rake-offs should form the basis for taxation rather than the excise taxes that were burdening industry and making it uncompetitive. But for the past hundred years, commercial banks have opposed property taxes, because taxing the land’s rent would mean less left over to pay interest. Some 80 percent of bank loans are for real estate, mainly to capitalize the rental value left untaxed. A property and wealth tax would reduce this market – along with the government’s need to borrow, and hence to pay interest to bondholders. And without a fiscal squeeze there would have been less of an opportunity for the financial sector to push to privatize what remains of the public domain.

Today’s central financial problem is that the banking system lends mainly for rent extraction opportunities rather than for tangible capital investment and economic growth to raise living standards. To maximize rent, it has lobbied to untax land and natural resources. At issue in today’s tax and financial crisis is thus whether the world is going to have an economy based on progressive industrial democracy or a financialized and polarizing rent-extracting society.

The ideological crisis underlying today’s tax and financial policy

From antiquity and for thousands of years, land, natural resources and monopolies, seaports and roads were kept in the public domain. In more recent times railroads, subway lines, airlines, and gas and electric utilities were made public. The aim was to provide their basic services at cost or at subsidized prices rather than letting them be privatized into rent-extracting opportunities. The Progressive Era capped this transition to a more equitable economy by enacting progressive income and wealth taxes.

Economies were liberating themselves from the special privileges that European feudalism and colonialism had granted to favored insiders. The aim of ending these privileges – or taxing away economic rent where it occurs naturally, as in the land’s site value and natural resource rent – was to lower the costs of living and doing business. This was expected to make progressive economies more competitive, obliging other countries to follow suit or be rendered obsolete. The era of what was considered to be socialism in one form or another seemed to be at hand – rising role of the public sector as part and parcel of the evolution of technology and prosperity.

But the landowning and financial classes fought back, seeking to expunge the central policy conclusion of classical economics: the doctrine that free-lunch economic rent should serve as the tax base for economies seeking to be most efficient and fair. Imbued with academic legitimacy by the University of Chicago (which Upton Sinclair aptly named the University of Standard Oil) the new post-classical economics has adopted Milton Friedman’s motto: “There Is No Such Thing As A Free Lunch” (TINSTAAFL). If it is not seen, after all, it has less likelihood of being taxed.
The political problem faced by rentiers – the “idle rich” siphoning off most of the economy’s gains for themselves – is to convince voters to agree that labor and consumers should be taxed rather than the financial gains of the wealthiest 1%. How long can they defer people from seeing that making interest tax-exempt pushes the government’s budget further into deficit? To free financial wealth and asset-price gains from taxes – while blocking the government from financing its deficits by its own public option for money creation – the academics sponsored by financial lobbyists hijacked monetary theory, fiscal policy and economic theory in general. On seeming grounds of efficiency they claimed that government no longer should regulate Wall Street and its corporate clients. Instead of criticizing rent seeking as in earlier centuries, they depicted government as an oppressive Leviathan for using its power to protect markets from monopolies, crooked drug companies, health insurance companies and predatory finance.

This idea that a “free market” is one free for Wall Street to act without regulation can be popularized only by censoring the history of economic thought. It would not do for people to read what Adam Smith and subsequent economists actually taught about rent, taxes and the need for regulation or public ownership. Academic economics is turned into an Orwellian exercise in doublethink, designed to convince the population that the bottom 99% should pay taxes rather than the 1% that obtain most interest, dividends and capital gains. By denying that a free lunch exists, and by confusing the relationship between money and taxes, they have turned the economics discipline and much political discourse into a lobbying effort for the 1%.

Lobbyists for the 1% frame the fiscal question in terms of “How can we make the 99% pay for their own social programs?” The implicit follow-up is, “so that we (the 1%) don’t have to pay?” This is how the Social Security system came to be “funded” and then “underfunded.” The most regressive tax of all is the FICA payroll tax at 15.3% of wages up to about $105,000. Above that, the rich don’t have to contribute. This payroll tax exceeds the income tax paid by many blue-collar families. The pretense is that not taxing these free lunchers will make economies more competitive and pull them out of depression. The reality is the opposite: Instead of taxing the wealthy on their free lunch, the tax burden raises the cost of living and doing business. This is a major reason why the U.S. economy is being de-industrialized today.

The key question is what the 1% do with their revenue “freed” from taxes. The answer is that they lend it out to indebt the 99%. This polarizes the economy between creditors and debtors. Over the past generation the wealthiest 1% have rewritten the tax laws to a point where they now receive an estimated 66% – two thirds – of all returns to wealth (interest, dividends, rents and capital gains), and a reported 93% of all income gains since the Wall Street bailout of September 2008.

They have used this money to finance the election campaigns of politicians committed to shifting taxes onto the 99%. They also have bought control of the major news media that shape peoples’ understanding of what is happening. And as Thorstein Veblen described nearly a century ago, businessmen have become the heads most universities and directed their curriculum along “business friendly” lines.
The clearest way to analyze any financial system is to ask Who/Whom. That is because financial systems are basically a set of debts owed to creditors. In today’s neo-rentier economy the bottom 99% (labor and consumers) owe the 1% (bondholders, stockholders and property owners). Corporate business and government bodies also are indebted to this 1%. The degree of financial polarization has sharply accelerated as the 1% are making their move to indebt the 99% – along with industry, state, local and federal government – to the point where the entire economic surplus is owed as debt service. The aim is to monopolize the economy, above all the money-creating privilege of supplying the credit that the economy needs to grow and transact business, enabling them to extract interest and other fees for this privilege.

The top 1% have nearly succeeded in siphoning off the entire surplus for themselves, receiving 93% of U.S. income growth since September 2008. Their control over the political process has enabled them to use each new financial crisis to strengthen their position by forcing companies, states and localities to relinquish property to creditors and financial investors. So after monopolizing the economic surplus, they now are seeking to transfer to themselves the economic infrastructure, land and natural resources, and any other asset on which a rent-extracting tollbooth can be placed.

The situation is akin to that of medieval Europe in the wake of the Nordic invasions. The supra-national force of Rome in feudal times is now situated in Washington, with Christianity replaced by the Washington Consensus wielded via the IMF, World Bank, WTO and its satellite institutions such as the European Central Bank, backed by the moral and ideological role academic economists rather than the Church. And on the new financial battlefield, Wall Street underwriters have used the crisis as an opportunity to press for privatization. Chicago’s strong Democratic political machine sold rights to install parking meters on its sidewalks, and has tried to turn its public roads into privatized toll roads. And the city’s Mayor Rahm Emanuel has used privatization of its airport services to break labor unionization, Thatcher-style. The class war is back in business, with financial tactics playing a leading role barely anticipated a century ago.

This monopolization of property is what Europe’s medieval military conquests sought to achieve, and what its colonization of foreign continents replicated. But whereas it achieved this originally by military conquest of the land, today’s 1% do it l by financializing the economy (although the military arm of force is not absent, to be sure, as the world saw in Chile after 1973).

The financial quandary confronting us

The economy’s debt overhead has grown so large that not everyone can be paid. Rising default rates pose the question age-old question of Who/Whom. The answer almost always is that big fish eat little fish. Big banks (too big to fail) are eating little banks, while the 1% try to take the lion’s share for themselves by annulling public and corporate debts owed to the 99%. Their plan is to downgrade Social Security and Medicare savings to “entitlements,” as if it is a matter of sound fiscal choice not to pay low-income payers while rentiers at the top re-christen themselves “job creators,” as if they have made their gains by helping wage-earners rather than waging war against them.

The problem is not Social Security, which can be paid out of normal tax revenue, as in Germany’s pay-as-you-go system. This fiscal problem – untaxing real estate, oil and gas, natural resources, monopolies and the banks – has been depicted as financial – as if one needs to save in advance by a special tax to lend to the government to cut taxes on the 99%.

The real pension cliff is with corporate, state and local pension plans, which are being underfunded and looted by financial managers. The shortfall is getting worse as the downturn reduces local tax revenues, leaving states and cities unable to fund their programs, to invest in new public infrastructure, or even to maintain and repair existing investments. Public transportation in particular is suffering, raising user fees to riders in order to pay bondholders. But it is mainly retirees who are being told to sacrifice. (The sanctimonious verb is “share” in the sacrifice, although this evidently does not apply to the 1%.)

The bank lobby would like the economy to keep trying to borrow its way out of debt and thus dig itself deeper into a financial hole that puts yet more private and public property at risk of default and foreclosure. The idea is for the government to “stabilize” the financial system by bailing out the banks – that is, doing for them what it has not been willing to do for recipients of Social Security and Medicare, or for states and localities no longer receiving revenue sharing, or for homeowners in negative equity suffering from exploding interest rates even while bank borrowing costs from the Fed have plunged. The dream is that the happy Greenspan financial bubble can be recovered, making everyone rich again, if only they will debt-leverage to bid up real estate, stock and bond prices and create new capital gains.

Realizing this dream is the only way that pension funds can pay retirees. They will be insolvent if they cannot make their scheduled 8+%, giving new meaning to the term “fictitious capital.” And in the real estate market, prices will not soar again until speculators jump back in as they did prior to 2008. If student loans are not annulled, graduates face a lifetime of indentured servitude. But that is how much of colonial America was settled, after all – working off the price of their liberty, only to be plunged into the cauldron of vast real estate speculations and fortunes-by-theft on which the Republic was founded (or at least the greatest American fortunes). It was imagined that such bondage belonged only to a bygone era, not to the future of the West. But we may now look back to that era for a snapshot of our future.

The financial plan is for the government is to supply nearly free credit to the banks, so that they can to lend debtors enough – at the widest interest-rate markups in recent memory (what banks charge borrowers and credit-card users over their less-than-1% borrowing costs) – to pay down the debts that were run up before 2008.

This is not a program to increase market demand for the products of labor. It is not the kind of circular flow that economists have described as the essence of industrial capitalism. It is a financial rake-off of a magnitude such as has not existed since medieval European times, and the last stifling days of the oligarchic Roman Empire two thousand years ago.

Imagining that an economy can be grounded on these policies will further destabilize the economy rather than alleviate today’s debt deflation. But if the economy is saved, the banks cannot be. This is why the Obama Administration has chosen to save the banks, not the economy. The Fed’s prime directive is to keep interest rates low – to revive lending not to finance new business investment to produce more, but simply to inflate the asset prices that back the bank loans that constitute bank reserves. It is the convoluted dream of a new Bubble Economy – or more accurately a new Great Giveaway.

Here’s the quandary: If the Fed keeps interest rates low, how are corporate, state and local pension plans to make the 8+% returns needed to pay their scheduled pensions? Are they to gamble more with hedge funds playing Casino Capitalism?

On the other hand, if interest rates rise, this will reduce the capitalization multiple at which banks lend against current rental income and profits. Higher interest rates will lower prices for real estate, corporate stocks and bonds, pushing the banks (and pension funds) even deeper into negative equity.

So something has to give. Either way, the financial system cannot continue along its present path. Only debt write-offs will “free” markets to resume spending on goods and services. And only a shift of taxes onto rent-yielding property and tollbooths, finance and monopolies will save prices from being loaded down with extractive overhead charges and refocus lending to finance production and employment. Unless this is done, there is no way the U.S. economy can become competitive in international markets, except of course for military hardware and intellectual property rights for escapist cultural artifacts.

The solution for Social Security, Medicare and Medicaid is to de-financialize them. Treat them like government programs for military spending, beachfront rebuilding and bank subsidies, and pay their costs out of current tax revenue and new money creation by central banks doing what they were founded to do.

Politicians shy away from confronting this solution mainly because the financial sector has sponsored a tunnel vision that ignores the role of debt, money, and the phenomena of economic rent, debt leverage and asset-price inflation that have become the defining characteristics of today’s financial crisis. Government policy has been captured to try and save – or at least subsidize – a financial system that cannot be saved more than temporarily. It is being kept on life support at the cost of shrinking the economy – while true medical spending for real life support is being cut back for much of the population.

The economy is dying from a financial respiratory disease, or what the Physiocrats would have called a circulatory disorder. Instead of freeing the economy from debt, income is being diverted to pay credit card debt and mortgage debts. Students without jobs remain burdened with over $1 trillion of student debt, with the time-honored safety valve of bankruptcy closed off to them. Many graduates must live with their parents as marriage rates and family formation (and hence, new house-buying) decline. The economy is dying. That is what neoliberalism does.

Now that the debt build-up has run its course, the banking sector has put its hope in gambling on mathematical probabilities via hedge fund capitalism. This Casino Capitalist has become the stage of finance capitalism following Pension Fund capitalism – and preceding the insolvency stage of austerity and property seizures.

The open question now is whether neofeudalism will be the end stage. Austerity deepens rather than cures public budget deficits. Unlike past centuries, these deficits are not being incurred to wage war, but to pay a financial system that has become predatory on the “real” economy of production and consumption. The collapse of this system is what caused today’s budget deficit. Instead of recognizing this, the Obama Administration is trying to make labor pay. Pushing wage-earners over the “fiscal cliff” to make them pay for Wall Street’s financial bailout (sanctimoniously calling their taxes “user fees”) can only shrink of market more, pushing the economy into a fatal combination of tax-ridden and debt-ridden fiscal and financial austerity.

The whistling in the intellectual dark that central bankers call by the technocratic term “deleveraging” (paying off the debts that have been run up) means diverting yet more income to pay the financial sector. This is antithetical to resuming economic growth and restoring employment levels. The recent lesson of European experience is that despite austerity, debt has risen from 381% of GDP in mid-2007 to 417% in mid—2012. That is what happens when economies shrink: debts mount up at arrears (and with stiff financial penalties).

But even as economies shrink, the financial sector enriches itself by turning its debt claims – what 19th-century economists called “fictitious capital” before it was called finance capital – into a property grab. This makes an unrealistic debt overhead – unrealistic because there is no way that it can be paid under existing property relations and income distribution – into a living nightmare. That is what is happening in Europe, and it is the aim of Obama Administration of Tim Geithner, Ben Bernanke, Erik Holder et al. They would make America look like Europe, wracked by rising unemployment, falling markets and the related syndrome of adverse social and political consequences of the financial warfare waged against labor, industry and government together. The alternative to the road to serfdom – governments strong enough to protect populations against predatory finance – turns out to be a detour along the road to debt peonage and neofeudalism.

So we are experiencing the end of a myth, or at least the end of an Orwellian rhetorical patter talk about what free markets really are. They are not free if they are to pay rent-extractors rather than producers to cover the actual costs of production. Financial markets are not free if fraudsters are not punished for writing fictitious junk mortgages and paying ratings agencies to sell “opinions” that their clients’ predatory finance is sound wealth creation. A free market needs to be regulated from fraud and from rent seeking.

The other myth is that it is inflationary for central banks to monetize public spending. What increases prices is building interest and debt service, economic rent and financial charges into the cost of living and doing business. Debt-leveraging the price of housing, education and health care to make wage-earners pay over two-thirds of their income to the FIRE sector, FICA wage withholding and other taxes falling on labor are responsible for de-industrializing the economy and making it uncompetitive.

Central bank money creation is not inflationary if it funds new production and employment. But that is not what is happening today. Monetary policy has been hijacked to inflate asset prices, or at least to stem their decline, or simply to give to the banks to gamble. “The economy” is less and less the sphere of production, consumption and employment; it is more and more a sphere of credit creation to buy assets, turning profits and income into interest payments until the entire economic surplus and repertory of property is pledged for debt service.

To celebrate this as a “postindustrial society” as if it is a new kind of universe in which everyone can get rich on debt leveraging is a deception. The road leading into this trap has been baited with billions of dollars of subsidized junk economics to entice voters to act against their interests. The post-classical pro-rentier financial narrative is false – intentionally so. The purpose of its economic model is to make people see the world and act (or invest their money) in a way so that its backers can make money off the people who follow the illusion being subsidized. It remains the task of a new economics to revive the classical distinction between wealth and overhead, earned and unearned income, profit and rentier income – and ultimately between capitalism and feudalism. 

Footnotes
[1] No such benefits were given to homeowners whose real estate fell into negative equity. For the few who received debt write-downs to current market value, the credit was treated as normal income and taxed!
[2]Philip Aldrick, “Loss of income caused by banks as bad as a ‘world war’, says BoE’s Andrew Haldane,” The Telegraph, December 3, 2012. Mr. Haldane is the Bank’s executive director for financial stability.
[3] Stephanie Kelton, “The ‘Fiscal Cliff’ Hoax,” http://www.latimes.com/news/opinion/commentary/la-oe-kelton-fiscal-cliff-economy-20121221,0,2129176.story, December 21, 2012.
Finally, Greek economist Yanis Varoufakis, author of The Global Minotaur: America, the True Origins of the Financial Crisis and the Future of the World Economy, spoke about debt and the global economy at a Seattle Town Hall. I urge my readers to listen to the entire lecture and question period by clicking here (video lecture not embeddable).

Varoufakis argues against the existence of a debt crisis in the global economy or in individual countries, asserting that the problem instead was a a crisis of too much savings “with no place to go.” The breakdown of the global recycling mechanism is what's causing this global crisis.

Realize there is a lot to digest in this post but wanted to share some excellent insights with my readers. Even though I don't believe the end is here, we are witnessing the end of an economic order. As Jonathan Nitzan reported in his lecture on the prison colony, the systemic crisis started before 2008, during the tech implosion, and it remains to be seen how the global economy evolves from here on.

One thing is for sure, none of these topics will be discussed at the next summit in Davos. Soros and a few other enlightened billionaires know what the risks are but most of the power elite are so far out of touch that they fail to see the bigger picture that threatens the system that enabled them to amass great wealth.

Below, Richard Duncan, author of "The New Depression," discusses whether too much credit in the U.S. could lead to the downfall of capitalism. Also embedded the audio version of Yanis Varoufakis's lecture on debt crises and the world economy. Again, it is better to watch the video lecture on C-Span (click here).