Economic Models used in the OECD Economics Department

In case you’ve ever wondered what an Ornstein-Uhlenbeck process satisfying the stochastic differential equation dX = dB – 5X dt looks like

This month marks the centennial of the birth of mathematician Alan Turing, the “father” of modern computing and artificial intelligence. To celebrate the occasion, we’ll be publishing a series of articles on modelling and economics. In today’s article, Dave Turner, Head of the Macroeconomic Analysis Division in the OECD’s Economics Department, gives a personal view of the models we use here. 

Macroeconomics and, more specifically, economic models have come in for widespread criticism for their failure to predict the financial crisis. Informed criticism often focuses on so-called “Dynamic Stochastic General Equilibrium” models (mercifully abbreviated to DSGE models), which had become the dominant approach to macroeconomic modelling in both academic and policy circles. Such models based on assumptions of “efficient markets” and “optimising agents” with “rational expectations”, seemed to rule out the possibility of financial crises by the very nature of their assumptions.

The approach to economic modelling within the OECD is, however, much more eclectic with a large number and wide variety of different models used for different purposes. This can be illustrated by a few examples of the models which are currently used within the Economics Department at the OECD to generate the twice-yearly projections published in the OECD Economic Outlook.

The projections produced for the OECD Economic Outlook place a high weight on the judgment of country specialists rather than relying on numbers mechanically generated by a single econometric model.  On the other hand, these country specialists increasingly have the option to compare their projections with what econometrically estimated equations would produce. Additionally, simulations from a conventional large-scale macro model provide further information on the effect of changes since the previous forecasting round n variables including oil and other commodity prices, and fiscal and monetary policy settings. Moreover, importance is attached to ensuring that the set of country projections are globally consistent, in particular that growth in world exports is in line with growth in world imports (so avoiding implicit exports to the Moon) and estimated trade equations often play a role in ensuring this global consistency.

With the onset of financial turmoil, further guidance for the Economic Outlook projections has been provided through the development of financial conditions indices for the major OECD countries. These capture the effect of broadly defined financial conditions on economic activity and include not only standard text-book measures of policy interest rates and exchange rates, but also survey measures of bank lending conditions and interest rate spreads (the difference between government interest rates and the rates at which companies can borrow). The latter, less conventional, components have been crucial in tracking the adverse effects of the financial crisis.  In addition to providing input to the main projections, these financial conditions indices have also been used as the basis for constructing upside and downside scenarios in relation to the ongoing financial and sovereign debt crisis.

Other models are used in the Economic Outlook projections to situate the current state of the main OECD economies, by using high frequency data. Thus “Indicator models” use estimated combinations of monthly data on hard indicators, such as industrial production and retail sales, as well as soft indicators such as consumer and business surveys to make forecasts of GDP over the current and following quarter.  Even here, treating the model predictions with caution is often warranted, especially, for example, if recent indicators have been affected by unusually unseasonal weather.

At the other extreme of the projection horizon, a model has recently been developed to extend the Economic Outlook projections over a further 50 years. While such projections are inevitably “heroic” and subject to many qualifications, such a long-term horizon is needed to address a number of important policy issues that will only play out over a period of decades. Such issues include the implications for government debt of current high fiscal deficits; the impact of ageing populations on growth and government budgets; the impact of structural policy changes on how economies catch-up with the technological leaders; and the growing importance of China and India in the global economy.

Beyond the Economic Outlook projections, much of the other empirical work undertaken in the OECD Economics Department can be described as using economic models, if “economic models” are defined more loosely to include any quantitative (usually estimated) relationship between economic outcomes and variables which are readily amenable to policy influence. Such models are often characterised by the construction of summary indicators which try to capture and contrast some salient features of member country economies and relate them to policy levers and/or economic outcomes.

Examples of such approaches include quantifying the effect of product market regulation on productivity, tax policy on R&D spending, or the design of pension systems on retirement decisions. Such specialised “models” are usually small, and do not pretend to provide a universal approach to economics or provide answers to questions across many different policy fields.

Moreover, work is ongoing to evaluate the impact of structural policies on macroeconomic performance, an area the OECD has been pioneering and in which it has already contributed significantly to the G20 process. In exploiting its access to a rich cross-country information set made available by its member countries, it is this type of modelling where the OECD is uniquely well placed to play a role in providing policy advice to its member countries, rather than attempting to develop the next generation of all-encompassing whole economy models with a fancy new acronym.

Useful links

OECD economic outlook, analysis and forecasts

What can NASCAR teach NASDAQ about avoiding crashes?

“Safety advances have, with a few exceptions, come as the result of tragic consequences”

This month marks the centennial of the birth of mathematician Alan Turing, the “father” of modern computing and artificial intelligence. To celebrate the occasion, we’ll be publishing a series of articles on modelling and economics. Today’s article is from David Leinweber, head of the Lawrence Berkeley National Laboratory’s Center for Innovative Financial Technology and author of “Nerds on Wall Street: Math, Machines and Wired Markets

The Flash Crash wiped one trillion dollars off US stocks in 20 minutes on May 6, 2010, with most of the damage being done in only five minutes. But it took the Securities and Exchange Commission (SEC) and the Commodity Futures Trading Commission (CFTC) nearly five months to produce a report on those five minutes. If it takes so long to reconstruct and analyze an event that has already happened, imagine the difficulties in trying to regulate and prevent such incidents in markets where 1500 trades are made in the time it takes you to blink and where dozens of globally interconnected exchanges and trading facilities have replaced a small number of centralized stock markets.

The Lawrence Berkeley National Laboratory (LBNL) is actually a Department of Energy national laboratory, but we work in a number of data-intensive scientific areas where detecting and predicting particular events is crucial, ranging from cosmology to climate change. In 2010, Horst Simon, then director of LBNL’s Computational Research Division (now deputy director of LBNL) and I co-founded LBNL’s Center for Innovative Financial Technology (CIFT) to help build a bridge between the computational science and financial markets communities.

At present, a basic tool in regulating financial markets is the “circuit breaker” that stops trading, and after the Flash Crash new circuit breakers were instituted that stop the trading of individual stocks if their price variations exceed a prescribed threshold. However, as different markets and venues become more interdependent, sudden halts in one market segment can ripple into others and cause new problems.

What’s needed is a system to detect and predict hazardous conditions in real-time to allow the regulatory agencies to slow down rather than stop markets. Energy networks do this with brownouts to prevent blackouts, but we can also seek inspiration in NASCAR racing, where, faced with a growing number of increasingly gruesome crashes as the cars got too fast for the tire technology of the day, officials introduced the yellow flag to slow the races down when things got too dangerous.

Racetrack officials (like air traffic controllers or weather forecasters) can see trouble looming and intervene to prevent disaster. We are exploring the possibility of using supercomputers to survey markets in real time and turn on a “warning light” to advise regulators to slow things down when anomalies start to appear. Anomaly is in fact a rather bland term for some of the weirdness seen during the Flash Crash. For instance, you could buy Accenture shares for one cent or more than $30 during the same second at one point.

Based on recommendations from traders, regulators, and academicians, we have implemented two sets of indicators that have “early warning” properties when applied to the data for the period preceding the Flash Crash. The Volume Synchronized Probability of Informed Trading (VPIN) measures the balance between buy and sell activities using volume intervals rather than time intervals. A variant of Herfindahl-Hirschman Index (HHI) of market fragmentation measures how concentrated the exchange operations are, since fragmentation is considered as a source of market instability.

Because of the computational demands, computing indicators like VPIN and HHI in real-time will require high performance computing (HPC) resources. It will also need reliable data. For example, we discovered that different sources disagree on how many trades there were of Apple Inc at $100,000 per share on May 6, 2010.

Is real-time high frequency monitoring needed? The SEC/CFTC has announced their intention to direct many billions from the financial industry to this effort, which has been criticized by others as unnecessary overkill. We disagree with the critics. It is worth spending money on ways to improve on regulatory approaches based on circuit breakers. Stopping trading is a very blunt instrument that does not allow the market to self-correct and stabilize, and can easily make a bad situation worse.

Our tests show that VPIN, HHI and similar indicators could provide early warning signals for a more gradual slow down, rather than stop, replacement for on/off circuit breakers and our high frequency trading and academic collaborators hold this opinion strongly as well. Furthermore, we believe that the same approach, likely with additional computation, is applicable in the area of financial market cyber-security, which is widely acknowledged as important, but largely ignored in the regulatory debate.

Useful links

For a detailed account the work summarized above, see Federal Market Information Technology in the Post Flash Crash Era: Roles for Supercomputing

OECD work on financial market trends and policies

Going with the flow: Can analog simulations make economics an experimental science?

Turbulence ahead? Click to see an animation of how it develops

This month marks the centennial of the birth of mathematician Alan Turing, the “father” of modern computing and artificial intelligence. To celebrate the occasion, we’ll be publishing a series of articles on modelling and economics. Today’s article is from John Hulls, of the Cambiant Project at the Dominican University of California that uses a fluid dynamics modeling concept he developed to simulate economic performance. John is also an affiliate at Lawrence Berkeley National Laboratory, working principally in the area of environmental applications of the LBL Phylochip microarray technology.

Nobel Laureate James Meade related how, “Once upon a time a student at London School of Economics got into difficulties with such questions as whether Savings are necessarily related to Investments, but he realized that monetary flows could be viewed as tankfuls of water…” Meade’s support led to the young Bill Phillips’ creation of the Moniac, a sophisticated hydromechanical analog simulator, a cascade of tanks and interconnecting valves controlled by slotted plastic graphs representing the major sectors of the British economy. It was a fully dynamic simulator that, as Phillips explained in a 1950 paper, “will give solutions for non-linear systems as easily as for linear ones. The relationships need not be in analytical form if the curves can be drawn”.

Meade recognized the machine’s dynamic nature and the visibility of its flows as a powerful teaching tool. A favorite exercise at London School of Economics was to run an experiment on the impact of uncoordinated government intervention. One student (the “Chancellor of the Exchequer”) controlled taxation and public spending, a second managed monetary policy (“Head of the Bank of England”). Told to achieve a stable target level of national income while disregarding each other’s actions, they produced results that were invariably messy. Phillips used his machine to investigate many complex issues, leading to his applications of feedback control theory to economics and the Phillips curve showing the relationship between inflation and unemployment for which he is famous.

So why the current lack of analog simulations, which can reveal an economy’s dynamics yet are regulated by the physical laws of the analogy on which they are based? Julian Reiss examined the value of simulation as a means of economic experiment in a 2011 paper, defining the difference between mathematical modeling and simulation. He shows that only a tiny percentage of economic papers employ true simulations, despite its success in many other fields from aeronautics to population genetics. Yet the money is in developing highly constrained, complex mathematical models to interpret market statistics. One need only consider the financial sector, with trading algorithms stalking each other through a cybernetic market ecology, where the difference between the quick and the dead in making a trade is as little as a few microseconds, with billions of dollars to the survivors.

This “black box” trading ecology requires precise, highly constrained mathematical analysis of market statistics to build these algorithms, which obviously affect the pricing of financial instruments associated with the current Euro crisis. Yet, as Kevin Slavin points out, these algorithms, performing  around 70% of all securities trades in the U.S. market, all came together two years ago and, for a few minutes, “vanished” 9% of the entire value of the stock market, though no human asked for it. He says that we’ve turned these algorithms loose in the world, invisible to humans. We can only watch the numbers scroll on our trading screens, our only power “a big, red ‘Stop’ button.”  Some high-speed digital glitch, invisible to humans, caused the Flash Crash of 2010 (the subject of tomorrow’s article in this series).  Contrast this with the analog basis of Phillips simulator. A few small mistakes and leaks are inconsequential, but vanishing 9% of the market would leave large puddles on the floor.

Here’s where the power of simulation as a tool for experimentation really counts. It is worth watching the video of Allan McRobie, demonstrating Cambridge University’s restored Moniac, quickly adjusting valves and graphs to demonstrate multiplier effects, interest-rate impacts and business cycles. Best quote: “Let’s just shut off the banking sector for a moment….”. McRobie shows the many metaphors relating to economics and flow – liquidity, income streams, asset flow etc.- but notes that Phillips’ machine is a direct analog device tied to physical laws governing flow, not the mathematics of digital instruction defining Slavin’s stalking algorithms.

In a Dominican University Green MBA project to develop an analog economic policy “flight simulator”, we invoked the shade of Phillips and his stocks and flows, to show policymakers how resource utilization and environmental considerations affect economic performance. Instead of hydraulics, we used the flow over a specific cambered surface to drive the simulation, essentially a “wing” flying through an atmosphere of potential transactions, with the surface representing the structure of a given economy. The idea came from the serendipitous observation of the similarity between pressure distributions over an airfoil and income distribution, producing an analog where the principal forces on the surface and dynamic outputs have direct economic equivalents.  Results are shown with stocks and resources represented as altitude and potential energy by the kinetic energy of flow represented as velocity through the atmosphere of transactions.

The properties of the simulation’s cambered surfaces were validated by comparing output from varying the growth coefficient with U.S. income from 1979-2007, and comparing the overall force coefficients developed by the U.S. and Sweden over the same period vs. GDP.  The simulation displays all the economies’ characteristics including long- and short-term cyclic behavior, efficiency and stability, with relative income shown by one’s position on the cambered surface. The results are highly visible, shown in the project website video, which includes an analog replication of the “Crash of 87” and the collapse of the housing bubble, We also show that former U.S. Treasury Secretary Larry Summers assumed metaphor that “the U.S. is flying out of the recession dangerously close to the stall” is actually a direct analog.

The simulation, disturbingly, shows that there is a minimum velocity below which austerity will have negative effects, directly opposite to the intended policy, literally a region of reversed commands. Phillips’ Moniac simply runs dry, but our model shows that catastrophic stall is inevitable unless velocity is restored to the point where growth is possible.

Math models and economists’ other tools all count, but the profession must develop good simulations that let policymakers evaluate the potential consequences of their actions in an accessible, comprehensible and visible way.

Useful links

OECD economic outlook, analysis and forecasts

Turing’s Economics: A Birth Centennial Homage

Click to go to the Alan Turing Year website

This month marks the centennial of the birth of mathematician Alan Turing, the “father” of modern computing and artificial intelligence. To celebrate the occasion, we’ll be publishing a series of articles on modelling and economics. The series starts with a contribution from Professor K. Vela Velupillai of the Algorithmic Social Sciences Research Unit at Trento University’s Economics Department, and Elected Member of the Turing Centenary Advisory Committee.

The “Five Turing Classics” – On Computable Numbers, Systems of Logic, Computing Machinery and Intelligence, The Chemical Basis of Morphogenesis, and  Solvable and Unsolvable Problems– should be read together to understand why there can be something called Turing’s Economics. Herbert Simon, one of the founding fathers of computational cognitive science, was deeply indebted to Turing in the way he tried to fashion what I have called “computable economics”, acknowledging that “If we hurry, we can catch up to Turing on the path he pointed out to us so many years ago.”

Simon was on that path, for almost the whole of his research life. It has been my mission, first to learn to take this “path”, and then to teach others the excitement and fertility for economic research of taking it too.

A comparison of Turing’s classic formulation of Solvable and Unsolvable Problems in his last published paper in 1954 and Simon’s variation on that theme, as Human Problem Solving, would show that the human problem solver in the world of Simon needs to be defined – as Simon did – in the same way Turing’s approach was built on the foundations he had established in 1936-37. At a deeper epistemological level, I have come to characterize the distinction between orthodox economic theory and Turing’s Economics in terms of the last sentence of Turing’s paper (italics added): “These, and some other results of mathematical logic may be regarded as going some way towards a demonstration, within mathematics itself, of the inadequacy of ‘reason’ unsupported by common sense.”

We – at ASSRU – characterize every kind of orthodox economic theory, including orthodox behavioural economics, advocating the adequacy of “reason” unsupported by common sense; contrariwise, in Turing’s economics we take seriously what we now refer to as Turing’s Precept: ‘the inadequacy of reason unsupported by common sense’.

At another frontier of research in many of what are fashionably referred to as “the sciences of complexity”, some references to Turing’s The Chemical Basis of Morphogenesis is becoming routine, even in varieties of computational economics exercises, especially when concepts such as “emergence” are invoked. It is now increasingly realized that the notion of “emergence” originates in the works of the British Emergentists, from John Stuart Mill to C. Lloyd Morgan, in the half-century straddling the last quarter of the 19th and the first quarter of the 20th century.

A premature obituary of British Emergentism was proclaimed on the basis of a rare, rash, claim by Dirac (italics added): “The underlying physical laws necessary for the mathematical theory of a large part of physics and the whole of chemistry are thus completely known, and the difficulty is only that the exact application of these laws leads to equations much too complicated to be soluble. It therefore becomes desirable that approximate practical methods of applying quantum mechanics should be developed, which can lead to an explanation of the main features of complex atomic systems without too much computation.”

Contrast this with Turing’s wonderfully laconic, yet eminently sensible precept in his 1954 paper (italics added): “No mathematical method can be useful for any problem if it involves much calculation.”

Turing’s remarkably original work on The Chemical Basis of Morphogenesis was neither inspired by, nor influenced any later allegiance to the British Emergentist’s tradition – such as the neurological and neurophilosophical work of Nobel Laureate, Roger Sperry.  On the other hand, the structure of the experimental framework Turing chose to construct was uncannily similar to the one devised by Fermi, Pasta and Ulam in 1955, although with different purposes in mind.

Turing’s aim was to devise a mechanism by which a spatially homogeneous distribution of chemicals – i.e., formless or patternless structure – could give rise to form or patterns via what has come to be called a Turing Bifurcation, the basic bifurcation that lies at the heart of almost all mathematical models for patterning in biology and chemistry, a reaction-diffusion mechanism formalised as a (linear) dynamical system and subject to what I refer to as the linear mouse theory of self-organisation, for reasons you can discover here.

Those interested in the nonlinear, endogenous, theory of the business cycle know that the Turing Bifurctions are at least as relevant as the Hopf Bifurcation in modeling the “emergence” and persistence of unstable dynamics in aggregative economic dynamics.

Turing’s Economics straddles the micro-macro divide in a way that makes the notion of microfoundations of macroeconomics thoroughly irrelevant; more importantly, it is also a way of circumventing the excessive claims of reductionists in economics, and their obverse! This paradox would have, I conjecture, provided much amusement to the mischievous child that Turing was, all his life.

Useful links

Pr Velupillai kindly provided this extended version of his article, including notes and comments

Computable Economics (Elgar, 2012) edited by Veupillai, Zambelli and Kinsella brings together the seminal papers of computable economics from the last sixty years and encompass the works of some of the most influential researchers in this area, including Turing

Applications of complexity science for public policy from the OECD Global Science Forum

Algorithmic Social Sciences Research Unit (ASSRU) at the Univesity of Trento

Alan Turing

Rio+20=0?

Click to find out more about OECD work of relevance to Rio+20

Today we publish the third in a series of articles on the OECD’s contribution to the  RIO+20 UN Conference on Sustainable Development

Many politicians “cannot resist the power of the Invisible Demons, because they Secretly Serve the Invisible Demons”, according to one comment on the Rio+20 outcome document on the blog of Kumi Naidoo, Executive Director of Greenpeace International. I had a more lurid image of Satan and his minions, but it’s true that an eternity spent affirming, acknowledging, underscoring, stressing, recognising and recalling the need for holistic and integrated approaches to this and that would qualify as a reasonable definition of Hell in most religions.  And speaking of definitions, Greenpeace’s political director Daniel Mittler described Rio+20 as an “epic failure … developed countries have given us a new definition of hypocrisy”. Other civil society organisations agree, including Oxfam, WWF, and the International Trade Union Confederation (ITUC).

How about the OECD? The document you can click on at the top of this article opens with a message from OECD Secretary-General Angel Gurría saying that 20 years on from Rio 1992, sustainable development  remains a powerful message but it still isn’t a reality. It’s unlikely to become a reality unless we start changing what can be changed now, but as Gurría points out,  “even the best policies are nothing without the political will to implement them”.

It’s not that the political will for change doesn’t exist. On the contrary, governments are always looking for new ways to develop the economy, but what we’ve seen since Rio 1992 is that economic growth on its own isn’t enough to address problems such as inequality, and it can even make environmental and other problems worse. And as we saw with Rio+20, countries at different stages of development and with different natural resources do not share a common view as to what the best policies are, even when they agree on the scale and causes of environmental degradation and climate change.

There’s also a problem of time scales and a related one of habit. It’s a bit like the character played by Marcello Mastroianni in Fellini’s 8½ (or 8.5 as the OECD Style Guide would have it). Somebody tells him about this great method to quit smoking in a fortnight. “It’s taken me 40 years to get up to two packets a day,” he replies, “So why do you think I’d want to quit in two weeks?” Our current model of economic growth has brought enormous progress to billions of people, and we’re hooked, even though the costs keep growing. Today’s technologies and ways of doing things will be expensive and difficult to replace, and many of the benefits may not appear for some time, or be so diffuse that the impact on individual people (or businesses) may not be very noticeable. The effects of the crisis and anything that would slow growth are, however, immediate.

The OECD proposes green growth as a way to meet the challenges. A report prepared with the World Bank and UN for the G20 summit that preceded RIO+20 starts from the fact that structural reform agendas exist already, so green growth and sustainable development policies could be incorporated into them. The main elements of a “green” policy package are those of any structural reform – investment, tax, regulation, innovation and so on, but the report is accompanied by a toolkit of policy for different national situations. For example in many OECD countries, the main energy issue may be reducing greenhouse gas emissions, whereas in a developing country, access to electricity supplies may be the priority.

The report provides a good overview of the main questions, but even if you’re familiar with the subject, take a look at the last section on the strengths, weaknesses and conditions for using the various market-based policies and non market-based policies, for example if you want to compare taxes on pollution with stricter technology standards.

Finally, what do you think Rio+40 will be? A shout of triumph or a cry of despair?

Useful links

OECD’s contribution to Rio+20

OECD work on green growth

Inequality, the crash and the crisis. Part 3: The Limit to Inequality

Click to see the book

Today we publish the last of three articles on inequality and the crisis by Stewart Lansley, visiting fellow at The Townsend Centre for International Poverty Research, Bristol University and the author of The Cost of Inequality: Why Economic Equality is Essential for Recovery, Gibson Square, 2012. He was one of the speakers at the 2012 OECD Forum session: How Is Inequality Holding Us Back?

The key lessons of the 2008 Crash are now becoming clear. For the last thirty years, some of the world’s most important economies have been applying a faulty theory on the way the economy works. Demand in most large economies is wage-led not profit-led. That is, a lower wage share leads to lower growth. This is also true in aggregate of the global economy.

The evidence from the last 100 years is that more equal societies soften, and more polarized ones intensify, the gyrations of the business cycle. Inequality is not just an issue about fairness and proportionality, it is integral to economic success. A capitalist model that allows the richest members of society to accumulate a larger and larger share of the cake merely brings a lethal mix of demand deflation, asset appreciation and a long squeeze on the productive economy that will end in economic turmoil.

Yet that model has survived the second deepest recession of the last 100 years largely intact. In contrast, the economic crisis of the 1930s was to give way to a very different model of political economy, one that eroded the extremes of wealth that had helped create the crisis.

Today, it is largely business as usual. The world’s rich have been the main winners from the global recession. In the United States, profits and dividends have risen since 2008 while real wages have fallen. According to the American economist, Emmanuel Saez, average real family income declined by a remarkable 17.4 per cent between 2007 and 2009.

Profits and dividends are up largely because wages are down. As JP Morgan Chase chief investment officer, Michael Cembalest, has documented. “U.S. labor compensation is now at a 50-year low relative to both company sales and U.S. GDP.”

A key consequence of this trend is that all income growth in the US in 2010 went to the wealthiest 10 percent of households, and 93 percent to the wealthiest one per cent.

In the UK, there has been a similar, if less extreme pattern. Real wages have fallen on average by seven per cent in the last two years and are set to continue to fall. Indeed, the independent Office for Budget Responsibility (the OBR ) has forecast that the wage share will have fallen by a further four percentage points between 2010 and 2006. In contrast, incomes at the top have continued to rise through the slump. In 2007, the ratio of the median earnings of FTSE 100 top executives to median wages stood at 92:1. By 2011, it had risen to 102:1. Not only did executive pay greatly outstrip average earnings growth up to 2007, apart from a slight blip in 2009, it has continued to do so.

There has been much talk about the need to tackle growing inequality, but little real action. Ending the present crisis and building a sustainable global economy requires a much more fundamental leap that accepts that there is a limit to the level of inequality – one that is still being breached in a majority of nations – that is consistent with stability.

The successful management of economies depends especially on securing a more equal distribution of market incomes, before the application of taxes and benefits. Tackling the unequal “pre-distribution” of incomes means elected governments taking more responsibility for both the distribution of factor shares and of relative levels of pay.

It is a role that most, if not all, governments have been and remain reluctant to play. For most national governments – and global institutions from the IMF to the OECD – reducing inequality has not been a central economic goal alongside say, controlling inflation, or tackling fiscal deficits.

In the US, the UK and most rich nations, the economic role and impact of inequality has been at best a side-issue in economic decision-making. Too many governments have, by default, allowed the relationship between wages and output to become dangerously imbalanced. They have permitted remuneration practices to emerge that have distorted incentives and sanctioned business activity geared more closely to wealth diversion than wealth creation.

Translating talk into action requires governments to set clear targets for a number of key economic relationships. These should include the balance between wages and profits, the pay gap between top and bottom and the degree of income concentration. In a majority of countries, the wage share is too low and heading lower; the pay gap, already at historic highs, is heading higher while income concentrations are above the limit consistent with stability.

Meeting these targets means ditching many of the failed economic shibboleths – that inequality leads to faster growth, that allowing the rich to keep more of their own money boosts growth and tax revenue, that a larger pay gap reduces unemployment – of the last thirty years. It will require much tougher policy measures aimed at keeping economic elites in check. National governments need to develop a new contract with labour that raises the wage floor, bolsters the middle and lowers the ceiling. This means the taming of excessive corporate power and a rebalancing of bargaining power in favour of the workforce. It means moving towards more progressive tax regimes with much tougher global action on tax havens.

None of this will be easy. Despite the accumulated evidence that fairer societies and economic success go hand in hand, and the mounting pressure for change, the political and economic consensus remains rooted in the past. Radical change will be heavily opposed by those with most to lose. Yet a model of capitalism that fails to share the proceeds of growth more proportionately is not sustainable.

Useful links

Divided We Stand: Why Inequality Keeps Rising

Growing Unequal? Income Distribution and Poverty in OECD Countries

Inequality, the crash and the crisis. Part 2: A model of capitalism that fails to share the fruits of growth

Click to see the book

Today we publish the second of three articles on inequality and the crisis by Stewart Lansley, visiting fellow at The Townsend Centre for International Poverty Research, Bristol University and the author of The Cost of Inequality: Why Economic Equality is Essential for Recovery, Gibson Square, 2012. He was one of the speakers at the 2012 OECD Forum session: How Is Inequality Holding Us Back?

The driving force behind the widening income gap of the last thirty years has been a shift in the distribution of “factor shares” – the way the output of the economy is divided between wages and profits. In the first two decades after the Second World War, a transformed model of capitalism emerged – across the rich world – in which it was accepted that the fruits of growth should be more evenly shared than they had been in the pre-War era. In the US, the share of output allocated to wages rose and stayed high. In the UK the “wage-share” settled at between 58 and 60 per cent of output, a higher rate than achieved in the pre-war era and the Victorian age. It was this elevated wage share that helped drive the “great leveling” of the post-war decades.

From the late 1970s, the capitalist model underwent another transformation, one characterised by a backward shift in the way the proceeds of growth were divided. By 2007, the share of output going to wages had fallen to 53 per cent in the UK. In the US, the fruits of growth became even more unevenly divided, with the workforce ending up with an even smaller share of the economic cake. There were similar, if shallower trends in most rich nations.

This process of decoupling wages from output has led to a growing “wage-output gap”, with a very profound, and negative, impact on the way economies function. This is for three key reasons. First, by cutting the purchasing power needed to buy the extra output being produced, the long wage squeeze brought domestic and global deflation. Consumer societies started to lose the capacity to consume.

The solution to this problem – which would have brought a prolonged recession much earlier – was to allow an explosion in private debt to fill the demand gap. In the UK, levels of personal debt rose from 45 per cent of incomes in 1981 to 157 per cent in 2008. In the US, debt reached a third more than national income by 2008. This helped to fuel a domestic boom from the mid-1990s but was never going to be sustainable. Far from preventing recession, it just delayed it.

The same factors were at work in the 1920s. The 1929 Crash was preceded by a sharp rise in inequality with the resulting demand gap also filled by an explosion in private debt. In 1920s America, the ratio of household debt to national income rose by 70 per cent in less than a decade.

Second, the intensified concentration of income led to the growth of a tidal wave of global footloose capital – a mix of corporate surpluses and burgeoning personal wealth. According to the pro-inequality theorists, these growing surpluses should have led to a boom in productive investment. Instead, they ended up fuelling commodity speculation, financial engineering and hostile corporate raids, activity geared more to transferring existing rather than creating new wealth and reinforcing the shift towards greater inequality.

Little of this benefitted the real economy. Of the £1.3 trillion lent by British banks between 1997 and 2007, 84 per cent was in mortgages and financial services. The proportion of lending going to manufacturing halved over the same period. It was this combination of the erosion of ordinary living standards and the accumulation of massive global cash surpluses that created the bubbles – in housing, property and business – that eventually brought the global economy to its knees. Again there are striking parallels with the 1920s when swelling surpluses in the US were poured into real estate and the stock market creating the bubbles that triggered the 1929 Crash.

Third, the effect of these trends has been to intensify the concentration of power with wealth and economic decision-making heavily concentrated in the hands of a tiny minority. In the US, such is the concentration of income, 5 per cent of earners account for 35 per cent of all consumer spending. A new elite has been able to exercise their muscle to ensure that economic policies work in their interest. Hence the inaction on tax havens, the blind-eye approach to tax avoidance and the scaling back of regulations on the City and Wall Street, policies that have simultaneously accentuated the risk of economic failure.

Not only did the growing income divide help to drive the global economy over the cliff in 1929 and 2008 it is now helping to prolong the crisis. UK wage-earners today have around £100 billion less in their pockets (roughly equivalent to the size of the nation’s health budget) than if the cake was shared as it was in the late 1970s. In the bigger economy of the US the sum stands at £500 billion. In contrast, the winners from the process of upward redistribution – big business and the top one per cent – are sitting on growing corporate surpluses and soaring private fortunes that are mostly sitting idle. This is a perfect recipe for paralysis.

The economic thrust of the last thirty years – greater reliance on markets, the weakened bargaining power of labour and hiked fortunes at the top – was aimed at dealing with the crisis of the 1970s, a mix of “stagflation” (stagnation and rising inflation ) and falling productivity. It succeeded in squeezing out inflation but replaced these fault lines with an equally toxic mix – global deflation, rising indebtedness and booming asset prices – that eventually brought economic collapse.

Part 3 looks at the lessons to be drawn for these trends.

Useful links

Divided We Stand: Why Inequality Keeps Rising

Growing Unequal? Income Distribution and Poverty in OECD Countries