The rising complexity of the global economy

NAECSony Kapoor, Managing Director, Re-Define International Think Tank and CEO of Court Jesters Consulting

A complicated system (such as a car) can be disassembled and understood as the sum of its parts. In contrast, a complex system (such as traffic) exhibits emergent characteristics that arise out of the interaction between its constituent parts. Applying complexity theory to economic policy making requires this important recognition – that the economy is not a complicated system, but a complex one.

Historically, economic models and related policy making have treated the economy as a complicated system where simplified and stylised models, often applied to a closed economy, a specific sector or looking only at particular channels of interaction such as interest rates, seek to first simplify the real economy, then understand it and finally generalise in order to make policy.

This approach is increasingly out-dated and will produce results that simply fail to capture the rising complexity of the modern economy. Any policy decisions based on this notion of a complicated system that is the sum of its parts can be dangerously inaccurate and inappropriate. What are the forces driving this increasing complexity in the global economy? What, if anything, can be done about this?

A complex system can be roughly understood as network of nodes, where the nodes themselves are interconnected to various degrees through single or multiple channels. This means that whatever happens in one node is transmitted through the network and is likely to impact other nodes to various degrees. The behaviour of the system as a whole thus depends on the nodes, as well as the nature of the inter-linkages between them. The complexity of the system, in this instance the global economy, is influenced by a number of factors. These include first, the number of nodes; second, the number of inter-linkages; third, the nature of inter-linkages; and fourth, the speed at which a stimulus or shock propagates to other nodes. Let us now apply each of these factors to the global economy.

The global economy has seen a rapid increase in the number of nodes. One way of understanding this is to look at countries that are active participants in the global economy. The growth of China and other emerging markets, as well as their increasing integration into the world trading and more recently global financial systems, is a good proxy to track the rise in the number of nodes. The relative size and importance of these nodes has also risen with China, by some measures already the world’s largest economy.

Simultaneously, the number of inter-linkages between nodes has risen even more rapidly. The number of possible connections between nodes increases non-linearly with the increase in the number of nodes, so the global economy now has a greater number of financial, economic, trade, information, policy, institutional, technology, military, travel and human links between nodes than ever before. The increasing complexity of supply chains in trade and manufacturing, ever greater outsourcing of services, rising military collaborations, the global nature of new technological advances, increasing migration and travel, as well the rise and rise of the internet and telecommunications traffic across the world have all greatly increased the number of connections across the nodes.

It is not just that the number of interconnections between nodes has risen almost exponentially. The scope and nature of these inter-linkages has broadened significantly. The most notable broadening has come in the form of the rapid rise of complex manufacturing supply chains; financial links that result directly from the gradual dismantling of capital controls; and the rise of cross-border communication and spread of information through the internet. These ever-broadening connections between different nodes fundamentally change the behaviour of the system and how the global economy will react to any stimulus, change or shock in one or more of nodes in ways that becomes ever harder to model or predict.

Last but not the least, it is not just the number and intensity of links between the nodes that has risen, but also how quickly information, technology, knowledge, shocks, finance or pathogens move between the nodes. This results, in complexity theory parlance, in an ever more tightly coupled global economy. Such systems are more efficient, and the quest for efficiency has given rise to just-in-time supply chains and the rising speed of financial trading and other developments. But this efficiency comes at the cost of rising fragility. Evidence that financial, economic, pathogenic, security and other shocks are spreading more rapidly through the world is mounting.

To sum up, the Dynamic Stochastic General Equilibrium (DSGE) models and other traditional approaches to modelling the global economy are increasingly inadequate and inaccurate in capturing the rising complexity of the global economy. This complexity is being driven both by the rising number of nodes (countries) now integrated into the global economy, as well as the number and nature of the interconnections between these, which are intensifying at an even faster pace.

This calls for a new approach to policymaking that incorporates lessons from complexity theory by using a system-wide approach to modelling, changes institutional design to reduce the fragility of the system and deepens international and cross-sector policy making and policy coordination.

Useful links

OECD-EC-INET Oxford Workshop on Complexity and Policy, 29-30 September, OECD HQ, Paris: Click here to register

New Approaches to Economic Challenges – Complexity of the Economy (October 2015 OECD Workshop)

Economic Models used in the OECD Economics Department Dave Turner, Head of the Macroeconomic Analysis Division in the OECD’s Economics Department, on Insights

Simple Policy Lessons from Embracing “Complexity”

NAEC main messagesBill White, Chair of the OECD Economic and Development Review Committee (EDRC)

The dominant school of economic thought, prior to the crisis, essentially modelled the national economy as a totally understandable and changeless machine (DSGE models). Moreover, the machine almost always operated at its optimal speed, churning out outputs in an almost totally predicable (linear) way, under the close control of its (policy) operators. While the sudden and unexpected onslaught of the current crisis, to say nothing of its unexpected depth and duration, might have been expected to have put paid to this false belief, in practice it has not. Nevertheless, the crisis has significantly increased interest in another viewpoint. Rather than being a machine, the economy should instead be viewed as a complex adaptive system, like a forest, with massive interdependencies among its parts and the potential for highly nonlinear outcomes. Such systems evolve in a path dependent way and there is no equilibrium to return to. There are in fact many such systems in both nature and society: traffic patterns, movements of crowds, the spread of crime and diseases, social networks, urban development and many more. Moreover, their properties have been well studied and a number of common features stand out. Economic policymakers could learn a great deal from these interdisciplinary studies. Four points are essential.

First, all complex systems fail regularly; that is, they fall into crisis. Moreover, the literature suggests that the distribution of outcomes is commonly determined by a Power Law. Big crises occur infrequently while smaller ones are more frequent. A look at economic history, which has become more fashionable after decades of neglect, indicates that the same patterns apply. For example, there were big crises in 1825, 1873 and 1929, as well as smaller ones more recently in the Nordic countries, Japan and South East Asia. The policy lesson to be drawn is that, if crises are indeed inevitable, then we must have ex ante mechanisms in place for managing them. Unfortunately, this was not the case when the global crisis erupted in 2007 and when the Eurozone crisis erupted in 2010.

Second, the trigger for a crisis is irrelevant. It could be anything, perhaps even of trivial importance in itself. It is the system that is unstable. For example, the current global crisis began in 2006 in the subprime sector of the US mortgage market. Governor Bernanke of the Federal Reserve originally estimated that the losses would not exceed 50 billion dollars and they would not extend beyond the subprime market. Today, eight years later and still counting, the crisis has cost many trillions and has gone global. It seems totally implausible that this was “contagion”. Similarly, how could difficulties in tiny Greece in 2010 have had such far reaching and lasting implications for the whole Eurozone? The global crisis was in fact an accident waiting to happen, as indeed was the crisis within the Eurozone. The lesson to be drawn is that policy makers must focus more on interdependencies and systemic risks. If the timing and triggers for crises are impossible to predict, it remains feasible to identify signs of potential instability building up and to react to them. In particular, economic and financial systems tend to instability as credit and debt levels build up, either to high levels or very quickly. Both are dangerous developments and commonly precede steep economic downturns.

Third, complex systems can result in very large economic losses much more frequently than a Normal distribution would suggest. Moreover, large economic crisis often lead to social and political instability. The lesson to be drawn is that policymakers should focus more on avoiding really bad outcomes than on optimizing good ones. We simply do not have the knowledge to do policy optimisation, as Hayek emphasized in his Nobel Prize lecture entitled “The pretence of knowledge”. In contrast, policymakers have pulled out all the stops to resist little downturns over the course of the last few decades. In this way, they helped create the problem of debt overhang that we still face today. Indeed, the global ratio of (non-financial) debt to GDP was substantially higher in 2014 than it was in 2007.

Fourth, looking at economic and financial crises throughout history, they exhibit many similarities but also many differences. As Mark Twain suggested, history never repeats itself but it does seem to rhyme. In part this is due to adaptive human behaviour, both in markets and on the part of regulators, in response to previous crises. While excessive credit growth might be common to most crises, both the source of the credit (banks vs non-banks) and the character of the borrowers (governments, corporations and households) might well be different. Note too that such crises have occurred under a variety of regulatory and exchange rate regimes. Moreover, prized stability in one area today (say payment systems) does not rule out that area being the trigger for instability tomorrow. Changes in economic structure or behaviour can all too easily transform todays “truth” into tomorrow’s “false belief”. The lesson to be drawn is that policymakers need eternal vigilance and, indeed, institutional structures that are capable of responding to changed circumstances. Do not fight the last war.

It is ironic that the intellectual embrace of complexity by economic policymakers should lead to such simple policy lessons. Had they been put into practice before the current crisis, a lot of economic, social and political damage might have been avoided. As Keynes rightly said “The ideas of economists and political philosophers, both when they are right and when they are wrong, are more powerful than is commonly understood”. Nor is the hour too late to embrace these ideas now. The recognition that the pursuit of ultra-easy monetary policies could well have undesirable and unexpected consequences, in our complex and adaptive economy, might lead to a greater focus on alternative policies to manage and resolve the crisis. Absent such policies, the current crisis could easily deepen in magnitude rather than dissipate smoothly over time. This is an outcome very much to be avoided, but it will take a paradigm shift for this to happen.

Useful links

Complexity of the economy: research and policy implications Workshop organised on 26-27 October 2015 by the OECD New Approaches to Economic Challenges project with GloComNet

Economic Models used in the OECD Economics Department

In case you’ve ever wondered what an Ornstein-Uhlenbeck process satisfying the stochastic differential equation dX = dB – 5X dt looks like

This month marks the centennial of the birth of mathematician Alan Turing, the “father” of modern computing and artificial intelligence. To celebrate the occasion, we’ll be publishing a series of articles on modelling and economics. In today’s article, Dave Turner, Head of the Macroeconomic Analysis Division in the OECD’s Economics Department, gives a personal view of the models we use here. 

Macroeconomics and, more specifically, economic models have come in for widespread criticism for their failure to predict the financial crisis. Informed criticism often focuses on so-called “Dynamic Stochastic General Equilibrium” models (mercifully abbreviated to DSGE models), which had become the dominant approach to macroeconomic modelling in both academic and policy circles. Such models based on assumptions of “efficient markets” and “optimising agents” with “rational expectations”, seemed to rule out the possibility of financial crises by the very nature of their assumptions.

The approach to economic modelling within the OECD is, however, much more eclectic with a large number and wide variety of different models used for different purposes. This can be illustrated by a few examples of the models which are currently used within the Economics Department at the OECD to generate the twice-yearly projections published in the OECD Economic Outlook.

The projections produced for the OECD Economic Outlook place a high weight on the judgment of country specialists rather than relying on numbers mechanically generated by a single econometric model.  On the other hand, these country specialists increasingly have the option to compare their projections with what econometrically estimated equations would produce. Additionally, simulations from a conventional large-scale macro model provide further information on the effect of changes since the previous forecasting round n variables including oil and other commodity prices, and fiscal and monetary policy settings. Moreover, importance is attached to ensuring that the set of country projections are globally consistent, in particular that growth in world exports is in line with growth in world imports (so avoiding implicit exports to the Moon) and estimated trade equations often play a role in ensuring this global consistency.

With the onset of financial turmoil, further guidance for the Economic Outlook projections has been provided through the development of financial conditions indices for the major OECD countries. These capture the effect of broadly defined financial conditions on economic activity and include not only standard text-book measures of policy interest rates and exchange rates, but also survey measures of bank lending conditions and interest rate spreads (the difference between government interest rates and the rates at which companies can borrow). The latter, less conventional, components have been crucial in tracking the adverse effects of the financial crisis.  In addition to providing input to the main projections, these financial conditions indices have also been used as the basis for constructing upside and downside scenarios in relation to the ongoing financial and sovereign debt crisis.

Other models are used in the Economic Outlook projections to situate the current state of the main OECD economies, by using high frequency data. Thus “Indicator models” use estimated combinations of monthly data on hard indicators, such as industrial production and retail sales, as well as soft indicators such as consumer and business surveys to make forecasts of GDP over the current and following quarter.  Even here, treating the model predictions with caution is often warranted, especially, for example, if recent indicators have been affected by unusually unseasonal weather.

At the other extreme of the projection horizon, a model has recently been developed to extend the Economic Outlook projections over a further 50 years. While such projections are inevitably “heroic” and subject to many qualifications, such a long-term horizon is needed to address a number of important policy issues that will only play out over a period of decades. Such issues include the implications for government debt of current high fiscal deficits; the impact of ageing populations on growth and government budgets; the impact of structural policy changes on how economies catch-up with the technological leaders; and the growing importance of China and India in the global economy.

Beyond the Economic Outlook projections, much of the other empirical work undertaken in the OECD Economics Department can be described as using economic models, if “economic models” are defined more loosely to include any quantitative (usually estimated) relationship between economic outcomes and variables which are readily amenable to policy influence. Such models are often characterised by the construction of summary indicators which try to capture and contrast some salient features of member country economies and relate them to policy levers and/or economic outcomes.

Examples of such approaches include quantifying the effect of product market regulation on productivity, tax policy on R&D spending, or the design of pension systems on retirement decisions. Such specialised “models” are usually small, and do not pretend to provide a universal approach to economics or provide answers to questions across many different policy fields.

Moreover, work is ongoing to evaluate the impact of structural policies on macroeconomic performance, an area the OECD has been pioneering and in which it has already contributed significantly to the G20 process. In exploiting its access to a rich cross-country information set made available by its member countries, it is this type of modelling where the OECD is uniquely well placed to play a role in providing policy advice to its member countries, rather than attempting to develop the next generation of all-encompassing whole economy models with a fancy new acronym.

Useful links

OECD economic outlook, analysis and forecasts

Going with the flow: Can analog simulations make economics an experimental science?

Turbulence ahead? Click to see an animation of how it develops

This month marks the centennial of the birth of mathematician Alan Turing, the “father” of modern computing and artificial intelligence. To celebrate the occasion, we’ll be publishing a series of articles on modelling and economics. Today’s article is from John Hulls, of the Cambiant Project at the Dominican University of California that uses a fluid dynamics modeling concept he developed to simulate economic performance. John is also an affiliate at Lawrence Berkeley National Laboratory, working principally in the area of environmental applications of the LBL Phylochip microarray technology.

Nobel Laureate James Meade related how, “Once upon a time a student at London School of Economics got into difficulties with such questions as whether Savings are necessarily related to Investments, but he realized that monetary flows could be viewed as tankfuls of water…” Meade’s support led to the young Bill Phillips’ creation of the Moniac, a sophisticated hydromechanical analog simulator, a cascade of tanks and interconnecting valves controlled by slotted plastic graphs representing the major sectors of the British economy. It was a fully dynamic simulator that, as Phillips explained in a 1950 paper, “will give solutions for non-linear systems as easily as for linear ones. The relationships need not be in analytical form if the curves can be drawn”.

Meade recognized the machine’s dynamic nature and the visibility of its flows as a powerful teaching tool. A favorite exercise at London School of Economics was to run an experiment on the impact of uncoordinated government intervention. One student (the “Chancellor of the Exchequer”) controlled taxation and public spending, a second managed monetary policy (“Head of the Bank of England”). Told to achieve a stable target level of national income while disregarding each other’s actions, they produced results that were invariably messy. Phillips used his machine to investigate many complex issues, leading to his applications of feedback control theory to economics and the Phillips curve showing the relationship between inflation and unemployment for which he is famous.

So why the current lack of analog simulations, which can reveal an economy’s dynamics yet are regulated by the physical laws of the analogy on which they are based? Julian Reiss examined the value of simulation as a means of economic experiment in a 2011 paper, defining the difference between mathematical modeling and simulation. He shows that only a tiny percentage of economic papers employ true simulations, despite its success in many other fields from aeronautics to population genetics. Yet the money is in developing highly constrained, complex mathematical models to interpret market statistics. One need only consider the financial sector, with trading algorithms stalking each other through a cybernetic market ecology, where the difference between the quick and the dead in making a trade is as little as a few microseconds, with billions of dollars to the survivors.

This “black box” trading ecology requires precise, highly constrained mathematical analysis of market statistics to build these algorithms, which obviously affect the pricing of financial instruments associated with the current Euro crisis. Yet, as Kevin Slavin points out, these algorithms, performing  around 70% of all securities trades in the U.S. market, all came together two years ago and, for a few minutes, “vanished” 9% of the entire value of the stock market, though no human asked for it. He says that we’ve turned these algorithms loose in the world, invisible to humans. We can only watch the numbers scroll on our trading screens, our only power “a big, red ‘Stop’ button.”  Some high-speed digital glitch, invisible to humans, caused the Flash Crash of 2010 (the subject of tomorrow’s article in this series).  Contrast this with the analog basis of Phillips simulator. A few small mistakes and leaks are inconsequential, but vanishing 9% of the market would leave large puddles on the floor.

Here’s where the power of simulation as a tool for experimentation really counts. It is worth watching the video of Allan McRobie, demonstrating Cambridge University’s restored Moniac, quickly adjusting valves and graphs to demonstrate multiplier effects, interest-rate impacts and business cycles. Best quote: “Let’s just shut off the banking sector for a moment….”. McRobie shows the many metaphors relating to economics and flow – liquidity, income streams, asset flow etc.- but notes that Phillips’ machine is a direct analog device tied to physical laws governing flow, not the mathematics of digital instruction defining Slavin’s stalking algorithms.

In a Dominican University Green MBA project to develop an analog economic policy “flight simulator”, we invoked the shade of Phillips and his stocks and flows, to show policymakers how resource utilization and environmental considerations affect economic performance. Instead of hydraulics, we used the flow over a specific cambered surface to drive the simulation, essentially a “wing” flying through an atmosphere of potential transactions, with the surface representing the structure of a given economy. The idea came from the serendipitous observation of the similarity between pressure distributions over an airfoil and income distribution, producing an analog where the principal forces on the surface and dynamic outputs have direct economic equivalents.  Results are shown with stocks and resources represented as altitude and potential energy by the kinetic energy of flow represented as velocity through the atmosphere of transactions.

The properties of the simulation’s cambered surfaces were validated by comparing output from varying the growth coefficient with U.S. income from 1979-2007, and comparing the overall force coefficients developed by the U.S. and Sweden over the same period vs. GDP.  The simulation displays all the economies’ characteristics including long- and short-term cyclic behavior, efficiency and stability, with relative income shown by one’s position on the cambered surface. The results are highly visible, shown in the project website video, which includes an analog replication of the “Crash of 87” and the collapse of the housing bubble, We also show that former U.S. Treasury Secretary Larry Summers assumed metaphor that “the U.S. is flying out of the recession dangerously close to the stall” is actually a direct analog.

The simulation, disturbingly, shows that there is a minimum velocity below which austerity will have negative effects, directly opposite to the intended policy, literally a region of reversed commands. Phillips’ Moniac simply runs dry, but our model shows that catastrophic stall is inevitable unless velocity is restored to the point where growth is possible.

Math models and economists’ other tools all count, but the profession must develop good simulations that let policymakers evaluate the potential consequences of their actions in an accessible, comprehensible and visible way.

Useful links

OECD economic outlook, analysis and forecasts