Complexity: A new approach to economic challenges

William Hynes, OECD New Approaches to Economic Challenges (NAEC) initiative

The OECD launched its “New Approaches to Economic Challenges” (NAEC) initiative in 2012 to reflect on the lessons for economic analysis and policymaking from the financial crisis and Great Recession. European Central Bank Governor Jean-Claude Trichet said that: “as a policy-maker during the crisis, I found the available models of limited help. In fact, I would go further: in the face of the crisis, we felt abandoned by conventional tools”. But even before the crisis Greg Mankiw from Harvard University lamented thatmacroeconomic research of the past three decades has had only minor impact on the practical analysis of monetary or fiscal policy”.

NAEC examined the shortcomings of analytical models, and it promotes new policy tools and data. It questions traditional ideas and methods and challenges group-think and silo approaches by inviting comment and criticism from outside the Organisation, and by soliciting input from social sciences such as sociology, psychology, and history to enrich the policy discussions.

While the financial crisis struck at the core of traditional economic theory and models, it became apparent in 2016 that the failure of economic thinking and acting was far deeper and more destabilising than we thought, so part of NAEC’s mandate is to develop an agenda for inclusive and sustainable growth.

This is all the more urgent given the backlash against globalisation, increased inequalities of income and opportunities, and the negative impact of growth on the environment. We need to develop what Eric Beinhocker calls a “new narrative of growth”, one that puts people at the centre of economic policy. Therefore NAEC is helping to focus on redistribution, a concept neglected in economic analysis for many years, and helping to ensure that policy decisions improve the lives of those at the bottom of the income distribution.

It is also helping to consider the well-being of people as a multidimensional concept, which implies reconsidering important elements of the economic narrative, such as justice and social cohesion. NAEC does so by thinking “out of the box”, emphasizing the need to empower people, regions and firms to fulfil their full potential. This is at the core of the Productivity-Inclusiveness Nexus that considers how to expand the productive assets of an economy by investing in the skills of its people; and that provides a level playing field for firms to compete, including in lagging regions.

However, the challenges are too complex and interconnected for conventional models and analyses. As Andy Haldane argues, the global economy is increasingly characterised by discontinuities, tipping points, multiple equilibria, radical uncertainties and the other characteristics of complex systems. This is why a key theme of NAEC has been the complexity and interconnectedness of the economy, exemplified by the Productivity-Inclusiveness Nexus.

The contributors to this series argue that complexity and systems thinking can improve understanding of issues such as financial crises, sustainability of growth, competitiveness, innovation, and urban planning. Recognising the complexity of the economy implies that greater attention should be paid to interactions, unintended consequences, stability, resilience, policy buffers and safeguards.

Working with the European Commission and the Institute for New Economic Thinking (INET) Oxford, the NAEC initiative demonstrated in a number of workshops that complexity economics is a promising approach for delivering new insights into major public policy challenges and an exciting research agenda going forward.

The workshops offered a timely opportunity for policy-makers, academics and researchers to discuss the policy applications emerging from the study of complexity. The NAEC Roundtable in December 2016 discussed whether economics was close to a tipping point – a transition to a new behavioural complexity paradigm. There is wide agreement among economists on the limitations and the shortcomings of the rational expectations paradigm and much discussion on how to move forward.

The first phase of NAEC’s complexity work has made the case for further and deeper examination of complexity. Going forward, it will be important to demonstrate the value of complexity, systems thinking and agent-based models in a number of areas including financial networks, urban systems and the other issues highlighted in this series. The challenge is to demonstrate the value of the approach.

Complexity offers an opportunity for addressing long-held concerns about economic assumptions, theories and models. For the OECD, it also holds out the potential for creating better policies for better lives.

Useful links

The OECD organised a Workshop on Complexity and Policy, 29-30 September 2016, at OECD HQ, Paris, along with the European Commission and INET. Watch the webcast: 29/09 morning29/09 afternoon30/09 morning

Out of complexity, a third way?

Bill Below, OECD Directorate for Public Governance and Territorial Development (GOV)

The perennial curmudgeon H.L. Mencken is famously misquoted as saying: “For every complex problem there is an answer that is clear, simple, and wrong.” The ability to simplify is of course one of our strengths as humans. As a species, we might just as well have been called homo reductor—after all, to think is to find patterns and organize complexity, to reduce it to actionable options or spin it into purposeful things. Behavioural economists have identified a multitude of short-cuts we use to reduce complex situations into actionable information. These hard-wired tricks, or heuristics, allow us to make decisions on the fly, providing quick answers to questions such as ‘should I trust you?’, or ‘Is it better to cash in now, or hold out for more later?’ Are these tricks reliable? Not always. A little due diligence never hurts when listening to one’s gut instincts, and the value of identifying heuristics is in part to understand the limits of their usefulness and the potential blind spots they create. The point is, there is no shortage of solutions to problems, whether we generate them ourselves or receive them from experts. And there’s no dearth of action plans and policies built on them. So, the issue isn’t so much how do we find answers?—we seem to have little trouble doing that. The real question is, how do we get to the right answers, particularly in the face of unrelenting complexity?

There’s a nomenclature in the hierarchy of complexity as well as proper and improper ways of going about problem solving at each level. This is presented in the new publication “From Transactional to Strategic: Systems Approaches to Public Challenges” (OECD, 2017), a survey of strategic systems thinking in the public sector. Developed by IBM in the 2000s, the Cynefin Framework posits four levels of systems complexity: obvious, complicated, complex and chaotic. Obvious challenges imply obvious answers. But the next two levels are less obvious. While we tend to use the adjectives ‘complicated’ and ‘complex’ interchangeably, the framework imposes a formal distinction. Complicated systems/issues have at least one answer and are characterised by causal relationships (although sometimes hidden at first). Complex systems are in constant flux. In complicated systems, we know what we don’t know (known unknowns) and apply our expertise to fill in the gaps. In complex systems, we don’t know what we don’t know (unknown unknowns) and cause and effect relations can only be deduced after the fact. That doesn’t mean one can’t make inroads into understanding and even shaping a complex system, but you need to use methods adapted to the challenge. A common bias is to mistake complexity for mere complication. The result is overconfidence that a solution is just around the corner and the wrong choice of tools.

Unfortunately, mismatches between organisational structures and problem structures are common. For example, in medicine, without proper coordination, two specialists can work at cross-purposes on a single client. While the endocrinologist treats the patient’s hyperglycaemia (a complicated system) with pharmaceuticals and diet, the nephrologist might treat her kidney failure (also a complicated system) through a separate set of pharmaceuticals and dietary recommendations. Not only can these two pursuits be at odds (what may be good for the kidneys may be bad for blood sugar, for example), but both treatments can have effects on other systems of the body that may go unmonitored. Understanding these interactions and those of each treatment on the body’s individual systems as well as on the body as a joined up, holistic entity (which it certainly is) would be the broader, complex and more desirable goal.

The body politic may not be so different. Institutions have specific and sometimes rather narrow remits and often act without a broader vision of what other institutions are doing or planning. Each institution may have its specific expertise yet few opportunities for sustained, trans-agency approaches to solving complex issues.

Thus, top-down, command-and-control institutional structures breed their own resistance to the kind of holistic, whole-of-government approach that complex problems and systems thinking require. This may be an artefact of the need for structures that adapt efficiently to new mandates in the form of political appointees overseeing a stable core of professional civil servants. Also, the presence of elected or appointed officials at the top of clearly defined government institutions may be emblematic of the will of the people being heard.  Structural resistance may also stem from competitive political cycles, discouraging candidates to engage in cycle-spanning, intertemporal trade-offs or commit to projects with complex milestones. In a world of sound-bites, fake news and scorched earth tactics, a reasoned, methodical and open-ended systems approach can be a large, slow-moving political target.

And that’s the challenge of approaching complex, ‘wicked’ problems with the appropriate institutional support and scale—there must be fewer sweeping revolutions or cries of total failure by the opposition. Disruption gives way to continuous progress as the complex system evolves from within. It is a kind of third way that eschews polarization and favors collaboration, that blends market principles with what might be called ‘state guidance’ rather than top-down intervention.

Global warming, policies for ageing populations, child protection services and transportation management are all examples of complex systems and challenges.  To take the last example, in the US, traffic congestion is estimated to cost households USD 120 billion per year and 30 billion to businesses (OECD, 2016). But where to start? With a massive infrastructure building spree? Where would you add additional capacity? How much would you invest in roads, and how much in pubic transportation? What are the relative advantages of toll roads vs increases in gas or vehicle taxes? What are the likely effects of gas price fluctuations and the onset of fleets of electric, self-driving cars? What about the technologies that have yet to be invented? And what will be the impact of policies on income inequality, gender equality, the environment and well-being? Finally, how do you efficiently join up levels of government and all the stakeholders potentially involved?

Complex systems are hard to define at the outset and open ended in scope. They can only be gradually altered, component by component, sub-system by sub-system, by learning from multiple feedback loops, measuring what works and evaluating how much closer it takes you to your goals.

General Systems Theory (GST), that is, thinking about what is characteristic of systems themselves, sprang from a bold new technological era in which individual fields of engineering were no longer sufficient to master the breathtaking range of knowledge and skills required by emerging systems integration. That know-how gave us complex entities as fearful as the Intercontinental Ballistic Missile and as inspiring as manned space flight. Today, the world seems to be suffering from complexity fatigue, whose symptoms are a longing for simple answers and a world free of interdependencies, with clear good guys and bad guys and brash, unyielding voices that ‘tell it like it is’, a world with lines drawn, walls built and borders closed. Bringing back a sense of excitement and purpose in mastering complexity may be the first ‘wicked’ problem we should tackle.

In the meantime, we need to find a way to stop approaching complex challenges through the limits of our institutions and start approaching them through the contours of the challenges themselves. Otherwise too many important decisions will be clear, simple and wrong.

Useful links

OECD Observatory of Public Sector Innovation

OECD Directorate for Public Governance and Territorial Development

Comparing Governments for Long Term Threats and Complex Challenges (OECD, 2016)

Building a Government For the Future: Survey of Strategic, Systems Thinking in the Public Sector  (OECD 2013)

A new narrative for a complex age

Eric Beinhocker, Executive Director, The Institute for New Economic Thinking at the Oxford Martin School

If 2008 was the year of the financial crash, 2016 was the year of the political crash.  In that year we witnessed the collapse of the last of the four major economic-political ideologies that dominated the 20th century: nationalism; Keynesian Pragmatism; socialism; and neoliberalism. In the 1970s and 80s the centre-right in many countries abandoned Keynesianism and adopted neoliberalism. In the 1980s and 90s the centre-left followed, largely abandoning democratic socialism and adopting a softer version of neoliberalism.

For a few decades we thought the end of history had arrived and political battles in most OECD countries were between centre-right and centre-left parties arguing in a narrow political spectrum, but largely agreeing on issues such as free trade, the benefits of immigration, the need for flexible efficient markets, and the positive role of global finance. This consensus was reinforced by international institutions such as the IMF, World Bank, and OECD, and the Davos political and business elite.

In 2008 that consensus was rocked, last year it crumbled. Some will cling on to the idea that the consensus can be revived. They will say we just need to defend it more vigorously, the facts will eventually prevail, the populist wave is exaggerated, it’s really just about immigration, Brexit will be a compromise, Clinton won more votes than Trump, and so on. But this is wishful thinking. Large swathes of the electorate have lost faith in the neoliberal consensus, the political parties that backed it, and the institutions that promoted it. This has created an ideological vacuum being filled by bad old ideas, most notably a revival of nationalism in the US and a number of European countries, as well as a revival of the hard socialist left in some countries.

History tells us that populist waves can lead to disaster or to reform. Disaster is certainly a realistic scenario now with potential for an unravelling of international cooperation, geopolitical conflict, and very bad economic policy. But we can also look back in history and see how, for example, in the US at the beginning of the 20th century Teddy Roosevelt harnessed populist discontent to create a period of major reform and progress.

So how might we tilt the odds from disaster to reform? First, listen. The populist movements do contain some racists, xenophobes, genuinely crazy people, and others whom we should absolutely condemn. But they also contain many normal people who are fed up with a system that doesn’t work for them. People who have seen their living standards stagnate or decline, who live precarious lives one paycheque at a time, who think their children will do worse than they have. And their issues aren’t just economic, they are also social and psychological. They have lost dignity and respect, and crave a sense of identity and belonging.

They feel – rightly or wrongly – that they played by the rules, but others in society haven’t, and those others have been rewarded. They also feel that their political leaders and institutions are profoundly out of touch, untrustworthy, and self-serving. And finally they feel at the mercy of big impersonal forces – globalisation, technology change, rootless banks and large faceless corporations. The most effective populist slogan has been “take back control”.

After we listen we then have to give new answers. New narratives and policies about how people’s lives can be made better and more secure, how they can fairly share in their nation’s prosperity, how they can have more control over their lives, how they can live with dignity and respect, how everyone will play by the same rules and the social contract will be restored, how openness and international cooperation benefits them not just an elite, and how governments, corporations and banks will serve their interests, and not the other way around.

This is why we need new economic thinking. This is why the NAEC initiative is so important. The OECD has been taking economic inequality and stagnation seriously for longer than most, and has some of the best data and analysis of these issues around. It has done leading work on alternative metrics other than GDP to give insight into how people are really doing, on well-being. It is working hard to articulate new models of growth that are inclusive and environmentally sustainable. It has leading initiatives on education, health, cities, productivity, trade, and numerous other topics that are critical to a new narrative.

But there are gaps too. Rational economic models are of little help on these issues, and a deeper understanding of psychology, sociology, political science, anthropology, and history is required. Likewise, communications is critical – thick reports are important for government ministries, but stories, narratives, visuals, and memes are needed to shift the media and public thinking.

So what might such a new narrative look like? My hope is that even in this post-truth age it will be based on the best facts and science available. I believe it will contain four stories:

  • A new story of growth
  • A new story of inclusion
  • A new social contract
  • A new idealism.

This last point doesn’t get discussed enough. Periods of progress are usually characterised by idealism, common projects we can all aspire to. Populism is a zero-sum mentality – the populist leader will help me get more of a fixed pie. Idealism is a positive-sum mentality – we can do great things together. Idealism is the most powerful antidote to populism.

Economics has painted itself as a detached amoral science, but humans are moral creatures. We must bring morality back into the centre of economics in order for people to relate to and trust it. Some might question whether this is territory the OECD should get into. But the OECD was founded “to improve the economic and social well-being of people around the world” and provide a forum for governments to “seek solutions to common problems.” These issues will dramatically impact the well-being of people around the world for decades to come and are certainly a common problem.

So my hope is that the OECD will continue to play a leadership role, through NAEC and its other initiatives, on new economic thinking, not just in a narrow technical sense, but in the broad sense of helping forge a new vision that puts people back at the centre of our economy. We are truly at a fluid point in history. It could be a great step backwards or a great step forwards. We must all push forwards together.

Useful links

The OECD organised a Workshop on Complexity and Policy, 29-30 September 2016, at OECD HQ, Paris, along with the European Commission and INET. Watch the webcast: 29/09 morning29/09 afternoon30/09 morning

Agent-based models to help economics do a better job

Richard Bookstaber, University of California

Economics has not done a very good job of dealing with crises. I think this is because there are four characteristics of human experience that manifest themselves in crises and that cannot be addressed well by the methods of traditional economics.

The first of these is computational irreducibility. You may be able to reduce the behaviour of a simple system to a mathematical description that provides a shortcut to predicting its future behaviour, the way a map shows that following a road gets you to a town without having to physically travel the road first. Unfortunately, for many systems, as Stephen Wolfram argues, you only know what is going to happen by faithfully reproducing the path the system takes to its end point, through simulation and observation, with no chance of getting to the final state before the system itself. It’s a bit like the map Borges describes in On Rigor in Science, where “the Map of the Empire had the size of the Empire itself and coincided with it point by point”. Not being able to reduce the economy to a computation means you can’t predict it using analytical methods, but economics requires that you can.

The second characteristic property is emergence. Emergent phenomena occur when the overall effect of individuals’ actions is qualitatively different from what each of the individuals are doing. You cannot anticipate the outcome for the whole system on the basis of the actions of its individual members because the large system will show properties its individual members do not have. For example, some people pushing others in a crowd may lead to nothing or it may lead to a stampede with people getting crushed, despite nobody wanting this or acting intentionally to produce it. Likewise no one decides to precipitate a financial crisis, and indeed at the level of the individual firms, decisions generally are made to take prudent action to avoid the costly effects of a crisis. But what is locally stable can become globally unstable.

The name for the third characteristic, non-ergodicity, comes from the German physicist Ludwig Boltzmann who defined as “ergodic” a concept in statistical mechanics whereby a single trajectory, continued long enough at constant energy, would be representative of an isolated system as a whole, from the Greek ergon energy, and odos path. The mechanical processes that drive of our physical world are ergodic, as are many biological processes. We can predict how a ball will move when struck without knowing how it got into its present position – past doesn’t matter. But the past matters in social processes and you cannot simply extrapolate it to know the future. The dynamics of a financial crisis are not reflected in the pre-crisis period for instance because financial markets are constantly innovating, so the future may look nothing like the past.

Radical uncertainty completes our quartet. It describes surprises—outcomes or events that are unanticipated, that cannot be put into a probability distribution because they are outside our list of things that might occur. Electric power, the atomic bomb, or the internet are examples from the past, and of course by definition we don’t know what the future will be. As Keynes put it, “There is no scientific basis to form any calculable probability whatever. We simply do not know.” Economists also talk about “Knightian uncertainty”, after Frank Knight, who distinguished between risk, for example gambling in a casino where we don’t know the outcome but can calculate the odds; and what he called “true uncertainty” where we can’t know everything that would be needed to calculate the odds. This in fact is the human condition. We don’t know where we are going, and we don’t know who we will be when we get there. The reality of humanity means that a mechanistic approach to economics will fail.

So is there any hope of understanding what’s happening in our irreducible, emergent, non-ergodic, radically uncertain economy? Yes, if we use methods that are more robust, that are not embedded in the standard rational expectations, optimisation mode of economics. To deal with crises, we need methods that deal with computational irreducibility; recognise emergence; allow for the fact that not even the present is reflected in the past, never mind the future; and that can deal with radical uncertainty. Agent-based modelling could be a step in the right direction.

Agent-based models (ABM) use a dynamic system of interacting, autonomous agents to allow macroscopic behaviour to emerge from microscopic rules. The models specify rules that dictate how agents will act based on various inputs. Each agent individually assesses its situation and makes decisions on the basis of its rules. Starlings swirling in the sky (a “murmuration”) is a good illustration. The birds appear to operate as a system, yet the flight is based on the decisions of the individual birds. Building a macro, top-down model will miss the reality of the situation, because at the macro level the movements of the flock are complex, non-linear, yet are not based on any system-wide programme. But you can model the murmuration based on simple rules as to how a bird reacts to the distance, speed and direction of the other birds, and heads for the perceived centre of the flock in its immediate neighbourhood.

Click to see ABM in motion. Original file on Reddit

Likewise, the agent-based approach recognises that individuals interact and in interacting change the environment, leading to the next course of interaction. It operates without the fiction of a representative consumer or investor who is as unerringly right as a mathematical model can dream. It allows for construction of a narrative—unique to the particular circumstances in the real world—in which the system may jump the tracks and careen down the mountainside. This narrative gives us a shot at pulling the system back safely.

In short, agent-based economics arrives ready to face the real world, the world that is amplified and distorted during times of crisis. This is a new paradigm rooted in pragmatism and in the complexities of being human.

Useful links

Richard Bookstaber video contribution, OECD New Approaches to Economic Challenges (NAEC)

Richard Bookstaber will be giving a conference within the framework of the OECD NAEC initiative in Paris in June 2017. His latest book, discussing the ideas outlined above, has just been published by Princeton University Press: The End of Theory: Financial Crises, the Failure of Economics, and the Sweep of Human Interaction

The OECD organised a Workshop on Complexity and Policy, 29-30 September 2016, at OECD HQ, Paris, along with the European Commission and INET. Watch the webcast: 29/09 morning29/09 afternoon30/09 morning

The aid community should stop pretending to know the answers and start asking the right questions.

Frans Lammersen and Jorge Moreira da Silva (Director) OECD Development Co-operation Directorate – DCD-DAC

In  The Wealth of Nations, Adam Smith wrote that: “Little else is requisite to carry a state to the highest degree of opulence from the lowest barbarism but peace, easy taxes, and a tolerable administration of justice: all the rest being brought about by natural course of things.” Others were less optimistic. They argued that nations are rich or poor because of differences in religion, culture, endowments, and/or geography.

Modern economic development theories originate from thinking about how to reconstruct Europe in the aftermath of World War II. The European Recovery Program – or the Marshall plan – was based on the notion that economic growth can be stifled by local institutions and social attitudes, especially if these influence the domestic savings and investments rate. According to this linear growth model, a correctly-designed massive injection of capital coupled with public sector intervention to address market failures would ultimately lead to industrialisation and economic development. Many other economic development theories have since followed, but none have been able to explain convincingly why some countries experience rapid economic growth and others not.

The development community has continued its quest for the missing ingredient to ignite economic growth. Candidates have included capital, technology, policies, institutions, better politics, and market integration. Every time we think we have identified what’s missing, we find that it is actually not something which can be provided from outside, but turns out to be an endogenous characteristic of the system itself. Traditionally, development assistance has been rooted in a type of engineering, mass production, conveyor belt mentality, with agencies promoting “silver bullet” solutions for such complex problems as eradicating malaria, reducing vulnerability, improving resilience, strengthening connectivity etc. Unfortunately, piecemeal or one step at a time development programmes often failed to deliver.

Increasingly, complexity thinking – a way of understanding how elements of systems interact and change over time – has found its way into the development discourse. After all, what could be more complex than promoting development, sustainability, human rights, peace, and governance? We should think of the economy and society as being composed of a rich set of interactions between large numbers of adaptive agents, all of which are coevolving. Based on this approach development is not just an increase in outputs, but the emergence of an interlinked system of economic, financial, legal, social and political institutions, firms, products and technologies. Together these elements and their interaction provide citizens with the capabilities to live happy, healthy and fulfilling lives.

Once we look at development as the outcome of a complex adaptive system instead of the sum of what happens to the people and firms, we will get better insights into how we can help accelerate and shape development. We would be more effective if we assess development challenges through this prism of complex adaptive systems. This could yield important insights about how best to prioritise, design and deliver holistic development programmes for achieving the multiple goals of inclusiveness, sustainability and economic growth that underpin the 2030 Sustainable Development Agenda. There is increasing support in aid agencies for the idea that solutions to complex problems must evolve, through trial and error – and that successful programmes are likely to be different for each local context, with its particular history, natural resources and webs of social relations. The key for anyone engaged in the aid business is to put their own preconceived ideas aside and first observe, map, and listen carefully to identify the areas where change for the better is already happening and then try to encourage and nurture that change further.

Complexity matters particularly when the knowledge and capacities required for tackling problems are spread across actors without strong, formalised institutional links. Inherent to many complex problems are divergent interests, conflicting goals or competing narratives. Moreover, it is often unclear how to achieve a given objective in a specific context, or change processes that involve significant, unpredictable forces. At the same time, it is important to emphasise that the counsel of complexity should not be taken as a counsel of despair for development. There has been immense social and economic progress, and development assistance has found to be helpful overall. Development co-operation has contributed to achieving economic objectives by helping developing countries connect their firms to international markets; to achieving social objectives by making globalisation pro-poor and reducing inequalities; and to environmental objectives by adapting to climate change while exploiting comparative advantages.

Not all development challenges are inherently complex though. For those that are, complexity should not be used as an excuse for fatalism and inertia. Instead we should strive to promote innovation, experimentation and renewal. We should build partnerships to learn about the past, allowing us to shape approaches that are more likely to work and that are owned by the people we are trying to help. They will tell us what is working and what is not. Together we should build a narrative for change involving many different voices and perspectives. We should also be modest and realise that it might better to start small and learn and adapt as we go along in iterative processes of dialogue. We should keep looking for change, scanning widely for new factors emerging in the wider world; listen to a wide range of opinions to be better able to anticipate and adapt and seize opportunities.

Embracing complexity where it matters will allow us to contribute more effectively to the 2030 Sustainable Development Agenda.

Useful links

The OECD and the Sustainable Development Goals

The OECD organised a Workshop on Complexity and Policy, 29-30 September 2016, at OECD HQ, Paris, along with the European Commission and INET. Watch the webcast: 29/09 morning29/09 afternoon30/09 morning

Should we rely on economic forecasts? The wisdom of the crowds and the consensus forecast

Brian Dowd, FocusEconomics

Laurence J. Peter, a Canadian educator and author, is often referenced as saying, “an economist is an expert who will know tomorrow why the things he predicted yesterday didn’t happen today.”

Economics and especially economic forecasting are often given a bad rap. Many people think of forecasting as akin to licking a finger and testing the wind. However, there is a science to it.

Forecasting is essentially attempting to predict the future and predicting the future behavior of anything, much less something as complex and enormous as an entire economy, is not an easy task, to say the least. Accurate forecasts, therefore, are often in short supply.

There are a few reasons for this; the first being that economies are in perpetual motion and therefore extrapolating behaviors and relationships from past economic cycles into the next one is, as one might imagine, tremendously complicated.

The second reason, and perhaps the most surprising, has to do with the vast amount of raw economic data available. In an ideal world, economic forecasts would consider all of the information available. In the real world, however, that is nearly impossible, as information is scattered in myriad news articles, press releases, government communications, along with the aforementioned mountain of raw data.

Although some might consider having all of that information an advantage, nothing could be further from the truth. The thousands of economic indicators and data available tend to produce a vast amount of statistical noise, making the establishment of meaningful relations of causation between variables a serious challenge.

And, of course, we cannot forget the uncertainty that is inherent with forecasting, something that forecasters must take into account and which creates even more noise to deal with.

The question then becomes, is there a way to cancel out all of that noise to get a more accurate forecast? This is where the wisdom of the crowds comes in.

Is there wisdom in the crowds?

To illustrate how the wisdom of the crowds works, it’s best to tell the story of Sir Francis Galton, a Victorian polymath, who was the first to note the wisdom of the crowds at a livestock fair he visited in 1906. In one event, fairgoers were given the opportunity to guess the weight of an ox. The person with the closest guess to the actual weight would win a prize.

Galton hypothesized that not one person would get the answer right, but that everyone would get it right. Bear with me.

Over 750 participants made their guesses and unsurprisingly no one guessed the weight perfectly. However, when Galton calculated the mean average of all of the guesses, incredibly, it turned out to be the exact weight of the ox: 1,198 pounds.

Tapping economic analysts’ wisdom with consensus forecasts

The basic idea of the wisdom of the crowds is that the average of the answers of a group of individuals is often more accurate than the answer of any one individual expert. This was evident in the story of Galton’s experiment at the fair.

For the wisdom of the crowds to be more accurate, it depends on the number of participants and the diversity of the expertise of each individual participant. The more participants involved and the more diverse the participants are, the lower the margin of error.

So what does the wisdom of the crowds have to do with economic forecasting? Remember all of that noise that makes economic forecasting so difficult and as a result affects the accuracy of forecasts? The theory is that idiosyncratic noise is associated with any one individual answer and by taking the average of multiple answers, the noise tends to cancel itself out, presenting a far more accurate picture of the situation.

Sometimes also referred to as simply combining forecasts, the consensus forecast borrows from the same idea of Galton’s wisdom of the crowds – a consensus forecast is essentially the average of forecasts from various sources. Averaging multiple forecasts cancels out the statistical noise to yield a more accurate forecast.

But don’t take my word for it. Over the last few decades there has been a great deal of empirical research that has shown consensus forecasts to increase forecast accuracy, including those cited below.

With that said, it is possible for an individual forecast to beat the Consensus, however, it is unlikely that the same forecaster will consistently do so one forecast period after another. Moreover, those Individual forecasts that do happen to beat the consensus in one period are impossible to pick out ahead of time since they vary significantly from period to period.

Taking a look at a practical example may serve to clear things up a bit further.

A practical example of a consensus forecast

In the graph above, the Consensus Forecast for Malaysia’s 2015 GDP taken in January 2015 was 5.1%. All the other points, marked in grey, along the same axis represent the individual forecasts from 25 prominent sources taken at the same time.

In March 2016, the actual reading came out at 5.0%. A few forecasts were closer to the end result, however, as mentioned previously, some individual forecasts are going to beat the consensus from time to time, but it won’t happen consistently and it would be impossible to know which forecasts those will be until after the fact.

The second graph uses the same example as before; 25 different economic analysts forecasted Malaysia’s 2015 GDP in January of 2015. By March 2016, the maximum forecast turned out to be 16% above the actual reading with the minimum 10% below the actual reading.  The consensus was only 1.9% above the actual reading. By taking the average of all forecasts, the upside and downside errors of the different forecasts mostly cancelled each other out. As a result, the consensus forecast was much closer to the actual reading than the majority of the individual forecasts.

Consistency and reducing the margin of error are key

The point to keep in mind is that whether they are consensus forecasts or individual forecasts or any other kind of forecast, predicting the future is seldom going to be perfect. In the Malaysia GDP example, the Consensus wasn’t spot on, but it did certainly reduce the margin of error. It is important to note that there is almost always going to be some error, but reducing that error is the key, and more often than not, it will result in a more accurate forecast.

The consensus not only reduces the margin of error, but it also provides some consistency and reliability. As was mentioned previously, an individual forecaster can beat the consensus, however, it is impossible to know which of hundreds of forecasts will be the most accurate ahead of time. As is evident in our previous example, the forecasts from individual analysts can vary significantly from one to another, whereas the consensus will consistently provide accurate forecasts.

Forecasting isn’t perfect, but does it need to be?

Forecasting is a science, but it isn’t an exact science. They may not be perfect, but forecasts are still very important to businesses and governments, as they shed light on the unforeseen future, helping them to make vital decisions on strategy, plans and budgets.

So, should you trust forecasts? That is a tough question to answer. Yes, forecasting is complicating and, yes, forecasts are notoriously inaccurate and there are few ways to consistently improve forecast accuracy. The point is, however, that forecasts don’t necessarily need to be perfect to be useful. They just need to be as accurate as possible. One such way to do so is leveraging the wisdom of a crowd of analysts to produce a consensus forecast.

As French mathematician, physicist and philosopher Henri Poincaré put it, “It is far better to foresee even without certainty than not to foresee at all.”

The consensus forecast is a more accurate way to “foresee.”

Useful links

OECD forecasting methods and analytical tools

OECD Economic outlook, analysis and forecasts

Academic research on consensus forecasts

“Consider what we have learned about the combination of forecasts over the past twenty years. (…) The results have been virtually unanimous: combining multiple forecasts leads to increased forecast accuracy. This has been the result whether the forecasts are judgmental or statistical, econometric or extrapolation. Furthermore, in many cases one can make dramatic performance improvements by simply averaging the forecasts.”- Clemen Robert T. (1989) “Combining forecasts: A review and annotated bibliography” International Journal of Forecasting 5: 559-560

“A key reason for using forecast combinations […] is that individual forecasts may be differently affected by non-stationaries such as structural breaks caused by institutional change, technological developments or large macroeconomic shocks. […] Since it is typically difficult to detect structural breaks in ‘real terms’, it is plausible that on average, across periods with varying degrees of stability, combinations of forecasts from models with different degrees of adaptability may outperform forecasts from individual models.” Aiolfi M. and Timmermann A. (2004) “Structural Breaks and the Performance of Forecast Combinations”

From economic crisis to crisis in economics

Andy Haldane, Chief Economist and Executive Director, Monetary Analysis & Statistics, ​Bank of England

It would be easy to become very depressed at the state of economics in the current environment. Many experts, including economics experts, are simply being ignored. But the economic challenges facing us could not be greater: slowing growth, slowing productivity, the retreat of trade, the retreat of globalisation, high and rising levels of inequality. These are deep and diverse problems facing our societies and we will need deep and diverse frameworks to help understand them and to set policy in response to them. In the pre-crisis environment when things were relatively stable and stationary, our existing frameworks in macroeconomics did a pretty good job of making sense of things.

But the world these days is characterised by features such as discontinuities, tipping points, multiple equilibria, and radical uncertainty. So if we are to make economics interesting and the response to the challenges adequate, we need new frameworks that can capture the complexities of modern societies.

We are seeing increased interest in using complexity theory to make sense of the dynamics of economic and financial systems. For example, epidemiological models have been used to understand and calibrate regulatory capital standards for the largest, most interconnected banks, the so-called “super-spreaders”. Less attention has been placed on using complexity theory to understand the overall architecture of public policy – how the various pieces of the policy jigsaw fit together as a whole in relation to modern economic and financial systems. These systems can be characterised as a complex, adaptive “system of systems”, a nested set of sub-systems, each one itself a complex web. The architecture of a complex system of systems means that policies with varying degrees of magnification are necessary to understand and to moderate fluctuations. It also means that taking account of interactions between these layers is important when gauging risk.

Although there is no generally-accepted definition of complexity, that proposed by Herbert Simon in The Architecture of Complexity – “one made up of a large number of parts that interact in a non-simple way” – captures well its everyday essence. The whole behaves very differently than the sum of its parts. The properties of complex systems typically give rise to irregular, and often highly non-normal, statistical distributions for these systems over time. This manifests itself as much fatter tails than a normal distribution would suggest. In other words, system-wide interactions and feedbacks generate a much higher probability of catastrophic events than Gaussian distributions would imply.

For evolutionary reasons of survival of the fittest, Simon posited that “decomposable” networks were more resilient and hence more likely to proliferate. By decomposable networks, he meant organisational structures which could be partitioned such that the resilience of the system as a whole was not reliant on any one sub-element. This may be a reasonable long-run description of some real-world complex systems, but less suitable as a description of the evolution of socio-economic systems. The efficiency of many of today’s networks relies on their hyper-connectivity. There are, in the language of economics, significantly increasing returns to scale and scope in a network industry. Think of the benefits of global supply chains and global interbank networks for trade and financial risk-sharing. This provides a powerful secular incentive for non-decomposable socio-economic systems.

Moreover, if these hyper-connected networks do face systemic threat, they are often able to adapt in ways which avoid extinction. For example, the risk of social, economic or financial disorder will typically lead to an adaptation of policies to prevent systemic collapse. These adaptive policy responses may preserve otherwise-fragile socio-economic topologies. They may even further encourage the growth of connectivity and complexity of these networks. Policies to support “super-spreader” banks in a crisis for instance may encourage them to become larger and more complex. The combination of network economies and policy responses to failure means socio-economic systems may be less Darwinian, and hence decomposable, than natural and biological systems.

Andy Haldane addresses OECD New Approaches to Economic Challenges (NAEC) Roundtable

What public policy implications follow from this complex system of systems perspective? First, it underscores the importance of accurate data and timely mapping of each layer in the system. This is especially important when these layers are themselves complex. Granular data is needed to capture the interactions within and between these complex sub-systems.

Second, modelling of each of these layers, and their interaction with other layers, is likely to be important, both for understanding system risks and dynamics and for calibrating potential policy responses to them.

Third, in controlling these risks, something akin to the Tinbergen Rule is likely to apply: there is likely to be a need for at least as many policy instruments as there are complex sub-components of a system of systems if risk is to be monitored and managed effectively. Put differently, an under-identified complex system of systems is likely to result in a loss of control, both system-wide and for each of the layers.

In the meantime, there is a crisis in economics. For some, it is a threat. For others it is an opportunity to make a great leap forward, as Keynes did in the 1930s. But seizing this opportunity requires first a re-examination of the contours of economics and an exploration of some new pathways. Second, it is important to look at economic systems through a cross-disciplinary lens. Drawing on insights from a range of disciplines, natural as well as social sciences, can provide a different perspective on individual behaviour and system-wide dynamics.

The NAEC initiative does so, and the OECD’s willingness to consider a complexity approach puts the Organisation at the forefront of bringing economic analysis and policy-making into the 21st century.

Useful links

This article draws on contributions to the OECD NAEC Roundtable on 14 December 2016; The GLS Shackle Biennial Memorial Lecture on 10 November 2016; and “On microscopes and telescopes”, at the Lorentz centre, Leiden, workshop on socio-economic complexity on 27 March 2015.

The OECD organised a Workshop on Complexity and Policy, 29-30 September, OECD HQ, Paris, along with the European Commission and INET. Watch the webcast: 29/09 morning29/09 afternoon30/09 morning