Should we rely on economic forecasts? The wisdom of the crowds and the consensus forecast

Brian Dowd, FocusEconomics

Laurence J. Peter, a Canadian educator and author, is often referenced as saying, “an economist is an expert who will know tomorrow why the things he predicted yesterday didn’t happen today.”

Economics and especially economic forecasting are often given a bad rap. Many people think of forecasting as akin to licking a finger and testing the wind. However, there is a science to it.

Forecasting is essentially attempting to predict the future and predicting the future behavior of anything, much less something as complex and enormous as an entire economy, is not an easy task, to say the least. Accurate forecasts, therefore, are often in short supply.

There are a few reasons for this; the first being that economies are in perpetual motion and therefore extrapolating behaviors and relationships from past economic cycles into the next one is, as one might imagine, tremendously complicated.

The second reason, and perhaps the most surprising, has to do with the vast amount of raw economic data available. In an ideal world, economic forecasts would consider all of the information available. In the real world, however, that is nearly impossible, as information is scattered in myriad news articles, press releases, government communications, along with the aforementioned mountain of raw data.

Although some might consider having all of that information an advantage, nothing could be further from the truth. The thousands of economic indicators and data available tend to produce a vast amount of statistical noise, making the establishment of meaningful relations of causation between variables a serious challenge.

And, of course, we cannot forget the uncertainty that is inherent with forecasting, something that forecasters must take into account and which creates even more noise to deal with.

The question then becomes, is there a way to cancel out all of that noise to get a more accurate forecast? This is where the wisdom of the crowds comes in.

Is there wisdom in the crowds?

To illustrate how the wisdom of the crowds works, it’s best to tell the story of Sir Francis Galton, a Victorian polymath, who was the first to note the wisdom of the crowds at a livestock fair he visited in 1906. In one event, fairgoers were given the opportunity to guess the weight of an ox. The person with the closest guess to the actual weight would win a prize.

Galton hypothesized that not one person would get the answer right, but that everyone would get it right. Bear with me.

Over 750 participants made their guesses and unsurprisingly no one guessed the weight perfectly. However, when Galton calculated the mean average of all of the guesses, incredibly, it turned out to be the exact weight of the ox: 1,198 pounds.

Tapping economic analysts’ wisdom with consensus forecasts

The basic idea of the wisdom of the crowds is that the average of the answers of a group of individuals is often more accurate than the answer of any one individual expert. This was evident in the story of Galton’s experiment at the fair.

For the wisdom of the crowds to be more accurate, it depends on the number of participants and the diversity of the expertise of each individual participant. The more participants involved and the more diverse the participants are, the lower the margin of error.

So what does the wisdom of the crowds have to do with economic forecasting? Remember all of that noise that makes economic forecasting so difficult and as a result affects the accuracy of forecasts? The theory is that idiosyncratic noise is associated with any one individual answer and by taking the average of multiple answers, the noise tends to cancel itself out, presenting a far more accurate picture of the situation.

Sometimes also referred to as simply combining forecasts, the consensus forecast borrows from the same idea of Galton’s wisdom of the crowds – a consensus forecast is essentially the average of forecasts from various sources. Averaging multiple forecasts cancels out the statistical noise to yield a more accurate forecast.

But don’t take my word for it. Over the last few decades there has been a great deal of empirical research that has shown consensus forecasts to increase forecast accuracy, including those cited below.

With that said, it is possible for an individual forecast to beat the Consensus, however, it is unlikely that the same forecaster will consistently do so one forecast period after another. Moreover, those Individual forecasts that do happen to beat the consensus in one period are impossible to pick out ahead of time since they vary significantly from period to period.

Taking a look at a practical example may serve to clear things up a bit further.

A practical example of a consensus forecast

In the graph above, the Consensus Forecast for Malaysia’s 2015 GDP taken in January 2015 was 5.1%. All the other points, marked in grey, along the same axis represent the individual forecasts from 25 prominent sources taken at the same time.

In March 2016, the actual reading came out at 5.0%. A few forecasts were closer to the end result, however, as mentioned previously, some individual forecasts are going to beat the consensus from time to time, but it won’t happen consistently and it would be impossible to know which forecasts those will be until after the fact.

The second graph uses the same example as before; 25 different economic analysts forecasted Malaysia’s 2015 GDP in January of 2015. By March 2016, the maximum forecast turned out to be 16% above the actual reading with the minimum 10% below the actual reading.  The consensus was only 1.9% above the actual reading. By taking the average of all forecasts, the upside and downside errors of the different forecasts mostly cancelled each other out. As a result, the consensus forecast was much closer to the actual reading than the majority of the individual forecasts.

Consistency and reducing the margin of error are key

The point to keep in mind is that whether they are consensus forecasts or individual forecasts or any other kind of forecast, predicting the future is seldom going to be perfect. In the Malaysia GDP example, the Consensus wasn’t spot on, but it did certainly reduce the margin of error. It is important to note that there is almost always going to be some error, but reducing that error is the key, and more often than not, it will result in a more accurate forecast.

The consensus not only reduces the margin of error, but it also provides some consistency and reliability. As was mentioned previously, an individual forecaster can beat the consensus, however, it is impossible to know which of hundreds of forecasts will be the most accurate ahead of time. As is evident in our previous example, the forecasts from individual analysts can vary significantly from one to another, whereas the consensus will consistently provide accurate forecasts.

Forecasting isn’t perfect, but does it need to be?

Forecasting is a science, but it isn’t an exact science. They may not be perfect, but forecasts are still very important to businesses and governments, as they shed light on the unforeseen future, helping them to make vital decisions on strategy, plans and budgets.

So, should you trust forecasts? That is a tough question to answer. Yes, forecasting is complicating and, yes, forecasts are notoriously inaccurate and there are few ways to consistently improve forecast accuracy. The point is, however, that forecasts don’t necessarily need to be perfect to be useful. They just need to be as accurate as possible. One such way to do so is leveraging the wisdom of a crowd of analysts to produce a consensus forecast.

As French mathematician, physicist and philosopher Henri Poincaré put it, “It is far better to foresee even without certainty than not to foresee at all.”

The consensus forecast is a more accurate way to “foresee.”

Useful links

OECD forecasting methods and analytical tools

OECD Economic outlook, analysis and forecasts

Academic research on consensus forecasts

“Consider what we have learned about the combination of forecasts over the past twenty years. (…) The results have been virtually unanimous: combining multiple forecasts leads to increased forecast accuracy. This has been the result whether the forecasts are judgmental or statistical, econometric or extrapolation. Furthermore, in many cases one can make dramatic performance improvements by simply averaging the forecasts.”- Clemen Robert T. (1989) “Combining forecasts: A review and annotated bibliography” International Journal of Forecasting 5: 559-560

“A key reason for using forecast combinations […] is that individual forecasts may be differently affected by non-stationaries such as structural breaks caused by institutional change, technological developments or large macroeconomic shocks. […] Since it is typically difficult to detect structural breaks in ‘real terms’, it is plausible that on average, across periods with varying degrees of stability, combinations of forecasts from models with different degrees of adaptability may outperform forecasts from individual models.” Aiolfi M. and Timmermann A. (2004) “Structural Breaks and the Performance of Forecast Combinations”

Responsible Algorithms in Business: Robots, fake news, spyware, self-driving cars and corporate responsibility

Roel Nieuwenkamp, Chair of the OECD Working Party on Responsible Business Conduct (@nieuwenkamp_csr)

Why is the topic of robots frequently being raised at recent conferences on responsible business conduct?  For example, October last year the Polish Deputy Prime Minister noted the connection between robotisation and corporate responsibility during the opening of the Conference in Warsaw celebrating the 40 years anniversary of the OECD Guidelines for Responsible Business.

The potential negative impacts of robots or automated systems have proved cause for concern. In May 2010 there was a trillion dollar stock market crash, a ‘flash crash’, attributed to algorithm trading or in other words: robot investors. And let’s not forget the mathematical models that contributed to the financial crisis of 2007 and 2008. Recent events surrounding fake news, with Pizzagate as the most extreme example, are also contributing to these concerns.

What is the common denominator of these automated systems? Algorithms! These rule-based processes for solving mathematical problems are being applied to more and more areas of our daily lives. Likely, we are only at the beginning of the era of algorithms and their widespread application is raising many ethical questions for society and businesses in particular.

For example “killer robots”, weapons systems that select and attack targets without meaningful human control raise questions about dehumanisation of killing and who is responsible? In December the United Nations decided to set up an expert group, in order to look into this issue following a campaign ‘Stop Killer Robots’ by Human Rights Watch and other NGOs.  While self-driving cars will never be at risk of driving while intoxicated they can make decisions that might pose moral dilemmas for humans.  Online face recognition technology raises concerns around privacy.  These are just a few examples.

The pervasiveness of the use of algorithms may result in many unintended consequences.  In her book ‘Weapons of Math Destruction’ Cathy O’Neil describes how algorithms in combination with big data increase inequality and threaten democracy. She provides examples of the financial crisis and the housing market, but also of a college student who does not get a minimum wage job in a grocery store due to answers provided on a personality test, people whose credit card spending limits are lowered because they shopped at certain stores, etc. She also discussed predictive policing models such as those that predict recidivism and algorithms that send police to patrol areas on the basis of crime data, which can have a racist effect because of harmful or self-fulfilling prophecy feedback loops.

Scholars and practitioners in this field are beginning to consider the ethical implications of application of algorithms. Julia Bossmann of the Foresight institute described her top 9 ethical issues in artificial intelligence. Prof Susan Leigh Anderson of the University of Connecticut stated: “If Hollywood has taught us anything, it’s that robots need ethics.”  Cathy O’Neil proposes a ‘Hippocratic oath’ for data scientists. Recently a group of scholars developed Principles for Accountable Algorithms.  In the private sector Elon Musk, SpaceX CEO and other business leaders have founded OpenAI, an R&D company created to address ethical issues related to artificial intelligence. Amazon, Facebook, DeepMind, IBM and Microsoft founded a new organisation called the Partnership on Artificial Intelligence to Benefit People & Society. The partnership seeks to facilitate a dialogue on the nature, purpose of artificial intelligence and its impacts on people and society at large.  It is encouraging that certain industry efforts are being undertaken in this area. Additionally one thing should be clear for businesses that create and use these technologies: when things go wrong, using algorithms as a scapegoat won’t do the trick.

What guidance on these issues can be found in the most important instrument on business ethics, the OECD Guidelines for Multinational Enterprises (MNE), a multilateral agreement of 46 states on corporate responsibility. Cases brought to National Contact Points, the globally active complaints mechanism of the Guidelines, provide a good illustration of what the Guidelines recommend with respect to these issues.  For example, in February of 2013 a consortium of NGOs led by Privacy International (PI) submitted a complaint to the UK National Contact Point (NCP)  alleging that Gamma International had supplied a spyware product – Finfisher – to agencies of the Bahrain government which then used it to target pro-democracy activists.

The NCP concluded that Gamma had not acted consistently with the provisions of the OECD Guidelines requiring enterprises to do appropriate due diligence, to undertake a policy commitment to respect human rights and to remediate human rights impacts. Furthermore the company’s approach did not meet with the OECD Guidelines’ standards to respect human rights and the engagement of the company with the NCP process was unsatisfactory, particularly in view of the serious nature of the issues. The NCP recommended that the company engage in human rights due diligence.

What is human rights due diligence and what does it mean for companies developing algorithms? Under the Guidelines due diligence is a process that should be carried out by corporations as part of a broader range of actions to respect human rights. The right to privacy, freedom of speech, freedom from torture and arbitrary detention are examples of the many potential human rights that could be impacted. Due diligence is the process of identifying, preventing and mitigating actual and potential adverse human rights impacts, and accounting for how these impacts are addressed. If there is a risk of severe human rights impacts a heightened form of due diligence is recommended. For example, significant caution should be taken with regard to the sale and distribution of surveillance technology when the buyer is a government with a poor track record of human rights. Due diligence should be applied not only to a company’s activities but across its business relationships. In the context of a company producing algorithms therefore it is not sufficient that they behave responsibly in the context of their own operations but due diligence should also be applied to ensure buyers of the technology are not using it irresponsibly. In instances where this is the case, the company that created and sold the technology is expected to use its leverage in the value chain to prevent or mitigate the impact.

A number of valuable tools to respect human rights and implement the ’know your client’ principle have been developed in the context of ICT business operations. For example, the European Commission has developed a useful guide for companies on respecting human rights in the ICT sector. TechUK, an industry association of ICT companies in the UK, in partnership with the UK government has published a guide on how to design and implement appropriate due diligence processes for assessing cyber security export risks. Additionally the Electronic Frontier Foundation has developed a guide on How Corporations Can Avoid Assisting Repressive Regimes and the Global Network Initiative has developed Principles on Freedom of Expression and Privacy.

Beyond the human rights related recommendations, the OECD Guidelines make other relevant recommendations for companies developing algorithms. For example the Environment Chapter recommends environmental, health and safety impact assessments.[1] The Consumer Chapter advises companies to provide accurate, verifiable and clear information to consumers.[2] In addition companies should respect consumer privacy and take reasonable measures to ensure the security of personal data that they collect, store process or disseminate.[3]

Businesses that create algorithms should do their due diligence on potential human rights impacts. Companies should also carry out due diligence on labour, environmental and health and safety impacts. They should provide accurate verifiable and clear information about their algorithms and take measures to protect personal data. Collaborative industry efforts on responsible algorithms are highly needed to shape these expectations in concrete terms. Responsible algorithms will not only generate profit, but protect the rights of individuals worldwide while doing so.

Useful links

There’s an algorithm for that. Or there soon will be Marina Bradbury on Insights

[1]               OECD Guidelines for Multinational Enterprises, Chapter VI.3

[2]               OECD Guidelines for Multinational Enterprises, Chapter VIII.2

[3]               OECD Guidelines for Multinational Enterprises, Chapter VIII.6

From economic crisis to crisis in economics

Andy Haldane, Chief Economist and Executive Director, Monetary Analysis & Statistics, ​Bank of England

It would be easy to become very depressed at the state of economics in the current environment. Many experts, including economics experts, are simply being ignored. But the economic challenges facing us could not be greater: slowing growth, slowing productivity, the retreat of trade, the retreat of globalisation, high and rising levels of inequality. These are deep and diverse problems facing our societies and we will need deep and diverse frameworks to help understand them and to set policy in response to them. In the pre-crisis environment when things were relatively stable and stationary, our existing frameworks in macroeconomics did a pretty good job of making sense of things.

But the world these days is characterised by features such as discontinuities, tipping points, multiple equilibria, and radical uncertainty. So if we are to make economics interesting and the response to the challenges adequate, we need new frameworks that can capture the complexities of modern societies.

We are seeing increased interest in using complexity theory to make sense of the dynamics of economic and financial systems. For example, epidemiological models have been used to understand and calibrate regulatory capital standards for the largest, most interconnected banks, the so-called “super-spreaders”. Less attention has been placed on using complexity theory to understand the overall architecture of public policy – how the various pieces of the policy jigsaw fit together as a whole in relation to modern economic and financial systems. These systems can be characterised as a complex, adaptive “system of systems”, a nested set of sub-systems, each one itself a complex web. The architecture of a complex system of systems means that policies with varying degrees of magnification are necessary to understand and to moderate fluctuations. It also means that taking account of interactions between these layers is important when gauging risk.

Although there is no generally-accepted definition of complexity, that proposed by Herbert Simon in The Architecture of Complexity – “one made up of a large number of parts that interact in a non-simple way” – captures well its everyday essence. The whole behaves very differently than the sum of its parts. The properties of complex systems typically give rise to irregular, and often highly non-normal, statistical distributions for these systems over time. This manifests itself as much fatter tails than a normal distribution would suggest. In other words, system-wide interactions and feedbacks generate a much higher probability of catastrophic events than Gaussian distributions would imply.

For evolutionary reasons of survival of the fittest, Simon posited that “decomposable” networks were more resilient and hence more likely to proliferate. By decomposable networks, he meant organisational structures which could be partitioned such that the resilience of the system as a whole was not reliant on any one sub-element. This may be a reasonable long-run description of some real-world complex systems, but less suitable as a description of the evolution of socio-economic systems. The efficiency of many of today’s networks relies on their hyper-connectivity. There are, in the language of economics, significantly increasing returns to scale and scope in a network industry. Think of the benefits of global supply chains and global interbank networks for trade and financial risk-sharing. This provides a powerful secular incentive for non-decomposable socio-economic systems.

Moreover, if these hyper-connected networks do face systemic threat, they are often able to adapt in ways which avoid extinction. For example, the risk of social, economic or financial disorder will typically lead to an adaptation of policies to prevent systemic collapse. These adaptive policy responses may preserve otherwise-fragile socio-economic topologies. They may even further encourage the growth of connectivity and complexity of these networks. Policies to support “super-spreader” banks in a crisis for instance may encourage them to become larger and more complex. The combination of network economies and policy responses to failure means socio-economic systems may be less Darwinian, and hence decomposable, than natural and biological systems.

Andy Haldane addresses OECD New Approaches to Economic Challenges (NAEC) Roundtable

What public policy implications follow from this complex system of systems perspective? First, it underscores the importance of accurate data and timely mapping of each layer in the system. This is especially important when these layers are themselves complex. Granular data is needed to capture the interactions within and between these complex sub-systems.

Second, modelling of each of these layers, and their interaction with other layers, is likely to be important, both for understanding system risks and dynamics and for calibrating potential policy responses to them.

Third, in controlling these risks, something akin to the Tinbergen Rule is likely to apply: there is likely to be a need for at least as many policy instruments as there are complex sub-components of a system of systems if risk is to be monitored and managed effectively. Put differently, an under-identified complex system of systems is likely to result in a loss of control, both system-wide and for each of the layers.

In the meantime, there is a crisis in economics. For some, it is a threat. For others it is an opportunity to make a great leap forward, as Keynes did in the 1930s. But seizing this opportunity requires first a re-examination of the contours of economics and an exploration of some new pathways. Second, it is important to look at economic systems through a cross-disciplinary lens. Drawing on insights from a range of disciplines, natural as well as social sciences, can provide a different perspective on individual behaviour and system-wide dynamics.

The NAEC initiative does so, and the OECD’s willingness to consider a complexity approach puts the Organisation at the forefront of bringing economic analysis and policy-making into the 21st century.

Useful links

This article draws on contributions to the OECD NAEC Roundtable on 14 December 2016; The GLS Shackle Biennial Memorial Lecture on 10 November 2016; and “On microscopes and telescopes”, at the Lorentz centre, Leiden, workshop on socio-economic complexity on 27 March 2015.

The OECD organised a Workshop on Complexity and Policy, 29-30 September, OECD HQ, Paris, along with the European Commission and INET. Watch the webcast: 29/09 morning29/09 afternoon30/09 morning

What is key for the OECD in 2017? An open economy perspective

Noe van Hulst, Ambassador of the Netherlands to the OECD

As we start a year that Ian Bremmer (President Eurasia Group) has coined as entering ‘the geopolitical recession’, it is worth asking what the OECD focus could be in 2017. I see two key issues worth highlighting in this context. First: Escaping the Low-Growth Trap. Escape games are popular nowadays, but this one is of eminent importance to all of us. In the latest Economic Outlook (November 2016) the OECD has aptly demonstrated how we got stuck at 3% per year for the last five years. How can we get out of this low-growth trap? Now that extraordinary accommodative monetary policy has reached its limits, the OECD recommends a more balanced policy set with a much stronger role for collective fiscal action and for more inclusive structural and trade policies.

Although the Economic Outlook makes a passionate case for a more expansionary fiscal stance in many countries, the reality is that this is unlikely to happen. Partly because some countries are cautious in the light of a heavy public debt burden. Partly because they are already growing at or above potential growth, as we heard from Prof. Christoph Schmidt (Chairman of the German Council of Economic Experts) a week after the publication of the Economic Outlook. The reason that potential growth is so low has, of course, everything to do with the productivity slowdown that was – very appropriately – the main topic of the OECD Ministerial Council Meeting in June 2016. Against this background, I think we will find more common OECD ground in 2017 if we focus strongly on boosting smarter structural policies as the main avenue to get out of the low-growth trap.

Let me mention just two concrete examples. The first one is harvesting the great potential of the digital economy, both a priority of the German G20 presidency and a promising new horizontal project within the OECD. The second example is inclusive structural reforms, particularly in product markets potentially delivering short-term benefits in terms of output, investment and employment. Making reforms more inclusive is also about exploiting benefits of complementarities between product and labour market reforms, synergies (growth & equity objectives) and designing policy packages to help vulnerable groups or mitigate trade-offs.The launch of the 2017 OECD Going for Growth publication is an excellent opportunity to highlight this key point. Reinvigorating good-old competition policy will also reinforce stronger and faster diffusion of new (digital) technologies from frontier to laggard firms and hence boost average productivity. Let’s not forget that structural policies are a traditional OECD strength, an area where the OECD holds a strong comparative advantage and rightly enjoys high international credibility.

What about inclusive trade policies? Well, that is my second key issue to focus on in 2017. Global trade growth has been very weak relative to historic norms for five years. The general consensus is that the relationship between trade and GDP growth is undergoing a fundamental shift. In the ‘good old days’ we enjoyed trade growth at a rate of twice global GDP growth and now trade barely makes global output growth. According to OECD analysis this also contributes to the productivity slowdown. So what exactly is going on with trade? Is low trade growth somehow intertwined with the general global growth malaise? To what extent is this due to global value chains contracting, as reflected in OECD analysis? Is the current slowdown in global trade only natural and should not be a major concern? In any case, it is clear that the rise of trade restrictions in G20 countries, still continuing in stunning contradiction to countless G20 communiqués, surely are not helpful. Deeper OECD analysis is required to pin down more precisely how the different factors contribute to the trade slowdown. And how trade impacts labour markets and economic growth in different regions within countries.

Deeper analysis, however, is not enough. We definitely need to ask ourselves some tough questions about where the public backlash against trade and globalisation is coming from and what went wrong. And even more importantly, what we can and should do better. One area is the need to rebalance our trade and investment policies, towards a more fair, sustainable and inclusive system. Making the OECD Guidelines for Multinational Enterprises the centerpiece of trade and investment policies would be a concrete step. Another area is more effective complementary domestic policies to help people deal faster and more successfully with trade-related job losses if and when they occur. Ideally, this entails not only effective ‘safety net’ policies but also so-called “trampoline” policies offering a tangible springboard to new jobs.

In any case, it is obvious that trade and trade policies are politically more under fire now than I can remember – and I am not young.  As Martin Wolf wrote in the Financial Times “The era of globalisation under a US-led order is drawing to a close…the question is whether protectionism and conflict will define the next phase”. For very open economies like the Netherlands it is of critical importance how this ‘next phase’ will shape up in 2017 and beyond. At this juncture, the Dutch economy is growing at a solid 2% per year (in 2016 and 2017) with unemployment coming down rapidly to 5%, but the downside risks are all related to where the global economy is heading. Many other OECD member countries have a similarly high exposure to shifts in the global economy. According to Open Market Index data from the International Chamber of Commerce (ICC), more than two-thirds of OECD countries have an above average openness, as measured by observed openness to trade, trade policy, Foreign Direct Investment openness and infrastructure for trade.

The OECD has a crucial role to play, in cooperation with other international organisations, in clearly demonstrating the adverse impact of rising protectionism, in monitoring what’s happening in trade and stimulating policy dialogue on better alternatives that help global growth.  In this light it is very fitting that the OECD Ministerial Meeting in June 2017 will focus on the theme of making globalization work for all. Let’s try to come up with concrete policy improvements that can help us preserve a well-functioning open global economy.

Useful links

OECD Global Economic Outlook, November 2016

The Future of Economics: From Complexity to Commons

Paul B. Hartzog, Futurist

This article looks at three crucial insights for the future of economics: Complex adaptive systems; how technologies of cooperation enable commons-based peer-to-peer networks; and why we need complex adaptive systems to understand new economies

COMPLEX ADAPTIVE SYSTEMS

The Edge of Chaos

Complex adaptive systems has enjoyed considerable attention in recent decades. Chaos theory reveals that out of turbulence and nonlinear dynamics, complex systems emerge: order from chaos.

We learned that complex systems are poised on the “edge of chaos” and generate “order for free” (Stuart Kauffman). They are composed of many parts connected into a flexible network. As matter and energy flow through, they spontaneously self-organize into increasingly complex structures. These systems, continuously in flux, operate “far from equilibrium” (Ilya Prigogine). Beyond critical thresholds, differences in degree become differences in kind. “More is different.” (Phil Anderson)

Complexity science reveals the difference between prediction and attraction. We can know that a marble in a bowl will reach the bottom even though we cannot predict its exact path because of sensitivity to initial conditions. Deterministic chaos means path dependence, where future states are highly influenced by small changes in previous states. A typical economic example is the lock-in of the now-standard “QWERTY” keyboard.

Networks

We see network effects: adding another node to a network increases the value of all other nodes exponentially, because many new connections are possible, economically “increasing returns to scale” (Brian Arthur). Reed’s Law goes even farther, because new groups can be formed, exhibiting a much greater geometric growth. We know about “small-world,” or “scale-free,” networks, so called because there is no statistic at any scale that is representative of the network as a whole, e.g. no bell-curve average, but instead a “long tail,” mathematically a logarithmic “power law.” Some networks are robust to random failures but vulnerable to selective damage, i.e. network attacks that target nodes with a higher centrality. Furthermore, “centrality” means different things inside different network topologies. Network structure affects the frequency and magnitude of cascades. Like avalanches in sand piles, power laws create “self-organized criticality” (Per Bak).

Information Landscapes

Complex systems constitute “fitness landscapes,” exhibit cycles of growth and decline, are punctuated by explosions of diversity and periods of stasis, and show waves of ebb and flow, seen in traffic patterns. On fitness landscapes, algorithms that pursue merely maximization, without the ability to observe remote information from the landscape, freeze in local optima. Without system diversity, there is no improvement. Swarms escape because they not only read information from the landscape but also write to it, creating shared information environments.

Landscapes and occupants impart selection pressures on each other. Good employees and good jobs both outperform bad ones. Agents and strategies evolve. Adaptation can become maladaptation when selection pressures change.

Dynamics and Time

When we study the spread of disease through a forest we see a slow progression of infected trees. However, when we study the spread of fire, we see the same pattern enacted much faster.

Complex systems and their dynamics are not new. What is new is that human systems have accelerated to the point where political, economic, and social changes now occur rapidly enough to appear within the threshold of human perception. We change from slow social movement to an era of “smart mobs.” Consequently, while it may be true that we did not need the tools of complex systems in the past, because economic change was slow and did not require a dynamical viewpoint, the current speed of economic change demands this new lens.

THE EMERGENCE OF COMMONS-BASED PEER-TO-PEER NETWORKS

A crucial global economic phenomenon is the rise of commons-based peer-to-peer networks. “Technologies of cooperation” (Howard Rheingold) enable people to self-organize in productive ways. Open-source software was one first clue to powerful new ways of organizing labor and capital. “Commons-based peer-production” is radically cost-effective (Yochai Benkler). By “governing the commons” (Elinor Ostrom), shared resources managed by communities with polycentric horizontal rules, without reliance on either the state or the market, escape the “tragedy of the commons.” Our thinking about production, property, and even the state, must evolve to reflect the growing participatory economy of global stewardship and collectively-driven “platform cooperatives” (Michel Bauwens). New commons include food, energy, “making,” health, education, news, and even currency.

The rise of 3D printing and the Internet of Things combined with participatory practices yields new forms of value production, paralleling new forms of value accounting and exchange. We witness a “Cambrian explosion” of new currency species, like BitCoin, and innovative trust technologies to support them: the blockchain and distributed ledgers. Just as 20th century electrical infrastructure remained fragmented until standards enabled a connected network (Thomas Hughes), new infrastructure matures when separate solutions merge and the parts reinforce the stability of the whole.

THE FUTURE FATE OF ECONOMICS

Economics as a discipline can only remain relevant as long as it can provide deep engagement with contemporary reality. Overly-simplified models and problematic axioms cannot guide us forward. The world is an interwoven, heterogeneous, adaptive “panarchy.”

Harnessing complexity requires understanding the frequency, intensity, and “sync” of global connectivity. Analyzing many futures demands better tools. To analyze “big data,” first we need data. Complexity science utilizes multi-agent simulations to investigate many outcomes, sweep parameters, and identify thresholds, attractors, and system dynamics. Complexity methods provide unique metrics and representations, animated visuals rather than static graphs.

This is not just big data; it’s dynamic data. With distributed systems, it becomes peer-to-peer data: shared infrastructure. Just as ants leave trails for others, shared infrastructure bolsters interoperability through a knowledge commons. Restricting connectivity and innovation, e.g. with intellectual property rights, carries extreme costs now. Fitness impedes uncooperative agents and strategies. Fortunately new commons have novel “copyleft” licenses already, promoting fairness and equity.

Complexity science shows us not only what to do, but also how to do it: build shared infrastructure, improve information flow, enable rapid innovation, encourage participation, support diversity and citizen empowerment.

Useful links

Panarchy 101, or How I Learned to Stop Worrying and Love Global Collapse Paul B. Hartzog

The OECD organised a Workshop on Complexity and Policy, 29-30 September, OECD HQ, Paris, along with the European Commission and INET. Watch the webcast: 29/09 morning29/09 afternoon30/09 morning

Don’t be a skinny blue mushroom! Try the Insights quiz!

New year quiz 2017

Relaaaax, Jiminey, nobody cares

We were going to start by asking which of these expressions became popular in 2016: post-alive for dead; post-faithful for cheating; or post-truth for lies. But we’ve decided to make it easy and concentrate on celebrity gossip, reality tv, and sports. [Post-post-truth disclaimer: it’s on economics, politics and the other stuff we published on the blog over the past year.]