Civilization and the Complexity Trap

 

Civilization has set a trap for itself. Humanity has made amazing progress over the centuries — progress made possible by the rise of the expert. But this progress creates a new problem, as ever-more-complex technologies are deployed at an accelerating rate. Billions of different devices, protocols, ideas, traditions, and people interact — second by second, all over the planet, creating an emergent complexity crisis.

Experts understand parts of this system, but the overall complexity is far beyond the comprehension of any individual — whether scientist, citizen, or political leader. This complexity presents a huge challenge for society: It’s hard to ensure the stability of the system if you can’t track all the key interactions and feedback loops that may cause trouble.

So how will we address the wicked challenges of the next decade? We need a “paradigm shift” in societal regulatory systems, with bold, creative new ideas to break us out of the complexity trap.

Unexpected consequences

While we have come to this point gradually, we have seen foreshocks at earlier stages of technological development. Looking back over the last several hundred years, science and technology, guided by reason and knowledge, have clearly improved daily life for most of humanity. Yet progress is not linear. It always involves some kind of disruption: Each advance leaves side effects that society then struggles to solve.

For example, the Haber-Bosch process for artificial fixation of nitrogen increased agricultural yields but has led to waterways around the world that are polluted with runoff from the excess use of some fertilizers. The advent of the internal combustion engine made transportation far more convenient, but emissions contribute to global warming. Chlorofluorocarbons, used as refrigerants, caused the ozone hole, but efforts to replace them gave rise to other compounds — hydrofluorocarbons — that are dangerous greenhouse gases. Antibiotics have saved hundreds of millions of lives but are now used so widely that drug-resistant strains have become a new risk to human health. There are many more such examples across all areas of science and technology.

Such problems arise because of system-level effects that are not obvious when new technologies are first introduced. Such unanticipated consequences can occur at almost any level — chemical, biological, computational, economic/financial, or social/political. But emergent complexity becomes an increasingly serious problem with the rise of computers — as individual components of the system “get smarter,” interact more rapidly, and now get coupled on a global scale. On one hand, connecting everyone and everything via the Internet offers tremendous convenience in many realms of life, but this connectivity also has allowed the rise of cybercrime and cyber-warfare.

Social media have facilitated information sharing but also allowed information abuse, as seen, for example, with Russian interference in the U.S. presidential election in 2016. In the financial sector, computers allow transactions to occur with such blinding speed and in such complex patterns of action and reaction that it took months to analyze a “flash crash” in the summer of 2010, when the Dow Jones Industrial Average dropped about 1,000 points and then recovered in a 30-minute period. New financial derivatives that exacerbate this complexity only exist because of advances in mathematics and computer science, and yet it was impossible for U.S. officials to foresee all the global consequences that would arise when Lehman Brothers was allowed to go bankrupt in 2008. The financial ties were so complex that some court cases resulting from the bankruptcy were not resolved until 2018.

A growing threat

There is a danger for democracy when the complexity of real-world events moves beyond the limits of human comprehension, and we are seeing this daily in the news. It becomes impossible for voters and elected leaders to make wise choices when we cannot even agree on the facts or make some reasonable choice of key priorities for governance. And there is a further risk: Other, non-democratic forms of governance may be less vulnerable to the effects of such complexity and may gain a competitive advantage. Anarchy may thrive; and dictators need not devote resources to sharing information, building consensus, and addressing social problems.

While these challenges may seem to be purely political, I think they are intertwined with broader issues about science and society. As a scientist myself with a successful career that focused on the structure and design of DNA-binding proteins, I resigned a tenured faculty position in the Department of Biology at MIT to look at these larger challenges of human thought and the human future. As I studied finance, cognitive neuroscience, governance, climate change, risks of environmental degradation, and risks posed by the rise of artificial intelligence, it became clear: limits of human cognitive capacity leave us struggling amidst the complexity of challenges now facing the planet.

So, what are we as a society to do? It is not reasonable to ask scientists or experts to anticipate the full effects of their work. Science is driven by discovery, often involving a race to find answers. Anyone who pauses too long to think about implications could “lose” to someone who keeps the blinders in place and dashes to the finish line. But, at the same time, it is very much in the interest of scientists to ensure that their discoveries are used in a way that offers a net positive benefit to society.

Innovations in other areas, like finance, lead to similar challenges. Regulatory efforts, such as the Dodd-Frank bill in the United States and the resulting tens of thousands of pages of associated regulatory documents, are far too complex for even policy-makers to read or to understand in meaningful detail. Meanwhile, blockchain and bitcoin, and the prospect of new currencies sponsored by companies, raise a host of new regulatory challenges that society has barely begun to explore.

There also are limits of human knowledge, and limits to our modeling capacity, that prevent us from taming emergent complexity by developing some “full model” of all our interconnected scientific/technical/social/political/financial systems. Models break down at this global system level. Scientific research typically depends on isolating a subsystem in the laboratory, adjusting one variable at a time as the experiment proceeds, having a suitable control, and letting others repeat the experiment and verify the results. All of this is lost when we struggle with full-world complexity.

A new way forward

We need some new approach, some new way of handling emergent complexity, and I think that begins with recognizing the following:

1) New levels of complexity create and exacerbate challenges for society.

2) These new levels of complexity engender two kinds of “external costs” paid by society as a whole. Some involve direct damage, as when flaws in Facebook were used to incite hatred and disrupt the U.S. elections in 2016. Other external costs are less direct, as when precious time and attention are needed to sort out the problems and develop some effective plan.

As with other external costs — like those of using fossil fuels — society faces a fundamental challenge in apportioning “costs” and “benefits” of complexity in some fair, reliable, well-structured way. The current debate over a carbon tax, for example, shows some of the general problems we have in measuring and distributing such costs; and related challenges are seen in all of the ongoing struggles to help regulate social media.

3) Society needs better ways to evaluate potential problems. Companies developing revolutionary new technologies, for example, should evaluate and mitigate risks at key points in research, development, and implementation. This evaluation should aim to anticipate a range of potential outcomes, weighing costs and benefits to society.

4) We need some approximate ways of measuring these costs of complexity, even while acknowledging that these measurements will never be as precise as we would like. We also must apportion costs and benefits and responsibility — ensuring that those developing and selling new systems have a responsibility to repay society for external costs of their new technology.

These first working assumptions do not yet solve the problem, but they frame it well enough to serve as a call for advice and comment from the scientific and policy communities and to provide the foundation for some productive meetings. These could be open-ended discussions funded by government organizations, or tech companies, or by philanthropists who want to preserve a democratic system and a livable human future.

Democracy and capitalism, coupled with modern science, have allowed a wonderful flourishing of thought, creativity, expression, and invention. Experts have done brilliant work, operating under the assumption that knowledge would steadily increase, guided by rational thought. But we now enter a new phase in civilization when emergent complexity is creating a world that no one understands in detail.

We need a wider discussion: This goes beyond a technical solution with some clever new program or device, or some new brain implant. I think it begins with “gating mechanisms” and new kinds of regulatory schema that can work as precautionary measures when technology is first introduced. Ultimately, we will need to upgrade our very methods of thought, and will need to hold complexity at a level that allows for effective mechanisms of governance. This is a call to action, worthy of our brightest minds, to help ensure a livable human future.


This essay was also published on Medium.