Title: Inter-Systemic Risk: resilience and the grand challenge
Approx. Reading Time: 15 minutes
Author: Peter Kingsley
Tipping points, cascading events, runaway climate change: just some of the language that attempts to grasp the realities of a chaotic, radically uncertain future. Peter Kingsley explores how catastrophic inter-systemic events will dominate the landscape, and why cultural transformation and breakthrough invention will be key to countering them. The grand challenge is how to navigate a path when no assumptions are safe and historical models are irrelevant. Time for predictive modelling that reflects the real world.
Read EssayTitle: Beware Blindspots
Approx. Reading Time: 9 minutes
Author: Peter Kingsley
The defining characteristic of human consciousness is that we project ourselves forward, exploring possible future worlds.
Peter Kingsley argues that innate neurophysiological forces are fundamental to how we think about long-term strategy, interpret emerging complex risks and how mental models can either inspire invention, or act as barriers. One of the recurring lessons of recent history is that risk management shrinks the world to events that can be priced or codified, rather than on the broader vulnerability of complex infrastructures, social systems and financial markets. The challenge is to re-invent strategy, then risk.
Read EssayArticle publicly available
Title: Weak Signals: no surprises?
Approx. Reading Time: 8 minutes
Author: Peter Kingsley
The COVID-19 pandemic has demonstrated the destructive impact that ‘wild card’ events may have. Pandemics were well-known ‘high impact’ risks categorised as ’near certain’ in risk registers.
Governments and companies around the world have learned a harsh lesson: wild card events should be constantly monitored for the first signs of emergence. Early action is critical.
Peter Kingsley revisits another equally important category in strategic risk management: weak signals—a key to thinking about the future.
Weak signals are ambiguous, barely defined stories that are easily dismissed, yet are a missing link in how to navigate a world of growing complexity, radical uncertainty and above all, speed.
The world’s most profound problems are often the result of short-term thinking, probabilistic worldviews, consensus culture and lack of imagination. Like wild cards, weak signals are ignored at our peril.
There should be few surprises.
Read EssayArticle publicly available
Title: Through a Glass Darkly
Approx. Reading Time: 12 minutes
Author: Peter Kingsley
With rationalism, scientism, big data and artificial intelligence promising answers to everything, we have forgotten the importance of creative imagination.
The paradox is that, contrary to conventional wisdom, confronted by an increasingly complex, interconnected and uncertain environment, imagined futures dominate our lives. We are shaped by our simulations, predictions and mental frameworks, continually re-inventing the world around us. We communicate our imagined futures through the stories we tell. Imagination is the engine of creation.
Read EssayArticle publicly available
Title: Politics and Machine Language Revisited
Approx. Reading Time: 11 minutes
Author: Peter Kingsley
In 1945, George Orwell, in a famous essay ‘Politics and the English language’ said that
“In our age there no such thing as ‘keeping out of politics’. All issues are political issues, and politics itself is a mass of lies, evasions, folly, hatred and schizophrenia. When the general atmosphere is bad, language must suffer.”
Today machines automate deep fakes. China is characterised as a ‘surveillance state’ and the West is captive of ‘surveillance capitalism’. Mass-scale manipulation by governments, political activists and terrorists make daily headlines but the workings of complex networks remain opaque. Secret states and unregulated technology companies host the ‘lies, evasions, folly, hatred and schizophrenia’ that Orwell talked about.
Yet, even now, few people recognise the real danger that machines will not ‘keep out of politics’. Peter Kingsley revisits Politics and Machine Language, with a new introduction.
Read EssayArticle publicly available
Title: The Misuse of Foresight
Approx. Reading Time: 8 minutes
Author: Peter Kingsley
Imagine you had an accurate picture of the future. Then imagine that the future threatened your interests. What would you do? We argue that the answer, too often, for corporate and political leadership teams, is to keep the picture secret, or create confusion and a web of deception. In other words, to misuse foresight.
Read EssayTitle: Propaganda Futures: the narrative is the message
Approx. Reading Time: 9 minutes
Author: Peter Kingsley
Marshall McLuhan famously said that ‘the medium is the message’. At the dawn of the information age, this had an element of truth. It has framed commentary on the role of media technology ever since. Most recently, Tim Berners-Lee talked about his fears that the technology he invented might yet be the ‘destroyer of worlds’, a technology so powerful that in the wrong hands it may undermine democracy.
Yet in a new media era where propaganda plays a decisive role in political security and stability, content, not technology, is the message, the manipulator of minds and behaviour.
Read EssayJorge Luis Borges’ 1941 short story
Amid the growing chaos and uncertainty pervading everything, from climate risk to geopolitical instability and runaway artificial intelligence, resilience is the concept high on many leadership agendas. Yet resilience is a contested, ambiguous, and in some ways outdated word. What constitutes resilience in a future of chaos and catastrophic—even existential—risk?
Political, media and technological landscapes are dominated by contests about the future—a defining feature not only of culture, but of power. Political, corporate and Big Tech leaders battle over their own visions of ‘technologies of the future’ to create hope and aspiration, and to set agendas. For populist leaders like President Putin, Prime Minister Modi, former-President Trump and many more, the contests are fought not on the battlefields of imagined futures, but also in the re-imagined past—a world of history, myth and false memory.
Extreme weather around the world, a foretaste of things to come, has sent shockwaves through the scientific, political and financial communities. Suddenly, dangerous global warming is no longer an abstract and distant threat, but a real and up-close peril. Time is short. Few are ready for the possible fundamental shocks ahead.
The finance, pensions and insurance sectors have long relied on data and mathematical models to justify decisions. Risk is traded, based on projected returns. In this essay Peter Kingsley explores how in ‘edge of chaos’ environments, these culturally defined value models break down.
As tensions over trade, supply chains and ‘industries of the future’ continue to rise, the search for national innovation models is more intense than ever. DARPA is seen as the prime example of radical invention, sponsored by government. Until recently it has not been successfully imitated.
Complexity science has long had an image problem. Complex systems are opaque. Yet simplification is also a risk. Woe betide modellers who miss the key variables, like culture, or make flawed assumptions.
‘Predictive Analytics’ dominates dialogues from policy development and risk assessment to financial market forecasting. This briefing argues there is some way to go, that claims are often overblown and that simulation, not prediction, better matches the real world characterised by complexity and radical uncertainty.
One of the critical measures of long-term value is the ability to act successfully on a mental model of the future operating environment. This is about acting on a fiction – an imagined future. In a word: foresight.
Foresight is about exploring possible futures, not prediction. It is innate, from our moment-by-moment simulations to our culturally shared imagined futures. This is not all: imaginative foresight is a primary source of innovation
Organisational culture is a critical determinant of the success or failure of foresight. At the same time, how we think about the future is more important than what we know, or even what we think about.
There are critical relationships between scenarios and real options in long-term strategy, vision-led innovation and strategic risk. All depend on imaginative human talent. AI is best in ‘narrow’ applications, relying on big data and pattern discovery. Scenarios will remain the most flexible set of techniques, but AI promises a future where machines and human creativity are combined.
On the horizon: AI-driven foresight tools, a future world where man and machine work together and everything is about simulation, predictive systems and human imagination. Scenarios will remain the most flexible set of tools and techniques, but AI and data promise a future where machines and human creativity are combined, in increasingly sophisticated forms.
Dystopian fiction is more popular than ever. Orwell’s 1984 and Huxley’s Brave New World are etched in the public imagination. Here we explore how themes of totalitarianism, mind control and the end of the world shape real politics and inspire violence. Yet climate change seems beyond even the most talented fiction writers.
Right wing populists are setting agendas, evoking history to create a false sense of security. Their readings of the past may be selective, but they have a crucial advantage: they can call on an endless pool of evocative stories that recall former glories and mythical worlds.
Liberal leaders risk underestimating the long-term disruption to the world order if they fail to develop competing narratives.
Imagined futures and the narratives that describe them are cultural realities, influencing present day decision-making, investment priorities and judgment about future value. They are fundamental to identifying the early signs of emerging systemic shocks and to understanding how markets may navigate out of them.
Contrary to conventional wisdom and practice, some extreme events are foreseen and warnings issued, only to be ignored or misinterpreted. Worst case scenarios may make uncomfortable reading, but modern-day Cassandras should not be ignored.
Academic research and media coverage of artificial intelligence (AI) rightly majors on the enormous potential and, at the other extreme, possible existential risks posed by ‘generalised’ machine intelligence. Yet three human talents: prediction, creativity and storytelling will remain elusive, possibly for decades. They may also hold the keys to breakthroughs.