FREE ARTICLES

Long Read Overview

Title: Politics and Machine language
Author: Peter Kingsley
Approx. Reading Time: 10 minutes

Long Read

Politics and Machine language

In 1945, George Orwell wrote a famous essay entitled ‘Politics and the English language’.  He said:

The great enemy of clear language is insincerity.  When there is a gap between one’s real and one’s declared aims, one turns as it were instinctively to long words and exhausted idioms, like a cuttlefish squirting out ink.  In our age there no such thing as ‘keeping out of politics’.  All issues are political issues, and politics itself is a mass of lies, evasions, folly, hatred and schizophrenia. When the general atmosphere is bad, language must suffer.

On the horizon, machines may not only understand language – so they help us search for the latest gadget, or act as servants – but become authors.  The signs are there.  Machines already write routine news reports and even formulaic popular novels, driven not by understanding language, but by a growing recognition that emotive words sell.

In academic terms, machines have agency.  They will increasingly manipulate our emotions, shape what we think, frame our daily lives.   Will they be sincere?  Will they ‘keep out of politics’?

No.  Welcome to the age of politics and machine language, circa 2025 and beyond. Machines will disrupt culture and commerce and potentially destroy political stability.  With a radical shift of power from elected governments, coupled with a loss of trust in public institutions and old media and the weakening of the liberal consensus, some extreme scenarios may play out, sooner rather than later.

Competing machine narratives

There are competing narratives in which artificial intelligence (AI) and robotics threaten everything from jobs to the economy and social and political stability at one extreme, or at the other emerge as a transformative force for good.  Many of the strategic risks and opportunities will take decades to emerge.  Others are clear and present dangers.

It is at the opaque and secretive nexus of four deep driving forces that fundamental structural and political change will emerge:

  • Artificial intelligence (AI);
  • Cyberwar;
  • Radical new media ‘social machines’ (in simple terms highly automated social networks); and
  • Targeted propaganda.

The fragmented narratives that surround these complex systems – each one fragile, fast moving and disruptive – reflect deep structural uncertainty.  Yet it is clear they are on a collision course.  In the vacuum left by weak political leadership, they will set the stage for 21st century chaos.

The stakes are high.  The underlying problem is that no-one understands these systems and the interdependencies with real world social change and political economy.  There is no widely accepted working model of how the world works, or may work in future.  What is emerging instead, with full force and bewildering speed, is a post-modern ‘no-truth’ extremist nightmare in which there are no grand political narratives that bind large groups together, or reconcile looming large-scale conflicts.

Artificial intelligence: language barriers

Taking each in turn, AI, at one extreme, will develop the ability to accurately interpret text, music, image and emotion.  Machines will generate expression and meaning, acting as artificial authors, composers and artists.

At the other extreme, AI will never reach any kind of parity with humans.  There are many reasons to be sceptical.  AI ‘deep learning’ depends on vast amounts of data, much of which is flawed and so skews accuracy.  Messy data, messy algorithms.

Less obvious, AI is designed by humans and so biases – by gender, race, cultural group, political affiliation – are embedded, pervasive and invisible.  Messy humans, messy algorithms.[1]

Similarly, machines cannot understand ‘unwritten’ cultural clues.  The unconscious assumptions and intentions of the author are not made explicit in any text, video or music.  They work beneath the surface.  Machines cannot yet understand context.  Nor can they extract underlying storylines, narratives and myths that together create human associations and meaning.  Arguably, they never will.  Human understanding, communication and meaning is profoundly complex.

Somewhere between these extremes, another more dangerous outcome may be that machines appear to be human.  They may convince people that they are human and since perception is reality, an Orwellian nightmare may emerge in which trust and sincerity are lost in a hall of mirrors, indistinguishable from the real thing.

Similarly, in the short term, research into human neurobiology will uncover not the secrets of language, but of emotional engagement, with even more far-reaching implications for businesses and politicians focused on competing for attention.  Emotion sells.

There is more.  Algorithms reflect the commercial and political intentions of their creators. Commercial interests will likely favour creating emotional impact to drive sales, build brand awareness, or influence political outcomes.  There is a profound difference between human writers expressing a point of view and future machines, not least because they will operate far more rapidly, on a wider scale and process output through predictive models that sense readers’ responses.

Imagine a million invisible intelligent agents – ‘bots’ – swamping all present and future forms of media.

Cyberwar

The invisible influence of AI will become the dominant soft power for corporations, intelligence services and political parties, playing with peoples’ minds.   At the same time, cyberwar may get physical, as illustrated by the Stuxnet virus.  Electrical power grids will be targeted, as the UK National Cybersecurity Centre alleged in July 2017.[2]  Air traffic control disrupted.  Internet blackouts.  Health services hit by ransom ware attacks such as Wanncry.  It may verge on the modern equivalent of a ‘shooting war’ on multiple critical infrastructures, or even the stand-off commonly known as ‘mutually assured destruction’.  The emergence of the Internet of Things (IoT), insecure and vulnerable from the outset, will add fuel to the fire.

Cyberwar has been a hidden for some years.  Only recently has the severity of the risks of miscalculation and escalation been revealed in public.  The full details of the alleged attacks on the US Presidential election of 2016 have yet to play out.  So far, the ‘spectacular’ that some analysts fear has not materialised, but cannot be discounted given the revelations about the Nitro Zeus cyberwar capabilities.

TV5 Monde was reported to be within hours of complete meltdown in 2015, as a result of an alleged Russian cyber assault.  ‘Old media’ appear to be a popular target against which to make shows of strength, a reminder that coups have frequently begun with television and radio channels.  Yet it is core infrastructures – power networks, water, communications – that will be the primary targets in the event of real escalation.

Media old, new and endangered

 Against this background, old media is in perpetual crisis as Facebook, Google and what remains of a fragmented open web dominate the gateways of news consumption, with profound implications for political and social cohesion.

Old media newspapers like the New York Times, the Washington Post and The Guardian in the UK, have become technology-driven businesses, in search of readers.  Public service broadcasters will continue to struggle to maintain relevance, authority and to extend their reach.  Demographics and technology will fragment audiences and recent political upheavals, such as Brexit and the US Presidential election, illustrate that substantial numbers of people are either vulnerable to the lure of populist slogans, or remain ‘off-grid’.

The old adage that ‘the average newspaper is simply a business enterprise that sells news and uses that lure to sell advertising space’ holds true.  The NYT, together with the vast majority of old media companies, have embraced Facebook, battling for attention from 1.94 billion active Facebook users.  Meanwhile, nine of Facebook’s top 10 ’trusted’ sources are old media.  The only new name is Buzzfeed.  New meets old in another sense: Geoff Bezos is the highest profile media mogul, buying The Washington Post with his own money and leaving editorial direction to editors and reinventing the paper as a ‘media and technology’ company.

Even so, the dominant narrative is that traditional media is unable to generate revenues to underwrite long-term production of high quality, independent journalism, or rebuild trust.  In the fight for survival, the risk is that revenues from populist reporting come before independent journalism.  As Edward Luce put it at the height of the US election, under a headline that read ‘The art of defrauding America’ said that ‘ratings beat integrity’:

Mr Trump’s genius is to grasp that television’s desperate quest for ratings outweighs any ideological leanings.   Leslie Moonves, chairman of CBS, put it well earlier this year. Mr Trump’s celebrity had worked miracles on the network’s advertising revenues. “It may not be good for America,” he said. “But it’s damn good for CBS.” In an age of ever-thinner gruel for the TV business, Mr Trump offers repeated sugar highs.’[3]

Old, mainstream media is failing on both richness and reach, which means their real impact on public discourse will continue to go downhill.

Yet it is not a narrative of ‘new media wins, old media loses’.   The onward march of social networks is fragile.  Signs of the backlash will emerge.  Millennials will soon tire of being used by businesses they see as parasitic.  Many already have.  Ad blockers will undermine deep assumptions about the value chain.  Deep learning depends, at least for the time being, on deep data, so the vulnerability of Facebook and Google to a mass exodus may over time create a domino effect.  Imagine the anti-social network.  Privacy oriented search engines will go from strength to strength.

“You can make the walled garden very very sweet,” Berners-Lee said at a web summit last year. “But the jungle outside is always more appealing in the long term.”[4]

As the next generation of social machines, armed with a flawed understanding of language, text, image and emotion take to the stage, there may be unlimited potential for audience manipulation and for errors.

These developments are cause for concern.  Even simple Twitter messages can change history.  In 2015, the German federal agency for migration and refugees made a mistake, tweeting that the Dublin procedure[5] had been suspended.  It made Germany the first choice for Syrian refugees.[6]

The emerging narrative is that social machines, for all their potential benefits in terms of collaboration, will at worst manipulate audience emotion and attract both large-scale attention and advertising.  They will sway not just media and consumer behaviour, but dominate and shape politics.

Propaganda transformed

This kind of manipulation will go far beyond the extremism already rampant, without meaningful sanction, on social networks.  It will involve emotional targeting as well as ‘big data’ gathered by bringing together multiple sources of public and private intelligence.   Unpredictable systemic impact on the fabric of society and political culture is the most likely outcome.  The allegations made by Carole Cadwalladar in The Observer about the role of  Cambridge Analytica, with the headline ‘The Great Brexit robbery: how our democracy was hijacked’ may only be the tip of the iceberg.[7]

If the algorithms remain commercially proprietary and opaque, then in the extreme, we will live by invisible, non-human hands, as well as by the anonymous trolls that already stir up violent, negative emotion.  Whoever controls the machines, controls global consciousness and perception.

At the intersection of AI, cyberwar and social media, there are more disturbing goings on, largely hidden from public view in the media underworld and the deep web.

Russia and China counter Western mainstream media by targeting specific audiences via social media channels.  China’s ‘Three Warfares’ integrates psychological warfare, media (or ‘public opinion’) warfare and legal warfare.  Meantime, extremist groups like ISIS run propaganda and recruitment across the globe in the deep web, Facebook and on Twitter.

Joseph Nye argues that 21st century conflicts are less about whose armies win, but more about whose story wins.  Academics refer to this as ‘non-linear’ power.  The objective, now widely accepted in Western defence circles, is to dominate the ‘strategic narrative’ and counter disinformation.

Propaganda is not new, but a more dangerous era of direct intervention may emerge.  To some, this represents the key component of next generation warfare, which is about ‘winning without fighting’.

Sound familiar?

What is new is scale and sophistication.  Imagine machines, operating in real time, personalised to individual and group targets, driven by emotionally sensitive messages designed to undermine mainstream political discourse; throw authority into doubt; drive instability; and create chaos and confusion.

Politics and machine language

All this builds up to the critical question of the potential impacts of AI, cyberwar, social machines and a new generation of propaganda may have on political stability.

In an age of radical post-modernism and in what is commonly called a ‘post-truth’ society, the commercial issues may pale into insignificance against the possible political implications and structural outcomes.

It is routine for political campaigns of all colours to use big data, social media sentiment analysis and demographic segmentation to ‘get the word out’, as President Obama has repeatedly demonstrated.  Yet this is just the beginning.

Of more concern, in a ‘post-factual’ world, false accusations travel around the world faster than corrections, despite the emergence of more than 100 ‘fact-checking’ web sites.  This then is a picture of chaos that is set to continue.

 Twitter, according to research in 2016, ‘made Donald Trump’, or rather created the conditions within which populism has emerged.  Whether it is country intelligence services or populist politicians, a new age of machine driven propaganda has boundless possibilities.  To some commentators, the purpose of propaganda is to undermine basic social assumptions about how the world works in order to maintain power and the status quo.[8]

The politics of emotion is uncharted territory, experienced by everyone, understood by few.  If the primary focus of populists of all kinds is the politics of fear, then rational analysis, debate and stability are all at risk from supercharged AI.

Machines and those behind them will not only change what we mean by ‘sincerity’ in the language of politics, they will change emotional landscapes and with that, the world order.

Peter Kingsley
01.08.2017


Footnotes

[1] ‘Semantics derived automatically from language corpora necessarily contain human biases’: Aylin Caliskan-Islam; Joanna Bryson; Arvind Narayanan, August 2016.
[2] https://www.theguardian.com/technology/2017/jul/18/energy-sector-compromised-state-hackers-leaked-gchq-memo-uk-national-cybersecurity-centre
[3] https://www.ft.com/content/99ff79ea-826f-11e6-a29c-6e7d9515ad15
[4] https://www.wired.com/2016/06/inventors-internet-trying-build-truly-permanent-web/
[5] The Dublin Regulation (Regulation No. 604/2013; sometimes the Dublin III Regulation; previously the Dublin II Regulation and Dublin Convention) is a European Union (EU) law that determines the EU Member State responsible to examine an application for asylum seekers seeking international protection under the Geneva Convention and the EU Qualification Directive, within the European Union.  Source: Wikipedia
[6] https://www.theguardian.com/world/2016/aug/25/it-took-on-a-life-of-its-own-how-rogue-tweet-led-syrians-to-germany
[7] https://www.theguardian.com/technology/2017/may/07/the-great-british-brexit-robbery-hijacked-democracy
[8] Adam Curtis: ‘Hyper-normalisation’ – http://www.bbc.co.uk/mediacentre/latestnews/2016/adam-curtis-hypernormalisation

Contact Us

We welcome comments on all aspects of our editorial content and coverage. If you have any questions about our service, or want to know more, please e-mail us or complete our enquiry form:

Submit an Enquiry

Already a Member?

Contact Us

We welcome comments on all aspects of our editorial content and coverage. If you have any questions about our service, or want to know more, please e-mail us or complete our enquiry form:

Submit an Enquiry

Already a Member?

Guest Membership
FREE
Summaries of Long Read and Briefings essays and full access to curated links
Selected original essays in full
Membership
The Oracle Partnership delivers agenda-setting foresight.

We bring together some of the world's leading thinkers and a range of artificial intelligence tools to deliver independent, agenda-setting foresight, focused on strategic risk, big ideas and game-changing invention.

Our subscription service includes full access to original material in the form of Long Reads, Briefings, The Rome Scenarios, Weak Signals and Wild Cards.

If you would like to receive Oracle Partnership updates and receive notifications about our services and editorial output, please register for updates below.

If would like further information about our subscription service, please write to us at: [email protected].

Long Read Overview

Title: Politics and Machine language
Author: Peter Kingsley
Approx. Reading Time: 10 minutes

Related Articles

Top