Briefing
Total Surveillance Gets Emotional
Privacy In Crisis
Over the last decade, privacy fears have gathered pace, as intrusive social media and state security applications alike have threatened to erode long-established basic principles. Most recently, the realisation that integrated monitoring systems can identify both suspects and suspect behaviour in large crowds, using face recognition, has deepened concerns, particularly in the West.
In all parts of the world security monitoring continues to grow in sophistication and reach. In China, as The Economist put it recently, intelligence integration is the guiding principle:
“A system called the Integrated Joint Operations Platform (IJOP), first revealed by Human Rights Watch, uses machine-learning systems, information from cameras, smartphones, financial and family-planning records and even unusual electricity use to generate lists of suspects for detention.”[1]
Nowhere To Hide
The emerging narrative is that there is nowhere to hide. Government and private corporations can mesh together or ‘fuse’ what intelligence services typically call ‘all-source’ data. They can, in particular, predict potential criminal activity, social unrest and monitor political opponents. Philip K Dick’s vision of ‘pre-crime’ technology in Minority Report has already been realised, imperfect as it is.
Even in the most liberal societies, the political trade-offs between public security and individual privacy are far from resolved. Weakening governments, some beset by increasing terror risks, lack the ultimate authority to impose fair, well developed and sustainable policies. Authoritarian states have few reservations. Meanwhile, technological ingenuity accelerates the rate of change and with that the challenge to ethics and governance.
There are few signs, even in the liberal West, that powers will be restricted. Since there is little effective scrutiny and little public understanding, surveillance regimes of all kinds can hide behind complexity. Even when there is some transparency, law makers do not have the depth of knowledge to make the right judgments.
Meanwhile, state and private intelligence services make the case that new technology heralds a better more secure world. Unfortunately, innovation is not always good, nor always in the hands of ‘responsible actors’.
From another perspective, someone, somewhere, knows more about us than we can imagine.
Watching Our Emotions
What is missing from the alarmist narratives is that the real revolution is about the politics of emotion. Cultural privacy norms may be under pressure, but the current technology only operates at a superficial level. We have seen nothing yet.
What comes next should be a source of wonder and fear. Sensors and machines are on the brink of developing insight into our most private, unconscious emotions and moods. It is one thing to map relationships between people in networks, track travel movements and follow buying patterns and outward behaviours. It is quite another to gain entry into our unconscious, inner worlds.
Yet this is what we will soon see emerge. Predictive models that draw on everything from heart rate monitors and bio-metric sensors, to facial expressions and gait will transform the accuracy of both private monitoring applications and surveillance.
To put this in context, current ‘sentiment analysis’ is largely based on measurements of arousal (excitement versus relaxation) such as heart rate or skin conductance. This tells us nothing about another important dimension of emotion and mood – valence – that contrasts the state of pleasure (e.g. joy) at one end with displeasure (e.g. fear) at the other.
On the one hand, if machines ‘know’ that we are feeling stressed and angry, they can help us relax. We will welcome monitoring well-being and health applications that help us calm after a stressful day. We will willingly allow trusted partners to access our emotion-sensor data if there are health benefits and our data is secure.
On the other hand, we can be manipulated, for good or ill, without being aware of our own ‘state’, or what the machine is doing. We may be calmed, say, in public spaces, echoing the soma of Huxley’s Brave New World. New technologies will not just monitor facial expressions to predict criminal behaviour. They will enable the large-scale monitoring of the emotional experiences and ‘states’ of everyone in public and, with some limitations, private space
Some of these technologies are already in general use. Facebook, Google and Twitter mine for emotional clues on the pretext that if emotional states can be at least estimated, then advertising can be more ‘targeted’, encouraging what is usually called engagement. Face recognition, linked to ever-growing public and private sources, is improving fast as machine learning techniques have access to more data. When ‘nanoscale’ drones take to the air, the tension between private good and public security will take on new dimensions.
Peter Kingsley
July 2018
Footnotes
[1] https://www.economist.com/briefing/2018/05/31/china-has-turned-xinjiang-into-a-police-state-like-no-other
Contact Us
We welcome comments on all aspects of our editorial content and coverage. If you have any questions about our service, or want to know more, please e-mail us or complete our enquiry form:
Submit an Enquiry