The narcissism of the digital world
Technology is not to blame for the never-ending erosion of our privacy.
Want to read spiked ad-free? Become a spiked supporter.
I’ve spent over a decade writing and talking about technology, and how central it is to our changing way of life. I am excited about the potential of technology to improve human lives, from medical advances to more plentiful, clean energy and faster, cheaper travel.
Inevitably, though, I also spend a lot of time warning about the potential downsides, especially of the digital, data-driven technologies through which we live more and more of our lives. It is, after all, an asymmetric world, in which a few companies and organisations collect data about us all, aggregate and analyse that data, and use it for their own purposes.
This shift into a data-driven society certainly has implications for our privacy. Almost anything we do, from texting our friends to walking around our own home, may be feeding data into an algorithm without our knowledge. The fastest growing area of the tech market, pre-pandemic, was smart speakers – those devices, like Amazon’s Alexa, that we install in our own homes to listen for our requests, play us music or answer our questions, like attentive servants.
The timesaving, effort-reducing qualities of this all-pervasive technology have led some people to talk about a ‘Digital Athens’. Like the citizens of that first democracy, we will be freed up by the labour of our new AI slaves to engage in political and philosophical debate. But I wonder if our technological world is not more like a digital Wolf Hall. Our houses are full of servants, but who are they really working for? They fetch what you command, but their true loyalty is to other masters.
Technology also increasingly dominates our time and attention. There’s always a funny new video to watch, or a response to our latest witty post to check out. Most of all, the ease of following where the algorithms lead puts us all in the passenger seat of our own lives. Human judgement, responsibility and initiative are handed over more and more to automated systems. Even when the systems are supposedly designed for our own good, handing over control to technology is eroding what it means to be an adult, a citizen, a person.
Grasping this change – which I explore in my new book, Technology is Not the Problem – is difficult however. Indeed, it’s hard even to remember a time when things were different, so let me give you an example.
When I first moved to London, way back in the previous century, I never went anywhere without my London A-Z. It was an atlas of London’s streets, with every road name indexed in the back. It didn’t matter where I was in London, I could look up the street I was on and the street where I was going, and work out how to get there, following the map from page to page as I walked.
I can’t remember exactly when I, and many others like me, stopped carrying their A-to-Zs. But it is likely to have slowly coincided with the moment at which we could access all that information, and more, on our mobile phones. All I now need to do is put in a postcode or address, and an app will offer me a menu of ways to get there – walking, cycling, driving or public transport. It will tell me when the buses are due, and whether there are delays on the Tube. I can follow a blue dot showing my location or, if I don’t want to read a map, some apps just put an arrow on the screen showing me which way to go.
This example captures a key aspect of the impact of technology on our lives. Instead of accessing the same information as everyone else, I now get information tailored to my specific needs. This is increasingly how we expect the world to work – addressing us as specific, unique individuals.
But there are limits to this expectation. You may be one of the few people in 2019 who was startled to hear a television advert address them by name. ‘Welcome, Rebecca, to the Unlimited’, said the presenter of Skinny Mobile’s ad on New Zealand’s TVNZ, but only to viewers called Rebecca. Streaming customers with one of the 200 most popular names also all heard the presenter say theirs. Channel 4 in the UK also experimented with advertising so personalised it says your name, back in 2017. It didn’t go down well. Hearing our own name does grab our attention, but there’s a big difference between unexpectedly hearing a friend call out to you from across the street, and finding an over-familiar stranger in your living room.
So we don’t like it when the attentive algorithms get too personal. But we also don’t like it when our digital world isn’t personalised enough – when we get adverts for the thing we’ve already bought, or for the wrong sports team. We have an ambivalent relationship to the online and offline technology that profiles and targets us.
When I visited Samsung’s smart-home hub in Surrey, in south-east England, a representative told me that its system knows when adults are watching TV downstairs, and kids are watching it upstairs. Samsung could target TV ads to each individual in the house, but as it stands most of us wouldn’t accept that.
In the physical world, too, customised products and services are proliferating. You can buy children’s books in which the hero has not only your kid’s name, but their skin, hair and eye colour. You can get an Ordnance Survey map centred on your house, or pyjamas printed with your pet’s face on.
This personalised world would amaze our great grandparents. In their day, only the rich got bespoke products and personal services. Everyone else got mass-produced goods and mass culture. But from roughly the 1960s onwards that all started to change. Since then we’ve seen the slow and uneven transformation of a mass society into an increasingly personalised one.
So how has digital technology, collecting data and building profiles, helped make this personalisation possible? How have algorithms turned all that information into useful insights about what makes each of us unique? Can Big Data really know me better than I know myself?
Curious about claims that algorithms can map our personalities and predict our behaviour, I gave the University of Cambridge Psychometrics Centre access to my X account, so it could profile me from my tweets. According to its analysis of my social-media activity, I am less than conscientious, sometimes disagreeable and positively introverted. I am also, according to its analysis, male (I’m not) and aged around 30 (I’m not), which does help to explain the adverts I get for watches and beard-care products.
Predicting personality traits from social-media activity is not as accurate as you might have heard. Those anti-Trump and anti-Brexit journalists who claimed that voters were tricked by micro-targeted Facebook campaigns in 2016 should relax. Those techniques are nowhere near as powerful as companies like Cambridge Analytica claimed.
But there are aspects of personality profiling that should still concern us. The Cambridge Psychometrics Centre carried out the original research on Facebook data that was later used by Cambridge Analytica. According to a 2013 paper by Michal Kosinski and colleagues, ‘Private traits and attributes are predictable from digital records of human behaviour’. They are claiming that age, gender, sexual orientation, intelligence and even key personality traits can all be predicted from what people like on Facebook. ‘People may choose not to reveal certain pieces of information about their lives, such as their sexual orientation or age’, wrote the researchers, ‘yet this information might be predicted in a statistical sense from other aspects of their lives that they do reveal’.
Some of the findings were surprising. Liking curly fries on Facebook was found to predict higher intelligence, for example. There’s obviously no guarantee that any one curly-fries-liking person they tag as intelligent has that quality in real life, but it is statistically likely that they will.
The claim that ‘Big Data knows you better than you know yourself’, which I shamelessly used to sell my first book, is not true of course. Neither data nor the computer programmes that analyse them can know anybody in the way one person knows another person. Having no mind of its own, a computer programme can never imagine itself inside somebody else’s mind. It can’t empathise, or intuit why somebody said or did something. All it can do is measure, record, compare with previous data and assign a probability that some other data will also be observed.
That is why the Cambridge Psychometric Centre’s algorithm assigns me a position well into the masculine side of the gender axis, and predicts an age of around 30. It’s because others who previously participated in the centre’s research, and whose tweets resemble mine in some way, were male and aged around 30.
This character portrait is almost the opposite of being intimately known and understood by another person. Humans have a theory of mind. We use our imaginations to understand what it might be like to be someone else, and what we would do if we were them. The data-driven technology, by contrast, is engaged in a mass sorting exercise, looking for statistical relationships in large populations and then applying the results to each of us. So, on average, I’m a 30-year-old bloke.
That profile of me is a mathematical model of a person, constructed by automated systems. It becomes me in the digital world without my knowledge or control. Most options, be they job adverts or dating opportunities, are removed without my ever knowing they existed. This is how technology creates our personalised world. Instead of me choosing the direction of my life and seeking out the opportunities, people or ideas that I think will help me on my way, I get to choose from a limited menu designed for my digital double. Unlike me, that persona cannot think for itself or decide to do anything I haven’t done before. There is no ‘who’ at the centre of my personalised world, only a ‘what’. The first casualty of personalisation is the person.
But this proxy personalisation wouldn’t work at all if I didn’t play my part. However much we worry about the creepiness of the profiling and personalisation, and adjust our device settings to block ads and cookies, we keep going back for more. So why do we choose to live in this personalised world?
The answer started to crystallise when, out of the blue, I was invited to Berlin to talk to the leaders of some fashion companies about what was changing in their industry.
I am the least fashion-conscious person you’re ever likely to meet. But the conference organiser recognised that digital technology was transforming the fashion business, and wanted me, as the author of a book on Big Data, to spell out the implications.
Sometimes, to understand something, you need to look at it from the other side. I was used to talking to people about Big Data, from how it works to the social and ethical pitfalls it brings. But to get that across to people who lived and breathed fashion, I needed to start from where they were – in an industry that anticipates what people will want to say about themselves, and produces a range of options from which customers can choose, and express who they are.
Data and profiling are disrupting their businesses. Top-down advertising is becoming less important than peer-to-peer influence, and brands need to engage directly with their audience via social media. The digital world is allowing consumers to reshape fashion themselves. And that means that companies have to learn in real time how their customers are making fashion their own, and adapt accordingly.
It was at this moment that I realised that the world of personalising and profiling technology was a continuation of fashion by other means. The fashion industry has always been powered by the human urge to say something about ourselves by how we appear to others. How we look has always mattered.
Technology is now simply giving us more choice than ever before, and more platforms on which to present ourselves. The business models are new, but they’re built on the deep and ancient foundations of the human urge for self-expression. I had the answer to my question. Why do we ultimately embrace intrusive data-collecting technology? Because we want the recognition of us as individuals it brings. We need to feel that we can be seen by others as our authentic selves.
This increasingly personalised world is not being driven by technology. It is being driven by us, by our emotional, psychological, even existential needs. I spend my professional life studying the trends that will shape our future, especially around technology. But when people ask me to name the big idea that will most influence Western societies in the next 30 years or so, it’s not colonising Mars, extreme longevity or brain-computer interfaces. It’s not even digital data or AI, though they will be everywhere, and we’ll conduct most of our everyday lives through them. The idea that will do most to shape our immediate future is identity.
Identity is the dominant way we understand ourselves in the world today. The idea of identity captures our need to be recognised, as both unique and authentic, and as a member of a particular group. For most of history, rigid social expectations pressured people to play narrowly defined roles in public, to hide sides of themselves that were important to them but socially taboo. Today, we are encouraged to play roles that feel authentic to us, bringing to a public stage elements of our personality that would once have been considered private matters.
Singer Sam Smith expressed this very eloquently on Instagram in 2019:
‘Today is a good day so here goes. I’ve decided I am changing my pronouns to THEY / THEM after a lifetime of being at war with my gender I’ve decided to embrace myself for who I am, inside and out… I’ve been very nervous about announcing this because I care too much about what people think but fuck it! I understand there will be many mistakes and mis gendering but all I ask is you please please try. I hope you can see me like I see myself now. Thank you.’
Why would a successful, wealthy singer with millions of fans care so much about what pronouns other people use to talk about them? Because Sam Smith wants others to ‘see me like I see myself now’.
As our freedom to choose how to live increases, our freedom to explore and express who we are expands with it. Identity has become a powerful but contradictory idea: it refers both to a unique, inner kernel, and an outward projection; both an essence and a performance for which we write the script. The quest for identity underpins the embrace of digital ‘personalisation’.
The word ‘persona’ itself comes from the masks actors used in Ancient Greek theatre. The mask would project the actor’s voice and give emphasis to his physical expressions. Now we each have an online persona, a way to project our identity, our vision of how we see ourselves out into the world.
The problem is that our sense of our identity is fragile. We want affirmation and reassurance from others that we are being seen as we want to be seen. That we’re not being misrecognised. Why else would people – including me, obviously – care so much how strangers respond to them on social media?
It is wonderful that we have so much more choice than our ancestors about how to live our lives. That freedom could give us a far stronger sense of who we are than any number of social-media likes or followers. Ask yourself this: on your deathbed, how will you want to look back at your own life? You’d want to be recognised for your achievements, not your identity or online persona. You’d want to be recognised for raising kind children, founding a movement for political change, inventing a new form of transport, composing a song that will be sung long after you are gone, or something else you have accomplished.
None of these things is easy. They’re not available at the twitch of a thumb on a smartphone screen. They demand effort, imagination and risk-taking.
Technology is not the root of the problem. But it does make it ever easier to choose from a menu of pre-selected options. At the same time, it makes it ever harder to find your own directions and build our own lives. Technology offers a constant menu of small choices and small social rewards, tiny ‘like’-button affirmations showing that we are recognised. The personalised, social-media world is designed to keep us coming back for more. Our screens are more seductive than the pond in which Narcissus gazed at his own reflection. We don’t just get our online persona reflected back to us, in all its edited perfection. We get instant assurance in the form of likes, replies and personalised feeds that others also find our reflection irresistible.
Instead of pointing the finger at technology, we need to look at why we want to live in this digital hall of mirrors. We don’t have a shrinking sense of who we are because of personalising technology. We have personalising technology because of our shrinking sense of who we are. The problem is not technology, but our own obsession with how others see us, and our insatiable need to be reassured that we are the person others see.
How to turn our energy away from the digital mirror of Narcissus, and out to the wider world, is the challenge that awaits all of us.
Timandra Harkness is a writer, performer and broadcaster. She presents the BBC Radio 4 series, FutureProofing and How To Disagree.
The above is an edited extract from Timandra’s new book, Technology Is Not the Problem: The ultimate history of our relationship with technology and how it has shaped our world today, from smartphones to AI, published by Harper Collins. Order it here.
Pictures by: Getty.
To enquire about republishing spiked’s content, a right to reply or to request a correction, please contact the managing editor, Viv Regan.
Comments
Want to join the conversation?
Only spiked supporters and patrons, who donate regularly to us, can comment on our articles.