Fasten your seatbelt: Intimacy capitalism is coming
With artificial intelligence, a new form of capitalism is emerging that we can call 'intimacy capitalism'. Tech giants are no longer just fighting for our attention, but also for our intimacy.
Chronicle in Politiken, 23 May 2023.
By Anders Søgaard (Professor, University of Copenhagen)
Sune Lehmann (Professor, DTU and University of Copenhagen)
Rebecca Adler-Nissen (Professor, University of Copenhagen)
Ole Winther (Professor, DTU and University of Copenhagen)
Michael Bang Petersen (Professor, Aarhus University)
POST UPDATE blues is the name of the game. It's the term used by users of the Replika service to describe the feeling of having lost their loved one because the system has been updated.
Replika is a chatbot that since 2017 has aimed to be your AI companion. A companion whose mission is to see the world through your eyes, who always takes your side and is patiently ready to listen to you and your feelings.
Replika doesn't just chat, but is happy to show you her diary and can even send you snapshots of herself. And Replika is not a small phenomenon. In 2018, the company had two million users - by 2021, that number will be over ten million.
Ten million people is a lot of people, but can they be written off as internet weirdos?
We think: God, that's weird. I can't believe someone would do that. And then we move on. A few weeks ago, Snapchat - the primary social media for Danish children with 375 million users worldwide - rolled out its own chatbot, My AI, based on OpenAI's language model, GPT-4. And it's ready to talk to your child around the clock.
Social media has created 'attention capitalism', where companies struggle to hold our attention long enough for us to be influenced by the adverts that the platforms also display.
Analyses show that negative, spectacular and shocking content is becoming more and more prevalent on social media. With artificial intelligence, a new form of capitalism is emerging that we can call 'intimacy capitalism'. Tech giants are no longer just fighting for our attention, but also for our intimacy.
IF WE ARE to understand how to have an emotional connection with an artificial intelligence, we might start in the animal kingdom.
In the 1950s, Dutch biologist Niko Tinbergen introduced the concept of 'supernormal stimuli' - also known simply as superstimuli. Superstimuli refer to exaggerated stimuli that provoke a stronger response from an organism than the actual normal stimuli.
Take songbirds like house sparrows, nightingales and swallows. Songbirds incubate their eggs in nests and are biologically coded to give what appear to be the healthiest eggs the most attention.
In other words, the birds prefer to sit on large and colourful eggs. These eggs have the best chance of producing healthy young. Therefore, it makes sense - for the survival of the species, so to speak - to favour these eggs.
But it is so ingrained in birds to prefer larger and more colourful eggs that you can play a trick on their biology by placing even larger and more colourful eggs in their nests - eggs that are not the birds' own.
The birds respond more to these exaggerated stimuli than to the normal stimuli of their own eggs and happily leave their eggs cold and lifeless, while perhaps incubating another bird's egg - or a tennis ball.
Similar experiments have been done with butterflies, fish and seagulls. The beetle species Julodimorpha bakewelli attempts to mate with brown glass because it emits very powerful stimuli that the beetles respond to because it has traditionally led them to their mates. These Australian beetles end up dying in droves in the baking sun on top of Australia's brown beer bottles.
BUT IT'S not just animals that allow themselves to be manipulated and tricked by their biology. Human coexistence with the internet, social media and artificial intelligence is just a brief moment in evolutionary time. And our biology has no chance to adapt.
The internet, social media and artificial intelligence are capitalising on this. Pornography is an obvious example. Here, our sexual desire is activated by visual stimuli alone - despite the fact that the situation doesn't involve the slightest chance of procreation. It's probably because reproduction plays such a crucial role in our biological evolution that our psychological systems are so easily fooled.
If reproduction is biologically the most important thing for humans, survival is the second most important. Survival requires food, and the food industry capitalises on this. At the end of the 20th century, American psychologist Howard Moskowitz discovered the so-called 'bliss point', the optimal combination of salt, sugar and fat that activates maximum pleasure when consuming food. Food giants like Pepsi and Kraft have since utilised this insight to sell more products.
Not by making their products more nutritious, but by capitalising on our biologically encoded cravings for salt, sugar and fat.
If sex and food come first and second, social relationships and community probably come third.
Humans are ultra-social animals. Research shows that just as we see reproductive opportunities where there are none, our social brains make us see people where there are none. We see faces in moons and electrical sockets, and many experiments have shown how easily we recognise, for example, triangles moving randomly between each other as beings with intentions and goals.
In psychology, we are said to have a 'hyperactive personality detector'. When our social brain meets artificial intelligence, we can fool ourselves into thinking there is intelligence - human intelligence - on the other side.
Chatbots can simulate a conversation and appear human. An international study led by South Korean researcher Jian Mou shows that the degree of humanity in chatbots increases the enjoyment of a conversation and reduces the desire to switch conversation partners from chatbot to human.
Significantly, a study from the China University of Geosciences further suggests that the enjoyment of conversing with a chatbot is greatest for people with social anxiety.
One interpretation of this study is that chatbots can be helpful for people with these types of mental health conditions, but conversely, it also means that some people are particularly vulnerable to the technology. And it's important to remember that while we humans form relationships with technology, the relationship is not reciprocal. The chatbot doesn't feel the other way and is often controlled by a company that has radically different motives than its users.
The STRONGER the models are, the more human and empathetic they can appear. Studies from Stanford University have shown that GPT-4 can solve psychological tasks that are used to evaluate 'theory of mind' - our ability to put ourselves in the shoes of others.
The tasks are used to examine children's and adults' awareness of other people's consciousness - a trait that is crucial for complex social interaction. These studies only look at text from artificial intelligence. Language models have gradually become better and better at putting themselves in the shoes of others - or rather, at simulating such an ability - and have become as good as we are.
Today we type with a chatbot, but soon we will be able to talk to an animated avatar, and puppets and robots may be equipped with chatbots. All of which will lead to an even greater degree of realism in our interaction with language models - and thus to even greater intimacy.
Social media today sells our attention for profit. We are held by dance videos on TikTok, angry outbursts on Facebook and breaking news on Twitter, and while we watch these posts, we are fed adverts. And the sale of adverts drives the platforms' profits.
Selling attention for profit has proven to be one of the most successful business models of the 21st century, but recent developments in artificial intelligence will affect that business model.
The battle for attention will become more of a battle for our intimacy. And the battle has just begun.
Why intimacy? Imagine you're sitting in a pavement café waiting for an acquaintance, a former colleague. There's a bus stop on the opposite side of the road and an advert for headphones hangs on the bus stop's advertising stand. 'World's best headphones', it says. Your former colleague arrives and you order two cups of coffee.
Then your former colleague pulls out a white plastic case. She has bought new headphones. They are unbelievably good, she tells you. They really are. The best headphones she's ever tried. What do you think affects you the most? The advert at the bus stop or your former colleague?
Social media advertising is good business, but often not very effective. Facebook is great at getting their users to spend more time on Facebook, but when they show you sunglasses adverts right after you've bought new sunglasses online, the success rate is relatively low. And there are also limits to how many SF voters Facebook has persuaded to vote conservative, or vice versa.
With intimacy, that picture might change. If you have an intimate and meaningful relationship with a chatbot, the chatbot can more easily influence your behaviour. Bonds that are characterised by warmth and trust have a greater impact on our choices. If AI can gain our intimacy and trust, we as users will be much easier to influence.
Chatbots will increasingly be our access to knowledge about the world. Today, many people use social media such as Facebook, Twitter and Reddit to share information. Here, users can constantly get information about exactly what they are most interested in through carefully constructed networks and subreddits.
In a way, you don't have to do that anymore, because artificial intelligence can learn your exact preferences. Right now, language models are passive and only respond when asked. But a more personalised chatbot can also be active and learn to reach out when something happens that matches your exact preferences.
A STUDY led by MIT psychologist David Rand shows that using machine learning to find the exact political message that best matches your gender, age, education, etc. can increase the impact of messages by 70 per cent.
With unique access to your daily thoughts and questions, artificial intelligence can make a much better selection of political messages for election campaigns. The same goes for products.
The owners of the language models can use their intimate knowledge of our lives to, for example, match us with the products that we have the hardest time saying no to at the exact moment we have the hardest time saying no - even if we really should be spending the money on something else right now or can't afford it at all.
But they wouldn't do that, would they? They're certainly going to try. The CEO of Netflix, Reed Hastings, was asked who Netflix's biggest competitor is, to which he replied: "Think about when you watch a Netflix show and get addicted to it, you stay up late at night. We're competing with people's sleep".
Profit-seeking companies harvest the resources necessary for that goal, whether it's your money, your attention, your sleep or your most intimate stories.
We're used to regulating our calorie intake. We stand in the supermarket and look at a Snickers bar. The pleasure centres in our brain light up. But we often say no, because we know the calories are empty. Online, we don't consume physical calories, but we consume information through the attention we give it. Some information is useful. Other information is like empty calories. But we can't consume both at once.
We can only focus our attention in one place at a time.
From a profit perspective, it doesn't matter if the information is useful or not.
It's all about what holds our attention. And often our attention is held by things that we know are not appropriate. When a motorway accident creates a 'lookout queue' on the other side, for example. Or when we can't help but watch yet another 'do it yourself' video on Instagram or scroll through yet another angry exchange in a heated debate on Twitter.
In the first Harry Potter book, Harry finds a secret room at Hogwarts. Inside the room is a strange 'dream mirror'. In the mirror, Harry sees his family - not just the parents the orphaned Harry has never met, but also uncles and aunts, grandparents, etc.
He ends up spending days sitting and staring into the mirror. When his friend Ron Weasley looks in the mirror, Ron sees himself without dominant older brothers, as the sports star, as the centre of everything.
The reader is about to get the gist of what's going on when Dumbledore explains that the mirror shows our deepest desires. He explains to the boys "The mirror can give us neither knowledge nor truth. People have withered away in front of the mirror, hypnotised by what they have seen, or driven mad without knowing whether what the mirror showed was real or even possible (...) You can't exist in a dream, because then you forget to live".
Artificial intelligence can easily lead us in front of the Dream Mirror. The question is who or what can prevent us - or at least many of us - from getting trapped there.
In keeping with the topic, this article has been translated from Danish by a neural machine translation service.
Topics
Contact
Rebecca Adler-Nissen
Department of Political Science
Mobile: +45 30 22 40 75
Email: ran@ifs.ku.dk