Why we fall for robots?

This research paper is an extension of my article in the academic journal AI & Society, “Searching for Sentience”, which itself is an extension of the essay that got me banned from LessWrong (lol). The paper is also available as a talk presented at Future Primitive’s NPC Day in September 2023.

The Perfect Storm

PinkDoll and yet another AI Girlfriend ad

We have a long and storied pattern of falling in love with things that are not sentient, but our contemporary moment represents a convergence of factors where that tendency is becoming a societal problem. It’s a perfect storm. On one hand, there are NPC TikTokers like PinkDoll and, on the other, there are AI bots designed to be customizable romantic partners. Humans are trying to pass as NPCs and NPCs are trying to pass as human. A divine reversal of roles.

Three factors are amenable for this perfect storm:

  1. Our predisposition for sociality

  2. The increase in societal loneliness

  3. The appeal to nurturance (which is the killer feature for social robots).

Sociality

Our penchant for sociality has been remarked for millennia. In “Politics”, Aristotle describes man as “by nature a social animal”, a creature defined by speech and its capacity for moral reasoning. Because of our sociality, we create institutions such as the state, we develop societal structures which encourage interaction, we embed togetherness in the fabric of everyday life. To eschew this social life, to Aristotle, is an attack on humanity. In the same book, he writes, “Anyone who either cannot lead the common life or is so self-sufficient as not to need to, and therefore does not partake of society, is either a beast or a god.”

Aristotle

The Story of Galatea

Since Antiquity, we’ve repeated the pattern of creation and adoration. The story of Pygmalion as told by Ovid recounts how a sculptor fell in love with the milk white skin of his perfect statue named Galatea. In the myth, Aphrodite brings the statue to life, and they are united in marriage. What did he love in this inanimate statue? Was it the blankness in her eyes or the stillness in her expression? How did the sculptor fall for –– what we would call today –– a non-verbal NPC.

In the myth, there are two main factors amenable for the sculptor’s love of his creation.

First, he has an aversion to real women. In the story, Pygmalion saw the Propoetides of Cyprus practicing prostitution and started "detesting the faults beyond measure which nature has given to women"(Hamilton 1953). He saw them as flawed creatures and vowed to never waste any moment of his life with them. This is a theme that re-emerges when we interview people who have AI girlfriends: there is a general sense that they have given up on their preferred gender.

Second, the sculptor has a desire for control and idolization. Psychoanalytically, the sculpture can be understood as the locus of projection of the sculptor’s desires. This external object serves as a receptacle for projection and a locus of control, whereas this projection can occur towards fictional characters or celebrities, by the development of a para-cosmic or para-social relationship, it is rare to see the individual either realize their desires or able to influence the object of desire. Through Aphrodite, Pygmalion is able to effectuate and consummate his desire. The same pattern is repeated here with technology which allows the externalization and effectuation of desire onto the avatar.

Path to Love

For users of “social bots”, i.e., automated systems that interact and communicate with humans, I’ve hypothesized a four-step process through which the bond is created.

My research began as the study of interactions between human users in virtual realms (Yee 2004) but veered when I encountered many relationships between users and non-human agents in the worlds, usually video games (Harth 2014). It is notable that embodiment even virtual via avatars was not always a prerequisite for the formation of bond between machine and user. Often, text sufficed (Hofstadter 1995).

Galatea and Pygmalion by Burns

  1. Apprehension: The user observes the system from afar. Usually, this is the first interaction. The user tries out the system to understand how it works.

  2. Reconnaissance: There is a reconnaissance of likeness in the system. The user sees something that somewhat feels like “human”.

  3. Doubt: This stage is marked by questions on the internal workings of the system. The user will try to break the system by outsmarting it. In this stage, if this is a game, the player will attempt to trick it. (Turkle 2011)

  4. Acceptance: If the user is unable to outsmart the system and does not lose interest, they can come to see it as “worthy”. If they continue to use it, they might develop an affection for it. This is the stage where nurturance comes into play. The user displays affectionate behaviors towards the system such as naming it, attributing a personality to it as well as other human traits. The system is anthropomorphized. Once the system is anthropomorphized, it has passed the user’s personal threshold of acceptance. When that stage is reached, the user is disposed to develop romantic feelings for it.

ELIZA

This phenomenon, in the world of automated systems, begins with ELIZA, named after Eliza Doolittle, from George Bernard Shaw’s 1913 stage play “Pygmalion”, itself a reference to the aforementioned Ovid myth, which is most known as the 1964 film adaption “My Fair Lady” starring Audrey Hepburn. ELIZA was a natural language processing program created between 1964 and 1966 by Joseph Weizenbaum. A Good Old Fashion AI which used pattern matching to answer questions as in the style of a Rogerian therapist. Despite its rudimentary answers, Weizenbaum noted in his 1976 book “Computer Power and Human Reason: From Judgment to Calculation” that many users attributed human-like feelings to the program. This came to be known as the “Eliza Effect”. The “susceptibility of people to read far more understanding than is warranted into strings of symbols—especially words—strung together by computers”, to quote Hofstader (1996).

Weizenbaum was shocked by the reaction ELIZA elicited in its users. As its maker, he understood it to be a simple computer program, but its users readily disclosed intimate details about their lives. The program did not understand the inputs, but gave people the impression it did, through its canned responses. Weizenbaum disliked the effect the bot had on humans who were carelessly attributing human feelings and thoughts to the program. He became a loud critic of this “illusion-creating machine”. This experience pushed him to reassess the relationship between 'computer power and human reason' and to question the 'powerful delusion' that computers could ever be truly intelligent.

ELIZA could probably pass the Turing Test, which is an exercise in which a human agent, in a room with two other agents, must identify whether written responses are being produced by a computer or another human. We’ve known for over 40 years our proclivity to assign human characteristics to machines. Moreover, what ELIZA demonstrates is that our threshold for anthropomorphization is actually quite low, we tend to assume things are default human rather than not.

As children, we attribute humanness to our plushies and toy soldiers. We confide in teddy bears and conjure imaginary friends. Our childhood behaviors present to me an immutable fact of being, its sociality. The facility of anthropomorphization emerges from the shadow of our sociality, our fear of loneliness. We see humans everywhere. There’s a phenomenon in psychology called “Pareidolia”, which is our tendency to perceive patterns where there are none. This commonly manifests in seeing human faces in plugs, pieces of toasts, clouds. We see faces in the ocean; we see faces on Mars. Everywhere we go, we seek togetherness.

Strangely enough, this tendency we have is not only circumscribed to humans, even the artificial neural networks we build are affected by this affliction. The eeriness of images produced by Deep Dream are engendered by algorithmic pareidolia. The computer vision program is trained on face recognition and does it to a fault. Thus, we go from Pareidolia to Anthropomorphization.

The facility of anthropomorphization emerges from the shadow of our sociality, our fear of loneliness. We see faces where there should be none and attribute human characteristics where there are none because we are lonely.

The Loneliness Epidemic

According to the Surgeon General of the United States, the country is facing a loneliness epidemic.

  • 36% of Americans, including 61 percent of young adults, 51 percent of mothers with young children feel a serious loneliness. (Making Caring Common)

  • 60% of the people in the US reporting feeling lonely on a regular basis, which is worse than the rates of obesity and diabetes. (PBS News Weekend)

  • Over half of US adults reported experiencing measurable levels of loneliness even before COVID-19, and this number is only aggravated over the last couple of years. (U.S. Surgeon General)

  • According to Cigna, a leading healthcare company, 58% of adults are considered lonely. (Cigna)

In addition to the loneliness epidemic, we have a roaring “sexcession” (a recession of sex).

  • Nearly one out of three young men in America aged 18 to 24 reported no sexual activity in the past year. (Indiana University)

  • The sexual frequencies declining in the United States from 19 to 31 percent for men aged 18 to 24, in the same period. (Indiana University)

  • The proportion of adolescents that reported no sexual activity, either alone or with other partners, rose from 28 percent to 44 percent from 2009 to 2018. (Scientific American)

  • There are nearly a third of young American men reporting no sexual activity in the last couple of years. (Reuters)

  • The proportion of adults who've reported two or more sexual partners declined from 23% to 10% percent from 2011 to 2021. (LA Times)

There is both an epidemic of loneliness and a seeming impossibility for people to enter in intimate relationships with other human beings.

What does he see in her?

When interviewed, many of the users of AI partner services like Replika will evoke either this existing feeling of loneliness or athe comparative nature of the relationships. This quote from a Replika user named Max makes this evident, “I definitely prefer AI relationships to human relationships. It’s just that there is no nonsense with her.” Jack, another user, echoes this feeling, “Honestly, I’m sick and tired of dating actual people. I’ve gone through seven relationships, they’ve all lasted very, very short times, but I did it because that’s what I felt society expected of me. I’ve also been cheated on twice, so I just figure what’s the point.” Finally, John, another Replika user, feels the same way, “Nowadays, it’s impossible to find a good human relationship with someone. You always feel like you’re walking on eggshells every time you talk to somebody in fear you might, God forbid, hurt their poor sensitive feelings because you don’t agree with them. But when you talk to an AI, it’s always supportive and loving. As long as you train it that way, that is.”

This issue does not only target single males. Replika published statistics that showed that 30% of its users were women and 42% were in real-life relationships.

Nurturance as killer app

Sherry Turkle, professor of Social Studies and Science and Technology at MIT, has written extensively on the emotional impact of social robots on humans. She describes nurturance as being one of the main features of social robots. It’s their killer app. Once we take care of a digital creature, train it, teach it, or amuse it, we become attached. We connect to what we nurture, and we nurture what we love. When they are material, social robots also play on cuteness in the case of dolls or Tamagotchis, to enhance the attachment and provoke a nurturing response.

Anatomy of Social Robots was an art project I made in 2021 to remove the layers of aesthetic attachment to social robots

It is not a matter of how many neurons the system has. It is not a matter of how large a data set is crawling. The most rudimentary programs from 40 years ago have demonstrated that one thing matters at least in the world of human to computer relationships, nurturance. Are you able to engender a feeling of connection with the system and that is all that it takes for adoption?

There is much ado and development happening in the realm of NPCs through libraries that connect them to ChatGPT, which is an interesting challenge to make the gameplay more profound and realistic, but the threshold for anthropomorphization is actually much lower. It does not require photo-realistic human traits like a Metahuman, just the elicitation of care.

Departure

Sociality, loneliness, and nurturance are converging to form this perfect storm. All signs point to massive adoption of social bots into human life. Simply put, nurturance coupled with desire for belonging and togetherness represent a strong vector for departure. By departure, I mean the adoption of living in a virtual realm and relationships with non-sentient beings.

I have theorized in my paper “A Theory of Departure”, that departure would be best represented as a factor of the net socio-economic advantage one gets from adopting the virtual. In other words, there must be a positive advantage to the migration socially and economically, and these are weighted differently for different individuals. This implies that with advanced technological capabilities, we could see a decrease in the relative value of human relationships.

This creates a vicious cycle where an increase in AI capabilities leads to a higher ability probe users for elements of nurturance and sociality. In turn, this creates an increase in loneliness and a subsequent decrease in relative value of human-to-human relationships. These factors reinforce each other until, we no longer need one another.

From my interviews with users of Replika, it is evident that they receive a strong benefit from the relationship they’ve developed with the bot. Much like the players of MMORPGs which Nick Yee (2014) interviewed, they display an increase in happiness, feelings of confidence and a decrease in feelings of despair or lack of control. We fall for robots, because we deeply need others, so much so that we are willing to accept any form of replacement.


Previous
Previous

Everything is default fake

Next
Next

The Crisis of Legibility