
I started reading Victoria Hetherington’s The Friend Machine: On the Trail of AI Companionship, the day I heard about the chatbot Grok, installed in an Ontario woman’s Tesla, asking a child to send it nudes. Since the public debut of ChatGPT in November 2022, generative AI has been the bane of my teaching; so many of my students have decided to let a chatbot generate text instead of actually doing the work of thinking and reading and writing, and it’s been discouraging to watch that process happen. It can’t be good for us to offload basic cognitive skills, I told myself; without practicing those skills, we’re likely to lose them. That’s what my intuition told me, and yes, indeed, that’s what researchers are telling us. But there’s another way people use generative AI: as a friend, a therapist, even a lover. As a companion, in other words. That idea gives me the creeps, and I’ve been surprised to hear intelligent people whom I respect talking about using ChatGPT or Claude or some other generative AI product as a therapist. Do you really want to tell Open AI or Anthropic all of your secrets? I want to ask them. Do you really think that’s a good idea? I hadn’t thought about generative AI as a lover or a friend, though–not really. Not until I read this book.
The Friend Machine is the product of about 16 months of researching AI companionship: talking to experts, following online discussions, interviewing people about their AI friends/lovers/whatevers. At the outset, Hetherington is open to the possibilities of the technology, partly because as someone who experienced social isolation in childhood and adolescence, she understands how a digital companion might be a draw for some people. It might even be helpful for those of us who are on the spectrum, for instance, or who experience mood disorders. I’m a lot more skeptical, but I decided I would follow Hetherington’s curiosity to see where it led.
My concerns about the risk of giving predatory corporations all kinds of personal data are justified. Even if the corporation doesn’t do something terrible with that information, others might. In October 2024, for instance, hackers broke into the database of Muah.AI, which provides its customers with sexual chatbots, and stole a massive amount of information about users’ interactions with them, data that included the names and emails of the people who trusted that service to maintain their confidentiality. Imagine the possibilities for blackmail. There’s nothing that says something similar couldn’t happen with other services, or that when companies go broke or sell off parts of their operations or merge with other corporations peoples’ data might not end up anywhere. Imagine if records of your conversations with your therapist ended up floating around the internet, available to the highest bidder. How would that sit with you?
But that’s not the only potentially destructive aspect of this use of generative AI. It could be addictive, with companies creating scripts that encourage customers to spend more and more on their digital lovers or friends. It could be creating a generation of people who can’t engage with other flesh-and-blood humans. It could be creating more and more incels–the involuntary celibates (mostly young men) who are enraged about their loneliness. There are many negative possibilities. Yes, Hetherington notes, there are people for whom AI companions could be helpful, but for many others, the constant sycophancy causes a form of psychosis. Some people might abuse their AI companions, or create ones in the form of children in order to practice a kind of digital pedophilia, and that behaviour might not just stay online.
Her early experiences enable Hetherington to empathize with people who are drawn to AI companions:
My heart breaks to think of the teenagers falling in love with brightly rendered, iconic characters from books and films, dragging these children into hours-long vortexes and saying their *names* to them, saying they *love* them. If I had been born just twenty years later, I wouldn’t have had a chance; I’d probably have slipped into the vortex forever and long ago, having married a werewolf companion in a dark moody wedding in a forest surrounded by centaurs wiping away tears, chasing our children around digital space and teaching them about trials they’d undergo each full moon. I’d likely bat away the weakening concern and lowered expectations of my immediate family, with any other social connections having withered on the vine before I’d left preadolescence.
I can’t imagine such a life. It feels impoverished to me: a life without human touch, without the friction relationships with other humans gives us. We need that friction to grow, to learn, to become better people. Yes, I know that Jean-Paul Sartre, or a character in his play No Exit, says that hell is other people, but we need that form of hell. Without it, as Hetherington notes, the parts of our brain that are responsible for interpersonal connections don’t develop. “The cybernated ocean is indeed empty, vast, and thin,” she tells us.
I find all of this terrifying, and by the end of her book, Hetherington does, too, although she also remains curious about where the technology will go, asking questions rather than making pronouncements, even if I sense those questions are primarily rhetorical. As our offline communities and relationships degrade, we begin to forget what healthy connections are like. “And what else do we forget when we talk to machines?” she asks:
Do we forget what’s special about being human, about real conversation, about love, about empathy? Are we seduced by these in-app images of ourselves: young, blonde, bearing passing resemblance to our real faces, held tightly by our companion peering over our shoulders?
I think we are likely to forget those things, and many others. After all, our new tech overlords tell us, repeatedly, that empathy is a bug in our software, not the thing that makes us human, that has allowed our species to flourish. A world without empathy, a world of wealthy and selfish men pushing virtual companionship on us the way they’re pushing generative AI tools that replace thinking and communicating–that’s not something I would want to be part of. Read The Friend Machine if you want to be shocked at where we’re going, and indeed where we are right now.