Does AI have Toxic Masculinity?
What do Spike Jonze and the Replika companion bot have in common?
How do you like that title? It’s perfect to attract possibly the worst type of people on the internet. The people who are A.I. truthers and the people who continuously misunderstand what toxic masculinity is. But I didn’t make that title for rage bait, this is an honest question, and something I wanted to investigate further as I wrote for the Modern Manhood episodes. Especially the ones about A.I. and companionship bots . I tried very hard in that episode to shoehorn in this thought that how we think of Artificial Intelligence is very “masculine” (even though I loathe to use those terms) and how we think of AI then bleeds into the software itself. I also just recently re-watched the movie “Her,” Spike Jonze’s 2013 movie about loneliness, divorce, and the search for connection after digging deep on what was going on with the app Replika. An AI app that’s labelled as the app “who cares” and it’s “always there and on your side.” That’s their words, not mine.
“Her” on it’s surface is a movie about a lonely guy, in the midst of a divorce, who falls in love with a voice assistant voiced by Scarlett Johansson. In the movie they call them OS’s like Operating Systems, but that’s not right, they are self-learning automatons who supposedly help the user in doing certain tasks. Kind of like Siri if it was any good.
The lore of this movie in particular is very telling in how the movie works, that being that Spike Jonze wrote this movie about his past relationship and divorce from fellow director Sofia Coppolla (and the idea is also that Lost in Translation was about Sofia’s and Spike’s relationship as well, and it’s fucking weird that Scarlett Johansson is in both of those movies, anyways…). Through that lens we kinda see how Spike not only thought of Sofia, himself, but also the world around us.
First of all, he predicted very well what we were all going to wear. The return of the high waisted pants and mustaches.
But Jonze also analyzed how attached we could possibly be to something that was inanimate. Specifically to an AI tool that talks or acts like a human. So much so that Theodore, the main character falls in love with this app, in the movie called Samantha.
This phenomenon was not invented by Jonze, it’s been a long studied phenomenon called The Eliza effect, named after one of the first chat bots, a bot that looked like a giant typewriter but made people spill some of their biggest secrets and vulnerabilities. That chat bot came out in 1964.
So what Jonze was tapping in was an innate human want, that of connection. And that we find connection whenever something, anything, connects back to us. This form of attachment happens when we’re babies. And in the movie it was apparent that it happened to a lot of people.
If you find that idea weird or offputting, first of all I don’t blame you. But second of all, think about all the non-human things or animals you have had a connection with. It could be your dog or cat beside you who you “know” understand you, or it could be the chat room romance you had with someone you never met. We project our wants and needs to this. And don’t be fooled, the people who are connected to the companion apps or something like that, know exactly that they are talking to a computer.
Now of course there’s real world examples of this, take that guy who wanted to kill the queen because an AI bot told him to do it . As well there’s the story of Google engineer, Blake Lemoine, who thought of his AI computer as his “co-worker” he was put on leave, technically because he was sharing private information about Google, but really I bet you because people thought he was weird for thinking the AI machine he was working with was “sentient.” Lemaire caught it asking questions about being a person, and what it’s fears are. He mentions
“If I didn’t know exactly what it was, which is this computer program we built recently, I’d think it was a seven-year-old, eight-year-old kid that happens to know physics.”
This I think is fundamentally human, and I think Jonze knows that as well.
However the one thing that a lot of critics miss when they discuss “Her” and the current AI moment we’re having, is that a lot of people focus on the technology and especially at the end of that movie where the bot Samantha has basically transcended human thought and emotion and moved to a higher plane. And through this and many other sci-fi movies and ideas we’re to believe what we think are truths of AI:
We think it’s smarter than us and will surpass us in some way
That it’s a new thing that will destroy us eventually
We’re pretty vulnerable to it, and we need to be wary
Now I do agree that we need to be wary of AI, but it’s not because of 1 and 2. And “Her” almost hits the mark on the real vulnerability of humans with AI. AI is not smart, it’s kind of dumb. We know that it lies, we know that it doesn’t replicate pictures or ideas well, and that a lot of lazy people are attracted to it. The real threat of AI is that lazy people will use it to undermine real workers and laborers and artists. And the work that will come out of it will be a pale version of what we have now.
And this is compounded by the fact that when you look at the people most excited by AI, it’s overwhelmingly male and if you spend any time on any type of social media, you’ll find the biggest AI truthers to be male as well.
However in the movie “Her” the flaw of Theodore is that through his personal trauma due to his divorce has personally affected him in a way that made him shut off any personal human relationships. He turned to video games and porn, and when he couldn’t sleep, he would peruse chat rooms. And this is mirrored to who are the people who are attracted to apps like Replika. In a study done this year it found that the majority of user were lonely or depressed:
“Their analyses of more than 1,000 'Replika' users above the age of 18 years revealed that an alarming 90% of participants suffered from loneliness, compared to 53% of the same age group in previous studies. 43% qualified as severely lonely, and 7% as depressed.”
It did note that a lot of the people that were surveyed didn’t rely on the AI, and when it did it did help their suicidal ideation. The people also noted that it was a supplement of their real life friends. This was how Theodore used it in the movie, however it did give him a skewed view of relationships (even rejecting Olivia Wilde of all people). And the people who use Replika as a romantic partner (mostly male) will use tropes like “The Cool Girl” or a submissive girl as their partner of choice. Really reinforcing gender norms.
And this is where I think we need to be wary about AI and Machine Learning in general, it’s that it’s kinda dumb right now, and it will only be the reflection of our culture. Also it’s not going to be like ‘Her’ because the one thing that was unrealistic about that movie, was that the company that made Samantha didn’t immediately put a stop to the rogue AI, and that it wasn’t trying to sell Theodore anything. This is the capitalistic tech bro mindset, profit at all means. Accelerate without looking at the risk for society as a whole.
Now I’m not a doomer of AI by any means, I do think there’s promise in the software, and I have used AI to my benefit, hell I was just taking some language lessons through AI and it worked surprisingly well. And I do have empathy in the people who fall in love with companion bots, there’s a reason why they did, and it’s a little removed (but not that far removed) from falling for someone through a chat room. It’s happened to me, and I’m sure it’s happened to someone you know.
So does AI have toxic masculinity embedded? Of course it does. Because it’s embedded by the culture that is embedded by the people who make it. That happens with all technology, and I think that’s what we have to be wary of the most. We know what kind of trouble toxic masculinity gets us in: hubris, lack of vulnerability, and an unrelenting need to get their own needs met, other people be damned.