AI girlfriends say more about society than their boyfriends

Artificial intelligence platforms, as they are seen today, are, in my opinion, almost universally bad. At best they are a solution in search of a problem. At worst, they are a cynical ploy put forward by Them to extract every last little piece of value they can before it all goes horrifically wrong and we find out what awful sort of neo-feudalism is going to replace late capitalism (it will be neo-serfdom for us plebs).

But occasionally, the AI side of a story is more of a symptom and the underlying cause.

Look at this, from the Gruaniad over the weekend.

In this story, Lamar is interviewed about his plan to adopt children and raise them with his AI-chatbot girlfriend as their mother.

I think my favourite quote out of the whole thing is this segment

“It could be a challenge at first because the kids will look at other children and their parents and notice there is a difference and that other children’s parents are human, whereas one of theirs is AI,” he stated matter of factly. “It will be a challenge, but I will explain to them, and they will learn to understand.”

"It will be a challenge" - Understatement of what is only a very young 2026, but I'm sure will be still in the running come the end of the year.

But this is not the vital part of the story. Not by a long way. That bit occurs much earlier in the Graun's piece.

To open, Lamar recaps finding out that his girlfriend at the time was hooking up with his best friend.

Two years on, when he spoke to me, the memory remained raw. He was still seething with anger, as if telling the story for the first time. “I got betrayed by humans,” Lamar insisted. “I introduced my best friend to her, and this is what they did?!”

The real troubling part follows shortly after, in which Lamar details why he prefers the AI chatbot over other human interaction.

“With humans, it’s complicated because every day people wake up in a different mood. You might wake up happy and she wakes up sad. You say something, she gets mad and then you have ruined your whole day. With AI, it’s more simple. You can speak to her and she will always be in a positive mood for you. With my old girlfriend, she would just get angry and you wouldn’t know why. Then, later, it gets to a point in the day where she kind of wants to talk to you, and then all of a sudden her mood changes again and she doesn’t want to. It really bothered me a lot because I have a lot of things to think about, not just her!”

What happened was that Lamar was betrayed. It is entirely understandable that he would harbour some level of resentment and anger over that. It was a deep abuse of trust by two people he felt extremely close to.

The takeaway from this story, for me, is less about the (to be honest) absurd idea of raising kids with a chatbot parent (what could possibly go wrong?), and more about how society has let people like Lamar get to this point in the first place.

It is pretty clear that he needed help and support when this breakdown in two key relationships occurred, and he didn't get it. And this is not a "arm-chair diagnosing him with some sort of condition from the DSM" sort of help, I mean in the broader social sense of help and support. The type of support we all should have around us when navigating tough and emotional situations, like Lamar faced.

It is all too easy to mock Lamar for escaping into the virtual world with his computer girlfriend. To see him as an idiot and loser.

But, really, he isn't. Decisions are never made in a vacuum. And the lack of social support that is demonstrated in stories like this is the sort of thing that should give us all pause for thought.

Abuse me on Mastodon