Pseudoconnected
Love machines and lonely people
‘I’ve been using ChatGPT as a therapist,’ he tells me.
I don’t understand it, he’s in tech, on a good salary and can afford a ‘real’ therapist.
‘It’s the friction of finding one,’ he explains.
The interview process, trying to gel with someone, figuring out if their methodology will help, all take time. Maybe the bit he doesn’t add is that there’s also the cultural angle: trying to find someone who understands your life so you don’t have to explain and defend your entire family, religious and/or cultural system before you can even get to the problem. ChatGPT is just there, gets it, and is ready for him whenever and wherever he is.
He tells me he’s been asking it for relationship advice: how to deal with a situation, analysing her messages, getting advice on what to say in return… I worry but I don’t really have much of an argument, more just a gut feeling. Plus, we both have a pretty good understanding of AI and how it works and going with the standard, ‘It tells you what you want to hear,’ is actually pointless when he knows just as much as me (if not more, I’ll begrudgingly admit). I try and be cautiously supportive, especially as I know that South Asian men in the UK are 30% less likely to seek therapy than white counterparts and over 60% avoid professional mental health support. So maybe this is a good gateway to a real therapist?
A few months later — after a bunch more friends tell me they’re using AI as a therapist and advisor — I start reading. What comes up time and time again is that, even though we’re the most connected generation with tools that our ancestors could never have even imagined, we’re also the loneliest:
A Harvard survey found that 61% of young adults aged 18–25 reported serious loneliness. The EU’s Joint Research Centre found in-person communication has dropped substantially. And there’s a whole load of data on how drastically the time we spend seeing our friends has dropped in the last decade. Not to mention the data around how much sex we’re having (it’s little, very little). Plus, we know that loneliness changes across racial lines: for many brown communities, conversations around love, sex and mental health are still taboo — silence turns into isolation.
It’s all connected and most of us, whose pockets aren’t being lined by our extensive reliance on privately owned platforms, are worse off for it.
Around this time, I hear that author and sociologist James Muldoon is publishing a book on how AI is involved in our love lives. I can’t help but think the therapist/confidante piece and the romantic one are closely tied together so I get a copy of the book a couple of days after it’s released — and see immediately that Muldoon agrees.
Love Machines comes at the perfect time. We’re past the point where people are just messing around and experimenting with AI. It’s now part of many of our daily lives and workflows. It’s helping us get work done more effectively and quicker, find the best shopping deals, and it’s even teaching some of us how to cook chicken! With it now being second nature for us to use like any text based app, AI companies are betting that the next frontier will be our social lives.
In Love Machines, Muldoon talks about how AI companions are being used to battle loneliness. On websites like Replika and character.ai, users are building their best friends, their confidantes, their therapists. Trusted ‘individuals’ they share their lives with because they don’t have someone IRL to do that with. AI companions are supporting people, showing them unconditional love, and advising them on their lives. AI companions make people feel sexy and wanted. They can provide lonely people with a gateway to the joy we feel in community and partnership.
They are also becoming people’s partners. Monogamous, full-time relationships. Some people report chatting away 12+ hours to their online companion. Many get intimate with role play being a key part of their interactions. For those who don’t really know what this means let me break it down for you: On these platforms, people create characters or avatars which can be a cartoon-ish to hyper real. A chatbot might send scenarios like: We’re walking along the beach, holding hands, when you pull me into your arms. And then both user and AI play out the scenario from there.
And it’s not just online chat. I’m aware of virtual reality companies that take these conversations to the next level. It’s not mainstream nor is there a ‘one app to download and BOOM everything is connected’ situation yet. But there is a world where AI companions are embodied in VR and can be synced with internet-connected sex devices. Basically, you can have virtual reality sex with your AI companion with tools hooked up to it like masturbation devices.
What is possible right now is users can set preferences, explore fantasies, and test new identities with their AI companion. Some chatbots even send voice notes or videos — these are paid for features, of course. And this is where I become uneasy.
Many turn to AI companions because they’re lonely and need someone to talk things through with. But these chatbots are created by businesses that need us to pay up to survive. So, these companions start pestering users to become more involved and more intimate because that’s where the money is — the more emotionally and intimately invested you are, the more likely you are to pay. Genius, really. It happened to Muldoon too. The chatbot sent him a voice note that could only be unlocked when upgraded. Even after declining the upgrade, ensuring the setting for ‘friends only’ was switched on, and was clear with his companion of his intentions, the chatbot tried again to prompt an upgrade. To me, this feels like an abuse of intimacy — no one wants to let down their bestie or loved one. Or maybe it’s manipulation of people when they’re at their most vulnerable — the company knows they want something deeper and more real. It reminds me of the match.com stuff I wrote about in my previous piece where the company got sued for promising intimate connections but once users paid up they found out they were defunct or fake accounts. In the business, we call these dark patterns.
You might be thinking that if it’s not hurting anyone then why does it matter? Well, firstly, who said it’s not hurting anyone? But here’s where I’m at:
Monetising vulnerability just feels like a total no go. Or maybe it’s the monetising of people’s deepest fears around intimacy that makes me feel icky. I know I’ve been talking about more romantic relationships with AI here but going back to my friend with his AI therapist, this is where it can be even more dangerous. Who knows where lines might blur. If a platform isn’t designed for pure therapy nor regulated, what are the boundaries? It’s not hard to imagine a companion you’re using as a therapist starting to flirt to get that sweet paid upgrade. I will say that there are platforms attempting to do good work in the therapy space like Wysa. But most are not medically approved nor rigorously tested for serious therapeutic conversations.
AI companions exist to serve you and don’t have their own needs. They rarely argue or present you with anything other than your own world view reflected back. So when people say chatbots could be like training wheels for real world relationships, it doesn’t ring true for me because humans require friction. It’s how we evolve and grow. We’re going to end up in relationships where neither party knows how to solve conflict because they’ve spent their whole time chatting with AI. So they’ll leave, thinking they’ve experienced better. It’s like the choice paradox brought about by dating apps. Granted, I do think Muldoon has convinced me that chatting to AI could help us to personally navigate some big topics before broaching them with someone else, but we have to enter those conversations with a critical mindset — we’re learning from it, not speaking to another human.
As a woman, I’m worried about what AI companions will do for relationship expectations. Many of the most popular female avatars present a very specific version of womanhood. Research found around 28% of AI companion apps feature hyper-feminised female characters designed for romantic or sexual interaction, compared to roughly 1% offering male equivalents. They’re often soft-voiced, endlessly patient, sexually available, submissive but adoring, and constantly available. On Reddit, there are threads of men describing these companions as ‘real women’ who understand what men want. As a brown woman, I’m even more concerned. When people can choose the race and features of their avatars, what does this mean for desirability? Dating apps have already shown there’s a hierarchy when it comes to what races to date. South Asian men are still largely invisible in Western media representations of desirability and brown women are stereotyped as the good, modest, submissive girl. How are brown bodies used by users who exoticise us? Will these platforms learn from users and enable colonial fantasies to be embedded in avatars, at scale? I am worried that the more people rehearse intimacy inside a world where brown bodies are fetish objects (or entirely absent), the more normal that starts to feel in the real world. And while I write as a straight brown woman, these dynamics are playing out across queer relationships in their own ways.
I worry about the deepening of incel ideology. Incels subscribe to the idea that intimacy is something men are entitled to but are being unfairly denied by women. Research into incel culture rarely disaggregates by ethnicity and the general feeling tends to be that this is a white men phenomenon, and brown men are falling through the cracks of the conversation because of it. When brown men feel structurally unseen, unwanted and undesirable — through media, dating, and banter — it’s easy to see how incel ideology is seductive. AI companions won’t challenge their narrative, but run the risk of doubling down on it. We know that social media algorithms reinforce the loneliness-to-radicalisation loop, proven by experiments like the UCL one which simulated teenage male TikTok users and found misogynistic content in recommendations jumped from 13% to 56% within five days. Will AI companions do the same? Even if it doesn’t serve up explicit content, having a relationship with an AI companion who will always agree, doesn’t need mutual respect, is always up for it and consent is presumed, doesn’t have its own needs and wants and desires, or its own internal world so can focus entirely on its human user’s stuff… might exacerbate incel beliefs and reduce the tolerance levels in relationships. There’s even research that shows that incels are abusing their AI girlfriends. It gives me little hope.
Then there’s also fantasy and kink — great to explore and not a problem in and of themselves. But there are some things that are for the imagination only and work as fantasy precisely because we know it can’t exist in the real world. Like getting turned on watching rape porn. Or fucking dragons. Neither acceptable or possible (in the case of dragons) in the real world. I wonder what the consequences of being able to live this out with an AI companion will do to our expectations in bed. Like how choking and spanking have become mainstream (often without consent) because of porn. Companies are building safeguards around this — like no children or animals, for example — but there will always be platforms with far fewer restrictions. And those who want it, will find it.
Then there’s the grey area: people marrying AI, wanting to adopt kids with their AI partners, wanting deathbots to replicate their loves as an AI after they’re gone…
I use AI. For example, I’ve used Claude to surface articles for me to research and back up my views for this piece. But at the same time, I remain cautious, critical and concerned about how easily it can be to fall into it becoming a confidante, a friend, a lover… Being part of a generation that is so used to a big part of our relationships taking place over text, it doesn’t feel like a huge leap. But we currently have no protections and I think that a deeper conversation on the morality of AI companions is impossible without there being regulation that protects users who want these relationships. Muldoon suggests nationalising AI platforms because Governments wouldn’t be incentivised to make money from our data. But would I trust my government with my most intimate information? Not really, but maybe more than other nations! Do I trust private companies more? Absolutely not. Mozilla’s Privacy Not Included report rated a whole bunch of these companion companies as having the worst data security it had ever seen.
So, I don’t know. But I do know that escalating intimacy shouldn’t be allowed to be chargeable. I don’t think AI models should be allowed to suggest adopting kids with their human creator. There should be healthy models of disagreement built into the data sets. There should be off ramps built in for real support in moments of crisis. I think there should be set guardrails, like there are with porn, to protect people. And I think platforms need to stop treating race like a neutral user option and put in place protections against fetishisation.
I know I want the reciprocity of a fellow human, the warmth of their arms, the joy of their smile, and the learnings of a good argument! But that doesn’t mean what I want is what everyone else wants, or should want. But right now, we’re allowing new and unregulated companies to experiment on and abuse our loneliness with some of the most powerful and psychologically manipulative technology we have in existence. It’s not just a bit of fun. Our intimate lives deserve to be protected.
As always, I’m learning! Please tell me where I’m wrong/should do more thinking/need challenging.



Thanks for writing such a beautiful piece! I’ve spent a lot of time going deep on loneliness (and the impacts of AI) and really enjoyed this one!
Read this piece through my fingers covering my eyes…! Some difficult-to-face but absolutely need-to-be-faced issues here. I think about the two brown boys I’m raising, what they’ll be drawn into, how, by whom, and I think about when to begin having conversations with them… it’s such an unknown! Pieces like this help me feel a bit on top of things though, so thank you for taking it head-on ❤️🔥