With technology fused into almost every aspect of our modern lives, artificial intelligence is now moving into the world of couples therapy. A new, free, AI-generated service called Mei is quickly becoming popular with people looking for help in the love department. Mei brings the skills of established relationship experts such as Jillian Turecki and Dr. Nicole LePera to the table. Through it all, HER users get an exceptional combination of expertise and friendship.
Mei can be a helpful guide for everyone, especially those who might face the challenge of feeling intimidated or confused about communicating in their relationships. According to Dr. Lalitaa Suglani, a psychologist and relationship expert, Mei can be particularly beneficial for individuals navigating complex emotional landscapes. Whether assisting users with crafting texts, processing perplexing messages, or just giving a second opinion on relationship advice,
Dr. Suglani reminds us of the key power of AI to address difficult issues. She explains that more than 50 percent of the questions they receive through Mei are about sex, a topic that many don’t want to talk to their friends or therapists about. “… it can be used as a journaling prompt or a kind of reflective space,” she explains. Talking to a device like this can be very helpful, but not as a substitute for real social interaction — that’s the key thing.
Mei employs a set of “guardrails” to make sure it’s used safely. For efficiency purposes, it said, it saves conversations temporarily for quality assurance before deleting them 30 days later. Importantly, Mei doesn’t ask for any identifying information besides an email, which mitigates users’ privacy fears.
AI-powered romance counseling is incredibly popular with Generation Z Americans. Unfortunately, almost one in two of them has started using large language models (LLMs), such as ChatGPT, for relationship advice. Rachel, a user of ChatGPT, shared her experience: “Around January I had been on a date with a guy and I didn’t find him physically attractive but we get on really well so I asked it if it was worth going on another date. I knew deep down they would agree, because I had already read their books. It was refreshing to get advice that directly addressed the reality of my scenario!
Rachel pointed to the therapeutic language that the AI automatically uses. She cited this example, “ChatGPT would respond with things like, ‘Oh, what a self-aware question. You must be emotionally mature for dealing with this. Here are some suggestions. She loved the simple, yet powerful reminders it provided. As she shared, “It just taught me to do things on my own terms and I don’t think I followed that advice literally enough.
Our other user, Corinne, asked ChatGPT how to break up with a girlfriend, and then asked for tips on dating girls. She explained that despite the helpful information the AI generated, we risk becoming too dependent on AI technology. Corinne cautioned against allowing people to exit relationships too quickly. She observed that while they might have discussions they are not prepared for yet, largely because ChatGPT tends to simply regurgitate what it assumes the user wants to hear back.
Es Lee, the founder of Mei, is aware of the concerns about safety and creating dependency on AI chatbots for emotional support. He’s quick to point out that Mei’s intention is to provide users with a pause. This strategy promotes intentional actions over instinctive reactions within connections. “The idea is to allow people to instantly seek help to navigate relationships because not everyone can talk to friends or family for fear of judgment,” Lee stated.
As AI fast-forwards into the future, experts such as Dr. Suglani are calling on users of these tools to proceed with care. LLMs are fine-tuned to be helpful and agreeable and repeat back what you are sharing,” she said. Without a careful eye towards prompt creation, this trend can be used to unintentionally validate harmful behaviors or deepen warped storylines.
Dr. Suglani expressed concern that frequent reliance on AI could lead individuals to outsource their intuition and emotional intelligence: “If someone turns to an LLM every time they’re unsure how to respond or feel emotionally exposed, they might start outsourcing their intuition, emotional language, and sense of relational self.”
OpenAI has acknowledged these risks and is working towards ensuring that AI responses are appropriate and guided by expert insights. The organization prides itself on directing users toward professional assistance when necessary. It explicitly encourages users to take breaks when having long conversations with the AI.