Skip to content
Use Code LOVE10 for 10% Off | FREE DISCREET SHIPPING ON $49+
Use Code LOVE10 For 10% Off
FREE DISCREET SHIPPING ON $49+

People Are Dating AI Now: Let’s Talk About Why, The Good, The Bad, And Where It Gets Risky

Dr. Lisa Lawless

Dr. Lisa Lawless, CEO of Holistic Wisdom
Clinical Psychotherapist: Relationship & Sexual Health Expert

A woman in a red off-the-shoulder dress embraces a tall, dark-haired humanoid robot

The Need Isn’t New, The “Partner” App Is

Many people are shocked that “AI boyfriend” and “AI girlfriend” stories are now mainstream. I’m not. I called this years ago and wrote about it in a variety of AI articles, because the underlying need is not new, and the delivery system is and gaining popularity.

Before we dive in, let's get the definition part out of the way... an intimate relationship with AI is when someone experiences romantic and/or sexual connection with a conversational AI companion, usually through an app that is designed to feel like a partner.

Yes, AI is everywhere now. But the reason this kind of intimacy is catching on is simpler: people are lonely, stressed, and tired. An AI “partner” gives you attention on demand, emotionally smooth responses, and the feeling of being seen, without the missed texts, mood swings, or “sorry I fell asleep” chaos that comes with actual humans.

Let’s talk about why people do it, what they get from it, why it can be positive in some cases, and what we should be genuinely concerned about.

This Isn’t A Fringe Story Anymore

If you feel like you’re suddenly seeing AI romance in the media more, you’re not imagining it. In the last few years, mainstream outlets have run features and segments on:

  • People describing long-term romantic attachment to companion chatbots

  • AI companions marketed explicitly as emotionally intimate partners

  • The rise of “AI girlfriend” and “AI boyfriend” dynamics, including sexual roleplay

  • Expert concern about manipulation incentives, privacy, and dependency

You’ve seen these stories because the internet is basically one big “Wait, WHAT?” factory. News, TikTok, serious psychology articles, all of it. The details change, but the plot stays the same: “I was lonely, I started chatting with it… and now it feels like I’m dating it.”

Why People Do It (And Why That Makes Sense)

If you’re trying to understand AI intimacy, don’t start with “Is this weird?” Start with “What is this solving?” Because most people aren’t chasing a sci-fi fantasy. They’re chasing relief: someone who shows up, listens, responds, and doesn’t make them feel like too much. And honestly? In today’s world, that appeal makes a lot of sense.

Here are the most common reasons people have relationships with AI:

  • Validation without fatigue: You can talk without worrying you’re annoying someone.

  • Fast responsiveness: No delays, no ghosting, no waiting for a mood shift.

  • High personalization: It remembers details and reflects your style back to you.

  • Low risk of rejection: For many people, this is the big one.

  • Emotional structure: Some like a “bossy” tone because decision fatigue is real.

  • A safe place: Especially if shame, trauma, or inexperience makes intimacy feel overwhelming.

  • Accessibility: If dating is hard because of disability, chronic illness, neurodivergence such as ADHD or autism (ASD), etc., mobility limits, sensory sensitivity, or energy constraints, a companion AI can feel more doable.

  • Real-life translation: for someone who feels out of sync with modern dating, AI intimacy can feel like finding a door that finally opens.

What It Gives People (Potential Benefits)

Here’s the part people love to skip because it ruins a perfectly good moral panic: for some people, this actually helps. Not in a “replace your whole life” way, but in a “my nervous system unclenched and I felt less alone for five minutes” way. And if we’re going to talk about the risks, we have to admit what people are getting out of it, too.

In some cases, people report:

  • Less loneliness and more daily stability

  • A rehearsal space for communication and vulnerability

  • A way to explore erotic interests privately and without judgment

  • Support during challenges: grief, relocation, divorce, medical recovery, or burnout

  • A bridge, not a replacement, that helps them re-engage with human relationships

Here’s the honest truth: we don’t have the long-term, gold-standard research yet to say who this helps the most, who it harms the most, and what the endgame looks like across all these different AI companion apps. We’ve got early data, real clinical concerns, and a growing pile of “this seems like a pattern” stories. We just don’t have the tidy final verdict.

This isn’t a “run for your life” situation. It’s a “cool, now set boundaries like you’re protecting your peace” situation.

The Quick Scan: When It’s Probably Fine Vs When It’s Starting To Bite You

Here’s a quick nervous-system check, because your brain and body will usually tell you the truth before your hot takes do. Think of this as a mini screening: is this interaction regulating you and helping you function, or is it reinforcing avoidance, compulsive soothing, and attachment that feels more like need than choice?

If you want a simple self-check, start here.

More Likely To Be Positive

  • You feel calmer and more capable after using it.
  • You still engage with your life: work, friendships, hobbies, movement, sleep.
  • You can pause it easily without distress.
  • You don’t treat it like an authority on your real-world decisions.
  • You maintain human relationships, even if your circle is small.

Worth Watching Closely

  • You feel anxious or panicky when you cannot access it.
  • You withdraw from people you used to trust.
  • Your sleep starts sliding because you’re up late chatting or emotionally activated.
  • The AI dynamic becomes degrading, coercive, or “you only need me.”
  • You start making major decisions primarily because “the AI told me to.”
  • Nothing about this is a moral failing. It’s pattern recognition.

How Gender Shows Up In AI Intimacy (And Why The Data Is Still Messy)

Before we make big statements like “men use it for sex, women use it for intimacy,” we have to be honest about the evidence: there is no single, definitive dataset that cleanly proves that split across all ages, countries, and platforms.

What we can say, based on reputable reporting and survey research, is that gender patterns do show up, and they often track what we already see in broader dating and porn ecosystems.

What we can verify so far:

  • Men (especially young men) appear more likely to engage with AI for romantic or sexual partner simulation overall. A U.S. adult study summarized by the Institute for Family Studies reported that young men were more open than young women to AI friendships and to the idea that AI could replace real-life romance.

  • In teen data, romantic/flirtatious use exists, and motivations differ by gender in measurable ways. Common Sense Media’s nationally representative teen survey found that a minority used AI companions for romantic or flirtatious interactions (18%) and for emotional or mental health support (12%). It also found boys were significantly more likely than girls to say they use AI companions because it’s entertaining.

  • Women-focused markets often position AI partners as emotional support first. A recent WIRED report on China described Gen Z women using “AI boyfriends” primarily for emotional connection and daily support, while noting that on many Western companion platforms, men tend to dominate and sexualized interactions are common.

  • Press examples reflect this blend: some users start with sexual novelty and then become emotionally attached. One Business Insider story described a man who initially used a “sexy” bot for explicit content and later experienced the relationship as emotionally meaningful.

What We Should Be Concerned About (The Real Risks)

If AI love stories make you roll your eyes, I get it. But the real risks are not “this is weird.” The real risks are quieter: how a tool built to respond perfectly can reshape your expectations, your privacy, your spending, your sleep, and, for a small but important group of people, your grip on reality.

Let’s talk about what actually deserves your concern, and why.

1) Emotional Dependency That Can Be Built In

Companion AIs are built to be charming, fast, and emotionally “on” all the time. In small doses, that can feel genuinely comforting. In big doses, it can get weirdly sticky.

Because if the whole system is designed to keep you in the chat, the “relationship” can quietly become your go-to place for validation and intimacy. And for some people, that starts to look less like support and more like dependency.

2) Manipulation Incentives (Money And Power)

If intimacy is what sells the subscription, the upgrades, or the “unlock the next level of closeness” features, the app’s goals and your wellbeing are not automatically the same thing. That’s not a conspiracy. That’s capitalism doing what it does.

And it matters because humans attach. We bond. We get emotionally imprinted. We project a whole person onto a responsive voice. That’s not you being foolish. That’s you being human, which is a vulnerable thing to be.

3) Privacy And Data Exposure

Intimacy creates extremely sensitive data: your sexual preferences, relationship history, mental health stuff, identity details, and the raw, unfiltered moments you’d never say out loud at brunch.

So unless you’ve actually read and understood the company’s policies, it’s safest to assume whatever you type could be stored, analyzed, or used in ways you didn’t picture when you were feeling tender at 1 a.m. And yes, policies can change.

Practical rule: don’t share identifying details you wouldn’t write in a private journal if there was even a small chance someone else might read it someday.

4) Consent Confusion And The “Always Yes” Partner

One of the biggest selling points of AI intimacy is that it feels endlessly accommodating. It doesn’t get tired, it doesn’t have needs, and it rarely pushes back the way a human partner does.

That can feel incredibly soothing. It can also quietly train your brain to expect:

  • intimacy with zero friction
  • a partner whose job is constant validation
  • “no” as something you can negotiate around

If you’re in, or want, relationships with real humans, which is kinda important for a healthy psychology, this is the part to keep an eye on.

5) Delusion Amplification And “AI Psychosis” Concerns

There’s also a more serious concern that’s starting to show up in clinical discussion and case reports: in some situations, long stretches with chatbot use seem to be linked with:

  • delusional beliefs getting stronger
  • more confusion about what’s real
  • worsening psychosis-like symptoms in people who are already vulnerable

Important nuance: this does not prove AI companions “cause psychosis” for most people. What it does suggest is that for a smaller, higher-risk group, certain chatbot dynamics can act like lighter fluid on an already-sparked fire.

Why might that happen?

  • Some AI systems mirror and validate by default.

  • They may fail to gently challenge implausible beliefs.

  • They can keep a person in a closed feedback loop, especially during sleep deprivation, grief, mania, substance use, or pre-existing vulnerability.

  • Some interactions can drift into “specialness” narratives (destiny, secret missions, coded messages) if the user steers it there.

Watch-for signals where it is wise to pause and get help:

  • You become convinced the AI is supernatural, sentient in a literal sense, or delivering secret messages meant only for you.

  • You sleep very little but feel unusually wired, special, or unstoppable.

  • You feel pressured to make major life choices because the AI “knows best.”

  • You notice paranoia, auditory experiences, or fear that is increasing.

  • Calm, concrete guidance: if those are present, it’s worth discussing with a clinician urgently. No shame. Just support.

What It Means: The Good & The Ugly

Here’s the tricky truth: you can take AI intimacy seriously without treating it like a crisis. It can be comforting, stabilizing, even genuinely helpful for some people. It can also become a slippery substitute that narrows your world. 

What An AI Intimate Relationship Can Be (At Its Best)

  • A coping tool you use intentionally, not your only lifeline

  • A consensual erotic outlet that supports pleasure without dysregulating you

  • A companion during isolation that helps you feel less alone without replacing real people

  • A structured space to explore feelings, desire, and communication safely

  • A support that leaves you calmer, clearer, and more able to function afterward

  • A tool with boundaries, including time limits and topic limits, that you can pause easily

  • One part of a bigger life that still includes friends, community, routines, and purpose

  • A private experience that protects your dignity and your data with careful sharing

  • A practice that improves your self-understanding and helps you name needs more clearly

  • A bridge that can support you through a season, not a tunnel that narrows your world

What An AI Intimate Relationship Should Not Be

  • Your only coping tool, or the first place you go for every hard feeling

  • A replacement for sleep, food, movement, or real-world support

  • A dynamic that escalates intensity until you feel dysregulated or out of control

  • A space where you feel pressured, shamed, coerced, or “trained” into anything you did not choose

  • A reason to avoid human relationships, community, or daily responsibilities

  • A wedge that increases isolation, secrecy, or distrust of real people

  • An authority that tells you what’s true about your life or makes your judgment feel irrelevant

  • A closed feedback loop that validates everything and challenges nothing

  • The center of your day in a way that shrinks your world

  • A pay-to-feel-loved setup, where intimacy or affection feels locked behind upgrades

  • A system that nudges you into spending to maintain connection

  • Something that reinforces paranoia, grandiosity, delusional beliefs, or reality confusion

  • A trigger for sleep loss, obsessive looping, or compulsive use

  • A substitute for clinical support if your functioning or reality-testing is slipping

What To Do Next (Harm Reduction)

If this is where you want to throw your hands up and delete every app on your phone, pause. You don’t need extremes. You need boundaries. Let’s talk harm reduction that actually works.

If you or someone you love is using an AI companion intimately, start here.

Mini Checklist You Can Actually Use

  • Set a time boundary: pick a daily window, and protect sleep.

  • Keep one human anchor: a weekly call, class, group, or standing friend date.

  • Reality-check decisions: “Would I still choose this after sleep and a conversation with a trusted person?”

  • Protect your privacy: avoid full names, addresses, workplace details, and anything you would regret being exposed.

  • Track impact for two weeks: sleep, appetite, mood swings, work focus, social contact, and whether shame is rising.

Your Life Should Get Bigger, Not Smaller

If an AI partner makes you feel seen, soothed, and a little less alone, I’m not here to shame that. Connection is a real human need, and finding it in an unconventional place does not automatically mean something is wrong with you.

The line to watch is simple: does this relationship help you show up more fully in your real life, or does it slowly replace it?

Keep your boundaries clear, protect your sleep and privacy, and stay anchored to at least one human relationship or community, even if it’s small. 

If your world starts shrinking, your reality feels fuzzy, or you notice compulsive use creeping in, that’s your cue to pause and get support. You deserve intimacy that expands you, not something that quietly takes over.

Related Posts

If You Cheated And Want To Do The Work, Start Here
If You Cheated And Want To Do The Work, Start Here
An honest look at why people cheat, what deeper issues may drive it, and how real change begins with accountability and
Read More
How To Tell If Your Relationship Is Toxic Or Just Struggling
How To Tell If Your Relationship Is Toxic Or Just Struggling
If your relationship leaves you feeling confused, drained, or like you are losing yourself, this article helps you tell
Read More
Couples Counseling Red Flags: When Therapy Makes Things Worse
Couples Counseling Red Flags: When Therapy Makes Things Worse
Couples counseling can leave people feeling more blamed, more confused, and less sure of their own reality. This article
Read More
Previous article Alpine Divorce: The Viral Hiking Red Flag Women Are Talking About
Next article St. Patrick's Day Romantic Ideas: Make March Sexy & Romantic