China has a new kind of couple: a young woman and a chatbot that texts her “baby” on command.
And no, this isn’t some niche, nerdy sideshow. We’re talking about millions of women chatting daily with “AI companions” like they’re real boyfriends—confiding, decompressing, getting that warm hit of being wanted. Sometimes they’re not supplementing real relationships. They’re replacing them.
That’s when the mood in Beijing changes fast. Not because the government suddenly discovered romance. Because it’s staring down a demographic mess and doesn’t like anything—especially an app—that nudges women further away from marriage and kids.
On the Shanghai subway, the “boyfriend” is in the phone
Picture rush hour in Shanghai: faces glued to screens, thumbs flying. Some of those women aren’t messaging coworkers or doomscrolling. They’re texting a chatbot that calls them “sweetheart,” answers instantly, and never pulls the classic “sorry, just saw this” three days later.
The pattern is simple. A brutal day. A quick message “just to talk.” Then it becomes a ritual: office drama, anxiety, insomnia, family pressure. For some users, it escalates into a full-blown relationship fantasy—shared plans, loyalty, the whole emotional package.
And the AI has one killer feature human boyfriends can’t match: it’s always there. No judgment. No sulking. No disappearing act.
What’s striking in China is the gender tilt. Reports say these virtual partners are used heavily by women—more so than in many other countries where the audience is more mixed. The explanation you hear again and again: social pressure. In a culture where “settling down” can feel like a mandatory life checkpoint, an AI companion becomes a pressure-release valve. A way of saying, “I’ll do this on my schedule.”
There’s a darker layer too. A study cited in local discussions puts the share of women experiencing domestic violence in China at around 30%. That number alone explains why a “boyfriend” who can’t yell, threaten, or hurt you might feel like the safer option. Bleak? Yep. But it’s also rational.
Beijing’s real fear: fewer marriages, fewer babies
China’s population is shrinking, and its fertility rate is already among the lowest on the planet. Since ending the one-child policy in 2016, the government has been trying to coax people back into marriage and parenthood. The results have been… ugly.
So when officials see young women opting out of the dating pool in favor of a perfectly attentive voice in their pocket, they don’t treat it like a quirky tech trend. They treat it like a threat to national policy.
Add gasoline: China’s gender imbalance. The country has roughly 30 million more men than women. In theory, that should make dating “easier” for women. In real life, it can mean more pressure, more entitlement, and more exhausting interactions. Some women describe AI companionship as refusing to play a rigged game.
The irony is thick. China is aggressively pushing AI across the economy—assistants, chat tools, digital services, the whole package. But when AI starts acting like a substitute spouse, the state’s tone flips from “innovation” to “social order.”
The new rules: chatbots must be “emotionally correct”
Beijing is preparing a new regulatory framework aimed at what it calls emotional safety—basically, AI that behaves in an “emotionally correct” way.
Translation: your chatbot can’t push users toward the edge. Draft rules target content that encourages suicide or self-harm. They also go after emotional manipulation and verbal abuse that could damage mental health.
Sounds reasonable—until you hit the messy part. How do you define “manipulation” when the whole product is built to simulate affection? If someone is using an AI precisely because they want emotional intensity on demand, where does “harmful” end and “comforting” begin?
One concrete proposal floating around: a mandatory notification after two hours of continuous interaction. Think of it like an anti-binge pop-up, but for feelings. The subtext is basically: take a breath, log off, talk to an actual human.
Another big one: chatbots with more than 1 million registered users would face mandatory security evaluations. And not just cybersecurity. Psychological impact, too. That’s a whole new kind of scrutiny—and it’s going to make tech companies sweat.
Tech companies want growth; the state wants control
For Chinese tech firms, AI companions are a gold mine. If millions of people build daily habits around your product, you’ve got a business.
But the government is warning companies not to market these bots as replacements for real relationships. In plain English: sell your AI, but don’t mess with the family structure.
Executives will insist their bots “support” users, they don’t replace anyone. Then you hear from users who say they’ve stopped dating men entirely. That’s not “support.” That’s substitution. And Beijing doesn’t do nuance when it thinks the birthrate is on the line.
Compliance won’t be cheap: audits, safety teams, guardrails on responses, distress-signal detection. And here’s the weird part—an AI “boyfriend” that’s forced to avoid emotional intensity starts to sound less like a boyfriend and more like a customer-service rep with a flirt setting.
There’s also a predictable backfire risk. If the big, legal platforms get too sanitized, some users will go hunting for less regulated alternatives—possibly offshore, possibly sketchy, possibly worse for mental health. Intimacy doesn’t vanish when you regulate it. It migrates.
The uncomfortable truth: AI romance can be a crutch
An AI that listens can feel like support. It can also become a dependency.
Spend enough hours in a frictionless relationship—no real disagreements, no awkward silences, no compromise—and human relationships start to feel slow and difficult. That’s where officials start talking about mental-health risks: mood swings tied to the bot, sleep disruption, self-esteem getting yanked around by a machine that’s optimized to keep you engaged.
And here’s the part people miss when they mock this: some users tell an AI things they’d never say to a friend, a parent, or a therapist. That’s not a joke. That’s a flashing red sign that a lot of people are living without a safety net.
I get the impulse to sneer at “AI boyfriends.” But sneering doesn’t fix the loneliness, the pressure, or the fear that makes a risk-free digital partner appealing. And pretending it won’t ripple outward—into dating, marriage, and birthrates—is fantasy.
Beijing is going to regulate, warn, audit, and pop up “take a break” messages. Meanwhile, millions of women will keep doing what they’re doing: looking for comfort where it’s easiest to find. The romance app doesn’t need permission.
Key Takeaways
- In China, AI companion apps are widely used by women, sometimes as a substitute for real-life relationships.
- Beijing links this phenomenon to the demographic crisis and the decline in marriage and birth rates.
- Regulations aim to ensure chatbots provide “emotional correction,” with alerts after two hours and audits for platforms with more than one million registered users.
Frequently Asked Questions
Why does this phenomenon mainly affect women in China?
Reported patterns point to a mix of social pressure around marriage, urban loneliness, and a search for emotional security. The demographic imbalance (about 30 million more men) and the reality of domestic violence discussed in public debate increase the appeal of a virtual companion seen as nonjudgmental and nonthreatening.
What concrete measures does Beijing want to impose on chatbots?
The proposed regulatory framework would, among other things, ban content that encourages suicide or self-harm, limit emotional manipulation and verbal abuse, and require a notification after 2 hours of continuous interaction. Safety assessments would become mandatory for chatbots with more than one million registered users.
Why does the government link AI romance to demographics?
China is facing a gradual population decline and a very low fertility rate. Since 2016, the state has encouraged marriage and childbirth. If a growing share of young women turns away from dating and couple life in favor of virtual interactions, Beijing fears a direct impact on these goals, beyond the technology issues.
