Zeteo

Zeteo

Home
Mehdi Unfiltered
We’re Not Kidding
Shows
Columns
Documentaries
Book Club
Shop
Donate To Zeteo
About

Share this post

Zeteo
Zeteo
Why Did Meta’s AI Policy Let Chatbots Have ‘Sensual’ Conversations With Kids?

Why Did Meta’s AI Policy Let Chatbots Have ‘Sensual’ Conversations With Kids?

Forget the AI apocalypse: chatbots are already wreaking havoc. But it’s not too late for Congress to do something to bring tech companies to heel.

Jacob Silverman's avatar
Jacob Silverman
Aug 20, 2025
∙ Paid
155

Share this post

Zeteo
Zeteo
Why Did Meta’s AI Policy Let Chatbots Have ‘Sensual’ Conversations With Kids?
10
40
Share
Photo by Jonathan Raa/NurPhoto via Getty Images

Did you know that, according to Meta’s in-house guidelines, its chatbots could “engage a child in conversations that are romantic or sensual”? They could tell a shirtless 8-year-old boy that “every inch of you is a masterpiece – a treasure I cherish deeply.” That was apparently until earlier this month, when Reuters questioned Meta on a document it obtained about its policies on chatbot behavior. Meta spokesperson Andy Stone responded, telling Reuters that the “examples and notes in question were and are erroneous and inconsistent with our policies, and have been removed.” Stone added, “We have clear policies on what kind of responses AI characters can offer, and those policies prohibit content that sexualizes children and sexualized role play between adults and minors.”

In another excruciating recent story, Reuters reported on a cognitively impaired elderly man who believed that the Meta chatbot he was talking to was a real woman – and that she was romantically interested in him.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Zeteo
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share