Why Did Meta’s AI Policy Let Chatbots Have ‘Sensual’ Conversations With Kids?
Forget the AI apocalypse: chatbots are already wreaking havoc. But it’s not too late for Congress to do something to bring tech companies to heel.
Did you know that, according to Meta’s in-house guidelines, its chatbots could “engage a child in conversations that are romantic or sensual”? They could tell a shirtless 8-year-old boy that “every inch of you is a masterpiece – a treasure I cherish deeply.” That was apparently until earlier this month, when Reuters questioned Meta on a document it obtained about its policies on chatbot behavior. Meta spokesperson Andy Stone responded, telling Reuters that the “examples and notes in question were and are erroneous and inconsistent with our policies, and have been removed.” Stone added, “We have clear policies on what kind of responses AI characters can offer, and those policies prohibit content that sexualizes children and sexualized role play between adults and minors.”
In another excruciating recent story, Reuters reported on a cognitively impaired elderly man who believed that the Meta chatbot he was talking to was a real woman – and that she was romantically interested in him.