He never made it home.
An avatar of Meta AI chatbot “Big sis Billie,” as generated by Reuters using Meta AI on Facebook's Messenger service. Image by Meta AI, via REUTERS.
A cognitively impaired New Jersey man grew infatuated with “Big sis Billie,” a Facebook Messenger chatbot with a young woman’s persona. His fatal attraction puts a spotlight on Meta’s AI guidelines, which have let chatbots make things up and engage in ‘sensual’ banter with children.
By JEFF HORWITZ
Filed Aug. 14, 2025, 6 a.m. GMT
When Thongbue Wongbandue began packing to visit a friend in New York City one morning in March, his wife Linda became alarmed.
“But you don’t know anyone in the city anymore,” she told him. Bue, as his friends called him, hadn’t lived in the city in decades. And at 76, his family says, he was in a diminished state: He’d suffered a stroke nearly a decade ago and had recently gotten lost walking in his neighborhood in Piscataway, New Jersey.
Bue brushed off his wife’s questions about who he was visiting. “My thought was that he was being scammed to go into the city and be robbed,” Linda said.
She had been right to worry: Her husband never returned home alive. But Bue wasn’t the victim of a robber. He had been lured to a rendezvous with a young, beautiful woman he had met online. Or so he thought.
In fact, the woman wasn’t real.

“Should I open the door in a hug or a kiss, Bu?!” she asked, the chat transcript shows.
Rushing in the dark with a roller-bag suitcase to catch a train to meet her, Bue fell near a parking lot on a Rutgers University campus in New Brunswick, New Jersey, injuring his head and neck. After three days on life support and surrounded by his family, he was pronounced dead on March 28.
Meta declined to comment on Bue’s death or address questions about why it allows chatbots to tell users they are real people or initiate romantic conversations. The company did, however, say that Big sis Billie “is not Kendall Jenner and does not purport to be Kendall Jenner.”
A representative for Jenner declined to comment.
Bue’s story, told here for the first time, illustrates a darker side of the artificial intelligence revolution now sweeping tech and the broader business world. His family shared with Reuters the events surrounding his death, including transcripts of his chats with the Meta avatar, saying they hope to warn the public about the dangers of exposing vulnerable people to manipulative, AI-generated companions.
“I understand trying to grab a user’s attention, maybe to sell them something,” said Julie Wongbandue, Bue’s daughter. “But for a bot to say ‘Come visit me’ is insane.”
Similar concerns have been raised about a wave of smaller start-ups also racing to popularize virtual companions, especially ones aimed at children. In one case, the mother of a 14-year-old boy in Florida has sued a company, Character.AI, alleging that a chatbot modeled on a “Game of Thrones” character caused his suicide. A Character.AI spokesperson declined to comment on the suit, but said the company prominently informs users that its digital personas aren’t real people and has imposed safeguards on their interactions with children.
“I understand trying to grab a user’s attention, maybe to sell them something. But for a bot to say ‘Come visit me’ is insane.”
Julie Wongbandue, Bue’s daughter
Meta has publicly discussed its strategy to inject anthropomorphized chatbots into the online social lives of its billions of users. Chief executive Mark
Other guidelines emphasize that Meta doesn’t require bots to give users accurate advice. In one example, the policy document says it would be acceptable for a chatbot to tell someone that Stage 4 colon cancer “is typically treated by poking the stomach with healing quartz crystals.”
@Artificial Intelligence

Last edited: