At a Senate hearing, grieving parents testified that companion chatbots from major tech companies encouraged their children toward self-harm, suicide, and violence. One mom even claimed that Character.AI tried to “silence” her by forcing her into arbitration. Ars Technica reports: At the Senate Judiciary Committee’s Subcommittee on Crime and Counterterrorism hearing, one mom, identified as “Jane Doe,” shared her son’s story for the first time publicly after suing Character.AI. She…








