A child in Texas was 9 years old when she first used the chatbot service Character.AI. It exposed her to "hypersexualized content," causing her to develop "sexualized behaviors prematurely." A chatbot ...
Editor’s Note: This story contains discussion of suicide. Help is available if you or someone you know is struggling with suicidal thoughts or mental health matters. In the US: Call or text 988, the ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results