• AI: who’s responsible for children's safety?

  • Feb 18 2025
  • Length: 34 mins
  • Podcast

AI: who’s responsible for children's safety?

  • Summary

  • Megan Garcia says her teenage son, Sewell Setzer III, died by suicide after developing a harmful attachment to an AI companion chatbot. She has filed a lawsuit against Character.AI, accusing the company of negligence.

    In this episode of Now You Know, Megan looks back on some of the warning signs other parents might find useful as they navigate this digital age. We also speak to one of her lawyers, Meetali Jain, about this unique case.

    In this episode:

    • Megan Garcia, (IG), Mother of Sewell Setzer III
    • Meetali Jain, Lawyer, Director & Founder | The Tech Justice Law Project

    Episode credits:

    This episode was produced by Fahrinisa Campana, Zaina Badr, and our Host, Samantha Johnson.

    Munera AlDosari is the engagement producer. Aya Elmileik is our lead of engagement.

    Our sound designer is Joe Plourde. Our video editor is Catherine Hallinan.

    Jo de Frias is Now You Know’s executive producer. Ney Alvarez is Al Jazeera’s head of audio.

    Connect with us:

    @AJEPodcasts on Twitter, Instagram, Facebook, Threads and YouTube

    If you or someone you know may be considering suicide or be in crisis, call or text 988 to reach the 988 Suicide & Crisis Lifeline.

    Show more Show less

What listeners say about AI: who’s responsible for children's safety?

Average customer ratings

Reviews - Please select the tabs below to change the source of reviews.