Safety groups say they’re increasingly finding chats about creating images based on past child sexual abuse material
Predators active on the dark web are increasingly using artificial intelligence to create sexually explicit images of children, fixating especially on “star” victims, child safety experts warn.
Child safety groups tracking the activity of predators chatting in dark web forums say they are increasingly finding conversations about creating new images based on older child sexual abuse material (CSAM). Many of these predators using AI obsess over child victims referred to as “stars” in predator communities for the popularity of their images.
“The communities of people who trade this material get infatuated with individual children,” said Sarah Gardner, chief executive officer of the Heat Initiative, a Los Angeles non-profit focused on child protection. “They want more content of those children, which AI has now allowed them to do.”
Predators active on the dark web are increasingly using artificial intelligence to create sexually explicit images of children, fixating especially on “star” victims, child safety experts warn.
Child safety groups tracking the activity of predators chatting in dark web forums say they are increasingly finding conversations about creating new images based on older child sexual abuse material (CSAM). Many of these predators using AI obsess over child victims referred to as “stars” in predator communities for the popularity of their images.
“The communities of people who trade this material get infatuated with individual children,” said Sarah Gardner, chief executive officer of the Heat Initiative, a Los Angeles non-profit focused on child protection. “They want more content of those children, which AI has now allowed them to do.”
Read full article here: https://www.theguardian.com/technology/article/2024/jun/12/predators-using-ai-generate-child-sexual-images