Editor's Note: The following was translated from a report originally published in Portuguese in Núcleo.
Investigation uncovered 14 Instagram profiles with hundreds of thousands of followers. None are online anymore
Meta's communications office in Brazil has confirmed that it has removed 14 profiles with child sexual exploitation content created with artificial intelligence from Instagram because of a Pulitzer Center-supported report by Núcleo. Instagram is owned by Meta Platforms.
Núcleo’s investigation, produced in partnership with the Center’s AI Accountability Network, had uncovered 14 Instagram profiles with thousands of followers that were sharing sexualized images of children and teenagers. In Brazil, this type of content is considered a violation of the Statute of the Child and Adolescent, even if it is generated by AI.
The accounts also directed users to subscription content platforms, such as Patreon and Fanvue, where more illegal material could be purchased. They also included links to WhatsApp and Telegram groups that contained not only AI-generated images but also sexual abuse material depicting real children.
During the investigation, Patreon said it had removed the creators of illegal content because of the report. Fanvue never replied.