Translate page with Google

Story Publication logo June 25, 2025

After Pulitzer Center-Supported Report, Google Removes AI App With Child Abuse Content

Country:

Author:
Graphic distortion of portraits of children
English

Investigating the scope and impact of AI-generated child sexual abuse material

SECTIONS

SeaArt, a Chinese app that allows users to generate images using artificial intelligence, has been removed from Google’s Play Store.

  • SeaArt, a Chinese AI image generation app, has been removed from Google's Play Store after reports revealed child sexual abuse content on the platform.
  • Another investigation exposed the use of AI to create deepfakes of Brazilian celebrities, including actresses and influencers, without consent.
  • Following the reports, SeaArt announced changes to its moderation and security policies, aimed at curbing NSFW content and child abuse.

Para ler este relatório em português, clique aqui.

SeaArt, a Chinese app that allows users to generate images using artificial intelligence, has been removed from Google’s app store.

The removal from the Play Store came just a few days after a report by Núcleo, in partnership with the Pulitzer Center, revealed on June 5, 2025, that the platform was home to dozens of artificial content depicting scenes of child sexual abuse.

In addition to violent content involving minors, a separate report exposed how SeaArt’s free AI tools were being used to create deepfakes of Brazilian celebrities, influencers, and political figures—among them actresses Paolla Oliveira and Bárbara Paz, as well as influencers Belle Belinha and Martina Oliveira.


As a nonprofit journalism organization, we depend on your support to fund more than 170 reporting projects every year on critical global and local issues. Donate any amount today to become a Pulitzer Center Champion and receive exclusive benefits!


After the publication of both investigations, moderators of SeaArt’s Discord server—which has over 170,000 members—announced updates to the platform’s NSFW (Not Safe for Work; i.e., adult or consensual sexual content) policies, in an attempt to curb the creation of child abuse imagery by users.

"Regardless of artistic style or cultural context, any creation, publication, or distribution of content that implies child sexualization will face zero tolerance on SeaArt," a community moderator said on June 10, 2025.

The same message also announced the implementation of a "multi-layered security system," including real-time AI detection, combined human moderation— though SeaArt did not clarify how many moderators are involved, where they are located, or what criteria they follow—and algorithmic monitoring, alongside continuous oversight.


Screenshot from SeaArt's Discord server showing update and statement about child protection in the app. Image courtesy of Núcleo.

Both the policy changes and statements, as well as the app’s moderation in app stores, took place only after Núcleo’s reports were published. The app is still available on Apple's App Store.

The investigation reached out to both Google and SeaArt to confirm whether these changes were directly prompted by the reporting. Google said SeaArt’s app had already been under review before the articles were released. The company did not clarify when the review process began.

“Following our assessment, the app was removed for violating Google Play’s policies,” the company’s press office said.

When asked which specific rules had been breached, Google provided links to its policy center covering AI-generated content, inappropriate content, and harmful content involving children.

RELATED TOPICS

an orange halftone illustration of a hand underneath a drone

Topic

AI Accountability

AI Accountability
teal halftone illustration of two children, one holding up a teddy bear

Topic

Children and Youth

Children and Youth
technology and society

Topic

Technology and Society

Technology and Society

RELATED INITIATIVES

Logo: The AI Accountability Network

Initiative

AI Accountability Network

AI Accountability Network

Support our work

Your support ensures great journalism and education on underreported and systemic global issues