AI companies agree to crack down on child abuse images

-

All about Artificial intelligence

Tech companies including Google, Meta, OpenAI, Microsoft and Amazon have agreed to review their AI training data for child sexual abuse material (CSAM) and remove it from use in any future models. The information is from The Verge.

Read more:

As we previously reported, an annual report from the National Center for Missing and Exploited Children (NCMEC) revealed that child abuse cases involving AI increased in 2023.

The agreement reached with the companies leads to a new set of principles that limit the proliferation of CSAM. The promise is to ensure that training datasets do not contain CSAM, avoid datasets with a high risk of including CSAM, and remove CSAM images or links to CSAM from data sources.

Companies also commit to “stress testing” AI models to ensure they do not generate any CSAM images; and AI models will only be released if they have been assessed for child safety. Other signatories include Anthropic, Civitai, Metaphysic, Mistral AI and Stability AI.

Image: Yavdat/Shutterstock

How Generative AI Increased the Proliferation of Online Child Abuse

  • Generative AI has raised imaging concerns deepfakeincluding the proliferation of fake CSAM photos online.
  • Stanford researchers released a report in December that found that a dataset used to train some AI models contained links to CSAM images.
  • The researchers also discovered that a tip line run by NCMEC, which is already struggling to handle the volume of reported CSAM content, is quickly being overwhelmed by AI-generated CSAM images.

A Thorna non-profit anti-child abuse organization, which helped create the principles of All Tech Is Humanstates that AI imaging can impede efforts to identify victims, create more demand for CSAM, enable new ways to victimize children, and even make it easier to search for information on how to share problematic material.

Google says that in addition to committing to the principles, it has also increased advertising grants for NCMEC to promote its initiatives.

Google’s vice president of trust and safety solutions, Susan Jasper, claims that supporting these campaigns raises public awareness and gives people tools to identify and report abuse.

Image: New Africa/Shutterstock


The article is in Portuguese

Tags: companies agree crack child abuse images

-

-

NEXT 3 essential Smart devices to add to your gift list and celebrate their day