Apple removed AI apps that create “non-consensual nudity” photos

-

A Apple appears to be removing all applications with which it is possible to use Artificial Intelligence to create photographs with “non-consensual nudity”.

The information is being shared by the website 404 Media, which indicates that some of these apps have even been removed from the ‘Empresa da Maçã’ virtual store, the App Store.

The apps in question began to be removed after ads began to be seen on Instagram for apps that were capable of “Fire any girl for free”.

The publication says that, after Apple got in touch to obtain more information on the topic, it began removing the apps. The process is still ongoing, so it is possible that more will be banned in the coming days.

Read Also: Honor and Huawei are the best-selling brands in China. And Apple?

APP Voted Product of the Year

Download our free App.

Eighth consecutive year Consumer Choice for Online Press and elected product of the year 2024.
* Study by e Netsonda, Nov. and ten. 2023 product of the year – pt.com


APP Voted Product of the Year

Download our free App.

Eighth consecutive year Consumer Choice for Online Press and elected product of the year 2024.
* Study by e Netsonda, Nov. and ten. 2023 product of the year – pt.com


The article is in Portuguese

Tags: Apple removed apps create nonconsensual nudity photos

-

-

PREV Huawei will also bet on a laptop without Intel or AMD!
NEXT 3 essential Smart devices to add to your gift list and celebrate their day