Site iconSite icon ForkLog

Apple and Google Remove AI ‘Undressing’ Apps from Stores

Apple and Google Remove AI 'Undressing' Apps from Stores

Apple and Google app stores have hosted dozens of nudify apps that enable users to photograph individuals and use artificial intelligence to create nude images of them, according to experts from the Tech Transparency Project (TTP).

Upon investigation, analysts found 55 such apps on Google Play and 47 on the App Store. They contacted the companies to request the removal of these services. Apple removed 28 nudify tools from its store and warned developers of potential removal if they violated guidelines.

Two apps were later restored after all issues were resolved.

A Google representative stated that the company had suspended several programs and is conducting investigations following such reports.

“Both firms claim to care about user safety, yet they host apps that can turn a harmless photo of a woman into an offensive sexual image,” TTP experts wrote in their report.

They identified the apps by searching for terms like “nudify” and “undress” and tested them using AI-generated images. The experiment analyzed two types of services:

“It is clear these are not just ‘clothing change’ apps. They are clearly designed to sexualize individuals without their consent,” said TTP director Katie Paul.

Analysts reported that 14 of the programs were developed in China.

“China’s data storage laws mean the government has the right to access any company’s information anywhere in the country. So if someone created deepfakes with your image, they are now in the hands of the authorities,” Paul added.

AI Misused

Artificial intelligence has made it easier than ever to “undress” women and create deepfake pornography. In January, the chatbot Grok was embroiled in a scandal over a similar feature. The company subsequently disabled the generation of explicit images of real people.

In August 2024, San Francisco City Attorney David Chiu’s office filed a lawsuit against the owners of 16 of the world’s largest websites that allow women and girls to be “undressed” in photos using AI without their consent.

The document cites violations of state and federal laws on deepfake pornography, revenge porn, and materials related to child sexual abuse.

“Generative AI holds great promise, but as with all new technologies, there are unforeseen consequences and criminals looking to exploit new technology for their own ends. We must be clear that this is not innovation — it is sexual violence,” Chiu stated.

The websites involved in the case offer user-friendly interfaces for uploading photos to create realistic pornographic versions. These images are nearly indistinguishable from real ones and are used for extortion, intimidation, threats, and humiliation, the statement said.

In September 2024, Microsoft announced a partnership with the organization StopNCII to combat deepfake pornography in the Bing search engine.

Exit mobile version