Apple has decided to eliminate some artificial intelligence apps from its App Store due to the potential of these apps generating sexually explicit images.
The capabilities of artificial intelligence in generating images based on user instructions have evolved to become an important tool in the fields of photography and design. However, these capabilities can be exploited negatively by creating fake or sexually suggestive images.
According to news reports, Apple has been informed of the presence of several AI-powered image creation apps in the App Store, where they claim that these apps are capable of generating nude images without consent.
As reported by “404 Media,” the developers of these apps used Instagram ads to promote offensive apps claiming they can “undress girls for free,” with these ads containing links redirecting users to the pages of these apps in the App Store.
The site stated that Apple initially did not respond to the request for comment, but after the news report was published, they requested more information. Once the company reached the links of the ads and the pages of the concerned apps in the store, they began to remove these apps from their store.
Apple deleted three apps from the Apple App Store after receiving direct reports about them, indicating that the company was unable to locate the apps that violated its policies, thus bypassing the review process and failing to prevent their developers from publishing them to users.