Google has issued new guidelines for developers creating artificial intelligence apps distributed through its Android app store, aiming to limit inappropriate and prohibited content.
The company states that apps featuring AI capabilities must prevent the creation of prohibited content – including sexual, violent, and other inappropriate content – and must provide users with a way to report any abusive content they encounter.
In addition, Google emphasizes that developers need to conduct thorough testing on AI tools to ensure respect for user safety and privacy.
Furthermore, it asserts that strict measures will be taken against apps promoting marketing content that involves inappropriate uses, like apps that remove people’s clothes or create nude images without their consent.
If an ad copy mentions that the app is capable of such actions, it may be banned from the Google Play store, regardless of whether the app can actually perform them or not.
These guidelines come after the appearance of many AI apps in recent months, which self-promote through social media platforms.
A report released in April found that Instagram was hosting ads for apps claiming to use AI to generate fake nude images.
Schools across the United States reported issues related to students exchanging fake nude images generated by AI technology for bullying and harassment purposes, as well as other types of inappropriate content resulting from artificial intelligence.
Google and Apple withdrew apps from their app stores, although the problem remains widespread.
Google states that its policies help prevent apps containing content generated by AI that may be inappropriate or harmful from accessing the app store.
The company says that AI apps cannot allow the production of restricted content and must provide users with a way to report abusive and inappropriate content, as well as monitor and review those comments and prioritize them.
Moreover, developers are responsible for protecting their apps from claims that may exploit AI capabilities to produce harmful or abusive content.
Google suggests that developers can utilize the closed testing feature to share early versions of their apps with users for feedback.
The company advises that developers document these tests before launching, as Google may request to review them in the future.