Google cracks down on offensive AI apps, disruptive notifications

Google is mandating developers to include in-app reporting of offensive material, restricting photo and video access, and introducing stricter notification rules.

Representative image. (Photo: Tayfun Coskun/Anadolu Agency via Getty Images)
Representative image. (Photo: Tayfun Coskun/Anadolu Agency via Getty Images)
user

IANS

Google has categorically told developers that all apps, including AI content generators, must comply with its existing developer policies, which prohibit the generation of restricted content like child sexual abuse material (CSAM) and content that enables "deceptive behaviour".

The company announced updates to its developer policies to further elevate the quality of apps on Google Play.

In line with its commitment to responsible AI practices, Google said it wants to help ensure AI-generated content is safe for people and that their feedback is incorporated.

"Early next year, we'll be requiring developers to provide the ability to report or flag offensive AI-generated content without needing to exit the app," the tech giant said in a statement.

The developers can utilise these reports to inform content filtering and moderation in your apps – similar to the in-app reporting system required under the 'User Generated Content' policies.

"As a reminder, apps that generate content using AI must also continue to comply with all other developer policies," Google noted.

To safeguard privacy, some app permissions require an additional review by the Google Play team and have additional guardrails.

"We've found this has been an effective strategy to protect people's privacy and are expanding these requirements further, including a new policy to reduce the types of apps allowed to request broad photo and video permissions," the company added.

Under the new policy, apps will only be able to access photos and videos for purposes directly related to app functionality.

Apps that have a one-time or infrequent need to access these files are requested to use a system picker, such as the Android photo picker.

The policy update also sets stronger boundaries around the use of full screen intent notifications that share high-priority messages and require the user’s immediate attention.

"For apps targeting Android 14 and above, only apps whose core functionality requires a full screen notification will be granted Full Screen Intent permission by default and all others will need to request consent for use of this permission," Google announced.

Follow us on: Facebook, Twitter, Google News, Instagram 

Join our official telegram channel (@nationalherald) and stay updated with the latest headlines