zdnet[.]com/article/a-new-android-feature-is-scanning-your-photos-for-sensitive-content-how-to-stop-it/
In October of last year Google released a security update for Android 9 & later. Something called Android System SafetyCore was included. It's reported that you could not stop the installation, and may not be able to disable or uninstall it. While the app itself does not contact Google, it has almost total permissions and can interact with other software.
Google said in a developer note that the release was an "Android system component that provides privacy-preserving on-device user protection infrastructure for apps."The update said nothing else. This information left ordinary users in the dark and, frankly, did little for programmers, either.
After the release, in a listing of new Google Messages security features, while not mentioning SafetyCore by name, Google described the service's functionality: "Sensitive Content Warnings is an optional feature that blurs images that may contain nudity before viewing and then prompts with a 'speed bump' that contains help-finding resources and options, including to view the content. When the feature is enabled, and an image that may contain nudity is about to be sent or forwarded, it also provides a speed bump to remind users of the risks of sending nude imagery and preventing accidental shares."
Google assured users in the note that: "Sensitive Content Warnings doesn't allow Google access to the contents of your images, nor does Google know that nudity may have been detected."
However, we now know SafetyCore does more than detect nude images. Its built-in machine-learning functionality can also target, detect, and filter images for sensitive content.