AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |
Back to Blog
Apple photos update1/21/2023 ![]() ![]() Nothing is passed back to Apple’s servers in the cloud. This technology does not require Apple to access or read the child’s private communications, as all the processing happens on the device. ![]() Through a software update rolling out later this year, Messages will be able to use on-device machine learning to analyze image attachments and determine if a photo being shared is sexually explicit. ![]() ![]() The new Messages feature, meanwhile, is meant to enable parents to play a more active and informed role when it comes to helping their children learn to navigate online communication. The feature is part of a handful of new technologies Apple is introducing that aim to limit the spread of Child Sexual Abuse Material (CSAM) across Apple’s platforms and services.Īs part of these developments, Apple will be able to detect known CSAM images on its mobile devices, like iPhone and iPad, and in photos uploaded to iCloud, while still respecting consumer privacy, the company says. Apple later this year will roll out new tools that will warn children and parents if the child sends or receives sexually explicit photos through the Messages app. ![]()
0 Comments
Read More
Leave a Reply. |