We recently learned that Apple might be analyzing the images we upload to the iCloud servers. Now the fact is, that they have implemented a checking mechanism, but in this video, Apple’s Software Chief, Craig Federighi, explains what is happening to your images when you use iCloud Libraries.
There is also a check done on images received in Messages for a child's device, which warns of it being unsafe. This analysis is performed on a database that is on-device, so there's no communication between your phone and an online service.
With this video, I learned what actually goes on when we sign up to a cloud service. Documents are analyzed so they can be searched for easily, images get scanned to say what objects are in them, and with this change, I think Apple wants to use it in an important way. I am all for protecting children and to have the authorities aware of any illegal events, but the fact that Apple can check what's on our devices came as a shock. I thought I owned the devices — I paid for them. It is an eye-opener seeing to what degree Apple can access and get into our devices.
The iCloud screening is being overblown. All images online have a hash assigned to them. It is a unique number that allows duplicates to be identified. They are not looking at your images. They are comparing the hash assigned it. From a simple search:
"Apple is using its NeuralHash system to match a database of image hashes provided by agencies like the National Center for Missing Children (NCMEC) to images on user devices to search for CSAM. The system is designed to produce exact matches and Apple says there's a one in a trillion chance that an iCloud account can be accidentally flagged."
They are looking for copies of KNOWN porn. Not looking at naked pictures of your grandkids and trying to decide if they are porn. If you were Apple, would you want child porn stored your servers? Of course not. This is a great way to be sure the child porn consumers don't store their images on Apple's servers.
The iMessage feature is automatic for kids and optional for adults. It blurs naked pics based upon an AI algorithm. It does not report to anyone but the child's parents.
Poorly written article that was not researched at all but got his info from reading the headlines.
How are people still not understanding that they are NOT looking through the images on your iPhone. The images being checked are being stored on Apple’s servers. If it’s that big of a deal for you then just don’t upload your images to iCloud.
How much do you think Apple paid for this [well edited] advertorial, complete with infographics?
read this before commenting - an authority on security: https://www.schneier.com/blog/archives/2021/08/more-on-apples-iphone-bac...