We recently learned that Apple might be analyzing the images we upload to the iCloud servers. Now the fact is, that they have implemented a checking mechanism, but in this video, Apple’s Software Chief, Craig Federighi, explains what is happening to your images when you use iCloud Libraries.
There is also a check done on images received in Messages for a child's device, which warns of it being unsafe. This analysis is performed on a database that is on-device, so there's no communication between your phone and an online service.
With this video, I learned what actually goes on when we sign up to a cloud service. Documents are analyzed so they can be searched for easily, images get scanned to say what objects are in them, and with this change, I think Apple wants to use it in an important way. I am all for protecting children and to have the authorities aware of any illegal events, but the fact that Apple can check what's on our devices came as a shock. I thought I owned the devices — I paid for them. It is an eye-opener seeing to what degree Apple can access and get into our devices.
The iCloud screening is being overblown. All images online have a hash assigned to them. It is a unique number that allows duplicates to be identified. They are not looking at your images. They are comparing the hash assigned it. From a simple search:
"Apple is using its NeuralHash system to match a database of image hashes provided by agencies like the National Center for Missing Children (NCMEC) to images on user devices to search for CSAM. The system is designed to produce exact matches and Apple says there's a one in a trillion chance that an iCloud account can be accidentally flagged."
They are looking for copies of KNOWN porn. Not looking at naked pictures of your grandkids and trying to decide if they are porn. If you were Apple, would you want child porn stored your servers? Of course not. This is a great way to be sure the child porn consumers don't store their images on Apple's servers.
The iMessage feature is automatic for kids and optional for adults. It blurs naked pics based upon an AI algorithm. It does not report to anyone but the child's parents.
Poorly written article that was not researched at all but got his info from reading the headlines.
How are people still not understanding that they are NOT looking through the images on your iPhone. The images being checked are being stored on Apple’s servers. If it’s that big of a deal for you then just don’t upload your images to iCloud.
Hashes are being generated on the device, and that has been happening for quite awhile. Literally the only new thing here is that if a hash that Apple has designated as CSAM shows up on their servers they flag it and send it for review. Why is this so difficult to understand? I get that you have a crusade against Apple - get over it. If you don’t want a giant corporation flagging your data then don’t put it on their servers.
You have spent time creating an account on a photography website just to make negative comments about a company, whilst also accusing someone else of being fanatical…. Surely you must see the irony in this?
I have no issue with what Apple are doing with this update, but I’m not here to discuss that with you. I was merely pointing out you have an account with only 4 comments on, all posted on this article, yet you’re accusing someone who’s view is different to yours of being a fanatic. In reality the one who’s behaviour is showing as more fanatical is the person who created the account solely for the purpose of commenting on this article.
In other words you’re essentially a troll, and not a very good one at that.
This is my only account, so no, we don’t ‘all have our troll accounts’, that would just be people like you.
No, it’s not a bad argument, it is an accurate statement. You choose who you send your files to and which company’s servers you want to store data on. Don’t like what Apple is doing? STOP USING THEIR SERVERS. There are plenty of other cloud solutions from other companies, and you can always create your own server to store your data that you are so protective of.
I am not defending Apple because they are Apple. If Google, Facebook, or another tech giant announced something like this I would have the same response.
You keep saying that there are a bunch of tech and security experts decrying this move by Apple. Show me some evidence of this, so far all I’ve seen is someone with fanatical hatred of Apple accusing everyone that disagrees with them of being brainwashed. If you have actual arguments please present them and I’ll be happy to refute them.
"You keep saying that there are a bunch of tech and security experts decrying this move by Apple. Show me some evidence of this"
I think you really need to answer this, its a perfectly reasonable question.
Do I keep calling you a troll? Or did I reference it once due to the fact you have specifically made an account to post on this article and your 2nd post on the whole website was to belittle someone as a ‘fanboy’ just because they commented that they had no issue with this tech being installed.
I then asked you to answer the perfectly reasonable question levelled at you as previously you hadn’t backed up your statements with any links.
You have nothing to convince me of because I know in my own mind that I have no issue with what is being done by Apple, I’ve read up on what they are doing and it’s of little interest to me, I don’t violated or anything else.
I’m still here because you have thrown accusations at me and I decided to defend myself. I have no need to defend Apple’s decision to do this, and I haven’t done so. What I have done is point out that misinformation about what is going to be done concerning images on someone’s device versus stored on Apple’s servers. I ask again, please provide some actual evidence that Apple is creating a back door into their devices. So far you just keep accusing anyone that disagrees with you of fanboyism and then continue to say that there is evidence to support your point of view without actually providing any.
Yours is the first intelligent response I have read since this news was announced.
I agree 100% with Apple's move to flag and remove CSAM off their servers. If the complainers want to keep their child porn, they will have to find another place to hide it. That is the bottom line here. Apple does not want child porn on their servers.
How much do you think Apple paid for this [well edited] advertorial, complete with infographics?
read this before commenting - an authority on security: https://www.schneier.com/blog/archives/2021/08/more-on-apples-iphone-bac...