Apple Plans To Scan All Your Images and Report People to the Police?

Apple has announced this week that it is going to start rolling out new child safety features. These features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. Apple says that this program is ambitious and protecting children is an important responsibility. 
In this video, iCave Dave outlines the new child safety features which will start to appear later this year with iOS 15. Dave gives a good breakdown of how the new features will work and how well Apple is handling such a sensitive issue. There are three new ways that Apple will aim to protect children online.

Safety in Messages

The message features will not be activated by default on all devices; they will need to be opted into for the children’s devices as part of a family on your Apple devices. This is what Apple has to say on the functionality of the protection for children coming to the Messages app as part of IOS 15:

The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos. When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.

There will also be Siri warnings in place if a user tries to search for images of Child Sexual Abuse Material (CSAM). This is how Apple says these features will work:

Apple is also expanding guidance in Siri and Search by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report. 

Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.

I think these features sound like an excellent way to help protect children online. 

CSAM Detection

Finally, the most contentious feature Apple is rolling out involved the on-device scanning of all images before they are backed up on your iCloud account. The images are still encrypted, so Apple still can’t see your images. They will simply be flagged if markers on a user's image match the same markers in the database at the National Center for Missing and Exploited Children. Here’s what Apple has to say on this feature: 

New technology in iOS and iPadOS will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC).

Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM. And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.

Concerns Over This Technology

It would be hard for anyone to fault Apple for making changes to protect children online and report images of CSAM. I completely agree with iCave Dave on the handling of these types of images and content of that nature. It seems as though Apple is handling the protection of children in a considered and appropriate way. 

Personally, I’m inclined to agree with some critics of the image-scanning technology and the precedent it sets. While we would all agree that the production and sharing of CSAM images is simply wrong. The issue that comes when scanning images is when reporting users is appropriate, where should the line be drawn? Should images of drug use be flagged? Some would say they absolutely should. What about terrorism, would that be defined by the government of each territory? In the West, we’re probably okay, but other parts of the world might have different definitions of “terrorist.” Who would decide what should be reported and to whom it is reported?

I think we all agree that the types of images being discussed in this video and specifically mentioned by Apple are bad, perpetrators should be flagged, reported, and the world would be a better place if these types of images were not being produced or shared. I am yet to see anyone arguing in defense of CSAM images. However, I do believe there is a discussion to be had around any further use of this technology. What about countries where homosexuality is illegal, is it a possible future outcome that images of consenting adults doing something the government doesn’t approve of get flagged and reported? This might seem like an unlikely possibility, but with the precedent this technology sets, it is a possible eventuality.

Would governments with questionable ethics in the future be able to leverage Apple into flagging images they dictate in order to keep selling iPhones in that country? I believe, with how focused Apple currently is on customers and their privacy, it’s unlikely to be an issue anytime soon.

Google and Facebook have been scanning uploaded images for this type of content for a number of years. Apple is now going to go it on the device. Does this detract from Apple's previous statement that "privacy is a human right"?

A cynic might say that this technology is being introduced in the interest of protecting children because that’s a very difficult subject for anyone to disagree with.

What are your thoughts on Apple scanning users' images? Are critics of the technology overreacting? Should a service provider be able to check anything stored on their servers? How would you feel if Adobe started scanning images on Creative Cloud or your Lightroom library for specific image types?

Let me know in the comments, but please remember to be polite, even if you disagree with someone’s point of view.

Brad Wendes's picture

Brad Wendes is a British photographer and travel lover.
He began photographing parkour and acrobatics in 2010 and has since taken to portraiture and fitness photography.
Brad is a self-confessed geek, Star Wars fan, tech enthusiast, cat lover and recently converted Apple Fanboy.

Log in or register to post comments
97 Comments

Seems to me there is ample room for abuse, but something needs to be done about producers of child pornography. I am concerned about those who genuinely shoot art.

.

My concern is the precedent this technology sets. If we concede that it’s acceptable for a digital service provider to check and report images on our devices, should we be concerned about who says what we are and aren’t allowed images of?

.

I have to transfer images, and Google Drive is the easiest to use. It's cloud-based. Of course I don't take lewd pictures of anyone, either.

Couldn't care less, if it helps put an end to Child abuse then carry on.

That’s an interesting statement. Hypothetically speaking, would you agree that it’s acceptable to infringe the rights of others if it is in the interest of child protection?

There is nothing in my phone that would attract the attention of anyone, if you have nothing to hide, what exactly is being infringed? It’s the same situation as people who moan about CCTV being installed everywhere, who cares? I’m not doing anything wrong so nothing to worry about.

In the case of this aren’t they merely trying to match binary codes of existing images that are stored on a database? And only when there is a match (much like a fingerprint) someone is tasked with reviewing the match before sending it to the authorities? I literally don’t see a problem with it.

The precedent set by the act of scanning images on-device seems to be the point of contention. We’re more fortunate in the Western world that our governments tend to be more tolerant. The same can’t be said for many other nations around the world.
The ability to scan and report content opens up the possibility of scanning devices for images of any number of things.

That’s up to Apple where they choose to distribute the tech isn’t it, I’m fairly certain working with the Saudi Arabian authorities won’t be top of the priority list for example.

China and Russia might be more interested in what’s on their citizen’s devices

Again, up to Apple to choose whether to roll the tech out to those markets. And bear in mind, it’s still Apple who have control of this process, and still Apple who choose to send the info to the authorities, are you suggesting they are going to start selling this to government agencies to spy on citizens? I doubt it.

I’m suggesting that it might be possible for governments to insist Apple give access to them in return for the ability to trade in their country. The existence of this technology on-device is of concern and is a strange direction from a company who previously stated that “privacy is a human right”.

I’m interested to see how this rolls out and is it’s implemented over the coming years.

Privacy is a human right, that in my opinion you lose the minute you commit a serious crime, of which child abuse is the absolute worst.

I also think Apple will happily pull their products out of a country if that level of bribery becomes apparent. There are rumours they are pulling out of the U.K. market for far less.

So, if you lose privacy when you commit a serious crime, why bother to have a trial? Oh yeah - it's called the rule of law. Damed inconvenient to have rights and constitutional protections. Minority Report here we come.

This makes no sense?

Wrong… they don’t scan the photos at all, they scan the digital fingerprint created by the hash, then compare it to a database of previously ceased/illegal images to check for matches, if a match is found, then they decide what to do after reviewing it. If there is no match nothing is done.

Please don’t miss out 80% of the process just so you can come here and tell me I ‘don’t get it’, are you people seriously that bad at taking in information? Or are you just trolling?

It’s encrypted and created as a code using the image data, that encrypted code is then compared to an existing database of images that have been ceased by authorities. If there is a match in those codes, the image is than reviewed.

How exactly are they ‘accessing your files’? It’s merely comparing a database of numbers, if there is no match then literally nothing else happens, if there is a match then a flag is raised, what exactly are you so concerned about?

Stop pretending that just because other people are making this into an issue that I’m automatically wrong, you clearly haven’t read the process and are choosing instead to parrot some nonsense about them invading your privacy, when they aren’t.

It’s really simple, if you don’t this then just don’t buy an Apple product… I’d rather they performed this scan and weeded out a few thousand nonces than sit on the internet crying about my supposed privacy being violated.

It’s sometimes good to see past the end of your own nose, but it’s now apparent you are unable to do so.

This makes zero sense, what exactly have I done wrong by seeing how this can produce a positive outcome? Please explain to me oh wise one, exactly what is so wrong with me not having a problem with it?

What am I standing to lose? Why would some random photographer from Leeds be so much more knowledgeable about this than a huge international company? Why are other large tech companies already doing this if it’s so morally wrong?

Stop making pathetic sarcastic remarks and tell me exactly what I as a normal person, stands to lose by this being rolled out.

Which is fine, but they already stated they won't be manipulated by Government's to expand on the tech to do anything other than this task.

My view on the privacy and snooping, years ago I used to get up to things that could be construed as breaking the law, and therefore subject of surveillance (namely dance music and dark nightclubs without going into it too much). Nowadays I just live my life and try to be the best person I can for both myself and others, so as far as im concerned nothing I do, or is on my device has any reason for me to be concerned about being 'watched'.

I understand there are nations/regimes out there who would love to use such technology for ill gotten gains, but im 99.9% certain the very liberal people who make the decisions at Apple are not going to let this tech get anywhere near anybody who would look to violate peoples human rights. I can't imagine for example, Tim Cook, agreeing to the Hungarian government weeding out gay people by scanning their Apple devices.

If it was to get into the wrong hands for the wrong reasons then there absolutely is a negative, but seeing as they have quashed that theory, people here are just making wild assumptions, or accusations based on nothing more than opinion. In fact, reading through the comment history of some of them, its almost a cliche that they are in here with these views.

Aren't other large tech companies already doing it? so Apple are just falling in line. In my view id rather trust the tech companies to have some morals than any government, from any country. You are correct that there is no law to stop them changing their stance, but there is such things as brand image etc which companies rely on to have people buy into their system, so I can't see them putting that at risk.

For Apple in particular, this just looks like another excuse to stick the boot in (something these websites and people who frequent them seem to enjoy), which is patently obvious by the wording used in the title on sites like this and Petapixel. My belief is regardless of the label people like to give Apple, their intentions are mainly well natured.

Perhaps reading this article may help you with some actual facts around the tech and what it is going to do. If you can’t see how this is a good thing then sorry, but you have some serious issues.

https://www.bbc.co.uk/news/technology-58145943

What is that is confusing to you? Is it the sarcasm? Is it the irony of having a company becoming the worlds largest police informant? Or is it the possibility that the software may have flaws and incorrectly report to the police? Maybe it's that you'd be happy with Apple trolling through your phone because if you have nothing to hide then you don't need privacy. (sarcasm again).

You’re making a whole lot of assumptions there that aren’t based on fact, and yes I have zero issue with this scan taking place on my devices.

Again you haven’t even read through the process, you have clearly just read the title of the article and decided to make comments that in your head, you seem to be somehow correct.

I’m done dealing with yet another moron on this website.

Ah, now the ad hominem rebuttal. Ingenious. I'm glad you like to have you phone scanned. Some people - people with nothing to hide - will still object. And yes, I did read and understand the article.

China and Russia already know what is on the phones of their citizens.

I don’t understand your point? What am I ‘defending’?

I’ve just read through your comment history…. Don’t bother replying to me, thanks.

In reply to "In the case of this aren’t they merely trying to match binary codes of existing images that are stored on a database?".

Yes they are, which I think makes the whole thing even more futile. It sets a precedent for the public allowing their devices to be scanned, and it's not even going to be that effective in actually stopping child abuse.

If images are being matched to images that are already in a database, then it means the acts depicted in the images in the database have already been committed. If someone is actually abusing a child, and sharing the images, those images wouldn't even be flagged.

It's A way of combating child abuse, but I don't even think it's a particularly effective one.

And what's next? Cross referencing everyone's touch ID with a criminal database to solve open cases?

If it stops just one person it has done a job as far as I’m concerned, another scumbag off the streets.

If a cop says he is going to search your car because he smells a joint and in the process sees that you are not wearing shoes while driving he can cite you for that (actually, I cant remember if that is illegal). When you allow any door to open with law enforcement they can search for whatever. Maybe the algorithm doesn't find child porn but recognizes that you are selling commercially without paying business tax. What if suspicious pics are brought to court and you are exonerated but the dates of the pictures which were recorded into evidence shows that you were not working during your employer's work hours and you are fired?

I suspect your answer is that you get what you deserve, but that's not how the Constitution works; the 4th amendment protects you from unnecessary search and seizure unless there is a reasonable suspicion that a crime has been committed. Scanning everyone's pictures to identify those that may be illegal is too broad.

We get it Stuart. You don't like pedophiles. None of us do either. But when you say that you are willing to open everyone's door to the cops so that you can catch one bad guy you create a slippery slope. Its like saying I will search everyone's pocket on this street corner because I think that one of you stole an apple.

Catching bad guys is hard and many times it seems the bad guys have the advantage. As the father of two daughters I deeply desire these predators off the street too. But not at the expense of civil liberties.

What makes this a unique situation is that Apple is providing a service that the participant willingly abdicates his/her rights to receive its value. Therefore Apple is likely within their rights just like some pic sharing apps said that submitted pics could be used by them commercially although this would seem like copyright infringement (eventually some use was considered improper).

You are correct that Apple is taking a risk with consumer backlash, but I suspect it wont hurt them because more people will be like you thinking in basic emotions than understanding what rights they have given away. We have already given away our privacy with Facebook (or login into fstoppers). Or have we just given up for a nanny state?

.

It would seem that there are a lot of people online who are concerned that’s the likely eventuality of any form of covert surveillance

.

That’s a very good point. I hadn’t considered parents taking innocent images of their own children, at what age is a nude child indecent? And who decides what is a parent documenting their child growing and what is more nefarious?

...

If those parents photos of their kids in bathing suits aren’t stored on a database of already ceased images that are needed for a match, they wont be getting a knock on the door.

It seems these people commenting are only reading half the info then making assumptions.

No-one has issue with what Apple are doing with regard to child protection. The technology and potential future misuse or misunderstanding is a concern

Anyone concerned by this has a serious lack of understanding around just how much we as citizens can be surveilled by the authorities if they need to. It takes the police what, 20 minutes to obtain phone records at a car accident to see if someone was texting whilst driving etc?

There is no privacy when it comes to technology so you have to either suck it up or stop using it if it bothers you, me personally, I don’t care.

.

Is the fact that we are already being surveilled a reason to accept giving up more of our freedoms? At what point will it be too far?

That’s a really interesting take. If we accept that we live in a virtual panopticon already, is it too late to make any meaningful change to privacy rights?
I don’t think it’s ever right to stop pushing for personal privacy, no matter how far things go, I won’t quietly and voluntarily choose to give up personal freedoms.

But let's not encourage more.

Don't update the phone or software to allow this. Tell Apple you don't want it. Help governments pass privacy laws. Just a few suggestions - some obviously more workable than others. Best bet: don't buy the phone.

Care to answer my response above, or just trolling?

.

Not in the Western World.

More comments