Apple has announced this week that it is going to start rolling out new child safety features. These features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. Apple says that this program is ambitious and protecting children is an important responsibility.
In this video, iCave Dave outlines the new child safety features which will start to appear later this year with iOS 15. Dave gives a good breakdown of how the new features will work and how well Apple is handling such a sensitive issue. There are three new ways that Apple will aim to protect children online.
Safety in Messages
The message features will not be activated by default on all devices; they will need to be opted into for the children’s devices as part of a family on your Apple devices. This is what Apple has to say on the functionality of the protection for children coming to the Messages app as part of IOS 15:
The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos. When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.
New Guidance in Siri and Search
There will also be Siri warnings in place if a user tries to search for images of Child Sexual Abuse Material (CSAM). This is how Apple says these features will work:
Apple is also expanding guidance in Siri and Search by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.
Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.
I think these features sound like an excellent way to help protect children online.
CSAM Detection
Finally, the most contentious feature Apple is rolling out involved the on-device scanning of all images before they are backed up on your iCloud account. The images are still encrypted, so Apple still can’t see your images. They will simply be flagged if markers on a user's image match the same markers in the database at the National Center for Missing and Exploited Children. Here’s what Apple has to say on this feature:
New technology in iOS and iPadOS will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC).
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM. And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.
Concerns Over This Technology
It would be hard for anyone to fault Apple for making changes to protect children online and report images of CSAM. I completely agree with iCave Dave on the handling of these types of images and content of that nature. It seems as though Apple is handling the protection of children in a considered and appropriate way.
Personally, I’m inclined to agree with some critics of the image-scanning technology and the precedent it sets. While we would all agree that the production and sharing of CSAM images is simply wrong. The issue that comes when scanning images is when reporting users is appropriate, where should the line be drawn? Should images of drug use be flagged? Some would say they absolutely should. What about terrorism, would that be defined by the government of each territory? In the West, we’re probably okay, but other parts of the world might have different definitions of “terrorist.” Who would decide what should be reported and to whom it is reported?
I think we all agree that the types of images being discussed in this video and specifically mentioned by Apple are bad, perpetrators should be flagged, reported, and the world would be a better place if these types of images were not being produced or shared. I am yet to see anyone arguing in defense of CSAM images. However, I do believe there is a discussion to be had around any further use of this technology. What about countries where homosexuality is illegal, is it a possible future outcome that images of consenting adults doing something the government doesn’t approve of get flagged and reported? This might seem like an unlikely possibility, but with the precedent this technology sets, it is a possible eventuality.
Would governments with questionable ethics in the future be able to leverage Apple into flagging images they dictate in order to keep selling iPhones in that country? I believe, with how focused Apple currently is on customers and their privacy, it’s unlikely to be an issue anytime soon.
Google and Facebook have been scanning uploaded images for this type of content for a number of years. Apple is now going to go it on the device. Does this detract from Apple's previous statement that "privacy is a human right"?
A cynic might say that this technology is being introduced in the interest of protecting children because that’s a very difficult subject for anyone to disagree with.
What are your thoughts on Apple scanning users' images? Are critics of the technology overreacting? Should a service provider be able to check anything stored on their servers? How would you feel if Adobe started scanning images on Creative Cloud or your Lightroom library for specific image types?
Let me know in the comments, but please remember to be polite, even if you disagree with someone’s point of view.
Apple have stated that they will have images reviewed by humans and reported to the authorities as appropriate.
And who at Apple gets the fun job of deciding whether it's a family pic of the kids on the beach in bathing suits or child porn? If it's child porn, then who wants a job at Apple looking at it? Sounds like a perverts' dream job has just opened at Apple. And when did Apple suddenly become part of the police department? This is an terrible idea on so many levels.
.
.
You need to read the article before making uninformed comments. But hey, uninformed comments, it’s the internet way.
I don't like the idea Apple is controlling my data. Not that i do have images to hide - but there's no warranty the software will be without mistakes. Apple is using people to verify say some - i didn't find confirmation on that. They put figures on 1 on 1.000.000.000 to the power of two - that it may go wrong, it's nothing but it's not zero. When it goes wrong? You loose access to your data - without being able to know what's going on, no way of recovering these data .... And what's happening with Apple - may happen with Amazon, Google, Microsoft and many others in the cloud. It's already going on!
It'll be AI driven - without human control - and mistakes and bugs will happen.
When i'll put data on the cloud it'll be double encrypted that's for certain. No company should act as police-investigator, judge, or prosecutor (Apple 's doing all of those roles)! It's a simple rule - and i 'd like to keep it that way. So Apple stays out of my house.
Do i have something to hide - not that i'm aware of. But i'd rather not be in a defenseless position - that's for sure. Every company deserves your maximal distrust, that's the first lesson in the economy class. And it's one that many won't learn.
A fair summary of the main criticisms of this announcement and the principle of the concerns. Thank you
I think it is unfair to target Apple in this manner. If you read the article you'll know that Apple isn't playing the role of police-investigator, judge, or prosecutor. Apple and every other company has the right to prohibit users who violate their user agreement. Apple and every other company has the right to pass on such degrading images to law enforcement. Facebook and Google have already been scanning your photos for years are you bothered about that too?
Do you use the internet? Then the state and others know what sites you visit.
Do you use a smartphone? The the state and others already know where you are and where you go.
Do you use a Chinese phone? Where do you think your usage data is going?
Do you use the internet in the UK? Then it all goes through GCHQ.
Do you have an Amazon Echo, Google Home? Then Amazon and Google know what you say, how you say it, and please don't have one in your bedroom.
Do you use Facebook, Amazon or Google? Then those companies know your comments, what you buy, when you buy it, and a shed load of other data you GIVE them for FREE.
Do you have a smart doorbell? Who do you think knows who calls on you.
Do you have a smart fridge? Then they know what you eat, how fat you are and prolly will try and sell you medical insurance for that stroke/heart attack your're going to have.
You have already lost control of your data. You never had control anyway. There is no privacy. IF Apple stays out of your house, then to be fair, every other company has to stay out as well. They want your data.
I don't agree with it either.
PS. The use of you, your, you're is not necessarily referring to you.
Some loss of privacy is inevitable in the internet and credit card age where data is collected by service/goods providers. However, the best thing to do is to limit what you have made available. Don't have Facebook. Don't store images with Google. Don't have and Amazon Echo or similar device.Don't have a smart doorbell. In short, try to remember a time (not that long ago) when these devices weren't here to "improve" our lives. Turn on your own lights and answer your own door. It is bizarre how people will give up their private information in exchange to a gadget or service that isn't needed. Give the inventors of this stuff credit: they have created solutions in search of a problem and managed to convince people that they can't live without these things. No need to become a cave dweller - just limit the amount of privacy-stealing devices and services you have.
Myself and no one I know wants child pornography anywhere, anytime. That issue is settled in the USA. But have Apple stick their nose in this slim? NOPE. Who owns my phone? I do. I don't want granddaddy Apple looking at my photos. I own my photos. I suggest if Apple is going this route, do a deep dive Apple in the HR dept and see how many pedophiles are hidden in your company, if any. Scan ever photo of employee's at Apple. See if your "house" is clean before you big brother us. My rant for the day.
As the article says Google and Facebook have been scanning uploaded images for this type of content for a number of years. Are you equally outraged by that? Or are you just Trolling Apple?
Here's the one difference between this Apple news vs Google/Facebook. Google/Facebook is scanning content that is uploaded onto their servers, which in my opinion they have every right to do. They are legally responsible for the content they allow to be stored on their servers, whereas Apple is stating not only are they going to scan the content sent through their servers, but they're also going to scan content of privately owned devices. The key difference is ownership of where the content is being stored.
The police would not serve Apple a warrant for seizure of property to seize your phone, but the would serve Google/Facebook a warrant for seizure of their servers.
What would you suggest to stop these perverted bastards generating and sharing these vile images?
"Apple also said its anti-CSAM tool will not allow the company to see or scan a user's photo album. It will only scan photos that are shared on iCloud."
Taken from the BBC article on the subject.
Apple have confirmed that scanning will be done on device but only to images which are backed up to iCloud.
Have they? Can you point me to where they said that?
Thinking about it logistically why would they be scanning millions of individual devices when they can just perform the scan on the servers they own and manage? Allowing them to do it in bulk then scan images as they are uploaded.
As part of IOS15 all images are scanned when taken or saved, making use of the neural cores of Apple silicon.
Images which contain text will immediately have that text ready to copy and paste. It’s a great feature that’s been widely praised.
The same technology will be used to identify hashes and compare with the database of inappropriate images.
You are correct, as stated in the article, that only images backed up to iCloud will be compared to the database of images. The scanning will take place on device, this is to maintain a degree of privacy and makes up a part of the new photos features in IOS 15
Hey Stuart, there’s a great video on YouTube by Rene Ritchie which was posted this week. It’s fairly long but it goes into all the technology Apple are introducing, how it’s implemented on device and in the cloud. The video also outline people’s concerns and discusses them in an impartial and nonjudgmental way. The video is called “The Ugly Truth About Apple Privacy Panic”
https://youtu.be/Dq_hvJI76IY
Here's further evidence that it'll be done on device, taken directly from the 3rd & 4th paragraph of CSAM Detection https://www.apple.com/child-safety/
"Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image."
Effectively they're scanning for CSAM on the device prior to uploading onto the iCloud server. Where this eventually leads to is, once the machine learning algorithms have been tried in a real world setting, Apple will proactively reject backing up photos identified algorithmically as CSAM on the iCloud server as a means of protecting themselves from liability. They'll simply keep track of the number of safety violations, then after a certain number of violations, they'll be uploaded to a separate server for review/processing by law enforcement. While in part it's to protect children, it's also a means to protect themselves from liability and property seizure
They aren’t doing this to protect children. They are doing it to spy on Americans.
Apple has no need to spy on americans, their government is already doing that.
Anyone who trusts Big Tech or the Government to limit the use of your private data for only the purposes they say is naive, to say the least.
There is a lot of ignorance when it comes to law, which is understandable, given how complex law is. I note this issue brings in contract law, IP law, criminal law, and constitutional law.
In the interests of clarity, you guys may wish to interview a lawyer or legal academic.
Turns out that as well as totally screwing up the law, no one here bothered taking the time understand the thing Apple has rolled out.
Imagine my surprise.
Is anyone actually seriously stupid enough to have CSAM images on their phone, or in fact any device?
No issue scanning for trafficking images. However, this slippery slope is coming if you want it or not eventually. Next, scan for "most Wanted" criminals. Scan for stolen goods. Scan if you have a beer open the car. Scan for guns, Scan if you are attending a rally deemed unacceptable by the powers that be. Scan for drugs, etc...you name it. Basically, when all individual's rights and privacy are taken away, it's always in the name of "safety". Remember "Ident-a-kid" back in the early 80's? In the name of safety, the Feds were able to build a huge database of children's fingerprints. So at the end of the day, here is NOTHING you can do about it unless you go off the grid (which is mostly impossible in the modern age). Remember, everybody is fine with book burning until someone gets in power and burns the books they like.
I have nothing to hide, and nothing to share. Stay out of my private life.
It is time we all leave these Social Justice WOKE corporations that do not understand that it is our data and they should stop scanning thru it by constantly redefining their privacy terms. It is not their job to do policing. As far as I know, apple's mission is to squeeze as much money as they can from their users while providing a lame service to their customers.
This is borderline fascism.
I would rather be a fascist than a Troll.
That is a ridiculous comment. You appear to be both a fascist AND a troll
Paul Ass lin, Appearances can be deceptive.
Chris, mine is a persona opinion. You reply is a personal attack. This is no way to carry a conversation and learn from other people's perspective. Your loss.
Alan, it is your opinion, but you did troll Apple, sorry. I do agree with you that it IS our data and that corporations should stay out of our lives. How would you stem the tide of CSAM and the perverted sickos who generate and share this vile form of imagery?
Here's an hypothetical, if someone in their private digital life is plotting a kidnap/rape/assault on someone you know/love, where, in the process, would you want the law agencies to intervene?
Chris, surveying private citizens is not the answer.
Amongst other rights, we have the right to privacy, and I don't care what people may or may not be plotting. Sometimes is just a fantasy and it will never be acted upon.
That said, there is no perfect answer to your hypothetical questions. I certainly don't think that surveying the private lives of the majority of the (honest) population is the answer.
Think of the ramifications of said surveillance; I can collect dirt on you and blackmail you into silence and compliance (or whatever I want). And that, to me is simply unacceptable.
Think of the Stasi in East Germany. That is a classic example of surveillance of private citizens used to control the people.
BTW - I used to have all Apple devices until I got fed up with their practices and attitudes - so I am speaking of personal experience, and not just trolling Apple. They are a for profit corporation and they should not engage in surveillance (hard stop).
Paul, I am certainly NOT advocating surveillance. Quite the contrary, I agree with you. It is a dilemma to which I have no answer. I am sure you are aware of the TV series of a few years ago, Person of Interest. I enjoyed the series very much, I don't want that future either. There are a lot of questions to be asked and a lot of answers to be found, if indeed, they can be.
Chris, you are so very very clever! In 67 years on this planet no one has ever made jest of my surname yet here you are. Such a wit!
Sorry, I have been practicing my wit. Cheeky I know. No one has ever made jest of your surname! You shock me! It could have been worse, you could be Indian and have the name Dikshit.
Scanning my phone photos? They're going to wonder why I was taking pictures of a kitchen faucet at Lowe's. After all, that's about all I use my phone camera for. ;-)
I hope the first innocent person who finds the police at his door because of a false positive by this infallible AI sues Apple for enough money to cause them some real pain.
The Police will not burst into someone's house at the behest of Apple. Well, not in the Western World anyway. Apple will pass any relavent information to the authorities who will investigate and then arrest if required. In the UK, whilst there is no legal requirement to report a crime, there is, according to the Police, a moral duty to do so. You could take it that Apple (and everyone else) has a moral duty to report illegal activities to the authorities. I think your country has much more important issues of innocent people being shot by the Police that any future Apple related issues.
Apple may have a moral duty to report illegal activities that it becomes aware of in the course of normal business. It has no "moral duty" to monitor the users of its products for signs of criminal activity.
I have a moral duty to report a neighbor if I suspect child abuse. I have no "duty" to walk around the neighborhood at night looking in windows.
While I share in concerns that a capability like this can be co-opted by a malicious actor, Apple's chosen implementation is about as secure as one can get from a privacy standpoint short of doing absolutely nothing. This isn't AI analyzing the content of your images to try to interpret what it's seeing and reporting it to the police if it thinks you have something. This is comparing image hashes to a database of a specific set of known illegal images. Even after that, you would have a collection of that specific set of known illegal images in order to be flagged for review. At that point, a human being would manually verify low those images to check if they match and then if they are verified, Apple would contact the National Center for Missing and Exploited Children (not the police as often being reported). Furthermore, this capability would only work if you're connecting your device photo library to iCloud in the first place. If you're not connected to iCloud, nothing gets scanned.
There are always going to be concerns with something like this because if you can search for one thing, it's not impossible to change the parameters for searching for something like. Because of this, many people are rightly concerned about governments requesting changes to search for things they want to find. To this Apple has already responded that they are not going to modify this to appease a government and I don't really see a reason not to believe them since all of their past behavior supports that general attitude. They've held the line against governments before so there's really no reason to think that they won't continue to do so. Of course things can always change in the future, but while I prefer there no be no scanning function exist at all, I do think that this is about the best implementation I can think of if one is going to exist.
Finally someone who has actually read up on the process and used some common sense to assess the implications and what it actually means.
That's how it begins. Where it will be 10 years from now is anyone's guess.
Even Orwell didn't predict this. He believed the government would be monitoring all of us not big business. Now it seems the worlds most profitable company is about to monitor the people. That, by definition, is fascism. I hope Apple isn't monitoring this thread of comments as I type on my iMac.