Apple has announced this week that it is going to start rolling out new child safety features. These features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. Apple says that this program is ambitious and protecting children is an important responsibility.
In this video, iCave Dave outlines the new child safety features which will start to appear later this year with iOS 15. Dave gives a good breakdown of how the new features will work and how well Apple is handling such a sensitive issue. There are three new ways that Apple will aim to protect children online.
Safety in Messages
The message features will not be activated by default on all devices; they will need to be opted into for the children’s devices as part of a family on your Apple devices. This is what Apple has to say on the functionality of the protection for children coming to the Messages app as part of IOS 15:
The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos. When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.
New Guidance in Siri and Search
There will also be Siri warnings in place if a user tries to search for images of Child Sexual Abuse Material (CSAM). This is how Apple says these features will work:
Apple is also expanding guidance in Siri and Search by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.
Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.
I think these features sound like an excellent way to help protect children online.
CSAM Detection
Finally, the most contentious feature Apple is rolling out involved the on-device scanning of all images before they are backed up on your iCloud account. The images are still encrypted, so Apple still can’t see your images. They will simply be flagged if markers on a user's image match the same markers in the database at the National Center for Missing and Exploited Children. Here’s what Apple has to say on this feature:
New technology in iOS and iPadOS will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC).
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM. And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.
Concerns Over This Technology
It would be hard for anyone to fault Apple for making changes to protect children online and report images of CSAM. I completely agree with iCave Dave on the handling of these types of images and content of that nature. It seems as though Apple is handling the protection of children in a considered and appropriate way.
Personally, I’m inclined to agree with some critics of the image-scanning technology and the precedent it sets. While we would all agree that the production and sharing of CSAM images is simply wrong. The issue that comes when scanning images is when reporting users is appropriate, where should the line be drawn? Should images of drug use be flagged? Some would say they absolutely should. What about terrorism, would that be defined by the government of each territory? In the West, we’re probably okay, but other parts of the world might have different definitions of “terrorist.” Who would decide what should be reported and to whom it is reported?
I think we all agree that the types of images being discussed in this video and specifically mentioned by Apple are bad, perpetrators should be flagged, reported, and the world would be a better place if these types of images were not being produced or shared. I am yet to see anyone arguing in defense of CSAM images. However, I do believe there is a discussion to be had around any further use of this technology. What about countries where homosexuality is illegal, is it a possible future outcome that images of consenting adults doing something the government doesn’t approve of get flagged and reported? This might seem like an unlikely possibility, but with the precedent this technology sets, it is a possible eventuality.
Would governments with questionable ethics in the future be able to leverage Apple into flagging images they dictate in order to keep selling iPhones in that country? I believe, with how focused Apple currently is on customers and their privacy, it’s unlikely to be an issue anytime soon.
Google and Facebook have been scanning uploaded images for this type of content for a number of years. Apple is now going to go it on the device. Does this detract from Apple's previous statement that "privacy is a human right"?
A cynic might say that this technology is being introduced in the interest of protecting children because that’s a very difficult subject for anyone to disagree with.
What are your thoughts on Apple scanning users' images? Are critics of the technology overreacting? Should a service provider be able to check anything stored on their servers? How would you feel if Adobe started scanning images on Creative Cloud or your Lightroom library for specific image types?
Let me know in the comments, but please remember to be polite, even if you disagree with someone’s point of view.
Seems to me there is ample room for abuse, but something needs to be done about producers of child pornography. I am concerned about those who genuinely shoot art.
.
My concern is the precedent this technology sets. If we concede that it’s acceptable for a digital service provider to check and report images on our devices, should we be concerned about who says what we are and aren’t allowed images of?
.
Couldn't care less, if it helps put an end to Child abuse then carry on.
That’s an interesting statement. Hypothetically speaking, would you agree that it’s acceptable to infringe the rights of others if it is in the interest of child protection?
There is nothing in my phone that would attract the attention of anyone, if you have nothing to hide, what exactly is being infringed? It’s the same situation as people who moan about CCTV being installed everywhere, who cares? I’m not doing anything wrong so nothing to worry about.
In the case of this aren’t they merely trying to match binary codes of existing images that are stored on a database? And only when there is a match (much like a fingerprint) someone is tasked with reviewing the match before sending it to the authorities? I literally don’t see a problem with it.
The precedent set by the act of scanning images on-device seems to be the point of contention. We’re more fortunate in the Western world that our governments tend to be more tolerant. The same can’t be said for many other nations around the world.
The ability to scan and report content opens up the possibility of scanning devices for images of any number of things.
That’s up to Apple where they choose to distribute the tech isn’t it, I’m fairly certain working with the Saudi Arabian authorities won’t be top of the priority list for example.
China and Russia might be more interested in what’s on their citizen’s devices
Again, up to Apple to choose whether to roll the tech out to those markets. And bear in mind, it’s still Apple who have control of this process, and still Apple who choose to send the info to the authorities, are you suggesting they are going to start selling this to government agencies to spy on citizens? I doubt it.
I’m suggesting that it might be possible for governments to insist Apple give access to them in return for the ability to trade in their country. The existence of this technology on-device is of concern and is a strange direction from a company who previously stated that “privacy is a human right”.
I’m interested to see how this rolls out and is it’s implemented over the coming years.
Privacy is a human right, that in my opinion you lose the minute you commit a serious crime, of which child abuse is the absolute worst.
I also think Apple will happily pull their products out of a country if that level of bribery becomes apparent. There are rumours they are pulling out of the U.K. market for far less.
So, if you lose privacy when you commit a serious crime, why bother to have a trial? Oh yeah - it's called the rule of law. Damed inconvenient to have rights and constitutional protections. Minority Report here we come.
This makes no sense?
What is that is confusing to you? Is it the sarcasm? Is it the irony of having a company becoming the worlds largest police informant? Or is it the possibility that the software may have flaws and incorrectly report to the police? Maybe it's that you'd be happy with Apple trolling through your phone because if you have nothing to hide then you don't need privacy. (sarcasm again).
You’re making a whole lot of assumptions there that aren’t based on fact, and yes I have zero issue with this scan taking place on my devices.
Again you haven’t even read through the process, you have clearly just read the title of the article and decided to make comments that in your head, you seem to be somehow correct.
I’m done dealing with yet another moron on this website.
Ah, now the ad hominem rebuttal. Ingenious. I'm glad you like to have you phone scanned. Some people - people with nothing to hide - will still object. And yes, I did read and understand the article.
China and Russia already know what is on the phones of their citizens.
In reply to "In the case of this aren’t they merely trying to match binary codes of existing images that are stored on a database?".
Yes they are, which I think makes the whole thing even more futile. It sets a precedent for the public allowing their devices to be scanned, and it's not even going to be that effective in actually stopping child abuse.
If images are being matched to images that are already in a database, then it means the acts depicted in the images in the database have already been committed. If someone is actually abusing a child, and sharing the images, those images wouldn't even be flagged.
It's A way of combating child abuse, but I don't even think it's a particularly effective one.
And what's next? Cross referencing everyone's touch ID with a criminal database to solve open cases?
If it stops just one person it has done a job as far as I’m concerned, another scumbag off the streets.
If a cop says he is going to search your car because he smells a joint and in the process sees that you are not wearing shoes while driving he can cite you for that (actually, I cant remember if that is illegal). When you allow any door to open with law enforcement they can search for whatever. Maybe the algorithm doesn't find child porn but recognizes that you are selling commercially without paying business tax. What if suspicious pics are brought to court and you are exonerated but the dates of the pictures which were recorded into evidence shows that you were not working during your employer's work hours and you are fired?
I suspect your answer is that you get what you deserve, but that's not how the Constitution works; the 4th amendment protects you from unnecessary search and seizure unless there is a reasonable suspicion that a crime has been committed. Scanning everyone's pictures to identify those that may be illegal is too broad.
We get it Stuart. You don't like pedophiles. None of us do either. But when you say that you are willing to open everyone's door to the cops so that you can catch one bad guy you create a slippery slope. Its like saying I will search everyone's pocket on this street corner because I think that one of you stole an apple.
Catching bad guys is hard and many times it seems the bad guys have the advantage. As the father of two daughters I deeply desire these predators off the street too. But not at the expense of civil liberties.
What makes this a unique situation is that Apple is providing a service that the participant willingly abdicates his/her rights to receive its value. Therefore Apple is likely within their rights just like some pic sharing apps said that submitted pics could be used by them commercially although this would seem like copyright infringement (eventually some use was considered improper).
You are correct that Apple is taking a risk with consumer backlash, but I suspect it wont hurt them because more people will be like you thinking in basic emotions than understanding what rights they have given away. We have already given away our privacy with Facebook (or login into fstoppers). Or have we just given up for a nanny state?
.
It would seem that there are a lot of people online who are concerned that’s the likely eventuality of any form of covert surveillance
.
That’s a very good point. I hadn’t considered parents taking innocent images of their own children, at what age is a nude child indecent? And who decides what is a parent documenting their child growing and what is more nefarious?
...
If those parents photos of their kids in bathing suits aren’t stored on a database of already ceased images that are needed for a match, they wont be getting a knock on the door.
It seems these people commenting are only reading half the info then making assumptions.
No-one has issue with what Apple are doing with regard to child protection. The technology and potential future misuse or misunderstanding is a concern
Anyone concerned by this has a serious lack of understanding around just how much we as citizens can be surveilled by the authorities if they need to. It takes the police what, 20 minutes to obtain phone records at a car accident to see if someone was texting whilst driving etc?
There is no privacy when it comes to technology so you have to either suck it up or stop using it if it bothers you, me personally, I don’t care.
Not in the Western World.
Apple have stated that they will have images reviewed by humans and reported to the authorities as appropriate.
And who at Apple gets the fun job of deciding whether it's a family pic of the kids on the beach in bathing suits or child porn? If it's child porn, then who wants a job at Apple looking at it? Sounds like a perverts' dream job has just opened at Apple. And when did Apple suddenly become part of the police department? This is an terrible idea on so many levels.
.
.
You need to read the article before making uninformed comments. But hey, uninformed comments, it’s the internet way.
I don't like the idea Apple is controlling my data. Not that i do have images to hide - but there's no warranty the software will be without mistakes. Apple is using people to verify say some - i didn't find confirmation on that. They put figures on 1 on 1.000.000.000 to the power of two - that it may go wrong, it's nothing but it's not zero. When it goes wrong? You loose access to your data - without being able to know what's going on, no way of recovering these data .... And what's happening with Apple - may happen with Amazon, Google, Microsoft and many others in the cloud. It's already going on!
It'll be AI driven - without human control - and mistakes and bugs will happen.
When i'll put data on the cloud it'll be double encrypted that's for certain. No company should act as police-investigator, judge, or prosecutor (Apple 's doing all of those roles)! It's a simple rule - and i 'd like to keep it that way. So Apple stays out of my house.
Do i have something to hide - not that i'm aware of. But i'd rather not be in a defenseless position - that's for sure. Every company deserves your maximal distrust, that's the first lesson in the economy class. And it's one that many won't learn.
A fair summary of the main criticisms of this announcement and the principle of the concerns. Thank you
I think it is unfair to target Apple in this manner. If you read the article you'll know that Apple isn't playing the role of police-investigator, judge, or prosecutor. Apple and every other company has the right to prohibit users who violate their user agreement. Apple and every other company has the right to pass on such degrading images to law enforcement. Facebook and Google have already been scanning your photos for years are you bothered about that too?
Do you use the internet? Then the state and others know what sites you visit.
Do you use a smartphone? The the state and others already know where you are and where you go.
Do you use a Chinese phone? Where do you think your usage data is going?
Do you use the internet in the UK? Then it all goes through GCHQ.
Do you have an Amazon Echo, Google Home? Then Amazon and Google know what you say, how you say it, and please don't have one in your bedroom.
Do you use Facebook, Amazon or Google? Then those companies know your comments, what you buy, when you buy it, and a shed load of other data you GIVE them for FREE.
Do you have a smart doorbell? Who do you think knows who calls on you.
Do you have a smart fridge? Then they know what you eat, how fat you are and prolly will try and sell you medical insurance for that stroke/heart attack your're going to have.
You have already lost control of your data. You never had control anyway. There is no privacy. IF Apple stays out of your house, then to be fair, every other company has to stay out as well. They want your data.
I don't agree with it either.
PS. The use of you, your, you're is not necessarily referring to you.
Some loss of privacy is inevitable in the internet and credit card age where data is collected by service/goods providers. However, the best thing to do is to limit what you have made available. Don't have Facebook. Don't store images with Google. Don't have and Amazon Echo or similar device.Don't have a smart doorbell. In short, try to remember a time (not that long ago) when these devices weren't here to "improve" our lives. Turn on your own lights and answer your own door. It is bizarre how people will give up their private information in exchange to a gadget or service that isn't needed. Give the inventors of this stuff credit: they have created solutions in search of a problem and managed to convince people that they can't live without these things. No need to become a cave dweller - just limit the amount of privacy-stealing devices and services you have.
Myself and no one I know wants child pornography anywhere, anytime. That issue is settled in the USA. But have Apple stick their nose in this slim? NOPE. Who owns my phone? I do. I don't want granddaddy Apple looking at my photos. I own my photos. I suggest if Apple is going this route, do a deep dive Apple in the HR dept and see how many pedophiles are hidden in your company, if any. Scan ever photo of employee's at Apple. See if your "house" is clean before you big brother us. My rant for the day.
As the article says Google and Facebook have been scanning uploaded images for this type of content for a number of years. Are you equally outraged by that? Or are you just Trolling Apple?
Here's the one difference between this Apple news vs Google/Facebook. Google/Facebook is scanning content that is uploaded onto their servers, which in my opinion they have every right to do. They are legally responsible for the content they allow to be stored on their servers, whereas Apple is stating not only are they going to scan the content sent through their servers, but they're also going to scan content of privately owned devices. The key difference is ownership of where the content is being stored.
The police would not serve Apple a warrant for seizure of property to seize your phone, but the would serve Google/Facebook a warrant for seizure of their servers.
What would you suggest to stop these perverted bastards generating and sharing these vile images?
"Apple also said its anti-CSAM tool will not allow the company to see or scan a user's photo album. It will only scan photos that are shared on iCloud."
Taken from the BBC article on the subject.
Apple have confirmed that scanning will be done on device but only to images which are backed up to iCloud.