Recently, Apple announced some new child safety features coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. While the public generally seems pleased that Apple is taking steps to protect children online, there have been concerns about user privacy. In a recent interview, Erik Neuenschwander, head of privacy at Apple, clarifies a few points.
In this video, Sam from iUpdate breaks down the important new information from Apple regarding their CSAM detection, why they are implementing it now, and why they chose to use on-device scanning in addition to server-side checks.
As previously discussed, this topic is polarizing. While no one is advocating the production or sharing of inappropriate images of children, there are a number of people who claim that these new features go against Apples' previous position of "Privacy is a Human Right."
My own personal concerns about the way Apple makes compromises with some governments in order to trade in their territories have been somewhat eased, at least in the short-term, as these features will initially only be rolling out to the USA.
Apple previously conceded to the FBI and agreed not to end-to-end encryption on all iCloud backups, as reported by Reuters. Now there is a de facto backdoor to iCloud backups, law enforcement can request access to your data if you're suspected of a crime. Apple also changed the way they store and manage data in China at the request of the Chinese government.
There's certainly an argument to be made that if you store your data on Apples servers, why shouldn't they have access to it? I've also seen more than one person suggest that if you have nothing to hide, then it won't be an issue.
Both of these points are good starting topics for any discussion about privacy rights in the modern world, but far too broad to give adequate attention to in this short follow up to Apples' latest announcements.
What we know for sure is that iOS will be scanning your images on your device, iOS scans for objects and text in photos to allow contextual searches based on your image content and allow copying of text from images. As part of this, Apple will be searching for specific hashes that can be compared to a database of known CSAM images. These can be manually checked and reported to the authorities when appropriate. Apple will only have access to data backed up on iCloud, even though scanning is done on the device.
My own personal concerns about the precedent this technology sets haven't been eased much. I am very excited about the quality-of-life improvements that on-device image scanning will bring, and it goes without saying that I don't have any questionable or illegal images on my device. So, I'm safe, for now. Right?
Does this interview change your mind about the new features? Will you continue to use iCloud? Would you object if Adobe started to do the same with your Creative Cloud library?
Let me know your thoughts in the comments. Remember to be nice, even if someone disagrees with you.