Apple has made a lot of noise with its camera-festooned iPhone 11 models, but beyond the lenses and hardware is a lot of interesting software. It's arguably that software that's driving the biggest changes to photography to date.
Some of that software isn’t available yet, such as “Deep Fusion” AI which Apple’s Marketing Chief described as “computational mad science,” or marketing speak for “I’m not really sure what this is yet, but it’s got a cool name.” But one of the technologies that is available right now is “semantic rendering,” which according to an article from Digital Trends is basically an editor in your phone touching up each individual part of your image.
To put it simply, whereas older cameras couldn’t distinguish from subjects in an image and would thus apply universal color edits to a photo, the iPhone can not only pick out a human, but can figure out where the eyes, the hair and other parts of the face are to adjust even parts of parts for optimal exposure. Since the human eye can see more dynamic range than even the best of smartphone or dedicated cameras, this software could be a huge boon to photographers shooting people and other subjects optimized for this technique.
One disappointing thing with all of Apple’s software improvements, however, is that the user isn’t in charge of them, at least out of the box. For instance, Apple’s “Night Mode” activates automatically whether you want it or not, unlike Google’s Android equivalent, “Night Sight,” which is a toggle on phones such as the Pixel 3a XL. It’s possible to envision a scenario where semantic rendering could get confused and try to lift shadows on a photo that might intentionally be a silhouette.
Likewise, Apple’s phones lock out other advanced features such as raw files. To get these things, you need a third party camera app, such as ProCam, but it’s not clear whether options such as Deep Fusion, Night Mode and semantic rendering will be available in thirds party apps through Apple’s camera SDK. It’s an open question as to whether any of these changes could be applied to a raw file anyway, at which point it ceases to become a raw file.
The other factor to consider is whose skin the photos are optimized for. It's well known that skin-tone rendering for film was set early on by Kodak's "Shirley Cards," a white woman, which meant that films didn't do so well with darker skin tones. If there isn't consideration given in the programming for a more diverse range of people this could also mean that the feature is useless for some ethnicities.
It may not matter much to end users anyway. As of October, a look at Flickr’s Camera Finder shows that the top cameras for the entire photo community aren’t cameras at all. They are the: Apple iPhone 6, iPhone 6s, iPhone 5s, iPhone 7, and iPhone 7 Plus. People just want to point their phones at things and take pictures, it seems, and not worry about the rest. The latest technologies from Apple seem poised to help them do just that.