What Is Apple's Semantic Rendering and How Does It Affect iPhone 11 Photos?

What Is Apple's Semantic Rendering and How Does It Affect iPhone 11 Photos?

Apple has made a lot of noise with its camera-festooned iPhone 11 models, but beyond the lenses and hardware is a lot of interesting software. It's arguably that software that's driving the biggest changes to photography to date.

Some of that software isn’t available yet, such as “Deep Fusion” AI which Apple’s Marketing Chief described as “computational mad science,” or marketing speak for “I’m not really sure what this is yet, but it’s got a cool name.” But one of the technologies that is available right now is “semantic rendering,” which according to an article from Digital Trends is basically an editor in your phone touching up each individual part of your image.

To put it simply, whereas older cameras couldn’t distinguish from subjects in an image and would thus apply universal color edits to a photo, the iPhone can not only pick out a human, but can figure out where the eyes, the hair and other parts of the face are to adjust even parts of parts for optimal exposure. Since the human eye can see more dynamic range than even the best of smartphone or dedicated cameras, this software could be a huge boon to photographers shooting people and other subjects optimized for this technique.

One disappointing thing with all of Apple’s software improvements, however, is that the user isn’t in charge of them, at least out of the box. For instance, Apple’s “Night Mode” activates automatically whether you want it or not, unlike Google’s Android equivalent, “Night Sight,” which is a toggle on phones such as the Pixel 3a XL. It’s possible to envision a scenario where semantic rendering could get confused and try to lift shadows on a photo that might intentionally be a silhouette.

Likewise, Apple’s phones lock out other advanced features such as raw files. To get these things, you need a third party camera app, such as ProCam, but it’s not clear whether options such as Deep Fusion, Night Mode and semantic rendering will be available in thirds party apps through Apple’s camera SDK. It’s an open question as to whether any of these changes could be applied to a raw file anyway, at which point it ceases to become a raw file.

The iPhone 11 Pro seemed to have a bit of difficulty with this subject in the shadows in this photo from Apple's Press Kit.

The other factor to consider is whose skin the photos are optimized for. It's well known that skin-tone rendering for film was set early on by Kodak's "Shirley Cards," a white woman, which meant that films didn't do so well with darker skin tones. If there isn't consideration given in the programming for a more diverse range of people this could also mean that the feature is useless for some ethnicities.

It may not matter much to end users anyway. As of October, a look at Flickr’s Camera Finder shows that the top cameras for the entire photo community aren’t cameras at all. They are the: Apple iPhone 6, iPhone 6s, iPhone 5s, iPhone 7, and iPhone 7 Plus. People just want to point their phones at things and take pictures, it seems, and not worry about the rest. The latest technologies from Apple seem poised to help them do just that.

Wasim Ahmad's picture

Wasim Ahmad is an assistant teaching professor teaching journalism at Quinnipiac University. He's worked at newspapers in Minnesota, Florida and upstate New York, and has previously taught multimedia journalism at Stony Brook University and Syracuse University. He's also worked as a technical specialist at Canon USA for Still/Cinema EOS cameras.

Log in or register to post comments

Software in camera and PS LR will change everything in photo. The US government is at least. 10 years ahead in AI Image Enhancement.
Manufacturer’s need to catch up and incorporate the cell phone aggressiveness in their larger cameras and editing..

I'm confused

I have the iPhone 11 Pro and only upgraded because of the camera capabilities. I had the iPhone X and after doing in-store comparisons, I was sold.
I probably have a differing opinion than most on this site when it comes to this, but I have a Sony a99II if I want to take more control over my photos. My phone is for snapshots, and it produces very high-quality images for those types of photos, but there is rarely a case that I need to make fine-tune adjustments when I'm trying to get a picture of my dog mid-yawn. An advanced "auto" is perfect for that type of stuff. I'm not looking for, or expecting, my phone to take professional-level images, but it is more than capable in the right hands / situations.

Love my iPhone 7

And all that is great, but would it hurt in some way to give users a choice?

No, but it's like buying a hammer and wanting the choice to turn it into a drill. It's just not the right tool if you want manual control over your settings. It's made to be easy, fast, auto, and high quality. That ticks the boxes for 99% of people taking photos.

"The other factor to consider is whose skin the photos are optimized for." I don't think it's a matter of skin tone, it's simply a dark/light (highlights/shadows - aka dynamic range) thing.

So is an iPhone racist?..