Apple’s New Depth Control Is Much More Important Than Merely Tweaking Bokeh

Apple’s New Depth Control Is Much More Important Than Merely Tweaking Bokeh

With the iPhone XS, XS Max, and XR, Apple introduced the ability to control the amount of blur in the background of its Portrait Mode photos. But what seems like a gimmicky tweak takes what was a gimmick and turns it into a pro feature.
Since the iPhone introduced dual cameras in its plus models back with the iPhone 7 Plus, Portrait Mode has always been a bit of a gimmick. To the untrained eye (and that is, to most eyes), it probably led to photos that were more than passable as professional shots. But likely to most of the photographers reading here on Fstoppers, too often did the feature fail to live up to the hype as it gave itself away in blurring hairs and faces’ edges. Occasionally, you might get a really fantastic implementation where everything was just right. But this was rare.

Now, the iPhone XR introduces this feature without the need for dual lenses, but that’s not the only amazing thing about Portrait Mode across this year’s lineup of super-bling phones. The new depth effects adjustment Apple calls Depth Control takes Portrait Mode and transforms it into something that is actually quite amazing — the way it always should have been.

While some will appreciate the ability to simply turn off or lessen the feature in post, the true grace of Depth Control can even be seen in Apple’s keynote earlier this month in which the company showed off the new effect for the first time. As Apple’s Phil Schiller pulled the effect in and pushed up the intensity, everything actually seemed to look pretty darn good. It was smooth, the effect was real-looking, and really was all done in true Apple fashion — until the last second. Right at the end, the Portrait Mode algorithm pulled the edge of the model’s shirt — otherwise perfectly crisp against the background — completely out of focus. This is the exact type of thing we’re used to in any company’s depth-effect features. Watch below on the way back down from f/16 to f/1.4.

But the important part is what happened right before that light shirt edge lifted off the model’s shoulder and smudged into the background. Because right before that happened, that background looked gorgeous. And right before that happened, that model looked sharp. And right before that happened, you could now hold off and stop the increased intensity of the effect. Look at the examples side-by-side below, where the left image at f/1.4 shows that glaring sign of Portrait Mode being used while that second image on the right with an effect-level increase of just half of a stop helps that shirt edge stay crisp on the right edge of the image.

This — Apple’s new Depth Control — is what will now give users the ability to create truly realistic Portrait Mode photos. And this is why the little things really do matter. This feature is the next best (probably better for ease of use) thing to being able to mask out an area to ignore. Having that power in a pocket-sized camera that’s always on you is incredible — when it works, which it now does. Thank you, Apple.

Adam Ottke's picture

Adam works mostly across California on all things photography and art. He can be found at the best local coffee shops, at home scanning film in for hours, or out and about shooting his next assignment. Want to talk about gear? Want to work on a project together? Have an idea for Fstoppers? Get in touch! And, check out film rentals!

Log in or register to post comments

How is this any different than the adjustable "bokeh" found in Huawei phones since the P10?

They did it best? Maybe with more fanfare :-)

My next phone is the P20 Pro. I saw travel photos taken and it is absolutely amazing.

I had not thought of it this way -- the ability to creep up to the very maximum bokeh you can get to without any artifacts (and back off one notch once any artifact(s) reveals itself.

I'm still in love with my fingerprint reader because I access my phone 90% of the time while it is sitting on my desk, so I'm stuck on the 8 Plus until I cave in 2021 and get Face ID.

The adjustment is really not bad at all. I have had the phone now for three days and I have made the switch. I actually like the new method.

I don't want to get too OT on this, but you still have to lift the phone to activate it with your face, right? I like leaving it on my desk which I can do with the fingerprint. Still I know I'll have to switch some time so I'm expecting it.

I agree with you. I prefer touch ID!

What would be neat is if they allow people to create bokeh effects. Imagine being able to recreate the feel of some vintage lenses.

Samsung has had this after focus bokeh feature for a fairly well...

They may have had for a while, but Samsung is not good at marketing as Apple is. Apple is never first but still does it better.

Sorry, but in which world, or parallel universe, you will have glows around her yellow shirt with a real bokeh? Like f1.4 example?

Exactly. But that's why being able to adjust the effects is important. Obviously those algorithms aren't infallible. But it's great to see a way to adjust finally on iOS.

This might work okay for photos, but not for video. DSLR and mirrorless still have a critical upper hand when it comes to videos.

DSLRs and <holding my nose> MILCs have a critical upper hand across the board.

How soon will we have depth (distance from image sensor) for each pixel? Being able to create depth based masks or custom bokeh curves will be interesting. Lytro?

Unfortunately, I think Lytro proved it's not really going to happen (at least not in that form). Light field cameras didn't seem to take off the way they'd hoped. And I doubt that will find its way into mobile devices anytime soon. But I could be completely wrong, of course ;-) But I think most depth-effect features over the next decade (at least for mobile) will rely on the differences between two lenses and/or a combination of comparing slight changes in motion from the same lens and/or a combination of also using various depth-sensing techniques such as Apple's dot projection stuff for Face ID.

I don't think dot projection is far from pixel based depth data?

This is most certainly the future of photography. Those that think it's a gimmick are naive honestly.

Once processing tech can get fast enough to run continuous depth mapping it'll make huge heavy glass obsolete and you can just do it computationally. Which will be awesome because then cameras won't be limited by physics in the lenses.

But we're still probably decades away from that.....because that would be a massive amount of computational power.

I also wonder why the heck cameras haven't put built in SSDs in their cameras to increase read/write speeds.

Actually I'm pretty sure only the Xs and Xs Max will have the DOF adjust feature.