Is LiDAR the Master Plan for Apple Smartphone Cameras?

September is fast approaching, which means we’ll soon be getting a new iPhone announcement. It’s a safe assumption that Apple will be announcing a faster processor and a better camera. But how can smartphone cameras continue to improve without the ability to include large lenses and larger sensors?
In this video, Max Tech goes through the history of the LiDAR scanner on the iPad Pro and the iPhone and how it has already made some significant changes to smartphone photography. Max also explains exactly how LiDAR technology works and what it can do in an iPhone or iPad. Finally, he goes on to discuss his thoughts and speculation on Apple's Master Plan for the iPhone 13 / iPhone 12S (whatever they call it), which is rumored to include a second-generation LiDAR scanner to enable important features on Apple smartphones.

While I’m inclined to believe that the next-generation Apple smartphone will be called the iPhone 12S, I do agree with quite a lot of Max Tech’s speculation around Apple's use of the LiDAR scanner. We can be fairly certain that Apple won’t want to make their phones much thicker, even if they do make the screens larger, so we can’t expect any huge increases in sensor size over the slightly larger sensor we say last year in the iPhone 12 Pro Max. Apple doesn’t publish these kinds of specs, but we can work it out. To have 12 megapixels with 1.7µm pixels at 4:3 ratio, it would be 6.8mm X 5.1mm. That’s around the size of a 1/2" sensor (8.5mm diagonal). We are also unlikely to see Apple start using periscope lenses this year. So, how can they make a significant improvement to the images the phone takes, which consumers will actually care about?

Computational Photography

Computational photography is what we call digital image capture and processing techniques that use digital computation instead of optical processes. Advances in computational photography have improved smartphone imaging significantly over the years. Smartphone processors are now so fast that the phone can detect what you’re taking a photograph of and process the captured image accordingly in seconds or even quicker, so that when you open your phone gallery, there’s a very pleasing image (on a small screen) for you to admire while you marvel at your wonderful photography skills.

Now, with LiDAR, the phone can instantly and accurately detect the distance to an object as well as its shape and size in the frame. This has allowed for very fast and accurate autofocus, even at night, in the iPhone 12 Pro and Pro Max. This year, with a faster processor, a larger sensor, and an updated LiDAR sensor, it’s rumored that we will see a highly accurate portrait mode video, with accurate and realistic background blur which could be edited and adjusted on your phone after capture.

This isn’t the first time we’ve seen portrait mode video implemented on a smartphone. We’ve gotten used to Apple rarely being the first to implement a technology. We are used to Apple doing things well by having highly optimized hardware and software designed to complement each other.

I believe that we won’t see much of an increase in smartphone camera sensor sizes in the near future, but I do believe that additional sensors, such as LiDAR, combined with faster processors will give the device more data on what’s being captured and allow for significantly improved image quality through advanced computational photography.

There’s even a suggestion of an astrophotography mode coming to the iPhone in the near future. That would be incredibly impressive for a 1/2" sensor.

What do you think we’ll see coming to smartphone cameras to improve image quality? Let me know in the comments.

Brad Wendes's picture

Brad Wendes is a British photographer and travel lover.
He began photographing parkour and acrobatics in 2010 and has since taken to portraiture and fitness photography.
Brad is a self-confessed geek, Star Wars fan, tech enthusiast, cat lover and recently converted Apple Fanboy.

Log in or register to post comments