Can You Get Google Pixel's Camera Tech in Your iPhone?

Can You Get Google Pixel's Camera Tech in Your iPhone?

Google’s Pixel has been hailed as one of the best on the market, some even calling it the best. I’m not saying the iPhone’s camera is lackluster, but is there something missing?

Cortex, available in the App Store, is being touted as the iPhone's answer to Google's Pixel. Apparently, you’ll be able to get better low light shots, less noise, and more dynamic range. I’m not so sure the last promise holds true, but there’s something to be said for what it’s capable of. It's been out for a couple years now but there's something extremely similar to how Cortex works, and how the Pixel's "best smartphone camera" gets it's image.

So What is it Doing?

The Google Pixel takes 10 underexposed images, and layers them on top of one another. The resulting image is bright, clear, and has a lot less noise going on in the shadows. This is exactly what Cortex is doing. Although, to their credit, you have more manual control over what’s going on underneath the hood. In fact, you can go from 10, all the way up to 99 exposures.

Testing it out against a Google Pixel, and the iPhone’s stock app, showed that it held up well. Below is a low light scene that I set up after a shoot, using an iPhone 6S. As you can see there’s certainly a difference between Cortex and the built in camera on the iPhone. Honestly though, I’m not seeing the major differences here. Sure, there’s slightly less noise in the Cortex version, but all three are pretty much on par.

There wasn't very much light for this photo, only a softbox 20ft away.

You can see plenty of great examples of where Cortex works on the App Store. However, there are plenty of places that it doesn't work. By trying to shoot so many frames at once, the meshed-together version is exposed for too long. You could essentially achieve the same thing by slowing down your shutter speed. Now, I used the app in it’s automatic mode, since both the iPhone and Pixel were shot in their automatic modes. Perhaps with some tinkering I could have achieved a better result.

Here I some shots I took on the Hudson River. Again, with the iPhone as normal, then using the Cortex App, and finally the Google Pixel. Unfortunately Cortex falls apart here. The regular iPhone and Pixel images are far sharper than Cortex. Even if the former is noisier, the stock apps are garnering a much better image. The boats and the water they’re resting in are a blurred mess!

[Pixel on the left, Cortex on the right]

It can’t hurt to check out anyway, at only $2.99. It would obviously be nice to see this sort of tech come in the iPhone’s stock camera app, when that could include some better automatic control of the shutter speed. If you’re willing to keep your phone still, it's a sweet app for nighttime shooting. However, for your regular smartphone use, the automatic mode just can't keep up.

[via Cortex]

Posted In: 
Log in or register to post comments

5 Comments

Looks like camera shake from a longer exposure.

red cat's picture

funny the blurred boats make me focus more on the buiding and the sunset !

Have a look at Hydra. It too layers exposures on top of each other, but is much sharper and cleaner. The only downside is the lack of manual control. However, I have ben told by the developer that manual control is on the way.

Thanks for the heads-up. Have bought and downloaded this and will try it out. One of the problems for me with these apps is that I have quite a shaky hand and it is quite difficult to keep still while it takes 50 images. There also used to be an app similar to Cortex and Hydra called Clear Cam, but they don't sell it anymore. From my testing, I think Cortex was better than Clear Cam anyway.

I have Cortex and it is pretty good. I think that it works best on relatively close-up shots, e.g taken in the confines of a room or similar distance as opposed to landscape shots. Definitely has less grain. I'd use something like Pro HDR X for landscape.