When Google announced that their Night Sight mode will now extend to astrophotography, many — myself included — were skeptical. Google have now written a blog post unpacking the new function.
I love astrophotography, but never get much of a chance to do it. Having captured many night skies before though, I am familiar with the difficulties that stand in the way of stunning starscapes. As a result, when Google said their Night Sight mode will not only be able to shoot low light like before, but capture the stars at a high level, I was filled with doubt.
I have a Pixel 3 XL, before that I had a Pixel 2, and before that a Pixel. It's safe to say I'm a huge fan of the phones, and it's usually risky to doubt Google's capabilities, but this seemed to me to be a step beyond mobile cameras. Well, I was wrong. On Google's blog this week, two software engineers responsible for the astrophotography capacity of the revised Night Sight mode have fleshed out a little more detail on how it works and the limitations. If you're interested in the nuts and bolts I suggest a full read of the article, but I'll summarize it.
You will need to perform astrophotography much like you do with a DSLR (etc.) and that should come as no surprise. You'll need a dark sky and your phone mounted on a tripod or at the least, perfectly steady. However, the Pixel's software does a lot of what astrophotographers must do in post production, automatically. That is, it stacks a series of short exposures for tack sharp stars, it reduces noise, it increases contrast in the sky, it tidies up hot pixels, it aids composition, and it even uses machine learning to do localized noise reduction on the sky. Furthermore, it allows for scene composition and autofocus after a brief 1 second exposure is captured to assist your otherwise blank screen.
This is impressive stuff and is doing a lot of what dedicated astrophotography software is doing. The limitations they admit to are largely limitations all cameras suffer from (extreme contrast with points of light and dark skies and foregrounds, light pollution, and so on.) There aren't quite enough images showing what this can do, but what there is, is promising. Google, if you're reading this, pop a Pixel 4 in my hand and send me to Namibia to review it!
I'm very appreciative to Google for pioneering new computational techniques that Apple can copy 1-2 years later (stacking; night sight; astrophotograpy; et.al.). For the first time this year, I signed up for Apple's upgrade program because I expect Apple will introduce astrophotograpy in next year's models and I really want it.