Developer Interview: Writing a Camera App for Android, Part One

Developer Interview: Writing a Camera App for Android, Part One

In an earlier article I bemoaned the state of Android camera apps and in particular, how they lacked the features that even moderately specced cameras have. Find out what one developer has to say about the prospect of Android camera apps.

Instant photo gratification and a sop to my global social media dependency are secondary to the real reason I want to shoot with a smartphone camera. And that is because it fits in my pocket and inevitably means I have a camera rather than not having one. Add to this the fact that my LG G5 allows me to shoot wide, 11mm wide, without having to lug a large globulus of glass around, and I feel I'm on to a winner. It may not be the the camera of choice for production work, but if it's on me, then I'll use it. More to the point, having an 11mm lens on me means I'm shooting wide more than I ever did. In fact I rarely use the normal lens now.

This state of play naturally led me to wanting to mimic the way I shoot with a mirrorless or DSLR. That is to say, I want to produce a single raw exposure that will maximize my dynamic range, enabling the greatest latitude in post-production. If I need to shoot multiple images for HDR, panoramas, or focus stacking, then this becomes part of the workflow. On my Nikon D700, my setup includes autofocus, raw imagery, aperture priority, and auto ISO. The new breed of smartphone cameras support autofocus, raw imagery, and manual control. With a fixed aperture, the obvious alternative is shutter priority. I was therefore surprised that while most apps have manual camera control, no one appears to support shutter priority. I'm not an app developer ,and I thought it would be useful to give the opportunity to see where two apps have come from, where they're going, and what their developers think about programming for Android.

For part one of this doubleheader, meet Martin Johnson, developer of the popular Snap HDR camera app and lecturer in computer science at Massey University, New Zealand. He's a programmer by trade and has done so in one form or another for over 40 years, finding the process fun and creative. When I asked him why he had developed Snap, I was expecting something like "becoming an app zillionaire!" But no, because he teaches app programming, it was "to show my students that it wasn't all just talk!"

Snap comprises about 280,000 lines of code (seriously!), split between about 200,000 lines of Java and 80,000 lines of C++ and Halide. Much of the former are libraries/Google code, but most of the latter are written from scratch by Martin. To give an idea of the scope of adding new features, the camera style dial in the UI is about 1,000 lines of code, took a month to write a beta version of, and then six months to fully release.

For Martin, Snap's current standout feature is the HDR algorithm.

I wrote it completely from scratch in a language called Halide, and it can align and merge seven frames very reliably and quickly.

Current features he is working on include denoising for low light (and indeed, bright light) photography, along with real-time HDR shaders to give a better idea of the end-result. Camera apps are also unusual in that they tend to break the design ethos of a smartphone by introducing their own UI, something that often is a blend of camera and computer. This is an area where I think Snap succeeds by being both minimalist and flexible. Martin comments:

It's tricky to make everything accessible. My thinking is that people want a UI they can customize and that gives them lots of choices which the stock camera doesn't.

Perhaps unsurprisingly for a camera app developer, he enjoys photography shooting with a Canon 7D and GoPro, although most of his snapping is with his phone (stand by the product you develop). And his favorite photographer? Andre Kertesz.

I asked Martin about the trend toward both multi (e.g Huawei P20 Pro) and single (e.g. Google Pixel 2) cameraphones. He was upbeat, because both these trends require lots of computational photography, and that's good for programmers!

The only thing that's bad for camera app developers is when manufacturers hide the more advanced camera APIs. Google is good, but some manufacturers only let their own apps take good photos.

This links in closely to the Camera2 API, something Martin wishes manufacturers would support across all their devices. Talking of the Camera2 API, I had to ask where Shutter Priority Mode was.

That's a decision Google made with the Camera2 API. Exposure is either fully automatic or fully manual, and all you can do is give hints to the algorithm with scene modes. In theory, you could switch to manual and use your own exposure algorithm, but this would be a lot of work. I might give it a try in the future though.

Is that a killer feature? Well, it appears that no apps are currently offering this as an option, so for a photographer, quite possibly. For the majority of smartphone users, the ability to produce the best in-camera photo will be more significant, which is maybe why we don't see it yet.

On the future prospect of smartphone photography, Martin is a firm believer in the smartphone's ability to leverage computational photography.

I'd say it's not going to be long before smartphone cameras will be as good as DSLRs, mostly through better software and being able to combine lots of frames.

This is a key point when thinking about the quality of images smartphones create: it is less about how good the lens-sensor system is at producing a great single frame, but rather if we have multiple frames, how can we produce great images.

One area Martin believes holds promise is using artificial intelligence (AI) to select the best image for you from a set. Imagine this in the form of Panasonic's 4K features on steroids: shoot a three-minute 4K video and have the best 10 frames shown to you.

Finally, I asked Martin about his biggest mistake. For a guy that has produced such a popular app in a competitive market, he's very modest: "dome of the early versions of Snap had some horrible bugs, but the nice thing about software is that you can fix almost anything." Like photography, you can always iterate to improve.

Lead image courtesy of Danny Ryanto via Unsplash, used under Creative Commons.

Mike Smith's picture

Mike Smith is a professional wedding and portrait photographer and writer based in London, UK.

Log in or register to post comments
1 Comment

ech. oneplus.