The Inventor of the Modern Image Sensor Announces a Breakthrough for Low-Light Photography

image-sensor-camera-canon-250mp

Eric Fossum, the inventor of the CMOS image sensor, the sensor in almost every modern digital camera, has teamed up with Jiaju Ma in developing the Quanta Image Sensor (QIS). The QIS represents a significant leap forward in low-light sensitivity that has major implications for both scientific imaging and consumer electronics. Fossum and Thayer note that the Quanta Image Sensor is fundamentally different from complementary metal oxide semiconductor (CMOS) and charge coupled device (CCD) sensors, the sensors that have dominated the image industry for years (with CMOS having largely replaced CCD). Whereas traditional sensors have tens of millions of photosites, the QIS aims to have a billion pixels on a sensor of the same size. What makes the pixels special, however, is their extreme sensitivity:

Light consists of photons, little bullets of light that activate our neurons and make us see light. The photons go into the semiconductor and break the chemical bonds between the silicon atoms and when they break a bond, an electron is released. Almost every photon that comes in makes one electron free inside the silicon crystal... We were able to build a new kind of pixel with a sensitivity so high we could see one electron above all the background noise. These new pixels are able to sense and count a single electron for the first time without resorting to extreme measures.

That's right: Fossum and Thayer have developed an image sensor that can work with single photons. The best part? They developed it with industry application in mind:

We deliberately wanted to invent it in a way that is almost completely compatible with today's CMOS image sensor technology so it's easy for industry to adopt it.

To put this into perspective, consider this: in typical low-light conditions, your camera is working with thousands of photons per pixel. This new sensor is working with single photons. This is literally the smallest amount of light possible. Fossum and Thayer note that in its current stage, it's a proof of concept and there are other issues to contend with, such as reading data from a billion pixels, but even so, this represents a promising step toward future advances.

[via phys.org]

Alex Cooke's picture

Alex Cooke is a Cleveland-based portrait, events, and landscape photographer. He holds an M.S. in Applied Mathematics and a doctorate in Music Composition. He is also an avid equestrian.

Log in or register to post comments
7 Comments

I rememer reading about this some time back. It will matter when we see it in the real world. I think photon accumulations on the sensor is in millions, not thousands.

This is awesome. No doubt, cameras will have to become mini supercomputers and/or find a way to throw away 90% of their data before handing it off to processors and subsequent memory storage devices. But still....first step, check. This is great news.

One other note....low-light photography will be awesome, then...but what happens when you want to take a photo at f/2 of a model in the sun? Can shutter speeds get fast enough? Will electronic shutters take over 100%? I guess companies that make ND filters will be loving the new business... Buy their stock, now! Ha...

I would say they would build in another method of working with ISO. Perhaps have a low-light vs normal lighting capability mode that you could toggle. That would change the sensitivity of the sensor to change how it responds to a different amount of light, perhaps by reverting to a sensor closer to a CMOS or CCD.

Luminance is relative, so shutter speeds and aperture values won't be an issue. It would work the same way as current sensors, effectively. The pixel with the lowest number of photons striking it would be black, and the pixel with the most photons hitting it would be white (provided the scene doesn't exceed the dynamic range of the camera, and the photo's not over/under exposed). Then other pixels would have their luminance fall somewhere between white and black depending on how many photons stimulated them. If you were to photograph your example, the camera would interpret say, 1000 photons hitting a pixel as being black, and the exposure would be exactly the same as the same photo taken by a CMOS sensor since it's basically ignoring 1000 photons from each pixel. The advantage of this comes when you photograph something very dark, and say only 30 photons hit each pixel. Whereas a CMOS sensor might only record 10 of those photons, the QIS sensor might record 25. And then if there's background radiation equivalent to 15 photons, the QIS sensor has a 25/15 signal/noise ratio versus the 10/15 of the CMOS sensor, meaning there's more good information relative to garbage, and you get a more accurate image (less grain/noise). Of course, all the numbers are purely made up here, and this is greatly simplified, but hopefully it gets the idea across somewhat. The basic idea is - no need for ND filters or fast shutters, the only difference is the signal to noise ratio. If you're interested in this at all, look up how CCDs work and quantum efficiency. CMOS and QIS operates on the same basic premise - photons carry energy that can be converted into an electrical signal and recorded.

Ha. Duh... I was just thinking this today while working after I posted my question. Thanks :-) I suppose there could be many more fine variations in tonality, then, perhaps... Whether or not we would notice the difference (probably not) is another issue. But it's all exciting stuff, that's for sure...

Well, of course. Any idiot knows THAT, right?

Little Photie the Photon says Yeah Man!