Camera sensors have evolved at a breakneck pace since the advent of the digital era, but many photographers lament the loss of a sensor from the early days of the technology, often calling its color rendition "magic." Is it nostalgia, or is there really something different in the way it produces images? This awesome video takes a look.
Coming to you from Robin Wong, this great video takes a look at cameras with CCD sensors. In the early days of digital camera sensors, CCDs (charge-coupled devices) were the primary style of sensor. A CCD uses a grid of metal-oxide-semiconductor capacitors to collect photons and generate an electrical signal. The two main drawbacks of CCD sensors were their relatively slow speed and their poor noise performance. CMOS (complementary metal-oxide-semiconductor) sensors improved on both these issues, also offering higher dynamic range and better quantum efficiency and requiring less voltage and current, and are now the standard sensor of choice in the vast majority of modern cameras. While the CMOS sensor is superior in many ways, the one thing a lot of photographers miss about CCD sensors is their color rendition, which is often considered more film-like. Is it really that much better? Take a look at the video above to find out.
they need to make this available in the usa.... as well as the leica phone.
Give me a break with the film like nonsense. Nobody said that when these were the it sensors of the day and now that we have more advance sensors as well post processing tools to why the nostalgia? Always when flag when film like is mentioned now watch the price increase on eBay for such cameras lol
I actually have a stash of CCD cameras I'm trying to unload on eBay and I'm trying to drive up the price.
I'm sure it generated comments on his channel.
The Leaf DCB2 had a large CCD with only 4 mp and built in sensor cooling system and no low pass filter. We printed on a Roland a 4 x 4 feet laminated poster and had it mounted in the prepress shop from an image I shot for a client. Visitors assumed it was shot on 8x10 film sheet. All my captures were separated in Linocolor by our in-house scanning and profiler guru from whom I learned to understand colors in the printing world. When I bought my 3mp Canon D30, I did test and compared. Despite 6 years of technology difference the Canon colors were nowhere to print anything remotely close to the DCB2. For us all colors were about neutral for reproduction and that's what I did and we had our system to achieve that. This said, when web sites started showing images using filters of all kinds, it took me a little time to rethink and apply "non neutral" images to my personal work. When I started shooting cars and races, I got in trouble for suggesting to someone that the color on their picture wasn't reflecting anything close to the actual color of a car that I knew well. I knew it was not intentional but at that point, I no longer attempted giving tips apart for a few good friends.
I think most devices today are good, but only so many people really care about learning colors but also attempt to memorize what they see when they shoot. For example a neutral white can have a little of yellow or a slight hint of cyan, but any little bit of magenta will clash unless something in the red family is supposed to reflect on the white. Well on phones at 72dpi it doesn't matter but my point is that one can ignore the specific color tendency of a brand or a sensor as long as they understand colors and how to correct them.
I would add that if we are going to start talking about how people perceive and remember colour, we're going to dive pretty deep into psychology (and potentially philosophy).
That notwithstanding, the only way to maximise fidelity is by means of rigorous colour management.
Color management is definitely part of it.
Most cameras shoot in the sRGB colour space by default. The leading "s" is for "stupid" — that colour space was developed to fit within the gamut of a VGA colour cathode ray tube display! Modern displays have a much wider gamut.
I once had a Roland Hi-Fi Jet. It's Hexachrome™ inkset could exceed even Adobe RGB gamut. Images in sRGB would come out posterized, especially in greens, as the colour management struggled to stretch sRGB to fit the Hexachrome™ inkset.
No matter how "rigorous" your colour management is, the source colour space can play havoc with the final display.
I guess it just depends on preference. I don't like that filmic grainy look. In my opinion that completely negates the purpose of buying a better camera. For example, I have every Fuji X-T camera ever made. After looking back at my X-T1 and comparing it with my X-2,3,4,5..... The image quality is garbage. I don't even see the purpose of editing to make things look nostalgic unless it was for a very specific purpose.
There is a nostalgia and terms to try to justify it. The latest good cameras, that don't over cook images as some do, produce accurate colors. I find I really only need to add a bit of contrast to raw files and all is good with my camera. Perhaps people are getting tired of the outlandish colors that are so vivid that they look cartoonish, including SOC images by some cameras.
So yes, there is a desire for more "Natural" colors vs the cartoonish photos that have been in vogue.
Look at the real world then really compare your colors to it and see what I mean.
Image colour is more dependant on the pigments used in the CFA than in the underlying technology of converting light intensity readings into a readable signal...
Some may be jumping on the 'nostalgia bandwagon' of CCD today, but it was not nostalgia all those years ago, when many complained about camera companies abandoning CCD and going CMOS.
For sure, some of us can see a certain 'quality' to CCD images, that seem to be missing in CMOS.
It has nothing to do with resolution (as the newer CMOS cameras are way better for image detail), but something else I cannot 'scientifically' explain. Maybe 'density' and 'vibrancy' are the closest words I can find to describe it?