Google Says That It Is Improving How Its Phones Photograph Black Skin

Google is changing the way that its Android smartphone cameras process darker skin tones in order to address historic problems relating to how people of color are portrayed in photographs.

Google’s Android VP Sameer Samat announced that the company is working on making its phones improve the way they render darker skin tones and different types of hair, drawing on a diverse range of experts as part of its development.

“As part of our ongoing commitment to product inclusion, we’re working to make technology more accessible and equitable,” Samat explained. The changes are expected to appear on phones later this year.

In the past, photographic processes have been geared towards lighter skin tones, as evidenced by the Shirley Card developed by Kodak, which was used to calibrate colors when processing images. More recently, it’s not unusual for even hugely respected photographers to come under criticism for struggling to portray dark skin, such as the Annie Liebovitz photograph of Simone Biles published on the cover of Vogue magazine last year.

Even in the digital era, technology has often failed to cope. Early webcams failed to track the faces of people with darker skin, and Twitter’s thumbnail algorithm continues to favor white faces over Black.

In a tweet, YouTuber Marques Brownlee welcomed this development, explaining that these changes had the potential to improve every smartphone camera:

Andy Day's picture

Andy Day is a British photographer and writer living in France. He began photographing parkour in 2003 and has been doing weird things in the city and elsewhere ever since. He's addicted to climbing and owns a fairly useless dog. He has an MA in Sociology & Photography which often makes him ponder what all of this really means.

Log in or register to post comments
14 Comments

That "Shirley Card" story is a myth. Color film technology preceded the creation of the Shirley Card by a half century. Color emulsion scientists have always pursued /accurate/ color, not Caucasian skin tones. Portraits weren't even the primary color film market in its first half century. Physicists, the military, horticulturalists, and biologists were some of the early critical markets, and they wanted accurate reproduction of all colors.

Color film was being produced by multiple companies in the US, Europe, and Japan, and for sure Fuji emulsion scientists were not using Shirley Cards. The Shirley Card was created by Kodak only after Congress forced Kodak to break up its virtual monopoly on color print processing in the 50s. The Shirley Card helped ensure Kodacolor print processing by non-Kodak labs would match that produced by Kodak's own labs.

The most important part of the Shirley Card was not the Caucasian skin, it was the gray and color patches that could be accurately read by a densitometer.

This should be an article.

.

It is good that AI can do this now, but "racially inclusive" sounds like Google advertising. Phones, or other cameras, are not biased. They are dumb tools. Most people don't know they can use exposure compensation and other settings to get photos that are more representative of what the eye/minds sees. They use their phones as point and shoot devices.

While the new technology is good, tying it to racial issues is dumb.

An inanimate object cannot be biased. However, technology is always a product of society, and the social processes that shape its creation can contain biases.

The "bias" has always been toward achieving accurate color, not good Caucasian skin tones. Imaging scientists have never set out to create film or sensors that were biased toward or against skin colors. They use carefully controlled color patches, they have always done so...and you know that.

They didn't set out to create a bias, but then you don't always have to do you. These things 'just happen' because people aren't consciously being inclusive.

I reckon Marques Brownlee would know what he's talking about, being that he is dark skinned and has tested probably everyone smartphone camera to come out in the past decade or so.

It has nothing to do with being "inclusive." It has to do with creating accurate color and dynamic range. Sensor scientists should be aiming for accurate color and greatest dynamic range, not "inclusivity."

And what would that mean? Human beings come in a great variety of skin tones. Black people are not one color, or even one shade. We can be a variety of shades with yellow, red, or even purple undertones. "Inclusiveness" is absurd. Accuracy should be the goal.

The issue has really been with photographers, since we have been able to simultaneously capture details in a bride's white dress and the groom's black tuxedo for quite a while now. The failure to do the same thing with black and white people in the same image has been the photographer's lack of skill. I've been photographing black people for 50 years, and I've been able to do it. And I've been both black and photographing black people longer than Marques Brownlee has been alive.

This is about the auto mode on smartphones, not teaching professional photographers how to properly expose an image. Did you read the article?

The first sentence of the article reads: "...in order to address historic problems relating to how people of color are portrayed in photographs." It further says, "...In the past, photographic processes have been geared towards lighter skin tones...."

Those assertions are specious. The "historic problems" have not been with some kind of implicit bias engineered into the technology against people of color, as the article. The historic problems have been with the capabilities of early color technology (primarily dynamic range) and photographers who were unable or unwilling to manage the technology shortfalls in the ways that were available and well-known.

If they can further advance the technology in color accuracy and dynamic range to overcome lack of skill on the part of photographers...well that's what photo-technologists have been striving for all along. It has nothing to do with some newly awakened social consciousness to become "inclusive."

The first sentence of the article reads: "...in order to address historic problems relating to how people of color are portrayed in photographs." It further says, "...In the past, photographic processes have been geared towards lighter skin tones...."

Those assertions are specious. The "historic problems" have not been with some kind of implicit bias engineered into the technology against people of color, as the article. The historic problems have been with the capabilities of early color technology (primarily dynamic range) and photographers who were unable or unwilling to manage the technology shortfalls in the ways that were available and well-known.

If they can further advance the technology in color accuracy and dynamic range to overcome lack of skill on the part of photographers...well that's what photo-technologists have been striving for all along. It has nothing to do with some newly awakened social consciousness to become "inclusive."

I agree to a point. But in the film days, it wasn't always about accuracy. Might have been true for specialized film for scientific use. Other than that, Kodak, Fuji, Agfa etc. created unique films. Most if not all companies made multiple color films. All with different looks. Well known examples were Kodachrome with strong reds and yellows, Velvia biased for greens and blues and Ektachrome had four different ISO 100 emulsions.

Mind you, I'm not saying they were biased in the sense this article implies about phones. Those looks were intended for general use. While they gave people of all appearances some uniqueness in appearance, they did so for every other subject that was photographed. And like today, if you didn't use a p&s or disposable camera, you could control exposure, light etc. to achieve a more or less representative look of the subject.

I remember when Fuji was making its first salient into the American market, trying to wedge that green onto the shelves between all that yellow, their advertising had the catch phrase, "The Japanese see color differently." Wow, that wouldn't even be PC today.

But the fact is, Mike, you and I can look at an object and agree, "That's red"...but we have no way of knowing if your brain actually perceives exactly the same color that my brain perceives. We've just learned to label what each of our brains perceive by the same label.

Maybe, to take an example from engineering, what we're really talking about is precision rather than accuracy. If it's "off," at least it should be consistently off so that we can apply a consistent factor to correct it to our own perception.