[News] An Algorithm That Tells Which Pics Sucks and Which Rock?

Xerox has a new tool called Aesthetic Image Search in development that supposedly can judge photos based on whether or not they have a good aesthetic or bad. Can't wait to see what you guys think of this one.
Check it out and let us know by leaving your comments below.

Many methods for image classification are based on recognition of parts -- if you find some wheels and a road, then the picture is more likely to contain a car than a giraffe. But what about quality? What is it about a picture of a building or a flower or a person that makes the image stand out from the hundreds which are taken with a digital camera every day? Here we tackle the difficult task of trying to learn automatically what makes an image special, and makes photo enthusiasts mark it as high quality.
Experiment with our newest application (still in the alpha stages!) which retrieves a group of images from a class, and then tries to predict which ones are normal, and which ones are of high quality. Do you agree with the system?

via [Gizmodo]
From Kenn:
Do you like what we are doing? Then show us some love. Tweet and Like your favorite articles and be sure to leave your comments below. Heck leave a comment even if you don't like what we are doing. We can take it. ;)

If you want to receive the best of the month's posts in a convenient newsletter then don't forget to subscribe now.
And don't be shy. I could use some more friends these days so hit me up on Twitter and Facebook.

Posted In: 
Log in or register to post comments


I wonder if this is similar to how cameras "recognize" scenes.  It's picking up some pictures into categories they don't belong to, but processing them as such.  It's like the algorithm is missing perspective clue in its way to recognize a scene. Very interesting nonetheless!

Jens Marklund's picture

Sooo... if you shoot with a black backdrop, and/or make it BW - it's a good photo?

It relies heavily on rules, like my old nemesis the Rule of Thirds.

"The Xerox algorithm understands good beach images as simple photos with
intense colours or dramatic black and white clouds. The algorithm also
detects images with silky waves captured with long camera exposure."

Also, since you can't click on any of the images, and they have all been cropped into square format, you just have to take their word for it that these images are good or bad. In my opinion, many good images break the rules. I'm guessing that the rule breakers would end up in the "bad" column. I'm not surprised that someone is trying this, and I'm only slightly appalled. Any art buyer who can't tell for him/herself what makes an image good or bad is not an art buyer who I would ever want to work with.

Peter Dowell's picture

If you read "how" the algorithm defines "good" images, it essentially consists of the "rules" of photography; rule of thirds, good saturate, tone range, etc. What this doesn't account for is any kind of emotional reaction provoked by a photograph, or anything that breaks those "rules". Not that I'd expect a computer to easily define, or find such things!
Would be interesting to pitch this thing against flickr's "interestingness algorithm", which presumably harvests user-generated data rather than anything like this.

The human eye likes "contrast" so I bet the program looks for luminance contrast and color contrast to judge photos.

As opposed to the anthropomorphic robots who currently decide what looks awesome or not?  I'm in.