Netflix Personalizes the Images of the Movies You're Browsing

Netflix Personalizes the Images of the Movies You're Browsing

Netflix is using AI to follow viewer habits. The AI then chooses the best image or photograph to present and advertise movies that it thinks you would like. It makes sure the movies put their best foot forward and shows you the best side of it, based on your preferences. If you're an action movie type, it's going to choose photos of the movie that best shows this side of the movie. If you're one for romantic films, it'll show images that portray emotions that you'll experience watching the film.

We know Netflix has been personalizing your feed almost since inception, but this is a new level of customization and giving you options that better suit your preferences.  I think this is early technology, but it might become important for photographers to shoot more looks for a shoot, to display the best part of the product or service based on the viewer's preferences. 

I can imagine visiting a site for arranging a wedding and based on whether it's a man or woman visiting, it displays different parts of information. I think well-developed sites most likely already do this. Imagine a clothing brand's site that knows your height and waist so it can show clothing that best suits your body type. 

What this means for photographers is that there's more work to be done for a shoot, and more money to be made. Every image used should be paid for it's duration, and it might even lead to having several shoots instead of just one to capture a specific brand's range of clothing. 

[via Netflix Technology Blog]

Wouter du Toit's picture

Wouter is a portrait and street photographer based in Paris, France. He's originally from Cape Town, South Africa. He does image retouching for clients in the beauty and fashion industry and enjoys how technology makes new ways of photography possible.

Log in or register to post comments
6 Comments

While I understand the reasons for doing this, overall it also tends to create unintentional echo chambers in which people are fed only the information that reinforces their prejudices.

So as a gross example, if the algorithm deduces I'm a portrait photographer, all it shows me wherever I look is what is of interest to a portrait photographer, and I miss out on what some great landscape artists are doing--leading me to believe that there isn't much of a world of landscape photography going on.

Is "based on whether it's a man or woman visiting" really the only narrow-minded example you could come up with? Can't think of how that would improve browsing through someone's wedding pictures....

It's merely an example. For wedding sites, if the browser is male, maybe it'll have images of the groom and his groomsmen getting ready for the ceremony, and if it's the bride, maybe of her? It's not really gender specific, just images that the browser can relate to best. Again, it was merely an example.

Well, I think even if it is merely an example, it is quite a dangerous one. You did write "based on whether it's a man or woman", that's an explicit gender specific statement. Algorithms are not some black magic, and it lies in our hands to make sure we understand what they do, so that we can use them consciously (and I don't think my browser showing me pictures based on the user's gender is very progressive). I'm sorry to pick on you, I don't think you had bad intentions, when writing this article! But it's just a thoughtless comment I've come across the internet multiple times and I think it's important to call out if examples like that are thrown into the world without some reflection.

Tessy, noted, and you are right, we are in control of what technology/AI will use as data in future.

Sometimes movie trailers already do this. I recall the very disparate trailers for the movie "Swingtime." One trailer was about a bunch of high-spirited German kids in the 1930s having a good time dancing to bouncy American swing music. The other trailer of the same movie was all about Nazis and arrests and imprisonment.