Nipples Are Banned, but Animal Abuse and Brutal Violence Are OK: Instagram Is Broken

Nipples Are Banned, but Animal Abuse and Brutal Violence Are OK: Instagram Is Broken

The moderation of social media platforms has fallen under intense scrutiny in recent months. A recent study suggests that while some content on Instagram is removed almost as soon as it is reported, other violations are allowed to stay online indefinitely, raising difficult questions about how the company is dealing with a growing problem.

Instagram’s parent company, Facebook, has not had a good year. A few weeks ago, the British parliament seized internal documents following Mark Zuckerberg’s refusal to appear before MPs to answer questions. This follows a series of scandals: Cambridge Analytica, alleged attempts to undermine George Soros, and revealing users’ private photographs to app developers, to name but a few.

Writing earlier this year, Mason Gentry investigated how effectively Instagram responded to reports of various types of content violation and came away with some worrying results. While he acknowledges the unscientific nature of his research, it raises concerns about how the platform deals with posts that breach its terms and conditions. Pornography, it seems, is dealt with almost immediately; by contrast, violence, gore, and animal cruelty can stay online indefinitely, though sometimes hidden behind a warning. 

As of June this year, Instagram passed one billion users. Moderating Facebook’s two billion users is posing a huge challenge, and Instagram seems no different, though Instagram has yet to be accused of contributing to a genocide. For some types of content, it can be understandably tricky to know where to draw the line. For example, the #gore hashtag (link deliberately not included) contains lots of incredible work with prosthetics and fake blood, some of which is so realistic that it's not clear what is real and what has been created. Elsewhere, however, the line is pretty obvious; it took me only a few clicks to find myself watching content so violent that I don't want to describe it. Lots of it.

As a photographer absorbed with curating my profile and admiring the work of some amazing artists, it’s not always apparent how much of Instagram is filled with truly terrible things. I’ve written before about how Instagram is a cesspit of populist content that is driven by clicks as opposed to quality. I’ve also complained at length about Instagram’s clear reluctance to combat freebooting on its platform, happy to see content stolen as long as users stay in the app, consuming its adverts. What I failed to realize was how much of Instagram is violent, graphic, and seemingly free of moderation. Around the world, thousands of 13-year-olds will be receiving new electronic devices this Christmas, many of them no doubt opening new Instagram accounts. Terrifyingly, those children, with all the parental controls in place, could in just a few clicks be watching footage of animals being abused, or, as I just discovered, people being executed. In Gentry’s experience, reporting this content seems to make little difference.

Why does pornography get removed so much more quickly? My guess is that finding human moderators willing to sit and look for sexual content is much easier than finding those happy to watch violence. Given that Facebook has established a reputation for failing to give its own moderators the support they need to deal with mental trauma, this would not be surprising.

As I noted in my rant against freebooting, you have to question whether there’s any incentive for Instagram to address this problem. Clicks are clicks, and clicks are ad revenue. For companies that are worth such vast amounts of money, we need to increase the pressure on them to be more accountable and to invest some real resources in moderation rather than the token efforts that have been put in place so far. If you have billions of users creating billions of dollars in profit, moderating their content comes with the territory.

Log in or register to post comments

32 Comments

revo nevo's picture

sure this is ok
https://www.instagram.com/p/BqInYMPBGax/?utm_source=ig_web_copy_link

but some more artistic nude portrait is banned in seconds

Fritz Asuro's picture

I know, right? But technically, it's not showing any nudity/nipples and will be considered "dancing".
I think, there should be a NSFW or mature content mode for Instagram where only legal age can view the nude posts but not to the point the content can be considered pornography. Though this won't stop from younger audiences to view the content, at least this can save Instagram from lawsuits somehow.

Motti Bembaron's picture

Won't do a thing. Kids can watch anything they want if they want. How about try preventing those kids from playing violent games instead :-)

Fritz Asuro's picture

Like I said, it won't stop younger audience accessing such content.

About violent games, I think it's no different from any type of media like films and shows with mild violence. It's the parents/guardians duty to clarify that what they see in games/shows are something not to do in the real world.
I, my brothers, cousins, and a lot of my friends grew up playing violent games and watching films with violence but we always know that it's not something to do in real life. So far, we all grew up fine.

Motti Bembaron's picture

It seems OK to watch and play violent games that is not real world but not OK to watch REAL naked bodies.

Why is it up to the likes of Instagram to control nudity for under age but it's up to the parents when it comes to violent games or movies?

Mauro Scattolini's picture

Motti,the correlation between Violence and Videogame has yet to be proven. As far as I know and the research paper I read, there is no confirmation of that phenomena. I might be not 100% up to date, but that's what it is. What's truly dangerous is a culture that promote violence by having law such as 'Stand your ground'. But I agree with you about the absurdity of banishing a body but having kids watch violent thing. I guess that if we are not talking about snuff content, with the right parenting, a kid could basically watch anything. As said with the right parenting.

David Pavlich's picture

I remember when 'Mortal Kombat' came on the scene. My son played it and played it well...."Flawless Victory". :-) And now, he's a professional wedding photographer. :-)

Their issue is not a moral one but a money one. They fear advertisers don't want their ad next to explicit content. Same with YouTube demonetizing loads of videos and channels, Tumblr removing porn. That doesn't explain why nudity is a problem but violence seems to be ok. My bet is that it's easier for machine learning algorithms to flag images with nipples... But America's glorification of violence and guns and aversion for nudity defenetly predates social media.

Motti Bembaron's picture

"...But America's glorification of violence and guns and aversion for nudity definitely predates social media..."

Very true.

Motti Bembaron's picture

Violent is OK, sex is sin...a cultural oddity.

Julian Ray's picture

Instagram. Someone has to keep us safe from the female nipple!

Rifki Syahputra's picture

maybe it's time to try the male nipple dude..

Color Thief's picture

What if we covered the female nipple by photoshopping a male nipple over it? Would that be okay?

David Pavlich's picture

As long as the hair is left out. :-)

David Love's picture

In short, Instagram sucks, Facebook sucks and Mark Zuckerberg is the devil. Done.

Rob Mitchell's picture

It's a free service run by a a self-righteous person, how can it be any other way?
I posted a playful summer photo of my 2 year old lounging 'topless' in the sun like a proper diva. WHAMMO, immediately deleted and the warning that it had prohibited content. Eh? After swearing at the screen and cursing the morally messed up situation in the US of A, I shrugged it off, you get what you pay for.

David Pavlich's picture

It used to be okay to take pictures of other parents' kids. In most cases, it's still 'legal', but because of social media and pedophilia being spread so easily via social media, a youngster even approaching nudity is not going to be acceptable. Right or wrong, welcome to the 21st Century.

A lot is justified, but there are 'helicopter parents' that would send their kids out the front door with three layers of bubble wrap on them if they could. I'm an official geezer. When we left the house in the morning, there were many days that we'd not come home until dinner. If that happened today, there'd be SWAT teams out looking for me. ;-)

Rob Mitchell's picture

Luckily I don't live in the UK anymore where smiling at a kid is even frowned upon in certain areas.
Like you, I was out after breakfast, home before dark and got a good hiding if I was late. No mobile phones and no worries. If we did see a flasher at the park we just chased him off on our bikes.
I've done my best to bring up my kids the same way and as I say, luckily, here in Belgium, there is some common sense left. Some.

David Pavlich's picture

About the only thing that changed from the way I was as a kid to how I raised my son is that we wanted him to tell us if he was leaving his original destination. He would head out on his bicycle and do what kids did. The world has changed, indeed.

Jason Lorette's picture

" I'm an official geezer. When we left the house in the morning, there were many days that we'd not come home until dinner. If that happened today, there'd be SWAT teams out looking for me. ;-)"

This...sooo much this. I would leave the house and no one would have a clue where I was until I came in for lunch or dinner, out until the street lights came on. People would lose their minds if a parent allowed that now.

David Pavlich's picture

Street lights?! :-) I lived in the sticks when I was a kid. When we disappeared, we were in the woods or fishing or some such thing. Skin a knee? Wash it in the creek or lake. Now, we go into a grocery store and the first thing you see is a sanitizing station for the shopping buggy handles. ;-)

Rob Mitchell's picture

Social sharing was bragging to your pals at school on Monday about the jumps you made on the bike and how high you got in the famous tree at the park.
No selfies needed to prove you were cheating death or a good face-plant.

David Pavlich's picture

Or when you're fishing with your friend and you're behind him changing lures and he gets ready to cast and sinks a green and white Daredevil in the side of your head! I had to poke the hook through and have him cut of the barbed end so I could get it out.

Did I go home to bandage it up? Not a chance, the bass were biting!

Robert Nurse's picture

You're lucky they didn't label it child porn and sick the feds on you. smh

user-165452's picture

Instagrams core direction and ethics have gone down the drain. I no longer believe that it has its primary base users in mind. I report posts on a daily basis, it’s not my job to police Instagram. That’s you for the article, it’s good that people voice their concerns, and if they don’t change soon, I suggest everyone hold onto their ethics and leave.

Rob Mitchell's picture

I've had a massive thin-out. People I knew, normal people, have suddenly become effluentcers, doing giveaways and paid posts. All binned, the look at me brigade, with the personal boyfriend photographer, all binned. every 5 posts an ad pops up. Lots I like about it, people I like on there, people I would have never found if it wasn't for that. Interesting things I'd have missed out on. For now, I try look though my fingers and remind myself, it's for free. It's crap, of course it's crap.

Sophia Loren once said:
Violent films can have terrible consequences.
But what can happen after sex films?
People only have more children.

Jason Lorette's picture

The world is so backwards...on sooo many things.

More comments