On January 17, 2023, the Oversight Board issued a decision advising Meta to revise its policies on freeing the nipple. Here's what that decision actually says.
The Guardian, the New York Post, Glamour Magazine, and a host of other news outlets are reporting that Instagram might finally adjust its policies to allow posting nipples regardless of gender. The news comes after a decision by the Oversight Board, a board of advisors for Instagram and Facebook, issued a decision on an appeal from two removed posts and suggested that Meta adjust their nudity policies to allow topless photos.
What Is the Oversight Board?
I wrote about the Oversight Board in a previous article. In some cases, if you have had a post removed, you might get a reference number, and you are allowed to appeal your case to the board if you have that reference number. The Oversight Board is an independent committee of academics, politicians (such as the former Prime Minister of Denmark), ethics specialists (including a Nobel Peace Prize laureate), and others who advise Meta on their policies regarding the moderation of content. The Board primarily deals with things of global significance, such as Meta's responsibilities in handling misinformation related to elections or the pandemic and things along those lines.
Almost Everyone Else Is Wrong About What Started This Decision
Here is the January 17, 2023 decision in its entirety online. It was about 30 pages when I printed it to a PDF and highlighted the relevant points, so here is a quick summary.
I've seen countless posts and story shares with clickbait titles about how Instagram is now going to free the nipple, citing this decision. Basically, the version you'll see shared is that some topless photos were shared by a non-binary couple and taken down for violating the policy on nudity, and the Oversight Board ordered them restored and that Instagram allow all nipples. That's about 20% accurate.
It is true that the images in question featured a transgender non-binary couple who posted two images with a caption about trying to raise money in a GoFundMe for top surgery (to flatten the chest) because they were having issues with insurance coverage. The images were flagged by the AI and found to be not in violation. They were reported by users and reviewed by human moderators and again found to not be in violation. The images were reported again, and that time, the human moderator found them to be in violation of the community guidelines and removed the images.
Here's where everyone gets it wrong. First, the images that were in question were completely covered:
The second, and more egregious thing every gets wrong is that the pictures were not removed for violating the nudity policy, they were removed because Instagram thought they were trying to engage in prostitution!
Third, the Board did not demand that Instagram restore the photos. Instagram already did that on its own before the case was decided. They also acknowledged that the posts were removed in error.
The Incredibly Insightful Analysis of Instagram's Moderation Policies
The decision gave a deep look at how and why Instagram implements its moderation policies. As I suspected, it is completely backwards, and the Oversight Board tore them to shreds over it. Here's what was uncovered.
Instagram Has Secret Moderation Policies in Addition to the Public Facing Guidelines
You can find Instagram's Community Guidelines here. That is what the public has access to, but that is just the tip of the iceberg as far as what the complete set of rules goes. The decision evaluated and referenced these hidden rules. For example, there are 18 additional rules for nipples that are only available to human reviewers:
For sexual solicitation, there are additional guidelines that include lists of poses frequently used by prostitutes to implicitly ask for sex in exchange for money:
The "Known Questions" refers to a list of internal-only secret guidelines that the reviewers use to moderate content.
The Oversight Board Found These Hidden Rules Absolutely Ridiculous
The Oversight Board asked Meta why the AI system would flag these images as a violation and they didn't know. The Oversight Board asked why a human reviewer would think this was sexual solicitation and they did not know. The Oversight Board pointed out how the rules are not just vague, but inconsistent:
Meta's Policies Disproportionately Affect Women and the LGBTQI+ Community
So, showing bare female breasts is not allowed under nudity, but covering female breasts is not allowed under sexual solicitation. Basically, just don't have covered or uncovered breasts in any photos. There were several other incidents where the Oversight Board mentioned that the rules are overly broad and disproportionately affect women and the LGBTQI+ community.
Meta has a website where they track statistics related to the enforcement of nudity and sexual activity here. The decision found that there were a high number of false positives in applying the overly broad and arbitrary rules to the regulation of female bodies, with only 21% of the images being restored. The decision cites a study that found 22% of images of women's bodies that were removed from Instagram were false positives.
The Rationale Behind Instagram's Nudity and Sexual Solicitation Policies
The Board looked at Meta's reasoning behind its policies for censoring nudity. Meta stated that there are two main issues that it wants to avoid: 1) the sharing of nude photos of underage girls, and 2) the unauthorized sharing of nude photos. It is impossible to look at a bare breast and determine if it is on a 17-year-old or an 18-year-old, so they decided to just not allow female breasts. Likewise, they have no way of knowing if there was consent to share the image, so they just don't allow those images to be published. Meta cites a survey that shows "90% of those victimized by non-consensual digital distribution of intimate images are women." It's an issue that affects images of female nudity by a factor of 10, so they don't allow female nipples, while male nipples are not regulated.
The Oversight Board's Recommendations
The Oversight Board did not request that Meta revise its policies to free the nipple. It gave three recommendations. The first is to have clear criteria ensuring consistent treatment. Second, Meta should provide users with more explanation of the criteria for sexual solicitation. Third, Meta should revise its internal reviewer guidance as it relates to sexual solicitation.
This decision has almost nothing to do with displaying nipples, as that was not even an issue in the underlying images. It has more to do with Meta's policy of scanning images and arbitrarily deciding whether the women in the images are asking for sex in exchange for money based on how they are posing or what they are wearing. This is an absolutely abhorrent policy that should never have existed in the first place. It is so great to see this policy evaluated under a microscope and dismantled.
Goodness, I made it through about half of their guidelines before giving up and just being thankful I don't photograph anything that shows a nipple!
If the issue is the sexualization of the nipple then why does it matter if the person in the photo is binary or non binary? Wouldn't it only matter if the person viewing it finds it sexually appealing? And if we take that stance, couldn't one argue that male nipples could be just as sexualized as female nipples?
None of it makes any sense.
As my prestigious Lawyer brother in-law would say, "reviewing a legal case, his note on the side, He would write" slubberdegullion"
oooooh! Nipples... i would not be without mine.
So glad I don't shoot model photography. Trying to work out what Instagram considers acceptable (art) or inappropriate seems to be a total minefield.
It's highly unpredictable and counterintuitive what they allow and don't allow. Very frustrating.
Can't say w*tback on FB or Twitter. I left..........no big loss to me.
I appreciate Fstoppers keeping us abreast of this situation.