Instagram’s Freebooting Moneyspinner Could Be About to End Thanks to Article 13

Instagram’s Freebooting Moneyspinner Could Be About to End Thanks to Article 13

Earlier this week, the European Parliament voted to make huge changes to copyright legislation. The new directive could mean that Instagram will finally have to address the vast amount of freebooted content that proliferates on its platform.

The European Union Directive on Copyright in the Digital Single Market contains one particular detail — Article 13 — that has drawn strong criticism for threatening to undermine the internet’s democratic principles. The changes mean that platforms will have to take greater responsibility for any content that is uploaded without the permission of the copyright holder. While critics argue that this has implications for memes, satire, and user-generated content, it could be good news for creators who are frustrated at seeing their copyright ignored and profited upon by the likes of Instagram.

If EU member states go on to approve the directive, changes could come into force in around two years. Platforms will have to filter content before it is uploaded as they will now be responsible for any copyright infringements, rather than relying on copyright holders to randomly stumble upon their content and file a complaint. YouTube already scans content to check whether it matches with existing videos, allowing creators to block or monetize the use of their videos and music. The Google-owned platform warns that increasing these measures will involve blocking and removing vast amounts of content and could have implications for citizens in EU countries accessing its site.

Will Instagram Start Paying Content Creators?

One company that should be particularly worried is Instagram as much of its ad revenue is generated by users viewing freebooted content. Not only has it consistently ignored available technology that would alert users to copyright infringements, it actively promotes freebooted photos and videos. If I head to the search tab on Instagram, typically three of the first eight recommended results are copyright infringements. In addition, I gave up following the #parkour hashtag as more than half of the posts appearing in my feed were freebooted. Instagram creates huge amounts of its advertising revenue by serving illegal content and the EU has just taken a huge step in bringing that to an end.

The "top posts" for the #parkour hashtag on Instagram. Five of the eight posts are freebooted.

Under current legislation, liability for copyright infringement lies with the perpetrator, and Instagram conveniently evades any responsibility, ignores its own terms and conditions, and profits accordingly. With the changes, Instagram will suddenly be accountable for these infringements and seems to have a few options. It can prevent it from being uploaded, or it can decide to pay creators for giving their permission. If the photographs and videos stay online, the copyright holder needs to receive a licensing fee. Instead of Instagram making huge amounts of money from your freebooted images, some of that revenue would come to you.

A Threat to Freedom of Expression

Opponents fear that new levels of automated filtering to prevent uploads would be disastrous for the internet, stifling its organic evolution and restricting freedom of expression. As describes, an open letter signed by the likes of Wikipedia founder Jimmy Wales and Web inventors Sir Tim Berners-Lee argued that the internet “could not have developed as it has if Article 13 had been in effect 25 years ago.” It also notes that fallacious copyright claims have been used as a tactic for silencing critical content. 

The full implications of Article 13 remain to be seen. If you have further thoughts on how the changes could affect photographers and filmmakers, please make sure to comment below.

Andy Day's picture

Andy Day is a British photographer and writer living in France. He began photographing parkour in 2003 and has been doing weird things in the city and elsewhere ever since. He's addicted to climbing and owns a fairly useless dog. He has an MA in Sociology & Photography which often makes him ponder what all of this really means.

Log in or register to post comments

Wow, that would be a game changer !

Oh man I hope this happens. I know this is a pretty big move in controlling people whether it be good or bad but if this happens We might be able to use the discover function and find ACTUAL artist and content producers. Right now if I use the discover/search function 90% of the content and profiles I find will be free booted content, promoted garbage, and spam profiles. I never use Instagram because of this. When I go to discover new stuff I want to see the creators and THEIR work not some 16 year old kid that figured out how to download images/ screenshot images from google/ social accounts and then post them on their shell accounts for likes and follows. This is but one problem with all social media.

Beyond the implications for freedom of speech that the Wiki foundation and others have expressed, the bigger problem is that this kind of content filtering is nigh impossible even for massive companies like Facebook or Google. Ever found a clip from your favorite movie on YouTube but the image is flipped, scaled, color shifted, or there's some weird artifact like snow? This is done to beat their automated content filters and it works pretty well. Image recognition is already a tricky challenge when the images aren't actively trying to not be recognized. There's even a class of algorithms called GANs that are specifically designed to beat recognition algorithms.

This will drastically add to the difficulty of starting a content sharing platform as a small company. If you're excitedly waiting for interesting alternatives to Instagram or YouTube this legislation will make you wait a lot longer. They'll have to purchase a content filter from one of the big players and incur much higher costs, put in a ton of work to develop their own, or just not make their product available to Europe. Even Fstoppers will need to find a way to be compliant with their image hosting forums.

Also, I don't expect this to benefit small content creators and photographers very much. Google and Facebook already don't care about reposting unless companies like Warner Brothers send copyright notices. Their content filter needs to know what content is copyrighted so unless you're submitting their images to their database regularly they won't filter for them. Sure there's legislation now but regulators aren't going to be sorting through every image getting posted and enforcing this law. The best bet will be using this as a legal precedent for a class action lawsuit and that would take years and a lot of money. So if you want to support this because you're sick of your images getting reposted I think you'll be disappointed.

Interesting thoughts. Thanks for posting.

If I link to sky news ireland is fstoppers going to write the cheque??

There’s a huge difference between sharing content via acceptable means like links to the site in question or using the share feature vs stealing content and posting it on your own page.

For example, sharing a link requires you to click on the link to view the content and thus visit the creators site. If however someone took an article from sky news and copied it onto their website then that’s not sharing that’s stealing.

I think you’re severely confusing sharing and stealing here, they’re different.

No it's article 11.

So if you grab the lead image, the title and description from the page you'd have to license it.

So... It's not stealing but it's also requires payment to the owners of the content.

The article does contain lots of exemptions specific for news, critiquing and so on. The article is not the problem, the problem is the massive platforms that don't want to do anything to improve their content filters or take on any responsibility.

The above screenshot sounds like scaremongering nonsense.

You have to license images if you want to use them in articles regardless this was true well before Article 13. On Fstoppers if we want to use an image we need to have permission we can't just take it.

If we have a license to use then only can we use them otherwise it's a no go.

There are exemptions as described above but that hasn't changed.

Begin quote:
"Wikipedia founder Jimmy Wales and Web inventors Sir Tim Berners-Lee argued that the internet “could not have developed as it has if Article 13 had been in effect 25 years ago."
End quote

So these individuals feel it's perfectly ok to steal content from others in order to promote the internet? They don't own the content but they give permission to steal it. THEY should be locked up. I have lost a tremendous amount of respect for Mr. Berners-Lee. As far as Mr. Wales is concerned... certainly a grey area at best.

Acknowledging that the internet would not be the same if the circumstances were different says nothing about whether they feel it's "perfectly ok" to do anything. It's just an observation that things would be different if the rules had been different. Sir Tim Berners-Lee never passed any judgment over whether it was right or wrong, merely pointed out that we would not have the same internet if those rules had been in place from the beginning.

It's actually already effecting people's accounts

I think he's lying. Where is this supposed 300k channel?

Never happen. Instagram is too big, with too much money. This would threaten their business model. If it goes through though, holy shit, that will be HUGE

It's a disaster. Rather than risk financial exposure, public platforms will severely limit individual posting because, realistically, there is NO practical way to know if something is legitimate or not.

It will affect the open source software business, bloggers, and even artists who want to get their work out there.

And yes, it will become a censorship tool (tying into the vague 'hate speech') to prevent dissent. It's much easier to censor corporations than individuals.

The censorship tool is probably the point.

This will actually makes sure new image based startups can't make it. So in the end all big boys will continue thriving while startups will not be able to keep up.

If I were to start a social photography platform, in order to validate original content against millions of uploaded contents elsewhere before letting user upload the content, I either have to buy some expensive service offered by a rich company that maintains original works and then integrate with that platform to do the check before allowing the photo to be published or do that by myself. Either ways, the cost of startup goes up (APIs are charged per call and especially image recognition API's are expensive).

Corporate might win or even take a risk at times due to their enormous cash flow.

This also raises another question: If I take a picture being inspired by a picture at the same spot, despite putting lots my own efforts to produce it, and has slight differences from the original (sunlight, post processing, etc...) does it get blocked (maybe the composition is very similar on the same location)? Creative ideas can occur to many people at the same time. I guess, we can't post online anymore. Most of the photos taken from an iconic locations end up being very similar and also most of the posture people display are very similar. This is like patenting (or copyrighting) operating systems or source codes.

Once again, governments played into corporate and corporate wins!!!

A content-hosting site can only be exempt from using upload filters if it has been available for less than three years; it has a yearly turnover below €10 million; and it has fewer than five million unique monthly visitors – that leaves a lot of websites in the cold.

Ignoring the greater implications for a second, Instagram can and should do it the way Facebook has forever. That is, make it easy for any account to embed a linked and clickable version of someone else’s post. That way the original creator gets the exposure and traffic, while whatever aggregator account maintains its reputation as a hub for that sort of content. Also it would take a tremendous load off Instagram’s servers, since they’d only need to host the original image rather than dozens or hundreds of copies. It’s so simple and obvious that really they should have to answer for not doing this already.

Of course that also means you won't be able to post any of your photos from "well-known" locations on any "compliant" service any more because they will be automatically filtered for infringing on some random other depiction of the same subject.

Enjoy your increased exposure.

If this gets rid of reaction videos, I'm all for it.

ideally, this works as planned with no unintended consequences. If so, I'm all for it. Be original. Be creative. Or find a way to make money that isn't theft and misrepresentation.
Interestingly, now that buildings (and even their lighting) and landmarks are being copyrighted, this could be disastrous...
Perhaps a step in the process would be to actually use EXIF info and tie it to the account. I know it can be faked, but it would be a good first step to weed out the illegitimate uploads.

The opportunity here is for the big players, I’m looking at you Adobe, to implement embedded copyright protection in their software. Making it more difficult to remove the digital signatures. Link that to a cloud based storage system that can automatically crawl the web for violations using AI and alert the image owners.

Of course it wouldn’t be perfect. But what if it reduced copyright infringement by 10%. It’s a good start.

Agreed. IMATAG has shown how the technology can work effectively. There might now be the incentive for it to finally be implemented.

I’m all for article 13. I think platforms should be held responsible for the content they host I mean they’re making money from it.

Companies have been using stolen content on Facebook and Instagram to generate income via sponsored posts. Even after you perform a take down request Facebook has already profited from the content and the creator gets nothing.