Photographers are increasingly concerned about AI's impact on their art. While huge arguments rage on about AI and copyright theft, artists are adopting two new tools that prevent their work from being scraped. It’s time photographers started employing them, too.
AI scraping is the parasitic practice of extracting data, including music, text, and images, from online sources to train generative AI models. Much of what it feeds on is copyrighted. Consequently, this has resulted in several copyright infringement lawsuits against AI companies. Understandably, the big tech companies, e.g., xAI, OpenAI, Google, and Meta, all withhold the data sources they used to train their models. Nevertheless, it is widely acknowledged that they have gathered a significant amount of information for free from the public internet.
Most artists lack the resources to file a lawsuit when their work is stolen in this way. Therefore, we are always looking for ways to prevent this parasitic scraping from happening. Whenever there is a need, something will come along to meet it.
What Are Glaze and Nightshade?
Glaze is a tool designed by Shawn Shan and a small team from the University of Chicago SAND Labs. It is a free download that makes changes to the art that, in theory, the human eye cannot perceive. However, these changes befuddle the AI’s perception of the image's style. Meanwhile, Nightshade, produced by the same team, prevents the software from recognizing the content of the photo.
Installing and Using Glaze
Glaze is designed to stop AI from copying an artistic style. It is available as a free, non-commercial download and works with Windows and macOS 13.0 and higher. The team is working to get it functional on older Mac machines.
The download is quite large—2.5 GB—and you need at least 4 GB of storage. Once downloaded, you unzip the folder and open the program by clicking on the program file in the new folder. I unzipped it to the Programs folder on my C drive and created a shortcut to the program file. However, in Windows, if you select the program file, you do need to run the program as an administrator.
![](https://cdn.fstoppers.com/styles/full/s3/media/2025/01/28/_1010062new_year_before_dawn_fast_low-protected-intensity-low-v2.jpg)
In the Glaze notes, it says that Microsoft has not yet registered the app as being safe to run, and they are slow at issuing their signature keys. Therefore, you might get a Microsoft warning when you try to install it. Although I installed it with no warnings, it did get temporarily blocked by my security software.
The program ran using my GPU. After allowing the program to download extra resources, you select a photo and set the Intensity and Render Quality sliders. Intensity affects the magnitude of changes added to your art, with higher values more likely to leave visible artifacts. Meanwhile, increasing the Render Quality slider raises the protection but also changes the time it takes to render the image. Finally, you click the purple "Run Glaze" button
Although it estimates a rendering time of around two minutes at default values, it took my computer about a minute to “glaze” a 20-megapixel JPEG. I found the minimalist image I tried first was left with visible artifacts in the areas of negative space. Therefore, I reduced the intensity to Low to decrease visible changes. I also increased the glazing time, which should lead to better protection. This took about four minutes to run.
Sadly, the artifacts were still visible, especially in the negative space. Nevertheless, in busier areas of the picture, they were not noticeable at all. Likewise, after I ran it on a photo with more features, I could not see any artifacts in the result.
![](https://cdn.fstoppers.com/styles/full/s3/media/2025/01/28/_c310007old_year_sunrise_2024-protected-intensity-default-v2.jpg)
Installing and Running Nightshade
Produced by the same team, Nightshade acts as a poison, affecting the image data that the AI relies on. Whereas Glaze hides the style of the image, Nightshade confuses the AI into thinking the subject is something entirely different.
Like Glaze, Nightshade is a big program and takes a while to download. I again unzipped it into my Program Files folder and created a shortcut to the .exe file on my desktop. This time, I did get a Microsoft warning, and I also got a report from my security software.
The render quality is much slower with this software. According to the program, the fastest results require five minutes, although my image rendered much more quickly. A high-quality render might take 20 minutes, although it was much faster for me.
I chose the same minimalist images as Glaze to test the rendering. At default settings, it also left significant visible artifacts on the photo, but again, they were more visible in the negative space.
![](https://cdn.fstoppers.com/styles/full/s3/media/2025/01/28/_7081110arctic_tern-nightshade-intensity-low-v1.jpg)
The visible artifacts reduced with each successive increase in render quality but were still visible even at the highest quality renderings. Again, the artifacts became unnoticeable in the busy areas of the photos I tested it on.
What I Liked and What Can Be Improved Next Time
What I Liked
- It’s early days, but any step towards stopping the unwanted misuse of our photographs must be a good thing.
- Rendering was faster than the suggested times, but I have a powerful machine and graphics card.
- The programs work better in busy areas of a photo.
- Artifacts were less visible when the images were reduced in size.
- It gives an extra layer of protection to photographers.
- It's free.
- It's great that some academics are on the side of creative artists.
What Could Be Improved Next Time
- Both programs left visible artifacts on the images. This will hopefully be improved with later releases as photographic art is included.
- It will be good when these programs develop into plugins for developing and editing software packages.
- A proper installer would make the program more accessible.
- Faster, more lightweight programs would speed up the workflow. These are currently not suitable for bulk processing.
In Conclusion
In 1996, John Perry Barlow's "Declaration of the Independence of Cyberspace" suggested the Internet would be seen as a revolutionary democratization tool. It gave us easy access to information, promoted freedom of expression, encouraged global connectivity, and circumvented censorship. Sadly, it has led to the opposite. Falsehoods spread easily. There’s greater polarization of views. The Internet has brought about cyber warfare, the spreading of extremism, and the wholesale theft of copyrighted work.
With the advent of AI, we’ve seen each of those exacerbated.
So, it is inevitable that ordinary, moderate-thinking people start to resist that.
I don’t think there is any difference between us using these tools that stop AI programs from pirating our work and photographers placing enormous copyright watermarks across their photos to prevent humans from stealing them. However, it’s been widely reported that ChatGPT has called these programs “abuse.” That was said by a $157 billion company that is one of many allegedly training their programs on our art for free.
Without a doubt, there will be ongoing friction between creative artists and the AI companies that make huge profits by using our work without permission. Consequently, photographers and other artists will continue to find ways to protect what we do. It’s an uphill battle as we are against the super-rich parasites trying to undermine our creativity and livelihoods for their benefit.
![](https://cdn.fstoppers.com/styles/full/s3/media/2025/01/28/golden_eagle_1_of_1.jpg)
Moreover, AI is censored. Try to ask Microsoft's Copilot to create a satirical cartoon of your least favorite politician. Or ask China's DeepSeek app, which knocked $500 billion off Nvidia's market value recently, a question about the Tiananmen Square massacre in 1989, and you'll find that freedom of information is fine so long as it suits the person who owns the AI.
The battles will go beyond the courtrooms and legislation, especially as governments are siding with businesses instead of the population; the UK government is currently consulting on whether to override our copyright in favor of AI training, and the same will happen elsewhere. I am sure other discussions are ongoing elsewhere in the world.
If the wholesale theft of our images continues, no doubt we will see more of these kinds of programs appearing. We will also hide good images behind paywalls to prevent AI from parasitizing them. That’s not a bad thing as, after all, it wasn’t that long ago when we used to buy books and magazines to view top-quality photography.
This rebellious software is in its infancy and has a way to go before it’s perfect for photographers. Nevertheless, I hope it will evolve and grow to protect us from the super-rich parasites that try to suck the blood from our art. Thank you to the University of Chicago. Keep up the good work.
A useful article. I’d imagine this approach will always require visual artefacts to have their intended effects, if the disruptive information were invisible to us it I’d imagine it’d be invisible to the machine as well.
Still could work well for sample images on a portfolio where people are selling prints of their work, those are usually low resolution by design.
Or simply as a means to poison any AI models that scrape one’s work. Perhaps the culture would shift where we’re all willing to tolerate artefacts in online images out of an understanding of their purpose.