Subscribe to our Newsletter

The AI lab waging a guerrilla war over exploitative AI

Yet it is “simplistic to think that if you have a real security problem in the wild and you’re trying to design a protection tool, the answer should be it either works perfectly or don’t deploy it,” Zhao says, citing spam filters and firewalls as examples. Defense is a constant cat-and-mouse game. And he believes most artists are savvy enough to understand the risk. 

Offering hope

The fight between creators and AI companies is fierce. The current paradigm in AI is to build bigger and bigger models, and there is, at least currently, no getting around the fact that they require vast data sets hoovered from the internet to train on. Tech companies argue that anything on the public internet is fair game, and that it is “impossible” to build advanced AI tools without copyrighted material; many artists argue that tech companies have stolen their intellectual property and violated copyright law, and that they need ways to keep their individual works out of the models—or at least receive proper credit and compensation for their use. 

So far, the creatives aren’t exactly winning. A number of companies have already replaced designers, copywriters, and illustrators with AI systems. In one high-profile case, Marvel Studios used AI-generated imagery instead of human-created art in the title sequence of its 2023 TV series Secret Invasion. In another, a radio station fired its human presenters and replaced them with AI. The technology has become a major bone of contention between unions and film, TV, and creative studios, most recently leading to a strike by video-game performers. There are numerous ongoing lawsuits by artists, writers, publishers, and record labels against AI companies. It will likely take years until there is a clear-cut legal resolution. But even a court ruling won’t necessarily untangle the difficult ethical questions created by generative AI. Any future government regulation is not likely to either, if it ever materializes. 

That’s why Zhao and Zheng see Glaze and Nightshade as necessary interventions—tools to defend original work, attack those who would help themselves to it, and, at the very least, buy artists some time. Having a perfect solution is not really the point. The researchers need to offer something now because the AI sector moves at breakneck speed, Zheng says, means that companies are ignoring very real harms to humans. “This is probably the first time in our entire technology careers that we actually see this much conflict,” she adds.

On a much grander scale, she and Zhao tell me they hope that Glaze and Nightshade will eventually have the power to overhaul how AI companies use art and how their products produce it. It is eye-wateringly expensive to train AI models, and it’s extremely laborious for engineers to find and purge poisoned samples in a data set of billions of images. Theoretically, if there are enough Nightshaded images on the internet and tech companies see their models breaking as a result, it could push developers to the negotiating table to bargain over licensing and fair compensation. 

That’s, of course, still a big “if.” MIT Technology Review reached out to several AI companies, such as Midjourney and Stability AI, which did not reply to requests for comment. A spokesperson for OpenAI, meanwhile, did not confirm any details about encountering data poison but said the company takes the safety of its products seriously and is continually improving its safety measures: “We are always working on how we can make our systems more robust against this type of abuse.”

In the meantime, the SAND Lab is moving ahead and looking into funding from foundations and nonprofits to keep the project going. They also say there has also been interest from major companies looking to protect their intellectual property (though they decline to say which), and Zhao and Zheng are exploring how the tools could be applied in other industries, such as gaming, videos, or music. In the meantime, they plan to keep updating Glaze and Nightshade to be as robust as possible, working closely with the students in the Chicago lab—where, on another wall, hangs Toorenent’s Belladonna. The painting has a heart-shaped note stuck to the bottom right corner: “Thank you! You have given hope to us artists.”

Leave a Reply

Your email address will not be published. Required fields are marked *