AI Poison Pill App Nightshade Has 250K Downloads in 5 Days

AI copyright infringement tool Nightshade generated 250,000 downloads shortly after its January release, exceeding the expectations of its creators in the computer science department at the University of Chicago. Nightshade allows artists to thwart AI models from scraping and training on their work without consent. The Bureau of Labor Statistics shows more than 2.67 million artists working in the U.S., but social media feedback indicates the downloads have been worldwide. One of the coders says cloud mirror links had to be added so as not to overwhelm the University of Chicago’s web servers.

Ben Zhao, the University of Chicago computer science professor described by VentureBeat as the leader of the project, told the outlet “the response is simply beyond anything we imagined.”

“Nightshade seeks to ‘poison’ generative AI image models by altering artworks posted to the web, or ‘shading’ them on a pixel level, so that they appear to a machine learning algorithm to contain entirely different content — a purse instead of a cow,” VentureBeat explains. After training on several “shaded” images scraped from the web, the intent is to have AI models subsequently generate incorrect images based on the user prompts.

Zhao — along with colleagues Shawn Shan, Wenxin Ding, Josephine Passananti and Heather Zheng — “developed and released the tool to ‘increase the cost of training on unlicensed data, such that licensing images from their creators becomes a viable alternative,’” VentureBeat says, citing the Nightshade project page.

Since the AI companies themselves have created opt-out requests that ostensibly prevent unauthorized scraping, these “rely on AI companies to engage in good faith,” reports TechCrunch, noting that “those motivated by profit over privacy can easily disregard such measures.”

While Zhao and his team aren’t trying to destroy Big AI, they do want to ensure tech giants pay for licensed work, as companies that operate in broad daylight must do, or face legal consequences. The fact that AI companies have web-crawling spiders that algorithmically harvest data in a manner that is usually undetected has essentially been a license to steal, Zhao explains.

“Nightshade shows that these models are vulnerable and there are ways to attack,” Zhao tells TechCrunch, adding that “what it means is that there are ways for content owners to provide harder returns than writing Congress or complaining via email or social media.”

VentureBeat reports that Glaze, another of the team’s apps protecting against AI infringement, has received 2.2 million downloads since its April 2023 release. Glaze hampers AI’s ability to “learn” from an artist’s signature style by altering pixels.

No Comments Yet

You can be the first to comment!

Leave a comment

You must be logged in to post a comment.