AI Poison Pill App Nightshade Has 250K Downloads in 5 Days

AI copyright infringement tool Nightshade generated 250,000 downloads shortly after its January release, exceeding the expectations of its creators in the computer science department at the University of Chicago. Nightshade allows artists to thwart AI models from scraping and training on their work without consent. The Bureau of Labor Statistics shows more than 2.67 million artists working in the U.S., but social media feedback indicates the downloads have been worldwide. One of the coders says cloud mirror links had to be added so as not to overwhelm the University of Chicago’s web servers. Continue reading AI Poison Pill App Nightshade Has 250K Downloads in 5 Days

Nightshade Data Poisoning Tool Targets AI to Protect Artist IP

A new tool called Nightshade offers creators a way to fend off artificial intelligence models attempting to train on visual artwork without permission. Created by a University of Chicago team led by Professor Ben Zhao, Nightshade makes it possible to include an instruction set that can cause AI models to “break” during unauthorized scraping. It does this by inserting “invisible pixels.” As a result, popular AI models including DALL-E, Midjourney and Stable Diffusion will subsequently render erratic results, turning dogs into cats and cars into cows, and so forth. Continue reading Nightshade Data Poisoning Tool Targets AI to Protect Artist IP