header banner
Default

AI that scrapes art without authorization will be punished by a new data poisoning attack tool


Table of Contents

    The tool is currently in the research stage, but the team plans to integrate it with its existing artist protection tools.

    67193 Total views

    25 Total shares

    New data poisoning tool would punish AI for scraping art without permission

    Researchers at the University of Chicago have developed a tool that gives artists the ability to “poison” their digital art in order to stop developers from training artificial intelligence (AI) systems on their work. 

    Called “Nightshade,” named after the family of plants — some of which are known for their poisonous berries — the tool modifies images in such a way that their inclusion contaminates the data sets used to train AI with incorrect information.

    According to a report from MIT’s Technology Review, Nightshade changes the pixels of a digital image in order to trick an AI system into misinterpreting it. As examples, Tech Review mentions convincing the AI that an image of a cat is a dog and vice versa.

    In doing so, the AI’s ability to generate accurate and sensical outputs would theoretically be damaged. Using the above example, if a user requested an image of a “cat” from the tainted AI, they might instead get a dog labeled as a cat or an amalgamation of all the “cats” in the AI’s training set, including those that are actually images of dogs that have been modified by the Nightshade tool. 

    Related: Universal Music Group enters partnership to protect artists’ rights against AI violations

    One expert who viewed the work, Vitaly Shmatikov, a professor at Cornell University, opined that researchers “don’t yet know of robust defenses against these attacks” — the implication being that even robust models such as OpenAI’s ChatGPT could be at risk.

    The research team behind Nightshade is led by Ben Zhao, a professor at the University of Chicago. The new tool is actually an expansion of their existing artist protection software called Glaze. In their previous work, they designed a method by which an artist could obfuscate, or “glaze,” the style of their artwork.

    An artist who created a charcoal portrait, for example, could be glazed to appear to an AI system as modern art.

    Examples of non-glazed and glazed AI art imitations. Image source: Shan et. al., 2023.

    Per Technology Review, Nightshade will ultimately be implemented into Glaze, which is currently available free for web use or download.

    Sources


    Article information

    Author: Destiny Quinn

    Last Updated: 1698702603

    Views: 1172

    Rating: 4.3 / 5 (30 voted)

    Reviews: 90% of readers found this page helpful

    Author information

    Name: Destiny Quinn

    Birthday: 2011-01-13

    Address: 866 Tran Springs Apt. 258, Laurenton, SD 36078

    Phone: +3648222986847926

    Job: Biologist

    Hobby: Billiards, Juggling, Coin Collecting, Hiking, Telescope Building, Rowing, Poker

    Introduction: My name is Destiny Quinn, I am a Precious, ingenious, strong-willed, valuable, welcoming, tenacious, striking person who loves writing and wants to share my knowledge and understanding with you.