New Tool Empowers Artists Against AI

New Tool Empowers Artists Against AI

Artists have found a way to challenge artificial intelligence (AI) by using a new tool called Nightshade. Developed by computer science professor Ben Zhao and his team at the University of Chicago, Nightshade makes subtle changes to images that trick machine-learning models into misinterpreting them. When these modified images are used as training data for AI models, they cause the models to malfunction. For example, an image of a dog could be mistaken for a cat, or a hat could be misidentified as a toaster. The implications of Nightshade are significant, as it gives artists the means to combat the indiscriminate scraping of their visual work by technology companies without their explicit consent.

The imbalance of power between technology companies and artists has long been a point of contention. Many artists feel violated when their work is used without permission, and they are demanding a shift from opt-out mechanisms to a system that requires consent and compensation. Nightshade offers a potential solution by discouraging companies from scraping artists’ work due to the risk of breaking their AI models. While some companies have offered to let artists opt out, there is currently no mechanism to enforce these promises. Nightshade could provide that mechanism, ensuring that artists have greater control over how their work is used in AI training. This tool has the potential to rebalance the power dynamics in favor of artists and individuals whose digital content is freely available online.

Ads
  

Source: This new tool could give artists an edge over AI

Similar Posts