New data poisoning tool helps artists in the fight against generative AI

preview_player
Показать описание
NBC News' Brian Cheung talks to artist and advocate Karla Ortiz, who is part of a class action lawsuit alleging copyright infringement by generative artificial intelligence.

Connect with NBC News Online!

#Data #ArtificialIntelligence #Artists
Рекомендации по теме
Комментарии
Автор

I think generative AI companies that do scrap data off the internet must pay a fee for every image stolen from an artist that fee going towards a UBI fund for artists that way the get compensation the ideal is 60$ per image scrapped

hotshot-texw
Автор

Opt-Out, aka, having to ask a company to "please not steal my stuff".

Matok
Автор

The AI companies shouldn’t have an opt-out policy, they should have an opt-in policy. Don’t steal!

DagNeb_It
Автор

Great job Dr. Zhao and his team of PHD students from Arizona State University.

jamewoods
Автор

Is Nightshade purposely inserting cat images as the noise? What better than the ultimate troll? 😺

kerikah
Автор

If you can program artificial intelligence to steal large amounts of data from particular sources, then you can program it to provide a digital bibliography of its primary source materials.

alphabrainwave
Автор

Finally! A way for artists to fight back

amaduddin
Автор

I support the artists, whatever the medium.
The ' creators ' of AI generated media just lay back and let something else do all the work.
Where's the creativity in that?

southwestsearch
Автор

This "opt out" nonsense by these companies needs to stop. By taking that stance they are, in effect, placing all the burden on the victims that are being impacted by their efforts. This is a great opportunity for laws to be put in place that places the burden back on those doing the damage. It should always be "opt in".

This is same as placing the recycling burden on consumers instead of the producers using more environmentally friendly packaging and processes.

zerkol
Автор

Poisoning the well for IA bots is a smart but i feel that these companies will find a way around it. 😮‍💨

iamdetour
Автор

That OR the government could just regulate AI. Just like they should have been regulating US social media platforms for years now

hansolo
Автор

I believe it's going to be a cat-and-mouse game between the AI and anti-AI researchers

containedhurricane
Автор

People forget that Photoshop has announced they copy and record artists drawing styles for AI and PS is something most Artists use! So when you paint, it'll all be for AI

williss
Автор

I do not know enough about tech or Ai to understand the logistics and programing behind this, but I get the concept and that's awesome they are fighting back. I think ai is incredible and so useful in countless applications - art however I don't really think is one of those spaces.

That said, I've seen so many ai generated drone footage of actual places and they look completely real. In that regard for movies I think that's a great way to cut costs. To send out a drone pilot to a location to shoot a 5 second clip just zooming into a location so the audience knows where the scene takes place - time wise and logistically is inefficient. But I think that's where the line needs to be drawn for the time being until we learn more. It should be a tool to aid in cutting time and costs on those types of shoots that really are unnecessary to go out to location to film. Ai is so advanced, and accesses so much data. If you tell it to do a panned in from top left of the Denver metro area, it will do it and be accurate by basing the generated content on actual footage that already exists of the Denver metro area.

That, I see no problem with. But using it as a full on replacement for a film crew, screen writers, and even actors is, for me, where I have to oppose. Every new idea, story, song, movie has some influence in past creators work. At its core, ai isn't really all that different from us - it just does everything billions of times faster with its learning model. The crediting and compensation for actual human artists though is a huge area of concern.

Like in theory, all artists do take inspiration or techniques from the artists whose work they study or learned from. That's how we grow. That isn't plagiarism in my mind. But this is so accelerated and hard to monitor that it's going to take a lot of collaboration and cooperation by many leaders in tech and regulation to figure out where this is going to - and how it's going to fit in our society moving forward. It's scary too with the deep fakes and the US election coming up...

willcookmakeup
Автор

FYI, Nightshade does NOT work on any model other than the one they developed it for. They can demonstrate it works, on their own version of that model, but it has no impact on the actual generative AI out there. And the images are easy to filter out from training data.

GrumpDog
Автор

There's also screenshots, scans, right click saveas... Not sure how this is really going to help if you can put a couple hundred humans in the middle to filter these out in the temp.

kleroterion
Автор

AWESOME ! Can they do that for music, also ?

rupertpupkin
Автор

Assume Nightshade takes the amplitudes of the Fourier transform of the bytes of the images and adds Gaussian noise. That is, because the sum of two random probability distributions is the sum of the convolution of the two probability distributions, a Fourier transform of the two products, one is the amplitude of the Gaussian noise and the other is of the image bytes, this can be multiplied together. In other words, you can scale the two together. The result is that the human just sees whatever the image is and the computer just sees the Gaussian noise.

(This is just a wild guess — don’t take this too seriously if I’m totally wrong.)

posthocprior
Автор

What about the potential medical molecules that generative AI can produce?

Matt.garrow
Автор

Waste of time. AI will simply filter out these altered images. 🙄

rescuegirl