Artists Are Fighting AI With AI

preview_player
Показать описание
In this video I discuss techniques artists are using to protect their artwork from being "stolen" by people training large language models to generate artwork.

My merch is available at

₿💰💵💲Help Support the Channel by Donating Crypto💲💵💰₿

Monero
45F2bNHVcRzXVBsvZ5giyvKGAgm6LFhMsjUUVPTEtdgJJ5SNyxzSNUmFSBR5qCCWLpjiUjYMkmZoX9b3cChNjvxR7kvh436

Bitcoin
3MMKHXPQrGHEsmdHaAGD59FWhKFGeUsAxV

Ethereum
0xeA4DA3F9BAb091Eb86921CA6E41712438f4E5079

Litecoin
MBfrxLJMuw26hbVi2MjCVDFkkExz8rYvUF
Рекомендации по теме
Комментарии
Автор

I like how the solution to constantly shooting ourselves in the foot is more bulletproof boots.

rustymustard
Автор

Artists wouldn't need to put "malware" in their work if these companies weren't using their work without permission.

vnty
Автор

I feel like treating Nightshade as an illegal piece of malware is like saying home security cameras should be illegal because you're taking away the house robbers' main source of income.

Bioniclethanok
Автор

If you eat someone else's lunch from the breakroom fridge, don't be surprised if it gives you explosive diarrhea.

terig
Автор

Well, if a company want's to use my artwork to train their network, I reckon they can always pay me for an un-poisoned copy.

arc
Автор

What took them so long? As an "artist" whose drawings were only ever good enough to make the neural networks worse, I've been doing this for years

theaudiocrat
Автор

AI as we were promised: "We're taking the hard labor jobs you don't want, freeing you up to do more art!"

AI we're getting: "We're taking over the creation of art away so people can have more time for manual labor."

TheZeroNeonix
Автор

Nightshade seems like a dye packet in a wad of bills and it's loudest dissidents just sound like they're crying "you ruined what I stole"

stupidweasels
Автор

The poisoning reminds me of the ad nauseum extension - not only does it block ads, but it also clicks them so the advertiser has to pay & the added bonus of ruining your advertising profile.

holdenwinters
Автор

i find it funny how companies get mad at ppl for pirating their software yet those same companies just get to use artists who they didnt even ask permisssion to steal their art with no consequences?

CyberMutoh
Автор

Calling Nightshade malware is like calling some software malware because your poor attempt to crack it caused your computer to shit itself.

markush
Автор

My biggest issue with neural networks is that it should be Opt IN, not opt OUT

If they want to include an artist's works then they should contact the artist and get their permission, not just use the works until the artist finds out and asks them to remove it

jayvee
Автор

Everyone parroting "They'll just use AI to work around Nightshade" are missing the point. The point of glazing images is to make it JUST annoying enough that data scrapers don't bother to circumvent the cloaking. It's the same logic as getting a big, scary padlock for your property. It's not supposed to stop the expert thief who is after you specifically, it's just supposed to keep your common everyday thief out. Glazing your images is likely to stop data scrapers from going after you because circumventing a glaze takes more time and effort than it would be to just find an unglazed image elsewhere, much like how most thieves see an industrial padlock and just leave to look for unsecured goods somewhere else instead.

kaijuultimax
Автор

Eh... as some once said... the cycle continues...
Ad > Adblock > Adblock Blocker > Anti Adblock Blocker >...
DRM > Cracks > New DRM > New Crack >...
Cloaked Image > Image Uncloaker > Anti DeCloak...

Saturn
Автор

The issue with glaze unfortunately is that it is *really* visible for more cartoony artstyles, or ones with lots of flat colors. But we've seen AI start to inbreed as more AI generated images make it to places typically scalped *by* AI. Artists have taken inspiration and iterated off each other since the dawn of human hystory, meanwhile AI can't make it past 2 years without it becoming glaringly apparent that it cannot create, only make shittier copies of human work

Furufoo
Автор

"That photo filter should be illegal because when I harvested the picture without the creator's consent for a commercial purpose it didn't do any harm to my actual property, but it meant that I couldn't generate profitable output from the other pictures I harvested without consent!"

That's like a mugger suing a victim that ran away from them, because the mugger accidentally dropped their favourite illegally purchased weapon in the gutter while chasing them.

Ivytheherbert
Автор

This reminds me of something people were doing years ago, adding subtle noise maps to images to make earlier AI misidentify those images. For instance, two seemingly identical images of a penguin might be identified as something totally wrong like a pizza or a country flag, based on the noise map that was added to the original image. That might even be exactly what evolved into image glazing.

KazmirRunik
Автор

in a year id be funny if the largest models got poisoned, then you need to hire a translator "hi, id like to make a dog driving a car" "ok, computer, generate cat plonking the cow"

eeeguba
Автор

"Egad! This package I stole from your porch contained a venemous snake! I'm suing you for damages!"

That's what the AI guys sound like.

MawdyDev
Автор

Considering how most of the images that an ai is trained on are stolen without the permission of the person who created or uploaded it, it’s really the company’s own fault if something they stole broke their program. This is a really cool tool to protect your work.

texanrattler