The company says it’s proof that quality AI models don’t have to include controversial copyrighted content. Adobe trained Firefly on content that had an explicit license allowing AI training, which means the bulk of the training data comes from Adobe’s library of stock photos, says Greenfield. The company offers creators extra compensation when material is used to train AI models, he adds.
This is in contrast to the status quo in AI today, where tech companies scrape the web indiscriminately and have a limited understanding of what of what the training data includes. Because of these practices, the AI datasets inevitably include copyrighted content and personal data, and research has uncovered toxic content, such as child sexual abuse material.
Who trusts Adobe to not exploit them
ifwhen they so choose to.What so many here are clamoring for is exactly what Adobe wants. Funny that. Let’s hope that this makes people think.
I sell photos on Adobe Stock. They never properly asked me permission to train on my photos. The whole thing was opt-out not opt-in. Leaving Adobe Stock is not an option either as Adobe, like most stock services, continues to sell your images even if you purge and delete your account. Getting them properly removed is a agonizing, teeth-pulling experience with customer support and not even guaranteed to work.
So Adobe can take their “ethical, non-exploitative” AI and shove it up their ass. As always, artists continue to be exploited and abused for profit.
Did they offer you any compensation for training on your data, like the article says?
They have not, no.
I’ll believe Adobe made something non-exploitative when I see it.
They’ll let you see it for $59.99 / month.
And then sue you for letting your kid look at it, too.
And when you try to cancel it you’ll see it was actually $59.99/month with a minimum runtime of 12 months.