While the motives may be noble (regulating surveillance) it might happen that models like Stable Diffusion will get caught in the regulatory crossfire to a point where using the original models becomes illegal and new models will get castrated until they are useless. Further this might make it impossible to train open source models (maybe even LoRAs) by individuals or smaller startups. Adobe and the large corporations would rule the market.
Full ban on Artificial Intelligence (AI) for biometric surveillance, emotion recognition, predictive policing
Generative AI systems like ChatGPT must disclose that content was AI-generated. Those ones are concerning because they already use it at some degre but still you are the bad civilian for accessing military grade AI tech to generate anime pictures, what a bunch of hypocrites.ai is already being used in militry and i saw some news it already taken lives
Yeah, flagging AI generated content as such and strict demands of announcing when AI is used against your privacy would be something I’d agree with. But stifling AI itself is just moronic because it will make EU fall behind those who still use it or others without such regulations. Let’s hope it doesn’t come to anything like that.
The cat’s out of the bag though. They can’t delete all the models I’m hoarding, I already have them. The repos I need to run them are cloned. The software that runs almost all of this is open source.
Yes, but it if they make it contraband then the open sharing will stop. If they tell the general public it is used to create illegal porn then it will be seen as nothing else. It still feels like sorcery to me sometimes, these mundies will certainly call it witchcraft and sharpen their pitchforks. If posession is made illegal then very few people will dare to share it. Being labeled a pirate has at least honor, being labeled a pervert is something else.
My point is: I think it is not enough we have models and code, it has to become part of many more people’s jobs now so they want to actively protect it and lobby for it.
SD is an amazing tool for basic photo editing, restoration, indie games and simply art. But most non-techie artists I talk to either cry about stolen copyright or some artisan fantasy, claiming real art must be created through diligence. (And then tell me a week later how they just looove Adobe’s Firefly and how magic it is).
Tbh, I’m kinda here for black market models. I don’t WANT AI art to be banned, I want it to flourish like everyone else here (I imagine anyway). But if AI art is driven underground, torrenting will allow us to share models, crypto will allow us to fund training. And the big big upshot is that big corpos like OpenAI won’t be able to have closed-sourced models that they neuter and then dole out to the peasants for a fee.
I2P will get more users
one solution is to use public images to train it than lot of volunteer will help it improve by rating images and reinforcement lerning
being labeled a pervert is something else.
I will embrace it and generate more images of otherwise fully clothed anime girls begrudgingly showing me their armpits.
o.o ookay senpai blush you can look explosion of underarm hair
Honestly I might’ve been caught up in some fear mongering but at this point I don’t feel like AI can be safe in the hands of capital owners. The huge possibilities for harming our society are too great.
Some good, some bad. I really hope they won’t turn this again open source genAI
They will just claim it “can be used to create illegal pornography” and ban it entirely. And “fix faces” will be a high-risk AI because it can detect faces. 🤦
Only idea I’ve got to combat this is to make it available to as many people as possible so they can speak in favor of it. Other ideas welcome.
I can use a pencil and paper to draw wathever i want. Will drawing be illigal just cause anybody can draw prohibited stuff? I’m just saying that if this is the only argument they can find it’s dumb
A difficult comparison - the pencil is more like photoshop or krita. Drawing a (photorealistic) image from scratch requires mechanical skill. SD is something else, like a magical canvas that guides the pencil while you draw…
They could still try to enforce safeguards against generating with certain tokens, or only allow models with certain training data. Or the almighty copyright-claim-banhammer…
While we here know it is futile to censor a model without breaking it - they do not understand it.
But this happens on the COMPUTER, which we don’t understand, so we have to regular the scary thing so nobody else can use it too.
Corps passing their stuff as “technically open source” would also be a problem. Google controls a lot of web by open source Chrome, Microsoft controls dev IDEs by open source VS Code. I’m sure OpenAI would find more ways to pretend to be “open” again if it would be more profitable than saying they have scary monster AIs they can’t release publicly.
Open source exceptions would have to be in tandem with breaking them up somehow and setting some limits to their activities.