I like to be optimistic, eventually such crusaders will have such tools turned against them and that will be that. Even they will begin doubting whether any nudes are real.
Still, I’m not so naive that I think it can’t turn any other way. They might just do that thing they do with abortions, that is the line of reasoning that goes: “the only acceptable abortion is my abortion”, now changed to “the only fake nudes, are my nudes”
To operate a model plane, there was a not-small amount of effort you needed to work through (building, specialist components, local club, access to a proper field, etc.).
This meant that by the time you were flying, you probably had a pretty good understanding of being responsible with the new skill.
In the era of self-stabilising GPS guided UAVs delivered next-day ready-to-fly, the barrier to entry flew down.
And it took a little while for the legislation to catch up from “the clubs are usually sensible” to “don’t fly a 2KG drone over a crowd of people at head height with no experience or training”
I mean, inpainting isn’t particularly hard to make use of. There are also tools specifically for the purpose of generating “deepfake” nudes. The barrier for entry is much, much lower.
It would also take a lot more effort to get something even remotely believable. You would need to go through thousands of body and face photos to get a decent match and then put in some effort pairing the two photos together. A decent “nude” photo of a celebrity would probably take at least a day to make the first one.
so anyone can do it and the victim can be your neighbor next door, not some celebrity, where you can internally normalize it with “well, it is a price of fame”
When Photoshop first appeared, image manipulations that would seem obvious and amateurish by today’s standards were considered very convincing—the level of skill needed to fool large numbers of people didn’t increase until people became more familiar with the technology and more vigilant at spotting it. I suspect the same process will play out with AI images—in a few years people will be much more experienced at detecting them, and making a convincing fake will take as much effort as it now does in Photoshop.
i think its ‘barrier to entry’
photoshop took skills that not everyone has/had keeping the volume low.
these new generators require zero skill or technical ability so anyone can do it
Ehhhh, I like to think that eventually society will adapt to this. When everyone has nudes, nobody has nudes.
Unfortunately, I doubt it will be everyone. It will primarily be young women, because we hyper-sexualize those…
You might think so, but I don’t hold as much hope.
Not with the rise of holier than thou moral crusaders who try to slutshame anyone who shows any amount of skin.
I like to be optimistic, eventually such crusaders will have such tools turned against them and that will be that. Even they will begin doubting whether any nudes are real.
Still, I’m not so naive that I think it can’t turn any other way. They might just do that thing they do with abortions, that is the line of reasoning that goes: “the only acceptable abortion is my abortion”, now changed to “the only fake nudes, are my nudes”
So it’s perfectly ok as long as you’re cool?
imho, not dissimilar to model planes>drones.
To operate a model plane, there was a not-small amount of effort you needed to work through (building, specialist components, local club, access to a proper field, etc.).
This meant that by the time you were flying, you probably had a pretty good understanding of being responsible with the new skill.
In the era of self-stabilising GPS guided UAVs delivered next-day ready-to-fly, the barrier to entry flew down.
And it took a little while for the legislation to catch up from “the clubs are usually sensible” to “don’t fly a 2KG drone over a crowd of people at head height with no experience or training”
Scale also, you can create nudes of everyone on Earth in a fraction of the time it would take with Photoshop. All for the lowly cost of electricity.
I mean its as easy as cut and paste.
deleted by creator
What if my goal is to constantly be led around by media every decade to fear things needlessly when when they use the same lazy appeals every decade.
The right have immigrant headlines. The left seem to hate AI and technology now.
What’s mind blowing is how the same headline are used for both.
deleted by creator
Have you tried to get consistent goal orientated results from these ai tools.
To reliably generate a person you need to configure many components, fiddle with the prompts and constantly tweak.
To do this well in my eyes is a fair bit harder than learning how to use the magic wand in Photoshop.
I mean, inpainting isn’t particularly hard to make use of. There are also tools specifically for the purpose of generating “deepfake” nudes. The barrier for entry is much, much lower.
deleted by creator
You could also just find the promps online and paste them in.
It would also take a lot more effort to get something even remotely believable. You would need to go through thousands of body and face photos to get a decent match and then put in some effort pairing the two photos together. A decent “nude” photo of a celebrity would probably take at least a day to make the first one.
so anyone can do it and the victim can be your neighbor next door, not some celebrity, where you can internally normalize it with “well, it is a price of fame”
unfortunately, this list is only going to grow: https://en.wikipedia.org/wiki/List_of_suicides_attributed_to_bullying
When Photoshop first appeared, image manipulations that would seem obvious and amateurish by today’s standards were considered very convincing—the level of skill needed to fool large numbers of people didn’t increase until people became more familiar with the technology and more vigilant at spotting it. I suspect the same process will play out with AI images—in a few years people will be much more experienced at detecting them, and making a convincing fake will take as much effort as it now does in Photoshop.
Nope, the ai will continue to get better, and soon spotting the fakes will be nearly impossible.
deleted by creator