A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.
Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.
Are you stupid? Something has to be in the training model for any generation to be possible. This is just a new way to revitalise kids
Not necessarily, AI can do wild things with combined attributes.
That said, I do feel very uncomfortable with the amount of defense of this guy, he was distributing this to people. If he was just generating fake images of fake people using legal training data in his own house for his own viewing, that would be a different story. The amount of people jumping in front of the bullet for this guy when we don’t really know the details is the larger problem.
So are you suggesting they can get an unaltered facial I.D. of the kids in the images? —Because that makes it regular csam with a specific victim (as mentioned), not an ai generative illustration.
No, I am telling you csam images can’t be generated by an algorithm that hasn’t trained on csam
That’s patently false.
I’m not going to continue to entertain this discussion but instead I’m just going to direct you to the multiple other people who have already effectively disproven this argument and similar arguments elsewhere in this post’s discusion. Enjoy.
Also if you’d like to see how the corn dog comment is absurd and wrong. Go look up my comment.
Sure thing bud. Sure thing 🙄