A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • spicystraw@lemmy.world
    link
    fedilink
    arrow-up
    25
    arrow-down
    16
    ·
    4 months ago

    I must admit, amount of comments that are defending AI images as not child porn is truly shocking.

    In my book, sexual images of children are not okay, AI generated or otherwise. Pedophiles need help, counseling and therapy. Not images that enable something I think is not acceptable in society.

    I truly do believe that AI images should be subject to same standards as regular images in what content we deem appropriate or not.

    Yes, this can be used to wrongfully prosecute innocent people, but it does not mean that we should freely allow AI-CP.

    • Landless2029@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      6
      ·
      4 months ago

      Cant speak for others but I agree that AI-CP should be illegal.

      The question is how do we define the crime with our current laws? It does seem like we need a new law to address AI images. Both for things like AI-CP, revenge porn, and slanderous/misleading photos. (The Communist Harris and Trump with black people photos)

      Where do we draw the line?
      How do we regulate it?
      Forced watermarks/labels on all tools?
      Jail time? Fines?
      Forced correction notices? (Doesn’t work for the news!)

      This is all a slippery slope but what I can say is I hope this goes to court. He looses. Appeals. Then it goes all the way up to federal so we can have a standard to point to.

      The shit wrong.
      Step one in fixing shit.

    • WormFood@lemmy.world
      link
      fedilink
      arrow-up
      9
      arrow-down
      13
      ·
      4 months ago

      the number of people willing to bat for this on Lemmy is truly disturbing. what do they think these ai models are trained on?

      • ZeroHora@lemmy.ml
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        4 months ago

        No necessarily is trained on CP, could be trained with images of children (already fuck up, who gave them that permission?) and pornography.

        • kaffiene@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          2
          ·
          4 months ago

          The article pointed out that stable diffusion was trained using a dataset containing CSAM

    • OutsizedWalrus@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      4
      ·
      4 months ago

      You’re not kidding.

      The only possible way I could see a defense if it were something like “AI CSAM results in a proven reduction of actual CSAM”.

      But. The defenses aren’t even that!

      They’re literally saying that CSAM is okay. I’m guessing a lot of these same comments would argue that deepfakes are okay as well. Just a completely fucked up perspective.

    • 31337@sh.itjust.works
      link
      fedilink
      arrow-up
      13
      ·
      4 months ago

      I generally think if something is not causing harm to others, it shouldn’t be illegal. I don’t know if “generated” CSAM causes harm to others though. I looked it up and it appears the research on whether CSAM consumption increases the likelihood of a person committing child abuse is inconclusive.

    • tron@midwest.social
      link
      fedilink
      arrow-up
      14
      arrow-down
      13
      ·
      4 months ago

      Pedophiles need help, counseling and therapy. Not images that enable something I think is not acceptable in society.

      I mean 30-40 years ago you could replace the word pedophile with homosexual and a vast majority of people would agree. I’m not defending pedophilia here but it’s important to remember these people are born the way they are. Nothing is going to change that, new pedophiles are born every day. They will never go away. The same way you can’t change gay or transgender people. Repressing sexual desire never works look at the priests in the Catholic Church. A healthy outlet such as AI generated porn could save a lot of actual children from harm. I think that should be looked into.

      • spicystraw@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        4 months ago

        Look, I get what you are saying and I do agree. However, I don’t think that comparing pedophilic relations to LGBTQ struggles is fair. One is consented relationship between consenting adults, other is exploitation and high probability of setup for lifelong mental struggles from young age.

      • nickiwest@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        4 months ago

        I would like to know what source you have for claiming that pedophiles are “born the way they are.”

        We understand some of the genetic and intrauterine developmental reasons for homosexuality, being trans, etc. That has scientific backing, and our understanding continues to grow and expand.

        Lumping child predators in with consenting adults smacks of the evangelical slippery slope argument against all forms of what they consider to be “sexual deviance.” I’m not buying it.

    • filcuk@lemmy.zip
      link
      fedilink
      arrow-up
      9
      arrow-down
      3
      ·
      edit-2
      4 months ago

      Agreed, especially considering it will eventually become indistinguishable.