Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

  • LostXOR@fedia.io
    link
    fedilink
    arrow-up
    18
    ·
    1 day ago

    Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?

    • Lka1988@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      18 hours ago

      I would consider that as qualifying. Because it’s targeted harassment in a sexually-explicit manner. All the girl would have to do is claim it’s her.

      Source: I’m a father of teenage daughters. I would pursue the individual(s) who started it and make them regret their choices.

    • lath@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      3
      ·
      1 day ago

      I don’t know personally. The admins of the fediverse likely do, considering it’s something they’ve had to deal with from the start. So, they can likely answer much better than I might be able to.

    • surewhynotlem@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      8
      ·
      1 day ago

      Drawing a sexy cartoon that looks like an adult, with a caption that says “I’m 12”, counts. So yeah, probably.