Rep. Joe Morelle, D.-N.Y., appeared with a New Jersey high school victim of nonconsensual sexually explicit deepfakes to discuss a bill stalled in the House.

  • TwilightVulpine@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    I’m as suspicious of “think of the children” stuff as anyone here but I don’t see how we are fighting for the rights of the people by defending non-consensual deepfake porn impersonation, of children or anyone.

    If someone makes deepfake porn of my little cousin or Emma Watson, there’s no scenario where this isn’t a shitty thing to do to a person, and I don’t see how the masses are being oppressed by this being banned. What, do we need to deepfake Joe Biden getting it on to protest against the government?

    Not only the harassment of being subjected to something like this seems horrible, it’s reasonable to say that people ought to have rights over their own likeness, no? It’s not even a matter of journalistic interest because it’s something completely made-up.

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      We’re not talking about whether we should make fakes. We’re talking about whether people who do, should be prosecuted - IE physically overpowered by police officers, restrained with handcuffs, and locked up in a prison cell. Some empathy?

      If some classmate of your little cousin makes a fake, should the police come and drag them out of school and throw them in prison? You think that would help?

      Realistically, it’s as likely to happen as prosecution of kids who “get into fights” for assault. Kids tell mean lies about each other but that is not resolved in civil suits over defamation. Even between adults, that’s not the usual thing.

      Civil suits under this bill would be mainly targeted against internet services, because they have the money. And it would largely be used over celebrity fakes. That’s the overwhelming part of fakes out there and they have the money to splurge on suing people who can’t pay. It would be wealthy, powerful people using it against horny teens.

      Also, this bill is so ripe for industrial abuse. Insert a risqué scene in a movie, and suddenly “pirates” can be prosecuted under this.

      • wildginger@lemmy.myserv.one
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        If my little cousin makes AI child porn, of anyone at all let alone a classmate he knows physically in real life, I dont think he should be allowed to kick his feet and go about his day.

        Like… Making kiddie porn of your classmates is not excusable because youre a horny teen. Sorry, bud, its fucking not

        • General_Effort@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          8 months ago

          If two 14-year-olds get it on, they should both be prosecuted for child abuse? That is what you are actually saying?

          • wildginger@lemmy.myserv.one
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            8 months ago

            You can only fuck by creating AI porn of the person you are trying to have sex with against their will? Are you a robot?

            The people who think creating non consentual AI child porn is equivalent to sex need to spend time outside