• GregorGizeh@lemmy.zip
    link
    fedilink
    arrow-up
    69
    arrow-down
    29
    ·
    edit-2
    5 months ago

    My personal belief still is that the prohibitive approach is futile and ultimately more harmful than the alternative: embrace the technology, promote it and create deepfakes of everyone.

    Soon the taboo will be gone, the appeal as well, and everyone will have plausible deniability too, because if there are dozens of fake nudes of any given person then who is to say which are real, and why does it even matter at that point?

    This would be a great opportunity to advance our societal values and morals beyond prudish notions, but instead we double down on them.

    E: just to clarify I do not at all want to endorse creating nudity of minors here. Just point out that the girl in the article wouldn’t have to humiliate herself trying to do damage control in the above scenario, because it would be entirely unimportant.

    • Chozo@fedia.io
      link
      fedilink
      arrow-up
      67
      arrow-down
      4
      ·
      5 months ago

      While I think removing the stigma associated with having deepfakes made of you is important, I don’t think that desensitization through exposure is the way to go about it. That will cause a lot of damage leading up to the point you’re trying to reach.

      • wewbull@feddit.uk
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        3
        ·
        5 months ago

        I don’t seen how else you do it.

        “Removing the stigma” is desensitizing by definition. So you want to desensitize through… what? Education?

        • Chozo@fedia.io
          link
          fedilink
          arrow-up
          20
          arrow-down
          2
          ·
          5 months ago

          I dunno, but preferably some method which doesn’t involve a bunch of children committing suicide in the meantime.

          • Instigate@aussie.zone
            link
            fedilink
            arrow-up
            15
            arrow-down
            2
            ·
            5 months ago

            As a child protection caseworker, I’m right here with you. The amount of children and young people I’m working with who are self-harming and experiencing suicidal ideation over this stuff is quite prevalent. Sadly, it’s almost all girls who are targeted by this and it’s just another way to push misogyny into the next generation. Desensitisation isn’t the way; it will absolutely cause too much harm before it equalises.

      • Mango@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        8
        ·
        5 months ago

        Eve seen a deep fake nude of someone ugly? People make them because they wanna see you naked. Can’t see how that’s an insult.

    • madcaesar@lemmy.world
      link
      fedilink
      arrow-up
      12
      ·
      5 months ago

      I second this motion. People also need to stop posting images of themselves all over the web. Especially their own kids. Parents plastering their kids images all over social media should not be condoned.

      And on a related note we need much better sex-education in this country and a much healthier relationship with nudity.

    • jsomae@lemmy.ml
      link
      fedilink
      arrow-up
      24
      arrow-down
      1
      ·
      5 months ago

      This sounds like a cool idea because it is a novel approach, and it appeals to my general heuristic of the inevitability of technology and freedom. However, I don’t think it’s actually a good idea. People are entitled privacy, on this I hope we agree – and I believe this is because of something more fundamental: people are entitled dignity. If you think we’ll reach a point in this lifetime where it will be too commonplace to be a threat to someone’s dignity, I just don’t agree.

      Not saying the solution is to ban the technology though.

      • fatalError@lemmy.sdf.org
        link
        fedilink
        arrow-up
        18
        arrow-down
        2
        ·
        5 months ago

        When you put out photos of yourself on the internet you should expect anyone to find them and do whatever they want to them. If you aren’t expecting that, then you aren’t educated enough on how internet works and that’s what we should be working on. Social media is really bad for privacy and many people are not aware of it.

        Now if someone took a picture of you and then edited it without your consent, that is a different action and it’s a lot more serious offense.

        Either way, deepfakes are just an evolution of something that already existed before and isn’t going away anytime soon.

        • jsomae@lemmy.ml
          link
          fedilink
          arrow-up
          9
          arrow-down
          1
          ·
          edit-2
          5 months ago

          Yeah I mean it’s just a more easy to use Photoshop basically.

          I agree people need to understand better the privacy risks of social media.

          When you put out photos of yourself on the internet you should expect anyone to find them and do whatever they want to them.

          Expect, yeah I guess. Doesn’t mean we should tolerate it. I expect murder to happen on a daily basis. People editing images of me on their own devices and keeping that to themself, that’s their business. But if they edit photos of me and proliferate, I think it becomes my business. Fortunately, there are no photos of me on the internet.

          Edit: I basically agree with you regarding text content. I’m not sure why I feel different about images of me. Maybe because it’s a fingerprint. I don’t mind so much people editing pictures I post that don’t include my face. Hmm.

          • wewbull@feddit.uk
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            5 months ago

            Yeah I mean it’s just a more easy to use Photoshop basically.

            Photoshop has the same technology baked into it now. Sure, it has “safeguards” so it may not generate nudes, but it would have no trouble depicting someone “having dinner with Bill Cosby” or whatever you feel is reputation destroying.

            • gaylord_fartmaster@lemmy.world
              link
              fedilink
              arrow-up
              6
              arrow-down
              1
              ·
              5 months ago

              Pretty sure they’re talking about generative AI created deepfakes being easier than manually cutting out someone’s face and pasting it on a photo of a naked person, not comparing Adobe’s AI to a different model.

    • PopOfAfrica@lemmy.world
      link
      fedilink
      arrow-up
      21
      arrow-down
      2
      ·
      5 months ago

      It’s also worth noting that too many people put out way too much imagery of themselves online. People have got to start expecting that anything you put out in the public domain becomes public domain.