• Hawke@lemmy.world
    link
    fedilink
    arrow-up
    75
    arrow-down
    9
    ·
    3 months ago

    Better title: “Photographers complain when their use of AI is identified as such”

    • CabbageRelish@midwest.social
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      3 months ago

      People are complaining that an advanced fill tool that’s mostly used to remove a smudge or something is automatically marking a full image as an AI creation. As-is if someone actually wants to bypass this “check” all they have to do is strip the image’s metadata before uploading it.

    • Valmond@lemmy.world
      link
      fedilink
      arrow-up
      20
      arrow-down
      5
      ·
      3 months ago

      “It was just a so little itsy bitsy teeny weeny AI edit!!”

      Please don’t flag AI please!

  • TastyWheat@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    6
    ·
    3 months ago

    Hey guys, I cheated in my exam using AI but I was the one who actually wrote down the answer. Why did I fail?

    • WolfLink@sh.itjust.works
      link
      fedilink
      arrow-up
      31
      arrow-down
      4
      ·
      edit-2
      3 months ago

      It’s exaggerated but it gets the point across: I too would like to know if AI tools were used to make even part of the image.

      There’s a reason any editing is banned from many photography contests.

      If they want to make a distinction between “made using AI” and “entirely AI generated”, sure. But “made using AI” completely accurately describes an image that used AI to generate parts of the image that were inconvenient in the original photo.

  • IIII@lemmy.world
    link
    fedilink
    arrow-up
    25
    ·
    3 months ago

    Can’t wait for people to deliberately add the metadata to their image as a meme, such that a legit photograph without any AI used gets the unremovable made with ai tag

  • Uncaged_Jay@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    3 months ago

    I saw this coming from a mile away. We will now have to set standards for what’s considered “made by AI” and “Made with AI”

    • Thorny_Insight@lemm.ee
      link
      fedilink
      arrow-up
      16
      ·
      3 months ago

      I don’t think that’s fair. AI wont turn a bad photograph into a good one. It’s a tool that quickly and automatically does something we’ve been doing by hand untill now. That’s kind of like saying a photoshopped picture isn’t “good” or “real”. They’re all photoshopped. Not a single serious photographer releases unedited photos except perhaps the ones shooting on film.

      • Zelaf@sopuli.xyz
        link
        fedilink
        arrow-up
        3
        ·
        3 months ago

        Even finns photographers touch up their photos, either during development by adjusting how long they sit in one or the chemical processes or by using different methods of shaking/mixing processes and techniques.

        If they enlarge their negatives on photo paper they often have tools to add lightness and darkness to different areas of the paper to help with exposure, contrast and subject highlighting. AKA. Dodging and burning which is also available in most photo editing software today.

        There are loads of things to do to improve developed photos and been something that has always been something that photographers/developers do. People who still go with the “Don’t edit photos” BS are usually not very well informed about photo history and techniques of their photography inspirations.

  • Zelaf@sopuli.xyz
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    3 months ago

    As a photographer I’m a bit torn on this one.

    I believe AI art should definitely be labeled to minimize people being mislead about the source of the art. But at the same time the OP on the Adobe forums post did say they used it as any other tool for touching up and fixing inconsistencies.

    If I were to for example arrange a photoshoot with a model and they happened to have a zit that day on their forehead of course I’m gonna edit that out. Or if I happened to have an assistant with me that got in the shot but I don’t want to crop in making the background and feel of the photo tighter I would gladly remove that too. Sure Adobe already has the patch, clone and even magic eraser tool (Which also uses AI, that might or might not mark photos) to do these fix-ups but if I can use AI, that I hope is trained on data they’re actually allowed to train on, I think I would prefer that because if I’m gonna spend 10 to 30 minutes fixing blemishes, zits and what not I’d much prefer to use the AI tools to get my job done quicker.

    If the tools were however used to rigorously change, modify and edit the scene and subject then for sure, it might be best to add that.

    Wouldn’t it be better to not discourage the use of editing tools when those tools are used in a way that just makes one’s job quicker? If I were to use Lightrooms subject quick selection, should it be slapped on then? Or if I were to use an editing preset created with AI that automatically adjusts the basic settings of an image and further my editing from that, should the label be created then? Or if I have a flat white background with some tapestry pattern and don’t want to spend hours getting the alignment of the pattern just right as I try to fix a minor aspect ratio issue or want to get just a bit more breathing room on the subject and I use the mentioned AI tool in the OP.

    Things OP mentioned in his post and the scenarios I mentioned are all things you can do without AI anyways it just takes a lot longer sometimes, there’s no cheating in using the right tool for the right job IMO. I don’t think it’s too far off from someone who makes sculptures in clay uses an ice scream scoop with ridges to create texture or a Dremel to touch up and fix corners. Or a painter using different tools and brushes and scrapers to finish their painting.

    Perhaps a better idea would be if we want to make the labels “fair” there should also be a label that the photo has been manipulated by a program in general or maybe add a percentage indicator to see how much of it has been edited specifically with AI. Slapping an “AI” label on someone because they decided to get equal results by using another tool to do normal touch-ups to a photo could potentially be damaging to ones career and credibility when it doesn’t say how much of it was AI or in what reach, because now there’s the chance someone might be looking for their next wedding photographer and be discouraged because of the bad rep regarding AI.

    • parody@lemmings.worldOP
      link
      fedilink
      arrow-up
      2
      ·
      3 months ago

      trained on data they’re actually allowed to train on

      That’s the ticket. For touchups, certainly, that’s the key: did theft help, or not?

      • Zelaf@sopuli.xyz
        link
        fedilink
        arrow-up
        2
        ·
        3 months ago

        Indeed, if the AI was trained based on theft it’s neither right on their part or ethical on mine.

        I did some searching but sadly don’t have time to look into it more but there were some concerning articles that would suggest they have either used shady practices to get their training data or users having to manually check an opt out box in the app settings.

        I can’t make an opinion on it right now before looking into it more but my core argument about using AI itself in this manner, even if that data was your own on your own trained AI using allowed resources, I still believe somewhat holds.

  • hperrin@lemmy.world
    link
    fedilink
    arrow-up
    42
    arrow-down
    7
    ·
    3 months ago

    The label is accurate. Quit using AI if you don’t want your images labeled as such.

  • conciselyverbose@sh.itjust.works
    link
    fedilink
    arrow-up
    17
    arrow-down
    2
    ·
    3 months ago

    This isn’t really Facebook. This is Adobe not drawing a distinction between smart pattern recognition for backgrounds/textures and real image generation of primary content.

  • pyre@lemmy.world
    link
    fedilink
    arrow-up
    33
    arrow-down
    4
    ·
    3 months ago

    or… don’t use generative fill. if all you did was remove something, regular methods do more than enough. with generative fill you can just select a part and say now add a polar bear. there’s no way of knowing how much has changed.

    • thedirtyknapkin@lemmy.world
      link
      fedilink
      arrow-up
      16
      arrow-down
      2
      ·
      edit-2
      3 months ago

      there’s a lot more than generative fill.

      ai denoise, ai masking, ai image recognition and sorting.

      hell, every phone is using some kind of “ai enhanced” noise reduction by default these days. these are just better versions of existing tools than have been used for decades.

    • BigPotato@lemmy.world
      link
      fedilink
      arrow-up
      28
      arrow-down
      4
      ·
      3 months ago

      Right? I thought I went crazy when I got to “I just used Generative Fill!” Like, he didn’t just auto adjust the exposure and black levels! C’mon!

  • nutsack@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    3 months ago

    I saw a video posted by someone who claimed to have taught their cat how to skateboard. and at the bottom it was tagged made with AI.

    meta w

    • parody@lemmings.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      Did they just e.g. remove a passing car from the background*, and will tags on some images lead to untagged fake images being trusted more? Oh this fun new world we’re in.

      *as someone else pointed out, if it was a minor edit, was the underlying technology using legit training data or unlicensed stuff