After months of complaints from the Authors Guild and other groups, Amazon.com has started requiring writers who want to sell books through its e-book program to tell the company in advance that their work includes artificial intelligence material.

The Authors Guild praised the new regulations, which were posted Wednesday, as a “welcome first step” toward deterring the proliferation of computer-generated books on the online retailer’s site. Many writers feared computer-generated books could crowd out traditional works and would be unfair to consumers who didn’t know they were buying AI content.

In a statement posted on its website, the Guild expressed gratitude toward “the Amazon team for taking our concerns into account and enacting this important step toward ensuring transparency and accountability for AI-generated content.”

  • dethb0y@lemmy.world
    link
    fedilink
    arrow-up
    16
    ·
    1 year ago

    Considering what a total wasteland amazon’s self published section is, i don’t know that it could be much worse.

    Of course any author with an IQ over 70 would have the good sense to never disclose they were using AI.

    • Natanael@slrpnk.net
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      What’s worse is AI generated books on mushrooms, etc, which can be literally deadly (and yes such books has already been published!)

      • dethb0y@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        13
        ·
        1 year ago

        Darwin in action. Anyone who’d use a guidebook to figure out what mushrooms to eat is gonna have a bad time regardless, it’s not really something you can sum up in a book in a safe way.

    • Flying Squid@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      My mother was originally going to self-publish her first novel on Amazon, but she realized what a scam it was. I’m glad she found a real publisher. She’s on her sixth book now. They aren’t bestsellers or anything, but people actually buy and read them. eBooks and physical copies.

  • donuts@kbin.social
    link
    fedilink
    arrow-up
    14
    ·
    1 year ago

    This is absolutely a good move, though I don’t know how effective it’ll be on its own. Unfettered AI garbage “content” is soon going to flood every storefront and service around, and the only way to really solve it is to close things down and move to more highly curated platforms. I wish that wasn’t the case, but I can image a future where it’s hard to find anything worthwhile in a sea of AI-generated junk.

  • Jax@sh.itjust.works
    link
    fedilink
    arrow-up
    10
    arrow-down
    9
    ·
    edit-2
    1 year ago

    Good. Unless you’re using an AI you yourself cultivated using your own creations: you’re plagiarizing with extra steps.

        • Chozo@kbin.social
          link
          fedilink
          arrow-up
          10
          arrow-down
          5
          ·
          1 year ago

          You joke, but you bring up an excellent point as to why I dislike the “AI is plagiary” argument that I see a lot of these days.

          Everything is plagiarized in some way. No thought is truly original. Unless you spend your whole life with zero contact to anyone or anything and consume zero media of any form (in which case, have fun conveying your original thoughts with the language you’ve had to invent for yourself that nobody else could possibly translate), then every idea is based on another idea before it. Every single thought has an inspiration behind it. LLMs aren’t just copy/pasting content; the actual logic behind generative content production is incredibly similar to how people form thoughts and ideas of their own.

          That said, if you’re writing a book using AI, I’d argue a case for laziness more than plagiary. Though I don’t see an inherent problem in using AI to help write a story. But if the whole book is AI-generated, I can’t imagine it will be good enough to sell enough to justify the time and effort it takes to produce that amount of text and have it published, so I wouldn’t foresee it being a very widespread problem just yet.

      • Jax@sh.itjust.works
        link
        fedilink
        arrow-up
        5
        arrow-down
        7
        ·
        1 year ago

        Taking inspiration from something is different than creating a Frankensteins monster. AIs replicate, they do not create.

        • Chozo@kbin.social
          link
          fedilink
          arrow-up
          7
          arrow-down
          4
          ·
          1 year ago

          That’s not actually how generative AI works. LLMs don’t copy/paste material, unless deliberately instructed to. And even then, most are coded in a way that it will still not reproduce it’s training material word-for-word.

          • Jax@sh.itjust.works
            link
            fedilink
            arrow-up
            4
            arrow-down
            1
            ·
            1 year ago

            Yes, change a few words here and there: it totally isn’t plagiarism!

            I’m not arguing this with people who have likely never created anything other than code.

            • Chozo@kbin.social
              link
              fedilink
              arrow-up
              2
              arrow-down
              2
              ·
              edit-2
              1 year ago

              Again, not how LLMs work. Maybe before you decide who you do and don’t argue with, you should decide if you even should argue something you don’t understand in the first place.

    • iforgotmyinstance@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      6
      ·
      1 year ago

      Overly simplistic outlook.

      If you provide the sources and direct the LLM to use those sources, and then proofread the damn thing and cite the sources, it flat out is not plagiarism.

      It’s as much plagiarism as using a calculator is to find square roots.

  • rivermonster@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    7
    ·
    edit-2
    1 year ago

    Here comes the chorus of people who know nothing about how AI/machinelearning/generative models work.

    The problem is capitalism. There are no safe jobs, and we need a UBI and shouldn’t have to work with AI productivity gains.

    The bitching about ai and stupid statements from those ignorant to how it works are all out of a fear of lost income which is a fear instilled by zero sum capitalism.

    Build guillotines, make all productivity gains from AI 100% taxed, and relax.

    • treefrog@lemm.ee
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      The article really isn’t about this so this comment probably would have been better as a response to some of the bitching above.

      If AI writes something, it should be flagged as AI. Right now there’s AI written mushroom foraging books on Amazon. If those books weren’t proofread by a mycologist or skilled forager and someone trusts that information, an AI hallucination could get them killed.