After months of complaints from the Authors Guild and other groups, Amazon.com has started requiring writers who want to sell books through its e-book program to tell the company in advance that their work includes artificial intelligence material.

The Authors Guild praised the new regulations, which were posted Wednesday, as a “welcome first step” toward deterring the proliferation of computer-generated books on the online retailer’s site. Many writers feared computer-generated books could crowd out traditional works and would be unfair to consumers who didn’t know they were buying AI content.

In a statement posted on its website, the Guild expressed gratitude toward “the Amazon team for taking our concerns into account and enacting this important step toward ensuring transparency and accountability for AI-generated content.”

  • Jax@sh.itjust.works
    link
    fedilink
    arrow-up
    5
    arrow-down
    7
    ·
    1 year ago

    Taking inspiration from something is different than creating a Frankensteins monster. AIs replicate, they do not create.

    • Chozo@kbin.social
      link
      fedilink
      arrow-up
      7
      arrow-down
      4
      ·
      1 year ago

      That’s not actually how generative AI works. LLMs don’t copy/paste material, unless deliberately instructed to. And even then, most are coded in a way that it will still not reproduce it’s training material word-for-word.

      • Jax@sh.itjust.works
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        1 year ago

        Yes, change a few words here and there: it totally isn’t plagiarism!

        I’m not arguing this with people who have likely never created anything other than code.

        • Chozo@kbin.social
          link
          fedilink
          arrow-up
          2
          arrow-down
          2
          ·
          edit-2
          1 year ago

          Again, not how LLMs work. Maybe before you decide who you do and don’t argue with, you should decide if you even should argue something you don’t understand in the first place.