Code used in the analysis is here

  • gedaliyah@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    8 months ago

    The courts have already established that the user is still responsible for the tool, even if the tool is very sophisticated.

    • CameronDev@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      8 months ago

      Have they? There is the air canada thing, but that was kinda a different situation, the chat bot was explicitly acting for the company, and made direct claims for the company?

      IANAL, but proving discrimination was already hard, and now they can just point at the black box and blame it, so its gonna get harder?

      • T156@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        8 months ago

        IANAL, but proving discrimination was already hard, and now they can just point at the black box and blame it, so its gonna get harder?

        Especially if it gets rolled into other checks, like a police check, or a “personality fit”, which makes it more ambiguous.