Dutch lawyers increasingly have to convince clients that they can’t rely on AI-generated legal advice because chatbots are often inaccurate, the Financieele Dagblad (FD) found when speaking to several lawfirms. A recent survey by Deloitte showed that 60 percent of lawfirms see clients trying to perform simple legal tasks with AI tools, hoping to achieve a faster turnaround or lower fees.

  • ToTheGraveMyLove@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 hours ago

    So you found five examples in the history of human aviation, how often do you think AI hallucinates information? Because I can guarantee you its a hell of a lot more frequently than that.

    • Lurking Hobbyist🕸️@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 hours ago

      You should check out Air Crash Investigation, amigo, all 26 seasons, you’d be surprised what humans in metal life support machines can cause when systems breakdown.

      • ToTheGraveMyLove@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        27 minutes ago

        I’m not watching 26 seasons of a TV show ffs, I’ve got better things to do with my time. Skimming the IMBD though, I’m seeing a lot of different causes for the crashes, from bad weather, to machine failure, to running out of fuel, improper maintenance, pilot errors, etc. Remember, my point had nothing to do with mechanical failure. Any machine can fail. My point was that airplanes don’t routinely spit out false information in the day-to-day function of the machine like AI does. You’re getting into strawman territory mate.