• Diabolo96@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    9
    ·
    edit-2
    1 year ago

    for consolation, at least an AI wouldn’t crack a joke with his buddies about doing a strike after bombing a whole neighborhood, a school and a hospital.

    Notice : I didn’t read the article. My internet is slower than a snail going backwards on a ramp so i couldn’t read it

    • joelfromaus@aussie.zone
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Turns out they used existing pilot information to train the AI so it’ll still bomb a supermarket and make a “fire sale” joke.

      P.s. I also didn’t read the article.

    • Anony Moose@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      Reminded me of the very campy movie Stealth where an AF AI plane/drone goes rogue. I hope they have lightning strike protection on these things!

  • colournoun@beehaw.org
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    Well, time to go watch Black Mirror again. You know, the one with the robot dog that hunts you, or the one with the quadcopters that kamikaze tap you on the head with explosives?

  • Send_me_nude_girls@feddit.de
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    1 year ago

    I’m pro AI, I see a lot potential in it. I actively use AI daily by now, be it YT algorithm, AI image generation or chatbots for a quick search. But full automated AI with access to weapons should never be a thing. It’s like giving a toddler a sharp knife and you never know who gets stabbed. Sure humans do error, but humans also are more patient and will stop if they aren’t sure. It’s better to have a pilot not bomb the house instead of accidentally bombing a playground with kids, because the AI had a hickup.

  • khalic@beehaw.org
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I see the arms industry is the latest to use the AI frenzy to sell some more products