McDonald’s is removing artificial intelligence (AI) powered ordering technology from its drive-through restaurants in the US, after customers shared its comical mishaps online.

A trial of the system, which was developed by IBM and uses voice recognition software to process orders, was announced in 2019.

It has not proved entirely reliable, however, resulting in viral videos of bizarre misinterpreted orders ranging from bacon-topped ice cream to hundreds of dollars’ worth of chicken nuggets.

  • DrCake@lemmy.world
    link
    fedilink
    English
    arrow-up
    48
    arrow-down
    2
    ·
    5 months ago

    Wasn’t this just voice recognition for orders? We’ve been doing this for years without it being called AI, but I guess now the marketing people are in charge

      • daddy32@lemmy.world
        link
        fedilink
        English
        arrow-up
        22
        ·
        5 months ago

        Voice recognition is “AI“*, it even uses the same technical architecture as the most popular applications of AI - Artificial neural networks.

        * - depending on the definition of course.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 months ago

          Well, given that we’re calling pretty much anything AI these days, it probably fits.

          But I honestly don’t consider static models to be “AI,” I only consider it “AI” if it actively adjusts the model as it operates. Everything else is some specific field, like NLP, ML, etc. If it’s not “learning,” it’s just a statistical model that gets updated periodically.

    • brianorca@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      It’s more than voice recognition, since it must also parse a wide variety of sentence structure into a discreet order, as well as answer questions.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        5 months ago

        Honestly, it doesn’t need to be that complex:

        • X <menu item> [<ala carte | combo meal>]
        • extra <topping>
        • <size> <soda brand>

        There’s probably a dozen or so more, but it really shouldn’t need to understand natural language, it can just work off keywords.

        • brianorca@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          5 months ago

          You can do that kind of imposed structure if it’s an internal tool used by employees. But if the public is using it, it has better be able to parse whatever the consumer is saying. Somebody will say “I want a burger and a coke, but hold the mustard. And add some fries. No make it two of each.” And it won’t fit your predefined syntax.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 months ago

            Idk, you could probably just show the grammar on the screen, and also allow manual entry (if insider) or fallback to a human.

            That way you’d get errors (sorry, I didn’t understand that) instead of wrong orders with a pretty high degree of confidence. As long as there’s a fallback, it should be fine.

            Anyway, that’s my take. I’m probably wrong though since I don’t deal with retail customers.

    • exu@feditown.com
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      5 months ago

      New stuff gets called AI until it is useful, then we call it something else.