• webadict@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    4 hours ago

    Holy shit. This is the craziest article to write about one of the shittiest videos I have ever seen.

    That video is glazing the fuck out of LLMs, and the creator knows jackshit about how AIs or even computers work. What a fucking moron.

    So, like, the point of the experiment is that LLMs will generate outputs based on their inputs, and then those outputs are interpreted by an intermediary program to do things in games. And the video is trying to pretend that this is LITERALLY a new intelligent species emerging because you never told it to do anything other than its initial goal! Which… Isn’t impressive? LLMs generate outputs based on their datasets, like, that’s not in question. That isn’t intelligence, because it is just one giant mathematics problem.

    This article is a giant pile of shit.

    • bleistift2@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      4
      ·
      2 hours ago

      If you argue like that then neither intelligence nor societies exist. A the fundamental level, every neuron just computes its output from its inputs, quite predictably even. That doesn’t mean emergent behaviours cannot exist.

      • webadict@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 hour ago

        Just as a brain is not a giant statistics problem, LLMs are not intelligent. LLMs are basically large math problems that take what you put into them and calculate the remainder. That isn’t an emergent behavior. That isn’t intelligence at all.

        If I type into a calculator 20*10 and it gives me 400, is that a sign of intelligence that the calculator can do math? I never programmed it to know what 10 or 20 or 400 were, though I did make it know what multiplication is and what digits and numbers are, but those particular things it totally created on its own after that!!!

        When you type a sentence into an LLM and it returns with an approximation of what a response sounds like, you should treat it the same way. People programmed these things to do the things that they are doing, so what behavior is fucking emergent?