I’m interested in developing the skill to estimate probabilities for real-world situations. What are the best ways to learn this systematically? Are there books, courses, or exercises that teach probabilistic thinking, Bayesian reasoning, or practical forecasting skills?

I want to check this math and see if the AI is gaslighting me.

  • Phoenixz@lemmy.ca
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    3 days ago

    see if the AI is gaslighting me.

    Aaahhwww crap, another one? Yes, the AI is likely wrong because no, it’s not intelligent at all, it’s a distance measuring database that learned for a long time to put bytes in a certain order so that it appears like intelligence to us. It is not. It has no concept of anything, its literally going "with this input, and with the previous word being “car”, the next likely word is “drives”

    It has no concept of what a car is, it doesn’t know what driving is. It could be an alien language for all it knows, it just knows that those words probably follow one another. With that, it just as easily dreams up nonsense, and most of the “facts” that it gives you will be factuali wrong.

    • dysprosium@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      6
      ·
      3 days ago

      Understanding and being correct are two different things my lord. Generative AI can be correct most of the time, while understanding 0%

      • Paragone@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        it can also be INcorrect most of the time!!

        ANY mind “educated” by ingesting the internetisn’t going to be high on accuracy!!

        People NEED to understand that getting parts of reddit out of LLM’s proves its untrustworthiness!!

        _ /\ _

        • dysprosium@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          1
          ·
          23 hours ago

          The internet is way bigger than social media sites, FYI

          People NEED to understand that getting parts of reddit out of LLM’s proves its untrustworthiness!!

          I’ve no idea what what you’re trying to say with this