• bionicjoey@lemmy.ca
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    ChatGPT doesn’t understand the things it says. It shouldn’t be treated as a source of truth. It can be tripped up by nuance, or by statements which require an understanding of the concept of syntax. For example, if you ask it what is the longest 5-letter word, it will confidently give you an answer.