Just in case someone doesn’t know, LLM in this case means “Large Language Model”, which is just the technical term for things like ChatGPT.
Just in case someone doesn’t know, LLM in this case means “Large Language Model”, which is just the technical term for things like ChatGPT.
It’s like searching for a picture of Prague, seeing a drawing of Delhi, and then concluding you’ve been there. It’s not about laziness. It’s about accuracy.
Yeah, we’re not there yet, but the way things are going, I don’t see it being THAT far off. Maybe within 5 years it’ll be as accurate as anything else.
Yes, I think if we can get an LLM to work while providing high quality, real world sources it will be a game changing technology across domains. As it stands though, it’s like believing a magician really does magic. The tricks they employ are incredibly useful in a magic show, but if you expect them to really cast a fireball in your defense, you’ll be sorely mistaken.
Huh, I was under the opinion that ChatGPT cited it’s sources. I know others do it.
Did you read my comment at all? I was replying to a comment about the level of effort, which is what my analogy addresses.
Your hyperbole not withstanding, if the accuracy isnt good enough for you, dont use it. Lots of people find that LLMs are useful even in their current state of imperfect accuracy.
Did you read mine? If you wanted a depiction of a city, it’s more than good enough. In fact it’s amazing what it can do in that respect. My point is: it gets major details wrong in a way that feels right. That’s where the danger lies.
If your GPS consistently brought you to the wrong place, but you thought it was the right place, do you not think that might be a problem? No matter how many people found it useful, it could be dangerously wrong in some cases.
My worry is precisely because people find it so useful to “look things up”, paired with the fact that it has a tendency to wildly construct ‘information’ that feels true. It’s a real, serious problem that people need to understand when using it like that.