• wiegell@feddit.dk
    link
    fedilink
    arrow-up
    2
    ·
    20 hours ago

    My naive hope is that local models or maybe workplace distributed clusters catch up and the cloud based bubble bursts. I am under the impression, that atm. a big difference as to whether a tool works well or not is more related to how well all the software around the actual llm is constructed. E.g. for discovery - being able to quickly ingest an internet url and access a web index is a big force of the cloud based providers atm. And for coding it has a lot to do with quickly searching and finding the relevant parts of the codebase and evaluate whether the llm has all the required information to correctly perform a task.