

Not sure if you really want to know, but a Google paper is where transformers (backbone of LLMs) were first mentioned (2016 I believe). Google initially used transformers for translations and eventually search, but OpenAI experimented with them for text generation (gpt 1+) eventually leading to chatgpt.






What’s hilarious is that all feels like bottom of the barrel stuff, like we should just be doing that stuff by default.