Disclaimer: I am not informed about the development of AI, LLMs, etc. or what we were doing with them pre-covid.
I remember them exploding when everyone was doing at home/ hybrid classes, and I wonder how much of an effect on things that really had.
Disclaimer: I am not informed about the development of AI, LLMs, etc. or what we were doing with them pre-covid.
I remember them exploding when everyone was doing at home/ hybrid classes, and I wonder how much of an effect on things that really had.
Without spending a lot of time digging into it, I don’t think that the pandemic was a major enabling factor.
Like, work on hardware-driven larger neural nets and generation of images had been going on pre-pandemic. I remember articles about Google researchers working on them quite some time back.
searches
https://en.wikipedia.org/wiki/DeepDream
https://research.google/blog/inceptionism-going-deeper-into-neural-networks/
Like, if you’ve used a diffusion model like Stable Diffusion and run low-iteration stuff in Automatic1111 or ComfyUI or something, that looks pretty familiar.
And that was being released in 2015, and was based on pre-existing work.
I think that the “why now” question mostly just has to do with hardware reaching the point where you can do some significantly-more-interesting things with neural nets, coupled with some mostly-iterative software improvements.
I don’t think that it was a “people are staying inside due to the pandemic and that drove a lot of change” in the sort of sense that, say, there was an impact on video game sales, something that I was talking about on here recently.