I’m not sure where you got that from, but the only thing from huggingface is the data set used to train it.
Dubbed CraftGPT, this version of the Large Language Model is decidedly compact as far as these things go, featuring only about 5 million parameters (as that’s all the creator’s “poor old laptop” could handle). However, translating that into Minecraft blocks took up a considerable amount of space.
Featuring 439 million blocks, this build required the Distant Horizons mod to keep everything on screen and operational. Beyond that, sammyuri claims that this build was made using only vanilla Minecraft’s own redstone mechanics. Built over many months, I find that’s a creative use of time still preferable to all the hours I spend doomscrolling.
It was trained with python, it’s running entirely in Minecraft though. If it takes hours to get a response the training would probably have taken years or longer
I’m not sure where you got that from, but the only thing from huggingface is the data set used to train it.
there’s even a whole video linked showing it in action https://www.youtube.com/watch?v=VaeI9YgE1o8
It was trained with python, it’s running entirely in Minecraft though. If it takes hours to get a response the training would probably have taken years or longer