• ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
    link
    fedilink
    arrow-up
    21
    ·
    edit-2
    10 days ago

    I’m not sure where you got that from, but the only thing from huggingface is the data set used to train it.

    Dubbed CraftGPT, this version of the Large Language Model is decidedly compact as far as these things go, featuring only about 5 million parameters (as that’s all the creator’s “poor old laptop” could handle). However, translating that into Minecraft blocks took up a considerable amount of space.

    Featuring 439 million blocks, this build required the Distant Horizons mod to keep everything on screen and operational. Beyond that, sammyuri claims that this build was made using only vanilla Minecraft’s own redstone mechanics. Built over many months, I find that’s a creative use of time still preferable to all the hours I spend doomscrolling.

    there’s even a whole video linked showing it in action https://www.youtube.com/watch?v=VaeI9YgE1o8

    • NichtElias@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      8 days ago

      It was trained with python, it’s running entirely in Minecraft though. If it takes hours to get a response the training would probably have taken years or longer