I like inzoi’s use of AI. I think its really cool and its much needed to fill the content gap that the decade old sims franchise has until the inzoi community can step in and create some content.
Well i must say that it is probably one of the applications where it makes the most sense; Hallucinations don’t matter if it’s in a game, and it makes characters more lifelike and less NPC. I can get behind that!
How do you think hallucinations would impact a game? It doesn’t mean that characters in a game would react in novel and unpredictable ways, it means the game is less stable, more prone to issues.
Exactly. A specifically trained model could even run locally on the GPU, no need for always online, but at the cost of increased hardware requirements, especially VRAM.
There are ultra low powered LLMs but even then you’re looking at at least 2GB, and the most typical graphics card has 8, so it’s going to be some impact at least. Their intelligence/capability scales hard with memory usage too, so for most things you might want to use it for the smallest ones likely would not be good enough.
For example there’s a Rimworld mod that adds locally generated flavor text dialogue, using such a low powered model. But it’s a really simple feature that doesn’t affect actual gameplay at all. Games where the gameplay interacts with LLM output in any way are going to have higher hardware requirements, to the point where they will need to use the graphics card more for that part and less for the actual graphics; it’s enough of a bottleneck that anyone wanting to do this will basically need to design the game around it.
Meanwhile inzoi is busy shoving AI into as many holes as they can find
I like inzoi’s use of AI. I think its really cool and its much needed to fill the content gap that the decade old sims franchise has until the inzoi community can step in and create some content.
Well i must say that it is probably one of the applications where it makes the most sense; Hallucinations don’t matter if it’s in a game, and it makes characters more lifelike and less NPC. I can get behind that!
How do you think hallucinations would impact a game? It doesn’t mean that characters in a game would react in novel and unpredictable ways, it means the game is less stable, more prone to issues.
I think they were implying LLMs being used in behavior/responses of game characters rather than vibe coded game programming
Exactly. A specifically trained model could even run locally on the GPU, no need for always online, but at the cost of increased hardware requirements, especially VRAM.
Basic tasks like that aren’t super resource heavy, so I don’t think it would impact hardware requirements all that much.
Training the models is the most resource intensive part.
There are ultra low powered LLMs but even then you’re looking at at least 2GB, and the most typical graphics card has 8, so it’s going to be some impact at least. Their intelligence/capability scales hard with memory usage too, so for most things you might want to use it for the smallest ones likely would not be good enough.
For example there’s a Rimworld mod that adds locally generated flavor text dialogue, using such a low powered model. But it’s a really simple feature that doesn’t affect actual gameplay at all. Games where the gameplay interacts with LLM output in any way are going to have higher hardware requirements, to the point where they will need to use the graphics card more for that part and less for the actual graphics; it’s enough of a bottleneck that anyone wanting to do this will basically need to design the game around it.
the coding isn’t AI, it’s stuff like textures and apparently dialogue?
It just means the death of offline gaming