Researchers at MIT decided to run a wild experiment by dropping 1,000 AI agents into Minecraft and giving them a simple goal: build a community. What followed feels like science fiction edging into reality. The AI didn’t just stack resources and wander around aimlessly. They organized, formed soci
Just as a brain is not a giant statistics problem, LLMs are not intelligent. LLMs are basically large math problems that take what you put into them and calculate the remainder. That isn’t an emergent behavior. That isn’t intelligence at all.
If I type into a calculator 20*10 and it gives me 400, is that a sign of intelligence that the calculator can do math? I never programmed it to know what 10 or 20 or 400 were, though I did make it know what multiplication is and what digits and numbers are, but those particular things it totally created on its own after that!!!
When you type a sentence into an LLM and it returns with an approximation of what a response sounds like, you should treat it the same way. People programmed these things to do the things that they are doing, so what behavior is fucking emergent?
Just as a brain is not a giant statistics problem, LLMs are not intelligent. LLMs are basically large math problems that take what you put into them and calculate the remainder. That isn’t an emergent behavior. That isn’t intelligence at all.
If I type into a calculator 20*10 and it gives me 400, is that a sign of intelligence that the calculator can do math? I never programmed it to know what 10 or 20 or 400 were, though I did make it know what multiplication is and what digits and numbers are, but those particular things it totally created on its own after that!!!
When you type a sentence into an LLM and it returns with an approximation of what a response sounds like, you should treat it the same way. People programmed these things to do the things that they are doing, so what behavior is fucking emergent?