Yes because then your car battery won’t start
There is a tv film, i don’t know the title, related about this topic.
The plot was :
a group of scientists made a living simulation, and go in the simulation to operate fixes and prevent making simulation. On day, one the scientific was killed, and left a message in the simulation for their coworkers. The message was : “take a road and follow no direction”, a guy in the simulation followed the instruction and discovered that he was in a simulation, but the message were for the scientists who are in a simulation too.
If someone can find the movie, it could be great.
and go in the simulation to operate fixes and prevent making simulation
Pardon?
In 2000, The Thirteenth Floor was nominated for the Saturn Award for Best Science Fiction Film but lost to The Matrix.
Yeah that’s a pretty good sell. I’ll check it out.
i liked it, but I like almost everything, so that’s not much of a sell.
Found God’s account
Instead of a Dark Lord, you would have a queen, not dark but beautiful and terrible as the dawn! Tempestuous as the sea, and stronger than the foundations of the earth! All shall love me and despair!
The fun thing about ethics is that not everyone shares the same rules. Personally, I would probably say it is. However, others may say they aren’t real, and only an illusion manufactured by the simulation, so it’s fine. There are other arguments I’m sure someone could make too. It’s up for you to decide what your ethics are, not others. There is no universal code of ethics just as there is no universal morality.
You can “simulate” life inside your brain, too.

[Alt text: this is Bob. Bob is a figment of you imagination. When you leave, Bob will leave too. “Don’t leave” says Bob]The Bob in your head is intelligent, it can communicate in English. Is it unethical to stop thinking about Bob? Was it unethical of me to show you this picture, creating a “Bob” in your head? Is any story unethical to tell?
Hmm. I also imagined a new bpb that says, “its ok to leave i will be ok”.
Just turn down the simulation speed real low and run it at one tick per 20 years, then you can technically keep it going without such great expense. The people inside won’t notice the difference.
If you take the limit of that you’ll realize that people won’t raise if you turn it off either.
“Now playing human music, on Earth Radio”
Hahahaha. Eat apple!
I’d imagine there could be an ethical way to do so through a sunset protocol similar to the concept of rapture (the religious kind, not the Bioshock city) - freeze simulation, move all the beings’ minds to “heaven”, shut down physical universe simulation (lowering operation costs by at least five orders of magnitude, I’d imagine), and let them enjoy afterlife until they get tired of existing, reach nirvana, or something like that.
That reminds me, I should really get back into AI research.
only way to know would be to enter the simulation and see for yourself… wait a minute…
Did you just watch “Plaything” on Black Mirror?
Sometimes I think of that episode when I play Rimworld and start harvesting blood and organs from a prisoner
I was thinking the microverse battery from Rick and Morty!
Intelligence isn’t the important factor there - consciousness is. Does it feel like something to be those entities in the simulation? If yes, then I’d argue that ending the simulation is like killing a person painlessly in their sleep.
I personally don’t think ending the simulation is even the most troubling part. We could unintentionally create a simulation that’s effectively a hell and then populate it with entities that have subjective experiences we don’t realize exist. The only thing worse than ending a life is creating one just for it to suffer through its entire existence.
We could unintentionally create a simulation that’s effectively a hell and then populate it with entities that have subjective experiences we don’t realize exist. The only thing worse than ending a life is creating one just for it to suffer through its entire existence.
And this is basically the plot of the TV series Severance. Has me wondering how they intend to address it.
USS Calister
Didn’t scientists train brain cells to exclusively play Doom? It’s like their whole conscience is stuck in a video game version of hell through a brain in a vat experience.
Not really. It’s not nearly enough cells to have any kind of consciousness as we know it. A few neurons learning to play a game is a far cry from tying a being into a simulation of hell.
I dunno. Some life forms have only a few brain cells. It could mean their whole world for those little cells, wouldn’t it?
It is definitely their entire world, but the point is it takes far more than a few cells to create actual human-relatable sentience.
That’s coming from someone who fully understands and knows that many more animals than most humans care to admit also have sentience.
Those petri dishes are not sentient nor conscious.
The only thing worse than ending a life is creating one just for it to suffer through its entire existence.
Antinatalism entered the chat
Or maybe just well reasoned morality?
Only if they’re conscious of the simulation.
Username checks out.
It would be unethical to start the simulation in the first place…
If this is a way for our simulation creator to decide to pull the plug without guilt, I guess just go ahead and do it. I was holding out hope that this was all real, but it has been getting more clear that it’s not.
Couldn’t they just make us all infertile and let us die naturally or something?
This is a tough question, I think to answer it you have to know if those simulated beings have actual consciousness / sapience or if that is just simulated.







