

That actually makes a lot of sense, when you think about it like that. I appreciate your contribution to science, by accidently burning yourself regularly and noting the results ;)
That actually makes a lot of sense, when you think about it like that. I appreciate your contribution to science, by accidently burning yourself regularly and noting the results ;)
Y’know, that’s a good way to put it, and exactly the kind of point I needed to see, in order to put it into perspective.
I’ve read that thing about running water before, and I’ve always wondered - is it really expected that you run the tap for 30 min.? I know it’s for the good of the skin, and super important, but I always struggle with running the tap for that long.
Nobody is going to visit his house in Florida in 100 years.
You’d need some scuba gear, probably.
I hate articles like this so much. ChatGPT is not sentient, it doesn’t feel, it doesn’t have thoughts. It has regurgitation and hallucinations.
They even had another stupid article linked about “AI blackmailing developers, when they try to turn it off.” No, an LLM participates in a roleplay session that testers come up with.
It’s articles like this that makes my family think that LLMs are reasoning and intelligent “beings”. Fuck off.
I didn’t consider that, but I think that’s a really good point. It makes sense to me, because I could see myself applying that kind of logic. Makes the 30ish min. rule make more sense.