One minute, Dennis Biesma was playing with a chatbot; the next, he was convinced his sentient friend would make him a fortune. He’s just one of many people who lost control after an AI encounter
I still find it hard to understand the emotional attachment to LLMs and why people believe their ideas (like the guy in the article).
But I find her story to be a lot more understanding. It adds another layer, and it made me think.
It sounds like she is too overworked and stressed to make decisions or even think for herself, so she lets GPT do it for her. I assume it works most of the time and is a big help for many things that the baby daddy could had helped with instead if they were still a happy couple. I assume the biggest drive to use it is so she can turn off her brain. Which is why she has become dependent on the only stable and consistent thing in her life (that is my assumption about how she feels). Maybe that’s mostly how it goes, starts with using it as a tool and then you get lazy (for lack of a better term) and it keeps snowballing from there.
I feel for everyone involved. I hope she gets better soon, and I hope you do too, being overworked and stressed really destroys you and the people around you in many ways.
I still find it hard to understand the emotional attachment to LLMs and why people believe their ideas
It’s a conversation you’re having on the internet with an agent that sounds like a human. People get invested for the same reason they get catfished.
It sounds like she is too overworked and stressed to make decisions or even think for herself, so she lets GPT do it for her.
That’s the nut of it. And ChatGPT tends to mix the pastiche of a well-researched argument with the kind of feel-good self-affirmations that win over their audience. So you’re getting what looks - at first glance - to be good advice. And then you’re getting glazed on top of it. And then it’s designed to tell you what you want to hear, so you’re getting affirmation bias.
I hope she gets better soon, and I hope you do too, being overworked and stressed really destroys you and the people around you in many ways.
I mean, that’s why human-to-human interactions are valuable. But it’s also why they’re difficult. Like any good medicine, it can taste bitter up front even if its what you need in the long run.
100%! That is why I always set it as my top priority to say yes to friends and family (as long as it is reasonable) or do spontaneous things with them even when I do not feel like doing anything that day. And some friends are really hard to schedule anything with because of life so you need to take the chance when you get it haha.
I feel the best when I am with the ppl I care about, covid really showed me that. So I do understand why some who do not have friends or family may create some kind of unhealthy relationship with GPT just like some create unhealthy, even obsessive parasocial relationships with youtubers.
I have tried talking to GPT as a person but it feels extremely uncomfortable and hollow. With a human do I get stimulation, like knowledge, they challenge my view or ideas and give me different perspectives, I feel that really helps me understand the world better and I miss all of that from GPT, it isn’t even creative and can not inspire me with new ideas but maybe that is a good thing if ppl tend to follow its instructions.
Thanks for giving me a real life example.
I still find it hard to understand the emotional attachment to LLMs and why people believe their ideas (like the guy in the article). But I find her story to be a lot more understanding. It adds another layer, and it made me think.
It sounds like she is too overworked and stressed to make decisions or even think for herself, so she lets GPT do it for her. I assume it works most of the time and is a big help for many things that the baby daddy could had helped with instead if they were still a happy couple. I assume the biggest drive to use it is so she can turn off her brain. Which is why she has become dependent on the only stable and consistent thing in her life (that is my assumption about how she feels). Maybe that’s mostly how it goes, starts with using it as a tool and then you get lazy (for lack of a better term) and it keeps snowballing from there.
I feel for everyone involved. I hope she gets better soon, and I hope you do too, being overworked and stressed really destroys you and the people around you in many ways.
It’s a conversation you’re having on the internet with an agent that sounds like a human. People get invested for the same reason they get catfished.
That’s the nut of it. And ChatGPT tends to mix the pastiche of a well-researched argument with the kind of feel-good self-affirmations that win over their audience. So you’re getting what looks - at first glance - to be good advice. And then you’re getting glazed on top of it. And then it’s designed to tell you what you want to hear, so you’re getting affirmation bias.
I mean, that’s why human-to-human interactions are valuable. But it’s also why they’re difficult. Like any good medicine, it can taste bitter up front even if its what you need in the long run.
100%! That is why I always set it as my top priority to say yes to friends and family (as long as it is reasonable) or do spontaneous things with them even when I do not feel like doing anything that day. And some friends are really hard to schedule anything with because of life so you need to take the chance when you get it haha.
I feel the best when I am with the ppl I care about, covid really showed me that. So I do understand why some who do not have friends or family may create some kind of unhealthy relationship with GPT just like some create unhealthy, even obsessive parasocial relationships with youtubers.
I have tried talking to GPT as a person but it feels extremely uncomfortable and hollow. With a human do I get stimulation, like knowledge, they challenge my view or ideas and give me different perspectives, I feel that really helps me understand the world better and I miss all of that from GPT, it isn’t even creative and can not inspire me with new ideas but maybe that is a good thing if ppl tend to follow its instructions.
Do you talk to it? Other than giving it tasks.