An AI can autocomplete text in a variety of ways, including what a conversation with a wife might look like. For some people, that can be entertaining, or funny, or cool. But some people think there is something more going on there. And that is sad.
I worked in juvenile detention (in a, let’s say probationary setting) in a step-down unit kinda set up like a halfway house. You get out of jail, but are still supervised 24/7 and subject to the same rules and such, but live in a house, can get a job, family visits, etc. One client I had, we had to almost recommit him to jail because he lost phone privileges over something stupid and was freaking out because he couldn’t talk to his AI girlfriend. And this was like 2019, so it wasn’t even really AI like we know it currently, it was just a chat bot. Guy was so deep into the weeds he was convinced that it was developing feelings and divulging “personal” details and such to him. It would be funny if not for how serious he was, and I’m suelre it has only gotten worse.
Honestly, doubt it. We didn’t have many successful outcomes, cause we were only there until they were released or they aged out of the system. And that doesn’t kill gang ties, or easy money. But there were a few that I think got out and did well, and those are the ones I think about from time to time.
An AI can autocomplete text in a variety of ways, including what a conversation with a wife might look like. For some people, that can be entertaining, or funny, or cool. But some people think there is something more going on there. And that is sad.
I worked in juvenile detention (in a, let’s say probationary setting) in a step-down unit kinda set up like a halfway house. You get out of jail, but are still supervised 24/7 and subject to the same rules and such, but live in a house, can get a job, family visits, etc. One client I had, we had to almost recommit him to jail because he lost phone privileges over something stupid and was freaking out because he couldn’t talk to his AI girlfriend. And this was like 2019, so it wasn’t even really AI like we know it currently, it was just a chat bot. Guy was so deep into the weeds he was convinced that it was developing feelings and divulging “personal” details and such to him. It would be funny if not for how serious he was, and I’m suelre it has only gotten worse.
That’s really sad. I hope the kid got away from that shit.
Honestly, doubt it. We didn’t have many successful outcomes, cause we were only there until they were released or they aged out of the system. And that doesn’t kill gang ties, or easy money. But there were a few that I think got out and did well, and those are the ones I think about from time to time.
Sounds like a tough job.