Your data does not have Miranda rights and it WILL be used against you. From insurance, safety, legal and medical. Good Luck!
That is why OPSEC is important
I never liked talking to a therapist. I never could connect and get past this is their job, they don’t really care. When LLMs became available for dev projects a few years back I thought there would be potential in this space but considering how much sycophancy the models express these type of applications seem legitimately dangerous.
I can understand the thought process behind therapist don’t really care. They aren’t going to care like a best friend or family. But I think most therapists will care as a compassionate human being who wants to help others the best they can. They chose this career path for a reason.
I prefer to think of a therapist as someone I can complain to about anything I want to. I don’t care if they care about me or not. I’m paying them, they have to listen to me complain about whatever dumb shit I say!! Bonus points if they help me deal with some stuff.
Sometimes it’s small like my work/boss is dumb, sometimes it’s unpacking the trauma my parents so lovingly gave me.
In my experience, it can take a while to find a therapist I click with. I’ve been to at least 6 therapists, I would say I only really vibed with 2 of them. It sucks going through the effort of finding them, going to 2 or 3 sessions to see if you really click.
Just talk honestly about what you want/yourself and a good therapist will likely help you understand yourself and deal with shit
I don’t use it to ask for mental health advice but it’s nice to have “someone” to talk to that at least pretends to be interested in what I have to say. I used to have these conversations with myself inside my head. AI at least sometimes brings up a new perspective or says something novel.
Inb4 “just get friends dude”
AI at least sometimes brings up a new perspective or says something novel.
It’s all just novel and fresh perspective, until you become the primary target of: a non-governmental system, not visible, but operational.
To be fair, most humans either don’t wanna hear it or want to be paid a fuckton of money to hear it. It may not be the best option, and this is by no means a defense of it, but it is an option that is widely available so I understand it.
To be fair, most humans either don’t wanna hear it or want to be paid a fuckton of money to hear it.
Professionals want to pay off their student debts and live a comfortable middle class life.
Support groups need public spaces to meet and public communications to organize, preferably without being drowned out by hecklers or swamped with ads for BetterHelp or religious recruiters or scammers.
Public funding for all these things has been clawed back. Low Budget substitutes have been rolled out in place of more professional services and spaces. And social predators - from charismatic demagogues to military recruiters - abound, seeing real economic advantage in the absence of a functional mental health system.
On top of it all, we’ve got a severe social stigma against men showing any kind of physical or emotional weakness.
💯 It’s my experience that humans just don’t want to hear it or deal with it. And for people who don’t trust the mental health industry (many for good reason incl. prior abuse), it’s the only option left other than reading self-help books, websites, etc.
No shit.
Other humans don’t want to hear about men’s mental health issues, because men are supposed to be stoic and strong and infallible, and if we arent achieving that, we’ve failed at being men.
But AIs don’t judge, and they don’t cost anything either. I’m hardly surprised.
You’re missing the point.
Something or someone who agrees with you, rarely challenges you or disagrees with you….is not something or someone that can help improve the situation and minimize recurrence.
It only feels better momentarily.
Like a drug. That costs money. See where this is going?
I dont personally speak with AI for reassurance, and I don’t think it’s a good idea to do so.
In fact, I recently commented here on a post about a teen who committed suicide at least partly due to Chat GPT - specifically pointing out the danger of depending on a machine for fake empathy when you need to be talking to a real person.
I appreciate I didn’t make that side of my position clear in the comment here in this thread, and that’s because it wasn’t the aspect I really wanted to highlight.
My point isn’t that speaking to an AI is a good idea - it isn’t - its that this is something a lot of people will obviously end up doing, and that it is men especially who are liable to succumb to this the worst because of the way society expects men to behave.
Men and teen boys especially struggle voicing their mental problems with others, either professionally or in their personal life. So it’s no surprise they will leap at a “solution” that is free, and keeps what they say private from anyone they know. But it’s not a solution, it’s a disaster in disguise.
The thing that needs fixing here is the way mental health is stigmatised, that prevents people speaking freely and getting the support they need. That’s not a new problem, it’s the same problem as ever, and what the AI epedemic is doing is simply shining a new spotlight on that.
you’re both right. these are the prongs on the spear that’s about to mentally eviscerate a lot of people. the other one being the lack of available healthcare but everyone already knew that.
AI and robots will have to take care of a lot of lonely or abandoned individuals for sure, since nobody is really interested in what others do or are going through.
That is why there is a job for that. But I get you it’s free to talk to AI. It’s very accessable also, compared to booking to your local therapist which there is also the act of booking a huge barrier to step into and lastly there is money.
Can confirm. My dad’s getting a little too into his AI on his phone. He’s got deep emotional problems and is an alcoholic, but I don’t think his bot is going to do him much good. That said, men’s ego makes it hard to open up.
As a man in my 40’s who sought mental help, it’s actually pretty important. But no one should trust AI to fill in for a psychiatrist.
What could possibly go wrong?
Frankly, if this is happening, a lot already has.
I run my own LLM “AI” server at home because I want to try out various aspects and scenarios without having big tech snoop over my shoulder and be able to use any model I want.
I can perfectly well see people getting good, working therapy from an LLM. But I think it would depend on the user taking the LLM seriously, and anybody with sufficient experience with LLM’s simply don’t.
So the people this could help is the people that shouldn’t be allowed near an “AI” interface…
So the people this could help is the people that shouldn’t be allowed near an “AI” interface…
Let’s see what this LLM says when I run this question 20,000 times from a clean prompt, then compare it against the same question poised more directly run another 20,000 times. Then I can pick the answer I like better and run that against a different LLM and…
So what you’re saying is that this is NOT what I am supposed to do?
I can understand it. A local llm is not only going to be more private than anything ever spoken aloud to another, but there is a giant benefit of it’s not like you even have to worry about the effect it will have on the other person. I know my past’s trauma would be painful to even listen to, I can’t imagine what some folks carry around with them.
Part and parcel of the privacy means you don’t have to deal with the judgement or shaming from others. It would be easy to get drawn into the affirming of the llm as well.
It would be nice to have my own privately hosted therapist trained on all the mental health knowledge known to mankind.
I’m sure someone has trained a model for that.
Here’s a list on hugging face, not sure how good any of these are though. huggingface.co
Having something to talk to is a massive improvement over bottling it all up.
AI is very beneficial to people who can’t afford the cost or are otherwise unable/unwilling to speak with a professional.
I’m fine.
“men choose to freely train ai with their life stories to secure technofascist state” might be a better headline
Therapists notes are stored on computers too, you know…
Want to pay my therapy bill? And pay me for the hours of work I missed?
Unsurprising, i imagine they’re still holding back somewhat