I know it’s not even close there yet. It can tell you to kill yourself or to kill a president. But what about when I finish school in like 7 years? Who would pay for a therapist or a psychologist when you can ask for help a floating head on your computer?
You might think this is a stupid and irrational question. “There is no way AI will do psychology well, ever.” But I think in today’s day and age it’s pretty fair to ask when you are deciding about your future.
deleted by creator
Eh, I give it 5 years.
Never say never, because everything is possible given enough time. The only question being how much time.
deleted by creator
I won’t trust a tech company with my most intimate secrets. Human therapists won’t get fully replaced by ai
It’s just like with programming: The people who are scared of AI taking their jobs are usually bad at them.
AI is incredibly good at regurgitating information and translation, but not at understanding. Programming can be viewed as translation, so they are good at it. LLMs on their own won’t become much better in terms of understanding, we’re at a point where they are already trained on all the good data from the internet. Now we’re starting to let AIs collect data directly from the world (chatGPT being public is just a play to collect more data), but that’s much slower.
I am not a psychologist yet. I only have a basic understanding of the job description but it is a field that I would like to get into.
I guess you are right. If you are good at your job, people will find you just like with most professions.
I slightly disagree, in general I think you’re on point, but artists specially are actually being fired and replaced by AI, and that trend will continue untill there’s a major lawsuit because someone used a trademarked thing from another company.
No, it won’t. I don’t think I would have made it here today alive without my therapist. There may be companies that have AI agents doing therapy sessions but your qualifications will still be priceless and more effective in comparison.
Given how little we know about the inner workings of the brain (I’m a materialist, so to me the mind is the result of processes in the brain), I think there is still ample room for human intuition in therapy. Also, I believe there will always be people who prefer talking to a human over a machine.
Think about it this way: Yes, most of our furniture is mass-produced by IKEA and others like it, but there are still very successful carpenters out there making beautiful furniture for people.
That’s a fair point.
deleted by creator
That’s great answer. Thank you.
I don’t think the AI everyone is so buzzed about today is really a true AI. As someone summed it up: it’s more like a great autocomplete feature but it’s not great at understanding things.
It will be great to replace Siri and the Google assistant but not at giving people professional advice by a long shot.
Not saying an LLM should substitute a professional psychological consultant, but that someone is clearly wrong and doesn’t understand current AI. Just FYI
Care to elaborate?
It’s an oversimplified statement from someone (sorry I don’t have the source) and I’m not exactly an AI expert but my understanding is the current commercial AI products are nowhere near the “think and judge like a human” definition. They can scrape the internet for information and use it to react to prompts and can do a fantastic job to imitate humans, but the technology is simply not there.
Even if AI did make psychology redundant in a couple of years (which I’d bet my favourite blanket it won’t), what are the alternatives? If AI can take over a field that is focused more than most others on human interaction, personal privacy, thoughts, feelings, and individual perceptions, then it can take over almost any other field before that. So you might as well go for it while you can.
Psychotherapy is about building a working relationship. Transference is a big part of this relationship. I don’t feel like I’d be able to build the same kind of therapeutic relationship with an AI that I would with another human. That doesn’t mean AI can’t be a therapeutic tool. I can see how it could be beneficial with things like positive affirmations and disrupting negative thinking patterns. But this wouldn’t be a substitute for psychotherapy, just a tool for enhancing it.
All my points have already been (better) covered by others in the time it took me to type them, but instead of deleting will post anyway :)
If your concerns are about AI replacing therapists & psychologists why wouldn’t that same worry apply to literally anything else you might want to pursue? Ostensibly anything physical can already be automated so that would remove “blue-collar” trades and now that there’s significant progress into creative/“white-collar” sectors that would mean the end of everything else.
Why carve wood sculptures when a CNC machine can do it faster & better? Why learn to write poetry when there’s LLMs?
Even if there was a perfect recreation of their appearance and mannerisms, voice, smell, and all the rest – would a synthetic version of someone you love be equally as important to you? I suspect there will always be a place and need for authentic human experience/output even as technology constantly improves.
With therapy specifically there’s probably going to be elements that an AI can [semi-]uniquely deal with just because a person might not feel comfortable being completely candid with another human; I believe that’s what using puppets or animals or whatever to act as an intermediary are for. Supposedly even a really basic thing like ELIZA was able convince some people it was intelligent and they opened up to it and possibly found some relief from it, and there’s nothing in it close to what is currently possible with AI. I can envision a scenario in the future where a person just needs to vent and having a floating head just compassionately listen and offer suggestions will be enough; but I think most(?) people would prefer/need an actual human when the stakes are higher than that – otherwise the suicide hotlines would already just be pre-recorded positive affirmation messages.
You still had some good/new points in last paragraph. Thx
By the way, if you want to try Eliza, you can telnet into
telehack.com
and run the commandeliza
to launch it.
I think it is one of these things that AI can’t make redundant, never.
The caring professions are often considered to be among the safest professions. “Human touch” is very important in therapy
Hey, maybe your back ground in psychology will help with unfucking an errant LLM or actual AI someday :P
Given the vast array of existing pitfalls in AI, not to mention the outright biases and absence of facts - AI psychology would be deeply flawed and would more likely kill people.
Person: I’m having unaliving thoughts, I feel like it’s the only thing I can do
AI: Ok do it then
That alone is why it’ll never happen.
Also we need to sort out how to house, heal and feed our people before we start going and replacing masses of workforce.
The level of liability you’d expose yourself actively advertising it as some sort of mental health product is insane.
I do believe someone will be dumb enough, but it’s a truly terrible, insanely unsafe idea with anything resembling current tech in any way.
Many valid points here, but here is a slightly different perspective. Let’s say for the sake of discussion AI is somehow disruptive here. So?
You cannot predict what will happen in this very fast space. You should not attempt to do so in a way that compromises your path toward your interests.
If you like accounting or art or anything else that AI may disrupt… so what? Do it because you are interested. It may be hyper important to have people that did so in any given field no matter how unexpected. And most importantly, doing what interest you is always at least part of a good plan.