Therapist vs. Artificial Intelligence - I answer your questions #chatgpt #mentalhealth

preview_player
Показать описание


Can Chat GPT handle complex questions about mental health? Could Artificial Intelligence do a therapist's job someday?
Let’s find out.

Artificial Intelligence and Chat GPT has taken over and it’s pretty incredible, so far it’s helped me figure out Excel formulas, helped my assistant write emails, summarized videos, and suggested topics for future videos. It’s a super powerful tool, and my husband is also obsessed with it.

I’ve asked you all to submit some questions and I’m going to try to answer them, then we’ll ask Chat GPT for it’s opinion and Today I’ve got my friend, colleague and boss with me Monica Blume, LCSW, and she’s going to be the judge of which of us did a better job. And throw out her suggestions for what the treatment would be.
For the sake of time we’ll try to outline a treatment approach if the answer to the question is too long for this video.

Therapy in a Nutshell and the information provided by Emma McAdam are solely intended for informational and entertainment purposes and are not a substitute for advice, diagnosis, or treatment regarding medical or mental health conditions. Although Emma McAdam is a licensed marriage and family therapist, the views expressed on this site or any related content should not be taken for medical or psychiatric advice. Always consult your physician before making any decisions related to your physical or mental health.
In therapy I use a combination of Acceptance and Commitment Therapy, Systems Theory, positive psychology, and a bio-psycho-social approach to treating mental illness and other challenges we all face in life. The ideas from my videos are frequently adapted from multiple sources. Many of them come from Acceptance and Commitment Therapy, especially the work of Steven Hayes, Jason Luoma, and Russ Harris. The sections on stress and the mind-body connection derive from the work of Stephen Porges (the Polyvagal theory), Peter Levine (Somatic Experiencing) Francine Shapiro (EMDR), and Bessel Van Der Kolk. I also rely heavily on the work of the Arbinger institute for my overall understanding of our ability to choose our life's direction.

Copyright Therapy in a Nutshell, LLC
Рекомендации по теме
Комментарии
Автор

In all seriousness a machine mind needs to be told “you’re a specialist and very empathetic” on top of the questions. The more you feed it the better the response you receive.

florentinog.
Автор

The sad point is based on what I've seen of therapists, sometimes you would have been better off with ChatGPT.

chessdad
Автор

I've been using chatGPT as therapy for a while now.
I went to this behavioral psychotherapist for a few years and chatGPT effectively seems to follow the same script he did.

Even if it doesn't follow some long term treatment plan, the space it gives you for reflection and bouncing thoughts around with 24/7 access at no monetary cost is nuts.

Its extremely knowledgeable, so it can pull advice on virtually any topic from many different resources. If you ask it to respond to your problem like a cognitive behavioral therapist, it'll suggest exercises used in CBT or use concepts used in CBT to approach your topics with.
It helps you to approach issues using ways or perspectives you may not have come up with on your own, so it feels more engaging than journaling.

If models similar to this were designed specifically to be used instead of traditional psychotherapy, I could see them being an amazing resource for mental health care.

I understand that it's scary for psychotherapists to be one of the first professions tremendously affected by AI, but really this should be lauded as a massive achievement in mental health care, since this can make standardized mental health care accessible to people who may have no available alternatives.

katakis
Автор

The only thing you need to change is prompting AI to he more compassionate and warm, and tell it to act as mental health professional. And voilà - it's gonna be as good at those aspects as the actual therapist.

delevoxdg
Автор

You could have asked chatgpt to incorporate more emotion or otherwise influenced its responses, just btw. You could ask it to embody some personality traits or to respond as though it was pretending to be a real person

brianlauren
Автор

Really interesting experiment! As far as I’m aware, providing empathy and emotional support is not the intended function of GPT… so this is kinda comparing apples with oranges. It was a great way to highlight the difference between information and support.

Celeste-in-Oz
Автор

The other thing to mention is that you are dealing with a first question and response, not an interaction. With a therapist there would naturally be listening time and they would respond to people's facial expressions, verbal tone, etc. You would have to try to test this software in an interaction mode.

tonydare
Автор

0:00: ! The video discusses the capabilities of artificial intelligence, specifically ChatGPT, in various tasks and its potential impact on jobs.
4:29: 💼 Ergophobia, also known as work phobia, is an anxiety disorder characterized by excessive fear or avoidance of work-related situations.
7:55: ❌ AI diagnosis can be inaccurate and potentially harmful, leading to misdiagnosis and identity issues.
12:10: 💡 Perfectionism can stem from childhood experiences, trauma, social comparison, and genetic factors, and its treatment approach includes mindfulness, challenging negative thoughts, setting realistic goals, seeking support, practicing self-compassion, and celebrating progress and successes.
15:33: 💬 The video discusses the importance of human connection in therapy and how a therapist can provide personalized advice and support compared to a TV show therapist.
Recap by Tammy AI

lilytea
Автор

While ChatGPT is interesting, I am really finding value in the dialogue between you two therapists. Your 2 different ways of conceptualizing things and communicating thoughts has been very helpful to me and my own anxiety. Thank you for making this video!

bobbiejeanesser
Автор

i think ChatGPT's role is to, at most, support the therapist in his/her response

microfoneman
Автор

@15:23 of course it sounds vague. ChatGPT is giving you an overview (what overview isn't vague?) and it's up to the individual user to ask followup questions and ask it to expand on those points. It remembers the last 4000 words of the conversation so you can easily refer back to previous questions and ask/tell it to "Expand more on number 4" or "Give me a detailed explanation on how to practice self-compassion and set realistic goals" and as more followup questions from those results. Overall, you'll get much more information on these topics when you ask good questions (instead of the vague ones in this video), followed up by good followup questions. I've learned many things using chatGPT by having an hour-long conversation with it and asking probing questions.

martini
Автор

15:47 You don't know how to use chatGPT for therapeutic purposes. The analogy to the TV therapist is just rubbish because you can literally write 10 paragraphs describing your unique life scenario in exacting detail and ask chatGPT to help you come up with a plan to dig yourself out of your situation, which is analogous to what happens in a clinical therapy session. I know because I've literally done this and not only did it understand the entirety of my scenario, it offered great advice.

martini
Автор

6:48 - just an observation, and I used to do the same thing: all of my life I've been dealing with gynecomastia (male breasts). Theres no way of knowing if this boy was dealing with this and perhaps the mom didn't know that this was something they were fearing, especially in young boys, because thats how I felt. The methods the mother described resonated with me as thats something that I used to do to cope with how the shirt would fit me, I'd pull at my shirt and readjust it until I felt comfortable, which it was rare to find comfort especially if whatever I was wearing accentuated the size of my breasts. Especially when I was in my teen years and around my high school peers. As I've got older, I accepted them and throughout my middle aged life, almost used it as a badge of confidence, and some partners have expressed their attraction to this somewhat unfixable body trait... so gynecomastian's unite! there's dividends later in life if you put up with it :)

Nothingreallyexists
Автор

13:19 I find it fascinating that Monica referred to GPT as “they” not “it” … wonder how many people will perceive electronic intelligence as an entity, either consciously, or unconsciously 🤔

Celeste-in-Oz
Автор

As someone who has had several therapists over the years, I would say that the A.I. would be equivalent to an advanced Google search, whereas with a therapist, I get the information to understand what is happening and why, but ALSO I feel that a PERSON understands me and can empathize and or sympathize. A.I. will NEVER be able to REPLACE a human and their lifetime of experiences.

lillybits
Автор

The "perfectionism" example is so telling. Clearly a therapist's creativity, interpretation of a given situation, and overall thesis stem from experience and context, and can vary from expert to expert, with multiple theses being *equally plausible. In contrast, GPT is literally generating a response based on case data, almost like case law. It's a litany of facts, in a way. So GPT is acting as a good data analyst in some way, not a predictor of a person's specific prognosis (even though it 'sounds' like doing that). As such, GPT is a good tool *for therapists in unknown scenarios (e.g., the ergophobia example); and *maybe their clients *after the therapist has advised the client. I think GPT has a much greater risk of abuse by self-diagnosis vs. a search engine, given the personalized and suggestive nature of GPT outputs. Great session Emma & Monica. Thank you and your volunteering clients for sharing.
**PS: Now, if we can feed sentiment analysis of GPT's response back into the model, and have it improve it's emotional content so that it doesn't sound fake, does it get better & better over time....? Food for thought. Or thought for Snoop. : )

KashifAli-ghpy
Автор

Let's be fair now.. your can't fault the computer for doing exactly as you asked when you asked it to answer in a kind loving way

Journey_with_Dawn
Автор

You can request ChatGPT to answer you in a less formal way, by saying 'answer me as though I was a young teenager'. You can ask it to answer in a variety of personas. Try it.

mlytle
Автор

Thank you for answering my question. Much appreciated. Cheers!

dekar
Автор

Fascinating, however if you were to have prompted chat-gpt to provide answers in the style of an empathetic and knowledgeable therapist, the answers would be better. It would still be found lacking, but better.

ryantaylor