Could ai Revolutionize Mental Health Care and Diagnose a Personality Disorder?

preview_player
Показать описание

Complex Borderline Personality Disorder: How Coexisting Conditions Affect Your BPD and How You Can Gain Emotional Balance. Available at:

In this video, we explore the possibility that artificial intelligence could revolutionize mental health care and diagnose a personality disorder.

Daniel J. Fox, Ph.D., is a licensed psychologist in Texas, international speaker, and a multi-award-winning author. He has been specializing in the treatment and assessment of individuals with personality disorders for over 20 years in the state and federal prison system, universities, and in private practice. His specialty areas include personality disorders, ethics, burnout prevention, and emotional intelligence.

He has published several articles in these areas and is the author of:

Complex Borderline Personality Disorder: How Coexisting Conditions Affect Your BPD and How You Can Gain Emotional Balance. Available at:

Thank you for your attention and I hope you enjoy my videos and find them helpful and subscribe. I always welcome topic suggestions and comments.
00:00 Introduction
01:09 How to diagnosis a personality disorder
01:50 Initial evaluation
02:10 Diagnostic criteria from DSM-5
02:46 Clinical interviews
03:20 Observational history
04:25 Differential diagnosis
05:50 Symptom duration and impact
06:54 Clinical judgment
07:20 ChatGPT and personality disorders
08:06 Can ai provide therapy?
09:09 Non-ai diagnostic criteria
Рекомендации по теме
Комментарии
Автор

Dear Dr Fox..AI will never be able to explain things so well like you do..Thanks for all you do! Annerien from South Africa

annerienvandenberg
Автор

I pray this never happens. It would be like going on google to get answers. That is a rabbit hole I'm not going down. You can not replace human interaction IMHO.

KatBlack
Автор

Itl never replace great drs like mr fox.please spread this mans work .

derrick
Автор

I don’t think AI is going to ever replace mental health workers but I can totally see it becoming a useful tool for them.

qazplmm
Автор

Hard pass on AI for me personally. I need to feel a human connection with a therapist. An AI therapy session would feel like talking to Alexa about my life issues. 🤪

kellyarmstrong
Автор

I see AI more in an assistant role. For instance, if someone is really distressed and they have access to AI programs the AI could engage them with coping strategies. I think that would be helpful, because when I’m really upset I know I forget coping strategies. AI could also access risk and call 911.

jackiegrice
Автор

I’m so glad AI is keeping human professional diagnoses HOWEVER that doesn’t help the large amount of short term online consoling licenses like we have in my area

dazie
Автор

A robot could never replace a warm empathic human wish a listening ear . And human validation. A robot could never fulfill my needs like my therapist does

kellybilotas
Автор

The only possible use for AI is to observe a persons behavior 24x7 and categorize incidents with audio snippets for download and evaluation by their mental health professional. This would allow the health professional to see real world interaction of their patients with others to see if their patterns fit what they are told by the patient … or maybe even fill in gaps the patient forgets or doesn’t want to disclose.

Davidjune
Автор

Dr Fox - How does therapy connect with the client’s spiritual nature for healing and wholeness? Do you believe it’s necessary? What is your approach?

MarleyLeMar
Автор

This is frightning... not only for the mental health clients (who desperately need human interaction) but for therapist too. Id rather not get my PhD in psychology then not be able to get a position because a robot has replaced me....

breeisme
Автор

I can't really see an AI being able to accurately read a person beyond putting them in a box and giving them a label. Plus, every box that the AI can place someone in is a box that was coded into its diagnostic lexicon by a person... unless it's like a machine learning AI. I don't think we're super close to this development.

Why not push for more investment in creating and maintaining interest in psych careers? Where I live, the mental health system is basically a rotting carcass and its patients are flies subsisting on what morsels they can find.

I see a psychiatrist on a computer screen at the mental health institute in my city, after going through quite a few different clinicians over the course of a good decade or so and I finally feel like I'm receiving the right care and am improving.

If it was all AI, how would one receive a second opinion?

EllieBearHasACat
Автор

Do you think that AI will be able to interpret different kinds of body scan or brain scans to make these diagnoses?

JT.Pilgrim
Автор

crazy me and my partner were talking about our difference of opinion on ai in school (how people use it to write papers) i belive if we don't have rules on its use we will become lazy lol .... but with this i kinda have the same thing its ( in my opinion) unethical to rely on ai to be the soul treatment for psycolagy even if it could but that doesn't mean we coudlent use it as a assistive device for diagnostic and even treatment planing you made a point about them being really good at mimicking emotions .... i can see ai being good in helping train new psycolagist is what you look for when observing emotional reactions in session by actually being able to show a large group what it might look like but i deff don't belive its ethical to toss us to ai and assume we will get just as much help a we would with a real humen

danielhernandez-fomj
Автор

It’s that human beings in mental health are from their ‘tribe’ and they have their agenda. So I can BPD, BP, ADHD, CPTSD a narcissist, autistic, an alcoholic etc., depending on who I am talking to, which YouTube video I am watching. But can Chat GPT break this down objectively? That is a question that I don’t trust it to do. It slips into generic apology mode, creating caveats upon caveats…and so I doubt it ..not yet anyway.

mebeasensei
Автор

I actually had a conversation with AI and it seems like their programmers won't allow it to answer moral questions like the trolley problem eventho it said it theoretically could "Count" moral and decide on the "Right" things based on math. i think they wouldn't allow it any close to diagnosing ppl

seeit
Автор

Sounds like a very dangerous idea 💡 to use AI .
There’s so much more than just reading symptoms

hawaiigirl
Автор

OK, so I have a number of thoughts on this and related topics.

Firstly, I think it's obvious that ChatGPT cannot diagnose anything. I feel like that's not the right question to ask. ChatGPT should *never* be thought of or relied upon as a source of information/knowledge. Any "facts" it tells you are your responsibility to verify. Super important.

Where I think ChatGPT *does* have value — besides entertainment — is as a tool to help you mull over and clarify your own thoughts and ideas. As such, I wonder if it can be helpful for making interpersonal decisions. Let's say I have a friend who is struggling with a personality disorder or mental illness, and I want to know what I can do for them in light of a particular situation that's happening. Maybe they're upset with me and I want some help in deciding what I can do to reconcile and show that I care. Maybe I already have an idea and I want to run it by someone and ask for a second opinion. I would *never* treat ChatGPT as an oracle, but as long as I think critically about its answers and accept responsibility for whatever I decide to do, maybe it can help, at least to the degree of being better than nothing. Your thoughts?

An important caveat: ChatGPT does collect data from your conversations, so there's the question of whether it's ever OK to mention confidential information in a chat.

I have discussed psychological topics with ChatGPT a few times, but this tends to be on an abstract level rather than concrete advice. For example, one conversation was about whether a science fiction story about a stranded time traveller who accidentally causes the Permian Extinction (when most life on Earth was wiped out 250 million years ago) could be used as a metaphor for mental illness. I have also had at least one conversation analysing the lyrics of songs that express a protagonist's psychological state. I always interact with it in a conversational way, not just asking questions but bringing my own perspectives to the table.

apm
Автор

There's no way I'd feel comfortable with Artificial intelligence I don't even like that idea in any way😢

fionahenderson
Автор

I want to know if it can cure it there isnt much technological advancements in the field

JeffreyJorge-pp