top of page
Search
  • Writer's pictureAdam O'Neill

Did artificial intelligence take my job?

Updated: Aug 3, 2023


Did artificial intelligence take my job?

The artificial intelligence (AI) chatbot known as ChatGPT has gained traction recently, and for good reason. The AI language model is able to produce stunningly human-like responses to questions in a way never seen before. The technology is able to not only answer questions posed to it (similar to Google Assistant and Amazon Alexa devices) but it can also remember previous conversations, reference them, and adjust responses accordingly.


Its humanlike responses don’t end with the creation of declarative statements. ChatGPT is able to compose poetry, write music (along with chord charts), and even communicate the gospel. Its implications are far-reaching and the number of applications endless, which leads to the primary question for this article: Has AI taken my job?


The application of AI to the mental health field can influence care in a variety of ways. From initial patient screening to monitoring the most up-to-date treatment protocols and completing rough drafts of chart notes, AI can free up providers’ time to do what they prefer—spending time with their patients; and because their language has improved, speaking to a chatbot today feels more like talking to a real human.


Yet, psychiatry (and even more so, Christian psychiatry) is not a series of checkboxes or protocols; it's not even just a conversation, but a complex interaction of physical and nonphysical elements, moral and ethical dilemmas, and questions on the meaning and purpose of life, human existence, and the presence of good and evil, sickness and health. These are not questions a computer can answer intrinsically. This is not to say that they cannot search and repeat another’s answer to these questions, but their assessment is not their own. Behind their answers are original thoughts or ideas that were devised from the minds of humans.


No matter how far AI progresses, it will not have an innate sense of right and wrong (conscience), only programmed guidelines of right and wrong. It does not have a sense of “self” (consciousness), and it cannot “feel” emotions as we do (sentience). Simply, they are not made imago Dei (in the image of God).


With a sense of irony that is not lost on me, I asked ChatGPT what its limitations were in providing mental health care. What follows was written completely by a computer without edits.

 

While AI has the potential to play a valuable role in the treatment of mental illness, there are several limitations to be aware of:


Lack of human interaction: Mental health treatment often involves a strong therapeutic relationship between the patient and the therapist. AI-powered mental health tools may not be able to fully replace the human connection and empathy that are critical to effective treatment.


Limited ability to interpret emotions: AI technology may struggle to understand the nuanced and complex emotional experiences of patients with mental illness. As a result, AI-powered tools may not be able to provide the same level of empathy and understanding as a human therapist.


Bias and inaccuracies: AI systems can be prone to bias, especially if they are trained on biased data. This can lead to inaccurate diagnoses and treatments, and may perpetuate existing inequalities in mental health care.


Limited scope: AI technology may be able to provide limited support for specific mental health conditions, but may not be equipped to address more complex or multi-faceted mental health issues.


Legal and ethical concerns: There are also legal and ethical considerations surrounding the use of AI in mental health treatment, such as privacy and confidentiality, informed consent, and accountability.


In summary, while AI technology has the potential to play a role in the treatment of mental illness, it is important to understand its limitations and to approach its use with caution. It is important to emphasize that AI should not be used as a replacement for human therapists, but rather as a tool to support and augment their work.


 

I'd have to agree with the computer's assessment though it is notable it did not address, except briefly in scope, those metaphysical qualities we discussed above: conscience, consciousness, and sentience.


When patients come to a Christian counselor or psychiatrist, they present some of the deepest parts of themselves for consideration. How they think and feel, what they desire and despise, how they've erred or achieved. What they need is a response that comes from the heart as much as it comes from the head.

So for now, it seems my job is safe, though I look forward to working alongside my computer colleagues to support the mental health of patients for years to come.

21 views0 comments

Related Posts

See All

Comments


bottom of page