Therapy By AI
We should be concerned that therapy is now being taken over by an AI and the session does not seem to be monitored.
11/19/20241 min read
That incessant need for instant gratification and lack of human workers has led us to this moment where AI takes that precious call and offers a ‘listening ear’. But if we are suffering from mental health issues, how can we judge the effectiveness of the transaction?
I suspect those who use this service offered by health service organizations will feel better knowing such companies are behind it. The assumption would be that they modelled this tool to assist and become a backup support.
I assume this means the content of the talks are recorded, assessed and there is a follow up call from a registered therapist. But how do we know how the person actually feels after that interaction and if the AI was able to do anything productive towards reliving their issues.
From what I understand, the calls are not being directly monitored which I find troubling. If we are being fooled into thinking a machine cares about us and understands our particular problem, that sets a dangerous precedent in my mind.
As it is, we are having trouble knowing what info is real and what is fake online. We also cannot tell if an AI created content or it was provided by a human or at the very least, supervised by one.
So can you see a theme here? We would not be able to recognize if the ‘therapy’ we are getting is appropriate or misleading for the particular issues we are suffering. Just listing off parameters to teach an AI a subject does not mean it can provide reasoning to discern what is needed.
I know as a person who provided counselling through mental health training on a crisis line that this first connection between therapist and caller is vital and mistakes can be made. It just strikes me we need to slow down and give much more study to this before letting go of the reins and using AI as the solution to all our needs. Humans are still vital in this formula.