Can AI Ever Replace Therapists?
Some of the world’s brightest minds, such as Elon Musk of Tesla and the physicist Stephen Hawking, consider artificial intelligence (AI), left unchecked, to be a serious threat to human civilisation. AI seems to be more likely to work with psychologists, to enhance psychologists impact on the client through automation of menial tasks.
Part of what practising psychologists do, such as administering certain types of psychological tests, assessments, and questionnaires, can be automated — the technical capabilities exist today. And the technology is growing exponentially more sophisticated. For example, researchers at MIT have created an artificial neural network computer model that can detect depression from natural conversation.

IBM Watson
IBM’s Watson matches patients with potential online counsellors, even making treatment suggestions. Just like a real person, the underlying AI can get a read on people through movement and cognitive analysis of their speech. It can determine mood, tone, inflexion, and so forth.
In the mid 60’s at MIT, Joseph Weizenbaum created a computer program called ELIZA. Its responses were based on Rogerian principles of reflecting back to the user what they had said. The set up was a basic DOS screen where the user could type in what was bothering them, and the computer would respond. The participants in the study began to attribute human emotions to the computer even though Weizenbaum insisted the computer was following a programmed algorithm.
Similarly to a programme called Woebot, a chatbot therapist, which helps with depression, anxiety, relationship problems and more, uses a Facebook messenger to administer a very common form of therapy called Cognitive Behavioural Therapy (CBT), which, as Megan Molteni explains it, "asks people to recast their negative thoughts in a more objective light." As responses are given, negative thoughts are processed which allows the chatbot to recognise patterns and triggers and try to stop them. As people continuously use this app, the robot remembers the user’s responses and “gets to know” them as time passes. It recognises changes in mood and can tailor suggestions, the same way a real therapist might.
According to Deloitte, among organisations that are early adopters of AI, 83% have already reported achieving moderate or substantial economic benefits, which in turn may positively reinforce increased investment spending in AI automation projects. By 2025 AI automation will replace 16 percent of U.S. jobs; both white-collar and blue-collar jobs will be eliminated. According to the American Psychological Association, job loss “can be devastating, putting unemployed workers at risk for physical illness, marital strain, anxiety, depression and even suicide.”
Each year, the World Health Organisation (WHO) estimates that depression affects over 300 million people, with nearly 800,000 suicides. This could either mean that a demand for clinical and counselling psychologists may grow as people become more aware of the need for it as consciousness of an individual and connections cannot be made by robots or psychologists may be replaced by AI as many apps such as Woebot, Calm, Simple Habit and many more as these apps earned $27 million in global revenue in the first quarter of 2018 alone.
All of this being said, robots are unable to create genuine connections with clients the same way therapists do and connections to help them thrive but making an innovative psychologist app to replace psychologists is not a question of ‘if’ but ‘when,’ so maybe we do have something to worry about. Let’s ask Siri.