A look at the use of AI in digital mental health
Digital access to mental health interventions is becoming more and available, and is predicted to be on the increase. Some of the reasons for its popularity is ease of access and avoiding the stigma of having to look for psychological help, but also its documented success rate. One reason for this might be the honesty a person is willing to share when not confronted by an actual human, as well as cleverly designed programs to both assess problems, severity and consequently suggest suitable interventions to ameliorate symptoms as well as teach tools for psychological skills.
In Sweden the ACT (Acceptance and Commitment Therapy) program based on Cognitive Behaviour Therapy and Mindfulness has been offered online through prescription for some years now and has been evaluated as successful at treating amongst other mild depression, GAD (General Anxiety Disorder) and stress related issues.
I recently listened to an interview with Psychologist Silja Litvin who is the founder of PsycApps, a company producing apps for emotional fitness as they call it. They are pioneering a market of digital tools for psychological wellbeing. Through trial and error they discovered that the best way to encourage commitment to the app was by gamification. Games are designed to find out what is the ailment of the user, as well as practicing tools to learn skills for coping and thriving. The issues dealt with are subclinical and comprise of amongst others; stress, relationship issues, social anxiety, mild depression.
With the use of big data they are looking to gather information in classified systems to be analyzed by AI in order to gain more insight and develop more effective ways of interventions. Issues of integrity and ethics are very important to the company, and algorithms are to be designed by humans so as to avoid suggestions from AI that are ethical or morally dubious. However, they see the possibility of AI to come up with solutions much more innovative an useful than what any human brain could think up.
I think this endeavour is very positive and helpful for many, and I was pleased to hear that the mental health apps they develop are to be seen as a complement to meeting a therapist in real life, and also as a tool for practicing skills that an IRL therapist can suggest.
Studies in various universities across the globe has shown what AI is capable of, which is a double edged sword and calls for caution, and further fruitful discussions on ethics and regulations among the international community in many different fields will be very important. Here are some samples of results from newly published studies:
· AI is faster than a human professional at discovering if a person is suffering from amongst others, current schizophrenia or depression through voice analysis alone
· AI can predict a persons future vulnerability to depression by going through their pictures on their personal Instagram account
· AI can correctly determine if a persons’ sexual orientation is towards the same gender, simply by analyzing their profile picture on Facebook
· AI can determine a persons’ personality traits by their eye movements alone, correctly pinpointing 4 of the 5 major personality traits: neuroticism, extroversion, agreeableness, conscientiousness.
This was some news from the mental health and neuroscience sphere that I hope might be of interest. Digital tools for mental health can be of great value for many and particularly since mental illness is on the increase, but the question of whether we should do something just because we can begs to be answered in the case of AI – used with the wrong intention it can certainly harm more than heal.