How AI Knows You Have Depression From Your Voice
As a precursor to this article, clinical depression can be crippling for an individual, and it quickly cycles out of control. Although there is a growing trend in accepting depression, if you feel as though you may be depressed, please feel free to use this test (click here) that is at the bottom of the ‘How to tell if you have depression’ section and/or feel free to consult your GP if you feel as though you are affected (remember than an online questionnaire will not come close to an experienced health professional!). Healthcare isn’t just for the body, the mind can be in just as much discombobulation and pain, and as such I urge to help yourself, the earlier you receive a diagnosis for clinical depression, the quicker you can get treated, and the quicker you could return to a normal life again. Thank you.
Anxiety and depression, known collectively as ‘internalising disorders’ (i.e. an individual internalises their problems), could affect roughly 20% of young children. Although surprising, as one could assume there is few stimuli in a child’s life to induce such behavioural and mental responses, this figure also represents a large problem as children struggle to convey their emotional turmoil in the same way an adult would, and modern life slowly leaves children more and more isolated and secluded from their parents and family life, allowing these internalised disorders to manifest into acute mental disorders that could have been easily prevented. Causes for this failure could lie in the fact that there are extended waiting lists for psychologists and also a failure of parents to spot any key symptoms that could hint at disorders of the mind. As Ellen McGinnis, a clinical psychologist at the University of Vermont Medical Center, puts it, “We need quick, objective tests to catch kids when they are suffering. The majority of kids under eight are undiagnosed."
However, (somewhat) recent publications in the Journal of Biomedical and Health Informatics as of May this year, a machine learning algorithm that could detect the symptoms of anxiety and depression in young children’s speech, which would help to quick and easily diagnose mental disorders in young children that can be missed or overlooked by parents. The element of speed is a key factor here, early diagnosis is of indescribable importance as for children, their ongoing brain development means that they will present strong responses to treatment. Currently, diagnosis is brought about by a semi-structured interview that generally lasts around an hour (or two - as the interview can be varied to find different elements of the conditions, or certain strains of it), conducted by a psychologist and a primary care-giver as an attendant. However, it is believed that artificial intelligence holds the key to making this process much faster and much more reliable, as increasing levels of childhood depression has shown increased (abusive) drug usage and suicidal rates in later life. As a result, Ellen McGinnis, who we met earlier, joined forces with Ryan McGinnis (who is a biomedical engineer at the University of Vermont) to assess how machine learning could be employed in this particular field of healthcare.
For the basis of their study, the researchers employed an adapted mood induction task known as the Trier-Social Stress Task, which is deliberately used to cause feelings of stress and anxiety in a study, sources state that a cohort of 71 children were instructed to generate a three-minute story, and they would be judged directly (which is the source of inducing the anxiety and stress), where one judge was consistently stern and would never give a positive response. Hence, the children, who ranged from the ages three to eight, would have to respond to negative criticism and also the tension in receiving a verdict, alongside persistent buzzer noises at timed intervals to keep the subject in a constant mental turmoil. On of this test, the children were also studied by the semi-structured interview (the standard practise we discussed earlier). With this, the researchers could establish which children could be classed with depression from the interview, and find similarities in the vocal patterns observed in the children during their story recital, forming a machine learning powered tool which can be used to detect signs of depression in children by assessing their vocal pitch and patterns via analysing audio files taken from a clinical survey; the result might surprise you, as the algorithm turned out to be very successful, in fact, Ryan McGinnis commented "The algorithm was able to identify children with a diagnosis of an internalising disorder with 80% accuracy, and in most cases that compared really well to the accuracy of the parent checklist". An excellent feat, given that a standard interview will take at least an hour, whilst the algorithm will present a result in mere seconds! The algorithm had uncovered three key features in the audio files of children who had been diagnosed with depression, namely;
generally low-pitch voices, rather monotone at times
Repeated speech content and inflections
The intermittent buzzers (that remind the child that time is running out) causing a high-pitch response in the short term
Not only would analysing audio files be much easier to deploy in a clinical scenario, as opposed to other methods such as motion analysis during ‘the fear task’ to diagnose depression, the researchers hope that the speech analysis algorithm can be implemented into a universal clinical tool to aid the screening of depression, even available to the public such as Babylon Health’s GP At Hand app that the NHS now encourages patients to use to help avoid wasteful spending of hospital resources. The technology is here, but now innovation is needed to bring the technology to practise.
This could very well be the case soon, as Woochan Hwang, who is a final year medical student at the Imperial College of London, Alice Tang, a newly appointed Junior Doctor, and Dr Wun Wong, a data scientist, have come together to form a ‘medtech’ venture known as Affect.AI, which hopes to address a major concern for clinical depression, i.e. the inability to monitor the progress of depression by patients and clinicians, as no reliable technology exists to this date. Similar to the McGinnis’ technology, Affect.AI will employ observations taken from an existing data set and apply it to a greater clinical environment, seeing how features of a patient’s speech can be linked to components of depression, and hence the patient’s progression can be tracked as the appearance, or loss, of certain patterns in the patient’s speech can be correlated to an increase or decrease in the development of clinical depression. In other words, phonic data taken from a patient may indeed give key details of the patient’s state of mind and its workings, which does boggle the mind - maybe your voice has changed slightly after you read that!
In summary, Affect.AI will employ machine learning and formal assessments (to clinically diagnose patients, and then apply the ML algorithm to decipher audio features present in the clinically diagnosed patients), and as such this will give clinicians a tool to assess how the emotive stress of a patient may change over time by simply analysing a patient’s voice at regular time intervals. If employed in a public app, patients could themselves monitor their progress, and make their clinicians aware of any progression or trends that have been observed to give near immediate treatment if needed. Astounding! Alice Tang and Woochan Hwang are currently in talks to begin trialling their technology for patients who are clinically diagnosed with major depression, and the venture will be showcased at an event organised by the MedTech SuperConnector accelerator programme, and the technology has been reviewed and improved under the advise of other researchers at Imperial College London. As with all pieces of artificial intelligence, further research and testing will be required before these are reliably inputted into a clinical practise, fingers crossed from me! As of now, Affect.AI is an entrepreneurial feat, but in a world as demanding as ours, innovation is key to meet those demands!
A huge thank you to Ellen and Ryan McGinnis, Woochan Hwang, Alice Wang, the University of Vermont, Imperial College London and all other persons involved in either projects. Absolutely fantastic work, and a clear showing of how incredible minds can bring forth equally incredible ideas to make the world a better place, little makes me as upset as the idea of childhood depression, a tragic reality in our playground image of childhood, I think I represent all our reader when I hope that other clinicians show the boldness of these individuals to bring about massive change in their industry to improve the lives of other clinicians and ultimately patients.
Anyone who is currently tackling depression, please follow the advice the start of the article, and know that the majority (roughly 60%) of depression victims do not receive treatment, a consultation at your local GP could the first step in you changing your life! Thank you.
Article thumbnail credit: HIV care worsened with chronicity of depression, Clinical Advisor [click here for page]. Note that this article has not contributed to any of the written content of my article, and that you click the above link at your own discretion as the page has not been checked by our team. Thank you to our readers for your continued support, and to anyone who contributes to our fantastic team to keep our services in orderly fashion.