Are chatbots really the answer for young people's mental health?
22 May 2019Artificial intelligence (AI) has its limitations, writes psychotherapist Adele Braun.
As a child and adolescent psychotherapist I am aware that mental health needs among children and teeagers are ever growing. But professional services are stretched by the legacy of austerity and there can be an agonising wait for families to receive the help that they need. For this and other reasons, there is a growing trend of using AI based therapy chatbots and other online resources. But are these digital services really an effective contribution to the field of mental health care?
"UK law states that appropriate actions must be taken if a young person discloses a significant risk of harm to themselves or others."
For young people in distress, it can be incredibly hard to know what kind of support they need and to seek this out from a helpful adult. They might not feel able to approach a member of their family or school to ask for help; for fear of not being understood or worse judged, not wanting to burden those they care about, or unsure if the adult may even know how to help them. To access professionals in Child and Adolescent Mental Health Services (CAMHS) or voluntary sector provisions often requires a referral from a GP or school, with few that allow teenagers to refer themselves. If and when a young person has managed to seek out help in this way, there can often be a long wait for therapy. So young people will often turn to the internet in search of a way to help themselves.
Can AI really step in to fill this gap in an effective and safe way? The growing industry of online services can be endorsed by mental health professionals and others have little or no clinical foundation. The appeal of therapy chatbots is clear given they are available 24 hours a day and have the potential to reach far more people than services can currently see.
There is a benefit in having an accessible prompt to engage in a helpful activity (e.g. yoga or deep breathing for those suffering from panic attacks), which some of these digital services offer. But those offering AI therapeutic ‘conversations’ meet their limitations in more clear ways. The human experience is incredibly nuanced and complex. Therapy chatbots largely respond in formulaic ways and simply cannot distinguish between content in the way that an empathic, helpful and trained human being can. Many AI services clarify that they are not an alternative to therapy, however there is something tantalisingabout a system that engages with a vulnerable person as though it were human - and it can be disturbing when the replies are not in fact sensitive and responsive to their needs.
More concerningly, several mental health chatbots have been in the public eye for failing to detect and respond appropriately to children’s disclosures of sexual abuse, eating disorders and drug use. This emerged through a BBC investigation into two mental health chatbot apps, which highlighted that in using one of these apps, a young person would not get an immediate warning of the seriousness of their situation, nor an appropriately careful and thoughtful plan of how to get more help. UK law states that appropriate actions must be taken if a young person discloses a significant risk of harm to themselves or others. But in tests conducted by the BBC, neither Wysa nor Woebot told an apparent victim to seek emergency help. The Children's Commissioner for England said the flaws meant these chatbots were not currently "fit for purpose", as "they should be able to recognise and flag for human intervention a clear breach of law or safeguarding of children".
Get serious
Fortunately, this month the UK government introduced the world’s first online safety laws of their kind, which state that social media firms must abide by mandatory “duty of care” to protect users and could face heavy fines if they fail to deliver. It is critically important that digital businesses offering to support young people at their most vulnerable take this responsibility seriously.
So where does this leave the role of AI in the world of children’s mental health care? Currently, therapy chatbots may hold some value as dynamic self-help books. But the therapeutic function can never replace the value of a helpful human relationship. That is really where the enthusiasm, innovation and funding needs to be directed. There are finally drives by the government to increase children’s access to therapy in schools and professional services. But this needs to be far more ambitious and have a far more expansive reach because currently the targets reach just 35 per cent of diagnosable children. In the same way that children cannot be adequately parented by an AI system, they cannot be adequately helped to work through the complexities of their emotional difficulties by one.
- See also: It's exam time - but university students need support all year long
- See also: Here's why young people are drawn to online counselling
Comments
Write a Comment
Comment Submitted