When Emilia Molimpakis’ friend attempted suicide a few years ago, she wondered what might’ve happened had her friend gotten mental health help sooner.
“I actually just could not understand or fathom then why her psychiatrist couldn’t actually see this coming — and he had actually just seen her two days before,” she told The Current guest host Nahlah Ayed.
That traumatic experience led to a burst of creativity. Motivated to fill the gaps in mental health treatment using technology, the postdoctoral neuroscience researcher left her position at University College London.
Soon after, she co-founded Thymia, which uses video games to gather data about how people interact with their screens — and what that could say about whether they have depression.
“The concept of Thymia is these video games that we design, they are based entirely on classic experimental protocols that have been tested and validated in thousands of clinical trials and research trials,” she said.
“Every game is basically a scientific experiment, and we’ve just put a very beautiful layer of graphics on top of it, and we get patients to engage with it.”
Thymia is just a part of a larger movement in the mental health industry to deploy artificial intelligence in order to address mental health. Advocates say this movement could revolutionize how society diagnoses and treats things like depression and psychosis, especially following the COVID-19 pandemic.
“I think people became much more open to that idea, and I think providers became more open as well on their end to using technology to facilitate those interactions,” said Dr. Sean Kidd, senior scientist at the Centre for Addiction and Mental Health and co-founder of the schizophrenia-focused mobile app App4Independence.
“We have both a greater openness to the use of technology … and greater need across many communities.”
An additional tool
Kidd’s App4Independence, also known as A4i, is a joint venture between CAMH and AI-driven patient engagement platform MEMOTEXT.
Kidd said it’s an evidence-based app that helps patients connect virtually and anonymously to health care providers, while also providing tools to reduce isolation and offer support.
“Only a very small percentage of people with psychosis ever get, for example, individual or group in-person cognitive behavioural therapy for psychosis,” he said. “With that challenge of access, tools like this can provide [cognitive behavioural therapy-based] prompts and suggestions to the individual.”
This is especially key now, when resources available to people with mental health issues are still pretty scarce, according to Kidd.
“Even if a person did have access to an individual or group psychotherapy and frequent contact with a psychiatrist … there still would be gaps and times in between those contacts when you would want to be knowing how a person’s doing,” he said.
One such gap is the general lack of tools available to psychiatrists to diagnose mental health illnesses, according to Molimpakis.
“They don’t really have many existing tools other than these questionnaires that they typically use,” she said.
“So as an example, if a clinician suspects you have depression, they may say, ‘On a scale of 1 to 4, can you tell me how suicidal you felt in the past two weeks?’ Which is quite a leading question.”
Molimpakis said technologies like Thymia can also help expand a physician’s observational capacity and keep a clearer, more objective track of changes in a person’s behavioural pattern — such as their speed of talking or their twitching, which might be missed by a clinician.
“Thymia is just measuring those things in a lot more objective way and [saying] ‘This is the speech rate for this person. Let’s compare it to how they were maybe a month ago,'” she said. “‘And this is their facial expression in terms of a range of emotions. Let’s see if this is different to what it was a few weeks ago.'”
What we are doing is actually … clarifying more objectively those measures that clinications inherently know are important.-Emilia Molimpakis
That said, Molimpakis makes it clear that tools like Thymia are not meant to replace clinicians or their existing questionnaires — nor are they trying to claim they can do the clinicians’ jobs better than them. Rather, they’re “clarifying more objectively those measures that clinicians inherently know are important,” she said.
Dr. David Gratzer, a psychiatrist at CAMH, compares these tools to some of the assist tools found in modern cars.
“[It’s] the same way some cars now can tell you if there’s another car in your blind spot when you’re about to make a turn or … you’re going too fast or that the conditions are slippery,” he said. “You as the driver continue to drive the car, but you’re helped along.”
Respecting patient privacy
But just like how a car can be wrong about another vehicle being in its blindspot, Gratzer said these tools can also be inaccurate or faulty, “which is why it’s so important that we be cautious about these experiments.”
One caution he says should be taken seriously is privacy. According to a study published in the Journal of the American Medical Association journal, 29 of 36 apps helping people with depression and smoking cessation sold patient data to third parties.
“This is an unregulated field. The promises are big. To be blunt, the patient need, and sometimes desperation, is great. We have to be cautious here.”
Some are interested, perhaps, in a quick profit. Others are goodwill, but maybe not as rooted in evidence as we’d hoped.-Dr. David Gratzer
Kidd agrees with Gratzer about a need for rigour and care with how apps such as A4i use patient data.
That’s why Kidd and his team made it an objective to be transparent with users with psychotic illnesses, their family members and care providers about what kind of data is collected and how it’s being used.
Furthermore, the A4i app has been reviewed by CAMH’s privacy office, as well as a third-party.
Still, Gratzer said there are a lot of different people and companies involved in this movement, and people who turn to technology to assist in their care need to be cautious about who they’re giving their personal information too.
“Some are interested, perhaps, in a quick profit. Others are goodwill, but maybe not as rooted in evidence as we’d hoped,” he said.
Despite the bad apples, Gratzer believes these technologies have “tremendous potential.”
“As a health-care provider, I’m looking forward to having more and better informational for my patients, so that we can make better decisions together,” he said. “If some of that can be borne of A.I., great. But we need to be cautious.”
If you or someone you know is struggling, here’s where to get help:
This guide from the Centre for Addiction and Mental Health outlines how to talk about suicide with someone you’re worried about.
Written by Mouhamad Rachini. Produced by Alison Masemann.
View original article here Source