How AI-equipped technology could help clinicians better diagnose mental health issues | CBC Radio


When Emilia Molimpakis’ good friend tried suicide a number of years in the past, she questioned what would possibly’ve occurred had her good friend gotten psychological well being assist sooner.

“I truly simply couldn’t perceive or fathom then why her psychiatrist could not truly see this coming — and he had truly simply seen her two days earlier than,” she advised The Current visitor host Nahlah Ayed.

That traumatic expertise led to a burst of creativity. Motivated to fill the gaps in psychological well being remedy utilizing know-how, the postdoctoral neuroscience researcher left her place at College School London. 

Quickly after, she co-founded Thymia, which makes use of video video games to collect knowledge about how individuals work together with their screens — and what that would say about whether or not they have melancholy.

“The idea of Thymia is these video video games that we design, they’re based mostly fully on basic experimental protocols which were examined and validated in 1000’s of medical trials and analysis trials,” she mentioned.

“Each recreation is principally a scientific experiment, and we have simply put a really stunning layer of graphics on high of it, and we get sufferers to have interaction with it.”

Emilia Molimpakis, co-founder of Thymia, mentioned her firm makes use of video video games outfitted with synthetic intelligence to assist clinicians diagnose psychological well being points like melancholy. (Submitted by Emilia Molimpakis)

Thymia is simply part of a bigger motion within the psychological well being trade to deploy synthetic intelligence in an effort to tackle psychological well being. Advocates say this motion might revolutionize how society diagnoses and treats issues like melancholy and psychosis, particularly following the COVID-19 pandemic.

“I believe individuals grew to become rather more open to that concept, and I believe suppliers grew to become extra open as properly on their finish to utilizing know-how to facilitate these interactions,” mentioned Dr. Sean Kidd, senior scientist on the Centre for Dependancy and Psychological Well being and co-founder of the schizophrenia-focused cell app App4Independence.

“We now have each a better openness to the usage of know-how … and better want throughout many communities.” 

WATCH: Enjoying video video games targeted on psychological well being 

Let’s Play: Video video games targeted on psychological well being

Sea of Solitude, Hellblade: Senua’s Sacrifice and Celeste are amongst a rising style of video games that consultants say might be therapeutic for gamers scuffling with melancholy and anxiousness. Narrative designer Kaitlin Tremblay and recreation design professor Sandra Danilovic be a part of CBC’s Jonathan Ore to present them a strive.

A further software

Kidd’s App4Independence, also called A4i, is a three way partnership between CAMH and AI-driven affected person engagement platform MEMOTEXT.

Kidd mentioned it is an evidence-based app that helps sufferers join just about and anonymously to well being care suppliers, whereas additionally offering instruments to scale back isolation and provide assist.

“Solely a really small share of individuals with psychosis ever get, for instance, particular person or group in-person cognitive behavioural remedy for psychosis,” he mentioned. “With that problem of entry, instruments like this may present [cognitive behavioural therapy-based] prompts and strategies to the person.” 

That is particularly key now, when sources obtainable to individuals with psychological well being points are nonetheless fairly scarce, in response to Kidd.

“Even when an individual did have entry to a person or group psychotherapy and frequent contact with a psychiatrist … there nonetheless can be gaps and instances in between these contacts if you would wish to be understanding how an individual’s doing,” he mentioned.

Dr. Sean Kidd, medical pschologist and scientist at CAMH, says App4Independence is an evidence-based app that helps sufferers join just about and anonymously to well being care suppliers, whereas additionally offering instruments to scale back isolation and provide assist. (CAMH)

One such hole is the final lack of instruments obtainable to psychiatrists to diagnose psychological well being sicknesses, in response to Molimpakis.

“They do not actually have many present instruments aside from these questionnaires that they sometimes use,” she mentioned. 

“So for example, if a clinician suspects you’ve gotten melancholy, they could say, ‘On a scale of 1 to 4, are you able to inform me how suicidal you felt up to now two weeks?’ Which is kind of a number one query.”

Molimpakis mentioned applied sciences like Thymia can even assist broaden a doctor’s observational capability and preserve a clearer, extra goal monitor of modifications in an individual’s behavioural sample — akin to their pace of speaking or their twitching, which may be missed by a clinician.

“Thymia is simply measuring these issues in much more goal means and [saying] ‘That is the speech fee for this particular person. Let’s evaluate it to how they have been possibly a month in the past,'” she mentioned. “‘And that is their facial features by way of a spread of feelings. Let’s examine if that is completely different to what it was a number of weeks in the past.'”

What we’re doing is definitely … clarifying extra objectively these measures that clinications inherently know are vital.-Emilia Molimpakis

That mentioned, Molimpakis makes it clear that instruments like Thymia aren’t meant to interchange clinicians or their present questionnaires — nor are they attempting to say they will do the clinicians’ jobs higher than them. Slightly, they’re “clarifying extra objectively these measures that clinicians inherently know are vital,” she mentioned.

Dr. David Gratzer, a psychiatrist at CAMH, compares these instruments to among the help instruments present in trendy vehicles.

“[It’s] the identical means some vehicles now can inform you if there’s one other automotive in your blind spot if you’re about to make a flip or … you are going too quick or that the circumstances are slippery,” he mentioned. “You as the motive force proceed to drive the automotive, however you are helped alongside.”

Though he thinks there’s potential in having technological development to monitoring somebody’s psychological well being, Dr. David Gratzer says customers should be cautious which instruments they provide their non-public and medical data to. (Talia Ricci/CBC)

Respecting affected person privateness

However similar to how a automotive might be mistaken about one other car being in its blindspot, Gratzer mentioned these instruments may also be inaccurate or defective, “which is why it is so vital that we be cautious about these experiments.”

One warning he says needs to be taken significantly is privateness. In keeping with a study published in the Journal of the American Medical Association journal, 29 of 36 apps serving to individuals with melancholy and smoking cessation bought affected person knowledge to 3rd events.

“That is an unregulated discipline. The guarantees are massive. To be blunt, the affected person want, and generally desperation, is nice. We now have to be cautious right here.”

Some have an interest, maybe, in a fast revenue. Others are goodwill, however possibly not as rooted in proof as we would hoped.-Dr. David Gratzer

Kidd agrees with Gratzer a couple of want for rigour and care with how apps akin to A4i use affected person knowledge.

That is why Kidd and his workforce made it an goal to be clear with customers with psychotic sicknesses, their members of the family and care suppliers about what sort of knowledge is collected and the way it’s getting used.

Moreover, the A4i app has been reviewed by CAMH’s privateness workplace, in addition to a third-party.

Nonetheless, Gratzer mentioned there are a number of completely different individuals and firms concerned on this motion, and individuals who flip to know-how to help of their care have to be cautious about who they’re giving their private data too.

“Some have an interest, maybe, in a fast revenue. Others are goodwill, however possibly not as rooted in proof as we would hoped,” he mentioned.

Regardless of the dangerous apples, Gratzer believes these applied sciences have “large potential.”

“As a health-care supplier, I am trying ahead to having extra and higher informational for my sufferers, in order that we will make higher selections collectively,” he mentioned. “If a few of that may be borne of A.I., nice. However we have to be cautious.”


When you or somebody you recognize is struggling, here is the place to get assist:

This information from the Centre for Addiction and Mental Health outlines discuss suicide with somebody you are anxious about.


Written by Mouhamad Rachini. Produced by Alison Masemann.

Leave a Comment