鶹Ƶ

'We want to change lives for the better': 鶹Ƶ researcher probes the mind for signs of cognitive decline

photo of Frank Rudzicz
Frank Rudzicz, an associate professor in 鶹Ƶ's department of computer science, is behind a research-focused web portal that gathers linguistic data through an array of cognitive tasks performed by participants (photo by Johnny Guatto)

Language, in the form of a speech disorder, is also a lens into the minds of patients suffering from cerebral palsy, Parkinson’s disease and multiple sclerosis. It is one of the telltale symptoms of Alzheimer’s disease, and one of the first signs of cognitive decline as a person ages.

As North America’s population gets older – by 2030, fully a quarter of Canadians will be over 65 years of age, up from 16 per cent in 2014 – there is a growing need for early detection, accurate diagnosis and improved outcomes in patient care, and a potential for language to aid in all those facets.

That’s the thinking behind Talk2Me, a web portal developed by University of Toronto and other researchers that gathers linguistic data through an array of cognitive tasks performed by participants. The tool was described in published in PLOS One.

“Talk2Me will help enable a community of people to solve problems related to neuro-degenerative issues, cognitive issues and psychiatry,” says Frank Rudzicz, an associate professor in the department of computer science in the 鶹Ƶ’s Faculty of Arts & Science.

“It’s a common, open platform to help solve these problems.”

In a diagnostic setting, the Talk2Me portal, which also runs on a tablet, replaces the typical assessment scenario conducted with pen and paper between a physician and their patient – a scenario that can be imprecise and vulnerable to bias. Researchers can access the gathered data for their research, or they can use the portal to gather their own data from select participants.

Almost 10 per cent of the North American population has some form of speech disorder, including 7.5 million individuals with disorders caused by cerebral palsy, Parkinson’s or multiple sclerosis.

Talk2Me was developed by a team that includes: Rudzicz, who is also affiliated with the Vector Institute for Artificial Intelligence and the International Centre for Surgical Safety and the Li Ka Shing Knowledge Institute at St. Michael’s Hospital; Daniyal Liaqat, a PhD candidate in the department of computer science; and researchers from Carleton University, St. Michael’s Hospital and the National Research Council.

Talk2Me collects data using tasks similar to those used in standard assessments of cognition, with participants inputting their responses by typing or speaking.

For example, in the picture-description task, participants describe images like the “cookie theft” illustration that portrays a woman and two children in a kitchen. The woman is washing dishes while two children take cookies from a cookie jar. In the task, participants typically respond with varying degrees of detail and inference. Some identify the woman as the mother even though the relationship is not explicitly portrayed. Similarly, the motivation of the children is unclear, but some say they are stealing cookies behind their mother’s back.

In the Winograd schema task, participants are given a statement like: “The trophy could not fit into the suitcase because it was too big.” They are then asked: “What was too big, the trophy or the suitcase?” Responding that the suitcase is too big could be a sign that a person’s executive function – defined by our set of mental skills – is impaired. If a person’s ability to answer properly changed over time, it could be an indication of the onset of age-related dementia.

Other tasks require participants to type as many words as possible that fit a given category – like “fruit.” In another, they are asked to re-tell a short story they have just read. And in the word-colour Stroop task, participants are shown the name of a colour spelled out in coloured letters – for example, the word “green” spelled out in a red typeface. Participants are asked to say the colour of the letters – in this case, “red.”

Different tasks involve different mental processes and the responses contain different features or measureable units of language. Talk2Me’s natural language processing software analyzes text and audio for these features, which include: the number of words used to describe something; the number of syllables in words; grammatical complexity; the frequency of speech fillers like “uh” and “um”; pitch, pauses, loudness and more.

Rudzicz’s broad focus is to apply natural language processing and machine learning to health care, and Talk2Me is just one way in which he and his collaborators are studying cognitive health through the lens of language.

In 2015, Rudzicz co-founded WinterLight Labs along with computer scientists Katie Fraser, Liam Kaufman and Maria Yancheva. WinterLight is a 鶹Ƶ start-up designing tools to track, screen for and predict the onset of dementia and psychiatric illness. Its first product was a tool that runs on a tablet or computer that, like Talk2Me, gathers input from a patient and analyzes the data to help diagnose and predict Alzheimer’s disease.

With Talk2Me, and their work at WinterLight and other institutes, Rudzicz and his colleagues exploit the lens of language and continue to sharpen its focus to see more clearly into the human mind.

“It’s an exciting time, now, where artificial intelligence can make a real impact in health care,” Rudzicz says. “And my colleagues and I want to have an impact beyond publishing papers and academic output.

“We want to change lives for the better and improve outcomes.”

The research received support from the Natural Sciences and Engineering Research Council and the Canadian Institutes of Health Research.

 

Arts & Science