A Novel Approach to Fluency Data Analysis to Improve Dementia Diagnosis
In 2018, an estimated 50 million people worldwide were living with dementia. However, there is currently no single, universal cognitive assessment to diagnose dementia. Many of the tests and metrics used are either too simple or take too long to administer in a clinical setting. Some common diagnostic screenings are also hindered by questionable validity and low sensitivity. Issues with these screening tests cause problems in clinical trials because they blur the line between the actual effects of treatment and the effects of an unreliable or poorly quantified test. I will be working on generating a more robust screening process for dementia, focused on quantifying memory decline. This project includes analyzing the psychometric properties of current diagnostic tests for these disorders to improve their efficacy. In particular, fluency tests are one common test for disorders affecting memory. Drawing on bioinformatics and recurrence quantification analysis concepts, I will be researching a more comprehensive metric for fluency data to see if these additional analyses can better track the progression of memory loss in patients.
Message to Sponsor
- Major: Cognitive Science Major, Bioengineering Minor
- Sponsor: Pease Fund
- Mentor: Ming Hsu