Towards Multimodal Dialog-Based Speech & Facial Biomarkers of Schizophrenia

Abstract

We present a scalable multimodal dialog platform for the remote digital assessment and monitoring of schizophrenia. Patients diagnosed with schizophrenia and healthy controls interacted with Tina, a virtual conversational agent, as she guided them through a brief set of structured tasks, while their speech and facial video was streamed in real-time to a back-end analytics module. Patients were concurrently assessed by trained raters on validated clinical scales. We find that multiple speech and facial biomarkers extracted from these data streams show significant differences (as measured by effect sizes) between patients and controls, and furthermore, machine learning models built on such features can classify patients and controls with high sensitivity and specificity. We further investigate, using correlation analysis between the extracted metrics and standardized clinical scales for the assessment of schizophrenia symptoms, how such speech and facial biomarkers can provide further insight into schizophrenia symptomatology.

Publication
ICMI 2022 Workshop on Social Affective Multimodal Interaction for Health