1

Multimodal Dialog Based Speech and Facial Biomarkers Capture Differential Disease Progression Rates for ALS Remote Patient Monitoring

Voice Activity Detection Considerations in a Dialog Agent for Dysarthric Speakers

Conversational dialog technology is increasingly being recognized as auseful means of automating remote patient monitoring and diagnostics for dysarthric speakers at scale. However, the characteristics of dysarthric speech introduce multiple …

Multimodal Conversational Technology for Remote Assessment of Symptom Severity in People with Schizophrenia

Investigating the Interplay Between Affective, Phonatory and Motoric Subsystems in Autism Spectrum Disorder Using a Multimodal Dialogue Agent

We explore the utility of an on-demand multimodal conversational platform in extracting speech and facial metrics in children with Autism Spectrum Disorder (ASD). We investigate the extent to which these metrics correlate with objective clinical …

Investigating the Utility of Multimodal Conversational Technology and Audiovisual Analytic Measures for the Assessment and Monitoring of Amyotrophic Lateral Sclerosis at Scale

We propose a cloud-based multimodal dialog platform for the remote assessment and monitoring of Amyotrophic Lateral Sclerosis (ALS) at scale. This paper presents our vision, technology setup, and an initial investigation of the efficacy of the …

Emotion Intensity and Gender Detection via Speech and Facial Expressions

Human emotion detection has received increasing attention over the last decades for a variety of applications and systems. However, detecting the intensity of the expressed emotion has not been investigated as much as detecting the type of the …

Toward Remote Patient Monitoring of Speech, Video, Cognitive and Respiratory Biomarkers Using Multimodal Dialog Technology

We demonstrate a multimodal conversational platform for remote patient diagnosis and monitoring. The platform engages patients in an interactive dialog session and automatically computes metrics relevant to speechacoustics and articulation, oro-motor …

Toward a Reinforcement Learning Based Framework for Learning Cognitive Empathy in Human-Robot Interactions

Observing another’s affective state and adjusting one’s behavior to respond to it, is the basic functionality of empathy. To enable robots to do this, they need a mechanism to learn how to provide the most appropriate empathic behavior through …

Unsupervised Online Grounding of Natural Language during Human-Robot Interaction

Allowing humans to communicate through natural language with robots requires connections between words and percepts. The process of creating these connections is called symbol grounding and has been studied for nearly three decades. Although many …

On the Utility of Audiovisual Dialog Technologies and Signal Analytics for Real-time Remote Monitoring of Depression Biomarkers

We investigate the utility of audiovisual dialog systems combined with speech and video analytics for real-time remote monitoring of depression at scale in uncontrolled environment settings. We collected audiovisual conversational data from …