Multimodal wearable sensor assessment of child biobehavioral responses for early detection of mental health disorders
Abstract
Can wearable sensors reveal what caregiver reports miss? In a study of 104 children (ages 4–8), we captured biobehavioral signals—heart rate, electrodermal activity, temperature, movement, and speech—across multiple body locations during emotion-eliciting tasks. Machine learning models predicted ADHD, anxiety, and depression with strong discriminative performance (AUCs: .76-.84), identifying up to 3× more clinically diagnosed children than caregiver report alone. Each condition showed distinct physiological response patterns. These findings highlight the potential of multimodal sensing to improve early identification of mental health concerns that often go unrecognized in young children.
Primary Faculty Mentor Name
Yuri Hudak
Status
Graduate
Student College
College of Engineering and Mathematical Sciences
Program/Major
Data Science
Primary Research Category
Engineering and Math Science
Multimodal wearable sensor assessment of child biobehavioral responses for early detection of mental health disorders
Can wearable sensors reveal what caregiver reports miss? In a study of 104 children (ages 4–8), we captured biobehavioral signals—heart rate, electrodermal activity, temperature, movement, and speech—across multiple body locations during emotion-eliciting tasks. Machine learning models predicted ADHD, anxiety, and depression with strong discriminative performance (AUCs: .76-.84), identifying up to 3× more clinically diagnosed children than caregiver report alone. Each condition showed distinct physiological response patterns. These findings highlight the potential of multimodal sensing to improve early identification of mental health concerns that often go unrecognized in young children.