UbiStroke: Multimodal Computational Assessment of Stroke
A human-centered AI system that uses multiple sensors to evaluate body posture, facial appearance, pupil response, and speech patterns to identify patient-specific signatures of stroke-related deficits and assist clinical diagnosis.
The Problem
Stroke is a leading cause of death and disability, and the only available therapy (thrombolysis) works within a narrow 3-hour window from symptom onset. Despite this urgency, the therapy is used in fewer than 5% of acute cases. Current assessment relies on subjective human analysis and imaging, with poor predictive outcomes. The gap between the time-sensitivity of stroke treatment and the difficulty of rapid, accurate diagnosis represents a critical healthcare challenge.
Our Approach
UbiStroke takes a human-centered AI approach to stroke assessment. Rather than replacing clinical judgment, the system supplements the standard NIHSS (National Institutes of Health Stroke Scale) clinical exam with computational analysis from multiple sensor modalities. The system evaluates:
- Body posture: Detecting asymmetries and motor deficits through depth sensing
- Facial appearance: Identifying facial drooping and asymmetry using computer vision
- Pupil response: Measuring pupillary reactions that may indicate neurological damage
- Speech patterns: Analyzing speech for slurring, word-finding difficulty, and other stroke indicators
By combining these signals, UbiStroke generates a comprehensive, patient-specific multimodal signature of stroke-related deficits, providing clinicians with quantitative data to support faster and more confident diagnostic decisions.
HoloStroke Spinoff
UbiStroke also spawned HoloStroke, a related initiative exploring how mixed reality technology could enable remote neurologist consultations for stroke diagnosis — allowing stroke specialists to virtually examine patients in emergency departments that lack on-site neurology expertise.
My Contribution
I contributed to the sensor integration and data processing pipeline for UbiStroke, working on the system architecture that combines inputs from multiple sensing modalities into a unified assessment framework. My experience with mixed reality systems also informed the HoloStroke spinoff work.
Team
- Nadir Weibel — Principal Investigator, Professor of CSE, UC San Diego
- Vish Ramesh — NIH NLM Postdoctoral Fellow
- Danilo Gasques — Ph.D. Candidate
- Steven Rick — Ph.D. Candidate
- Gert Cauwenberghs — Professor, Bioengineering
- Brett Meyer
- Kunal Agrawal
- Andrew Nguyen
- Erik Goron
Collaborations & Funding
- UCSD Stroke Center
- UC San Diego Department of Bioengineering
- Institute of Neural Computation
- HomniHealth
Funded by the National Science Foundation (NSF), NSF I-Corps, NIH National Library of Medicine, IBM Research, and UC San Diego.