Loading…
WiDS Puget Sound is independently organized by Diversity in Data Science.
Tuesday, May 14 • 3:15pm - 3:40pm
Data expansion to improve accuracy and availability of digital biomarkers for human health and performance

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Feedback form is now closed.

Advances in deep learning and sparse sensing have emerged as powerful tools to enable and expand human motion tracking. Motion tracking and analysis is essential for monitoring disease progression, guiding rehabilitation treatment, evaluating sports performance, and informing assistive device design. Biomechanists traditionally characterize motion, such as gait, by measuring biomechanical variables like joint kinematics, kinetics, and spatio-temporal parameters. Certain biomechanical variables have been established as biomarkers that correlate with meaningful outcomes, such as knee adduction angle for ACL injury or step width variability for aging/fall risk. In the US, with 1 in 7 individuals having a mobility disability and 1 in 2 adults living with a musculoskeletal condition, monitoring human motion 'in the wild' is vital for observing individuals' natural functionality and lifestyle. For motion to be observed in natural or uncontrolled environments, sensing devices must be portable, unobtrusive, reliable, and accurate. However, for sensing data to be meaningful, measurements must be converted to and contextualized as personalized biomechanical outcomes, a challenge not yet overcome in natural environments. Here, we present a deep learning algorithm -- originally developed for full state-space reconstruction of complex dynamical systems -- for personalized human motion tracking. Using this algorithm, we learn a mapping that transforms a low-dimensional sensor input into the full state-space dataset. By using as few as one sensor, we demonstrate that it is possible to reconstruct a comprehensive set of measures that are important for tracking and informing mobility-related health outcomes. As a concrete example, most smartwatches and smartphones contain an IMU (inertial measurement unit) sensor that monitors movement and is currently used for simple measures like daily step count or gesture control. We have demonstrated that our deep learning algorithm can use this single sensor to reconstruct not just the body segment where the sensor is worn, but the motion and – in some cases – the physiological state of the body. The basic premise of our approach that makes this powerful transformation possible is the leveraging of sensor measurement time histories to inform the mapping from low to high dimensional data. By expanding our datasets to unmeasured or unavailable quantities, this work can impact clinical trials, robotic/device control, and human performance. Additionally, this methodology may enable more efficient and cost-effective remote monitoring of patients, reducing the need for frequent visits to clinical settings. Overall, our work represents a major advance in personalized human motion sensing and has the potential to transform the way we monitor and manage movement-related health outcomes.

Speakers
avatar for Megan Ebers

Megan Ebers

Postdoctoral scholar, University of Washington
I am a postdoctoral scholar in Applied Mathematics with the NSF AI Institute in Dynamic Systems at the University of Washington. My postdoctoral research focuses on data-driven and reduced-order methods for complex systems. In my PhD research, I developed and applied machine learning... Read More →


Tuesday May 14, 2024 3:15pm - 3:40pm PDT
Room 130, Student Center