We introduce a new framework that combines deep learning and top-down optimization to predict lower extremity joint kinematics directly from inertial data, without relying on magnetometer readings.
The difficulty of estimating joint kinematics remains a critical barrier toward widespread use of motion-tracking sensors in biomechanics. Traditional sensor-fusion filters are largely reliant on magnetometer readings, which may be disturbed in uncontrolled environments. Careful sensor-to-segment alignment and calibration strategies are also necessary, which may burden users and lead to further error in out-of-laboratory settings. We introduce a new framework that combines deep learning and top-down optimization to accurately predict lower extremity joint angles directly from inertial data, without relying on magnetometer readings.
Data: We uploaded the sample data and code for the demo run. This includes pre-trained models and one-subject sample IMU data for each joint (hip, knee, and ankle) and activity (walking and running). To run the demo code, please follow:
1. Download the demo.zip files in Downloads module and unzip it
2. Unzip data.zip and models.zip in the directory
3. Install the required dependencies listed in the requirements.txt file
4. Run demo.py by specifying your target joint and activity.
(e.g., python3 demo.py 'Hip' 'Walking')
Code: https://github.com/CMU-MBL/JointAnglePrediction_JOB
Citation: Eric Rapp*, Soyong Shin*, Wolf Thomsen, Reed Ferber, and Eni Halilaj. "Estimation of kinematics from inertial measurement units using a combined deep learning and optimization framework." Journal of Biomechanics 116 (2021): 110229.
* equal contribution