IMU Fitness

Random Motion Dataset

An open-source synthetic IMU dataset of random movements to make models robust to non-target movements.

Regular price $0.00
Regular price Sale price $0.00
Sale Sold out
The IMU Fitness Random Motion dataset contains 400 random motion examples with paired video and IMU (3D sensor orientation) recordings. The movement is designed to be temporally and spatially coherent, and is bound by physical constraints such as natural arm flexion and extension. This dataset pairs well with our other open-source IMU dataset (IMU Fitness Exercises), and may help your models distinguish between real signal and random movements.


<li>400 paired IMU (3D sensor orientation) samples + videos.</li>
<li>Physics-constrained, temporally and spatially coherent random movements.</li>
<li>Serves as negative data for ML models (e.g. rep counting)</li>
<li>20 fps datasets</li>
<li>Variation in IMU sensor locations (left and right wrists) and crown orientations (proximal and distal).</li>


Random, but kinematically constrained motions


The paired IMU (3D sensor orientation) and video data is accompanied by a rich set of per-frame perfect labels and metadata. For more precise descriptions, please visit our <a href="" target="_blank">README</a>.
A few highlights:
<li>IMU sensor readings (3D sensor orientation represented by rotation matrices) at 20fps.</li>
<li>Wrist location and crown orientation of the IMU sensor on the avatar.</li>
<li>Relative xy reference and wrist rotation of the IMU sensor.</li>


480 x 480 mp4 videos (20fps) with paired IMU (3D sensor orientation) readings (20 fps)
Total size: 450MB


This dataset is licensed under a <a href="" target="_blank">Creative Commons Attribution 4.0 International License</a>.
Both academic and commercial applications are allowed.


At Infinity AI, we use rotation matrices to represent IMU sensor rotations in 3D space. If you’re not familiar with rotation matrices, they can be easily converted to eulers or quaternions using scipy.
This synthetic IMU (3D sensor orientation) data has been validated against that produced by the Apple CoreMotion attitude measurements (e.g. Apple Watch).
Note: Raw accelerometry and gyroscope data streams are not provided, only 3D sensor orientation data (also referred to as angular position, rotation vector, “quaternions,” etc.).


<li>Github <a href="" target="_blank">README</a>: full dataset and annotation descriptions</li>
<li>Demo <a href="" target="_blank">Jupyter notebook</a></li>

Questions? We’re happy to chat asynchronously via email or hop on a call. Just send us a note at (this goes to all of the Infinity AI founders).

View full details