Skip to product information
1 of 7

IMU Fitness


Generate custom synthetic IMU datasets for remote fitness and PT applications

Regular price $0.00
Regular price Sale price $0.00
Sale Sold out

The IMU Fitness API generates synthetic data for applications at the intersection of computer vision and wearable sensors. The API allows users to generate IMU datasets (3D sensor orientation) from a wearable, wrist-worn sensor (such as the Apple Watch) on avatars performing a variety of exercises. Users can generate a single collection or thousands. The API adds realistic kinematic variation to every movement so that no two reps are ever performed in the same way. Everything in the datasets is controllable from number of reps to avatar body shapes to watch position and more. 


The API gives users full programmatic control over the generated synthetic datasets, including:
<li>29 different exercises (plus random motion options)</li>
<li>Kinematic variation control</li>
<li>1-20 reps per video</li>
<li>Watch and wrist position control</li>
<li>Wide range of rep speeds and cadences
20-40 fps</li>
<li>and more!</li>
For a full set of API parameter options, visit the <a href="" target="_blank">README</a>.


The API includes over 25 of the most common fitness and PT exercises. See GIF of all exercises <a href="" target="_blank">here</a>.
<li>Arm raise (with dumbbell)</li>
<li>Bear crawl</li>
<li>Bicep curls</li>
<li>Deadlift (with dumbbell)</li>
<li>Downward dog</li>
<li>Split squat</li>
<li>Tricep kickback</li>
<li>And more!</li>
Need a different exercise? <a href="">Get in touch</a>. Any motion can be easily added into a custom API.


Every paired IMU (3D sensor orientation) and video dataset is accompanied by a rich set of per-frame perfect labels and metadata. For more precise descriptions, please visit our <a href="" target="_blank">README</a>. A few highlights:
<li>Frame-specific rep counts</li>
<li>IMU sensor readings (3D sensor orientation represented by rotation matrices)</li>
<li>Wrist location and crown orientation of the • IMU sensor on the avatar</li>
<li>Relative xy reference and wrist rotation of the IMU sensor</li>
<li>Amount of kinematic variation and speed variation injected into each rep</li>


Dataset size depends on parameter choices like fps, number of reps, etc. Each .csv dataset is accompanied by an .mp4 video showing avatar motion and a .json file with the API job parameters.


The API and the synthetic data generated by the API are licensed under Infinity AI’s <a href="" target="_blank">Terms and Conditions</a>.


Additional API parameters and customizations can be added in custom APIs. Get in touch to discuss your needs (<a href=""></a>).
At Infinity AI, we use rotation matrices to represent IMU sensor rotations in 3D space. If you’re not familiar with rotation matrices, they can be easily converted to eulers or quaternions using scipy (example notebook, scipy reference).
This synthetic IMU (3D sensor orientation) data has been validated against that produced by the Apple CoreMotion attitude measurements (e.g. Apple Watch).
Note: Raw accelerometry and gyroscope data streams are not provided, only 3D sensor orientation data (also referred to as angular position, rotation vector, “quaternions”, etc.).


<li><a href="" target="_blank">README</a></li>
<li>Demo <a href="" target="_blank">Jupyter notebook</a>: 5 different demo notebooks are available.</li>
<li><a href="" target="_blank">Blog post</a></li>

Questions? We’re happy to chat asynchronously via email or hop on a call. Just send us a note at (this goes to all of the Infinity AI founders).

View full details