On hearing about local marathoner, record holder and fund raiser Martin Parnell’s ambitious attempt to complete the 150km ultra at the Calgary Marathon weekend at the end of May 2017, I couldn’t help but visualise the opportunity to collect some pretty unique gait analysis data. Just a week before the event myself and Martin got together and came up with a basic plan to kit him out with sensors to collect data on his gait characteristics. My main objective, as is the case with every runner I work with, was to ensure that the data collection did nothing to interfere with his main task. In this case, running a demanding ultra style race for up to 20 hours.
ensure the data collection does not interfere
My plan with Martin was simple. 2 watches, a chest strap sensor, left and right tibial sensors and 4 sensors on the shoes. 2 on the back and 2 on the laces. Finally I attached a receiving head unit or RFD to his waist belt. Measuring data from around his hips was not an option at any point as Martin needed his waist belt at all times for carrying fluids.
From this array of tech we would be able to record the following metrics:
GPS track and elevation (m), heart rate (bpm), pace (min/km), cadence (spm), bulk power (W), vertical displacement (cm), bilateral power (W)/force(N)/peak acceleration (g)/impact shock (G)/foot strike character/ankle pronation (deg)/ankle stability (deg/s)/ground contact time (ms) and overall asymmetry (%).
A few days shy of the event I performed a quick test run on a treadmill with the sensors to make sure everything was working fine and to give Martin some insight into what exactly was being recorded. Luckily for me Martin was positively surprised at how ‘non-annoying’ the sensors were and things looked good for a fantastic experiment.
At this point I should explain why I was so keen to run this data collection project during Martin’s ultra marathon.
Using wearable technology to perform gait and biomechanical analysis is still relatively new. In fact from my own perspective I’d say it’s only been possible to do it with any validity and depth for the last 2 years. As far as I know from my knowledge, networks and published information, no one has ever collected a full suite gait analysis dataset on a runner for 150km of running and 19-20 hours of time duration. So Martin’s unique challenge would provide a brilliant opportunity to push the technology to its maximum limits in terms of recording time and data quantity.
push the technology to its maximum limits
On the day however things (as is just the case sometimes) did not unfold quite as we had expected. First a heavy rain storm developed during the first 30 minutes of the run, second the Garmin Fr735XT we were depending on stopped recording correctly* and thirdly the size of the task of completing the 150km in the tight timeline set by the organisers, put an unrealistic squeeze on Martin’s pacing. So by 8am the following morning after 14 hours on the move Martin’s race reached its conclusion, with around 96 kilometers done and 12 hours of running data collected.
12 hours of continuous data collected
Although it wasn’t the full 19 hour event we had hoped for and although some of the data collection was compromised by the faulty watch, we were able to collect a pretty insightful and unique set of data over a 12 hour period. Which may still be the longest continuous gait analysis of its type performed so far?
The funny thing was, as we sat down several days later to go through analysing what we had recorded, we both simultaneously came to the same high level conclusion. Now that we’ve done this and learned so much about how to perform such an experiment, why park all the learnings and not develop them further? So. Next up, Martin’s next idea. The 85km Golden Ultra trail race in Golden, BC. Watch this space. Calgary 150 was just the start.
*The Fr735XT that was designed to be Martin’s primary GPS watch during the event was setup to run in UltraTrac mode in order to save significant battery life such that the watch would be able to last for the maximum requirement of 20 hours continuously. Unfortunately this mode didn’t record data in the manner that it was designed to and provided a dataset that ultimately wasn’t usable. In future events the main GPS watch will be run in Smart mode (enabling very minor battery savings) and changed with a new watch at predetermined points, with the data being spliced together afterwards.