Sometimes your fitness tracker lies – a lot
You did it -- 10,000 steps!
Yep, 10,000 steps in a single day.
It's now a daily accomplishment you celebrate along with millions of other people who track their activity with a wearable fitness monitor.
But is your fitness tracker really giving you accurate data – or just telling you what you want to hear?
To answer that question, 13 Investigates splurged a little.
We bought a bunch of fitness trackers and tested them so you don't have to.
Steps, distance, calories, heart rate – we looked at all of it.
While the devices are sometimes on target, our test results show all of the activity trackers reported questionable calculations that are vastly inaccurate and, in some cases, even potentially dangerous to your health.
What we tested
In the past three years, the fitness tracker market has exploded worldwide. It's now a multi-billion dollar industry dominated by big names like Fitbit, Jawbone and Garmin.
WTHR's test included those popular brands, as well as activity trackers from lesser-known companies that are sold online and at local retailers.
The specific models 13 Investigates tested are:
- Fitbit Charge HR
- Fitbit Zip
- Jawbone UP3
- Garmin Vivosmart HR
- iFit Vue
- Misfit Flash
Some of the fitness trackers are worn like a wristwatch, while others are designed to be worn on your hip. They all have wireless capability to send data directly to a smartphone. The tested models range in price from $50 to $180.
WTHR bought all of the fitness trackers new and, yes, we read pages and pages of small print to make sure we followed all of the manufacturer instructions.
We let actual experts handle the testing.
Inside the lab
Ball State University's Human Performance Laboratory is a fitness tracker testing wonderland. Dr. Alex Montoye oversees the lab, where he's been testing activity trackers for the past two years.
"They're popular because the information they provide, it's easily accessible data about yourself," Montoye said. "As far as accuracy, I think people tend to put a lot more stock in the accuracy of these than maybe they should. I think they probably would be surprised."
Hang on. We'll get to the surprise part in just a second. Promise.
First let me explain how we did the nifty-but-kind-of-geeky test.
Using a team of student researchers from Ball State's Clinical Exercise Physiology Program (a fancy way of saying college kids who really like to study exercise), Montoye conducted a thorough set of tests on each of the six activity trackers provided by WTHR.
The professor likes to test the fitness trackers not only during traditional exercise, but also during other activities that reflect how real people live their actual lives.
So test volunteers Chad Balilo and Alexis Sutter – both Ball State graduate students – participated in the following series of closely-monitored activities:
- Walking (both on a treadmill and in a hallway)
- Climbing stairs
"Most people spend the majority of their day sedentary, sitting at a desk writing or typing or reading on a computer, so having an idea how the monitors work during those activities is really important," Montoye explained. "And a lot of people spend at least part of their day doing lifestyle or chore activities. Light intensity activities like that do have health benefits we want to track and be aware of."
(If you're a movie star and your butler does all of your sweeping, your publicist types all of your party invitations and your new mansion has only an elevator instead of stairs, you can stop reading here.)
The actual test is a combination of modern technology and science mixed with old-fashioned mathematics and sweat.
The research team first recorded Alexis and Chad's height, weight and other terribly personal information to download the information into the fitness monitors. The volunteers then strapped and clipped on the gadgets – all six at once – to begin their hour-long test regimen.
As the fitness monitors collected and reported data during each activity, Montoye and his team did the same thing.
They used manual tally counters – click, click, click – to record every step Chad and Alexis took. A calibrated treadmill measured the precise distance they traveled. Portable medical devices with intimidating names like metabolic analyzer and pulse oximeter constantly calculated how many calories they burned and the speed of their heart beats.
In other words, the researchers measured everything the fitness trackers were trying to measure. And they methodically recorded the data before, during and after each activity.
The results provide a snapshot of each fitness tracker's accuracy -- how the information reported by each gadget compares to the data scientifically measured in the lab.
Steps: Putting their best foot forward
The best news for all of the fitness trackers we tested is they're solid step counters.
For example, manual tally counters showed Alexis took 567 steps while walking on the treadmill. Each of the fitness trackers reported numbers between 554 and 585 – less than 20 steps off.
Chad took 748 steps during his run, according to undergrad interns who manually counted every one of them. Most of the activity trackers were within five steps of that number.
Overall, most of the monitors were accurate within 2 or 3% while counting steps during walking and running. Montoye says that level of accuracy is outstanding.
"We kind of consider 10% as a threshhold. If you can get within 10% of the actual steps taken or calories burned, you're doing pretty well," he said.
Most of the fitness trackers also did pretty well counting steps on stairs – with an average error rate around 8%.
And they were all perfect during the typing activity, with each activity tracker recording zero.
"The monitors are usually worn on a wrist, and your wrists are probably moving as you're typing. In that situation, the monitors worked well as far as not picking up steps," explained Montoye.
The activity trackers were far less accurate in counting steps while Alexis and Chad were sweeping.
According to the manual tally trackers, the grad students took more than 350 combined steps while they swept the lab. But the Fitbit Zip counted, well, … zip. The activity tracker didn't detect a single step. The Fitbit Charge HR was at the other extreme. It over-counted steps – by a lot.
For all of the fitness trackers tested, the average step error rate during the sweeping portion of the test was nearly 60%.
"That's not very good," Montoye told WTHR. "Sweeping involves a little more back and forth motion, and the steps are not quite as well defined as what we see with walking and running, so that's probably why the accuracy goes way down. With lifestyle and household chore activities, we tend to see that across the board."
Still, overall step counting was quite good for most of the monitors, and Montoye says that's important.
"A step is the first thing the monitors are designed to detect and the other variables follow from that," he said. "If the steps are wrong, the calories and distance are certainly going to be wrong because you're not even starting with the right data."
Distance: Falling a little short
Calculating distance was not as strong.
Some of the fitness trackers estimated too high, others too low. Among all devices tested, the calculated distances were off by about 14%. That means if you walk seven miles, your fitness tracker might tell you that you walked eight miles -- or maybe only six -- instead.
That error rate isn't horrible, but the research team at Ball State says it's not great, either.
"Ideally, we'd like to be a little bit closer than that," said Montoye. "Anything more than 10% and I'd say they can do better."
Unless you're training for a marathon (where your 26.2 miles could quickly turn into either 22.6 or 29.8 miles based upon a 14% miscalculation), chances are the distance tracking shortcomings on your activity monitor won't drastically impact your life.
Calories: Just a few dozen Big Macs off
OK, here's where your fitness tracker is probably messing with you big time -- and where you'll get that surprise I promised earlier.
The results – for those who truly rely on their activity tracker for accurate calorie information – are downright ugly.
That's my take, and Dr. Montoye has my back on this one.
"I absolutely agree with you. The numbers aren't even close," he said.
Most of the activity monitors tested for WTHR grossly over-reported the number of calories actually burned.
The metabolic analyzer apparatus strapped to Chad's chest, Velcro'd around his head, and pushed tightly against his nose and mouth to monitor his oxygen consumption throughout the test (I'm so sorry, Chad!) showed he burned 55.2 calories during his walks. The fitness monitors all said he burned a lot more than that. The Misfit Flash, for example, reported Chad burned 98 calories during his walking activities – an over-estimation of 77%. The Misfit recorded an identical calorie error rate following Chad's run.
Alexis got similar lousy results. She burned 44.5 calories during her walks, according to the metabolic analyzer. The Fitbit Charge HR claimed she burned 97 calories – a 122% error.
On average, even the best fitness tracker tested for WTHR was off by nearly 30% on calories burned, while three of the devices had an average error rate of 44% or more.
Those dismal numbers for all of the fitness monitors renders their calorie information practically useless.
"I don't see a lot of value in it," Montoye said after reviewing the results.
"Don't put your money on it. Don't bet on the calories," agreed graduate student Josh Bock, who's tested dozens of fitness trackers inside the Ball State lab and who recorded all of the data during WTHR's tests. "We see errors all over the place. The calories are pretty much a joke."
The consequences may not be funny at all.
Let me put the calorie error rate into perspective for you.
According to professor Montoye, an average adult burns about 2,000 calories a day. If your fitness tracker overestimates calories by 30%, it's telling you you burned an extra 600 calories today that you didn't actually burn. That's off by 18,000 calories -- or 5 pounds -- every month!
That means your trusted little activity tracker is trying to convince you that you burned off 33 Big Macs this month when … c'mon! … we all know you really didn't.
"The fact that it doesn't work well is problematic if people are using these devices to try to measure how many calories they're burning so they know how much to eat," Montoye told 13 Investigates. "In general, I think we need to be really careful of the calorie counts. If someone's off every day in how much they think they should be eating to lose weight or maintain weight, over a period of time, that can translate into huge weight gain or weight loss."
Heart rate: Bordering on dangerous
Not all of the fitness trackers tested for WTHR offer the ability to track heart rate. Only two of the six devices – the Fitbit Charge HR and the Garmin Vivosmart HR – had on-demand heart rate readings that allowed Ball State researchers to test their accuracy. (The Jawbone UP3 measures heart rate, but its readings occur at unpredictable times and, therefore, we could not accurately record a corresponding heart rate with lab equipment.)
The box for the Fitbit Charge HR says "every beat counts." Despite what the package says, the tracking device inside missed lots of them.
For example, when the Fitbit detected Alexis' heart rate at 68 beats per minute, the portable pulse oximeter showed her real heart rate was actually much higher at 91.
The Garmin fitness tracker frequently overcounted heart rate – showing 96 beats per minute when it was really 69.
Calculating a heart rate that's off by 20 or 30 beats per minute can be dangerous -- especially for people at high risk of heart disease.
"That's too high to be acceptable to us," Montoye said. "Heart rate is a measure of exercise intensity. Small changes in intensity can affect the benefit you'll receive, but they also increase your risk associated with the activity. That risk can be very real … so the heart rate has to be accurate."
Unlike steps, distance and calories -- which the professor prefers to see with error rates of less than 10% -- Montoye says the desired error for heart rate should be less than 5% or within 5 beats per minute.
The average heart rate error for the Garmin Vivosmart HR was about 10% and the Fitbit Charge HR averaged an even worse 14%. Falling within 5 beats per minute of the actual reading didn't happen frequently for either one.
The bottom line
Let's summarize, shall we? The six fitness monitors tested for WTHR include highly recognized brands and/or easy-to-find models you'll find at popular retailers and online (if you don't own them already). But the big names didn't necessarily perform better than their lower-profile rivals, and paying big bucks didn't guarantee more accuracy than some of the cheaper options.
As a group, they all tracked steps pretty darn well -- unless you spend the majority of your free time sweeping floors.
Distance tracking was, overall, not bad. But if you want to walk a mile in someone else's shoes, and by a mile you mean approximately 5,280 feet, you'll likely need something more precise than these fitness trackers.
Heart rate was not good enough. Realistically, you probably won't face sudden cardiac arrest triggered by a fitness tracker error. But a 20+ heart beat per minute oops on something so important just isn't cool.
And calories? What else can we possibly say about calories? If the image of 5 mysterious pounds of monthly belly fat and 33 Big Macs doesn't stick with you, there's not much more I can say to illustrate the travesty of the calorie error rate for all of the fitness trackers we tested.
"Yeah, I think people have to be a little more careful than to just trust whatever these monitors say," Montoye told us, before we parted ways at the lab.
And yet, despite all their shortcomings – and we've certainly uncovered a few – the professor pointed out any of these fitness trackers can be a great investment – if it motivates you to be more active.
"I like the idea people can track their own activity, see changes over time, see where they are in a day," he said. "If you have a goal and you have a way to measure progress towards that goal, I think that's really beneficial. I think that's where the real value of these monitors is."
The fitness tracker companies fight back. Well, not really.
WTHR contacted Fitbit, Jawbone, Garmin, iFit and Misfit to get their responses and perfectly plausible explanations to help us understand why their products performed so brilliantly in some tests and so horribly in others.
Jawbone and Misfit have, so far, been silent. No reply.
Garmin tells 13 Investigates it is reviewing our request.
Fitbit sent a written reply that makes no attempt to address its woeful test results. In an e-mail, a company spokeswoman wrote:
"Fitbit trackers are designed to provide meaningful data to our users to help them reach their health and fitness goals, and are not intended to be scientific or medical devices. [Sorry to interrupt, Fitbit, but since when does a wristband accelerometer with a built-in heartbeat monitor not qualify as a scientific or medical device?] Overall, the success of Fitbit products comes from empowering people to see their overall health and fitness trends over time — it's these trends that matter most in achieving their goals."
Of all the fitness tracker companies we reached out to, only Utah-based iFit called me to discuss the results. Company director Mark Watterson could not have been nicer while sharing his disappointment about the iFit Vue's 34.7% calorie error rate in our recent test.
"That really surprises me because we take great pride in accuracy," Watterson said. "That's unusual, and we want to look into that more to understand how that happened. Our goal is to get that more accurate."
He also gave us a fascinating scoop on why his company's product had a whopping 95% error rate when trying to calculate steps during the sweeping activities. The iFit Vue detected only 18 of Alexis and Chad's 358 steps as they swept the floor in the lab.
OK, get this… Watterson says that's exactly how the fitness tracker is designed and how he wants it.
"Sweeping isn't going to help you achieve your fitness goals, so we made a conscious decision in our algorithm to tune out motions that aren't going to help you get healthy," he explained. "We've spent years developing our algorithms to detect walking and running – and to tune out other motions – because we don't want to give people false positives for things that are not going to get them healthy. When we talk about 10,000 steps, we're not talking about getting in 10,000 steps while sweeping. That won't get it done."
Then it got even better, as the iFit executive shed light on what he describes as the very different philosophy of his biggest competitor. (He wouldn't mention any names, but I suspect the company he's referring to might rhyme with Zitbit.)
"When we tested the leading brand, they gave us over 100 steps for eating cereal. We were getting 1,000 steps by shaving, brushing our teeth, doing things like combing our hair. Creating lots of false positives, false impressions that I'm going to buy one of these and all of a sudden I'm going to be fit and healthy because I'm getting to 10,000 steps [per day] so quickly and easily, that's not what we do. Others do, but we don't set up our algorithm that way."
So he's saying the iFit wasn't simply missing steps during the sweeping test. The fitness tracker was ignoring them because, according the company's director, it wants to – for your own good. And it was very accurate in counting steps during jogging and walking.
Food for thought as you're pondering all those previously-mentioned Big Macs.
By the way, if any of the other companies provide information or reply to our questions, we'll pass along their insight and comments in this very spot.
|% Error Steps, Calories, Distance |
|% Error Heart Rate |
NA: Fitness tracker model doesn't calculate heart rate.
NA*: Fitness tracker model does not offer an on-demand heart rate reading and, therefore, heart rate could not be verified during testing.