Self-tracking using a wearable device can be fascinating. It can drive you to exercise more, make you reflect on how much (or little) you sleep, and help you detect patterns in your mood over time. But something else is happening when you use a wearable device, something that is less immediately apparent: You are no longer the only source of data about yourself. The data you unconsciously produce by going about your day is being stored up over time by one or several entities. And now it could be used against you in court.

The first known court case using Fitbit activity data is underway. A law firm in Canada is using a client’s Fitbit history in a personal injury claim. The plaintiff was injured four years ago when she was a personal trainer, and her lawyers now want to use her Fitbit data to show that her activity levels are still lower than the baseline for someone of her age and profession to show that she deserves compensation.

As an additional twist, it is not the raw Fitbit data that will be used in the courtroom. The lawyers are relying on an analytics company called Vivametrica, which compares individual data to the general population by using “industry and public research.” Vivametrica claims that they “define standards for how data is managed, bringing order to the chaos of the wearable.” In other words, they specialize in taking a single person’s data, and comparing it to the vast banks of data collected by Fitbits, to see if that person is above or below average.

Vivametrica says that they are doing more than just enabling consumers to get access to their own data. They are also working with wearable tech companies and healthcare providers, and seeking to “reimagine employee health and wellness programs.” But what happens when there are conflicting interests between individuals who want to monitor data about their body and employers, wearable manufacturers and healthcare providers, and now the law?

Vivametrica isn’t the only company vying for control of the fitness data space. There is considerable power in becoming the default standard-setter for health metrics. Any company that becomes the go-to data analysis group for brands like Fitbit and Jawbone stands to make a lot of money. But setting standards isn’t as simple as it may seem.

Medical research on the relationship between exercise, sleep, diet, and health is moving extremely rapidly. The decisions about what is “normal” and “healthy” that these companies come to depends on which research they’re using. Who is defining what constitutes the "average" healthy person? This contextual information isn’t generally visible. Analytics companies aren’t required to reveal which data sets they are using and how they are being analyzed.

The current lawsuit is an example of Fitbit data being used to support a plaintiff in an injury case, but wearables data could just as easily be used by insurers to deny disability claims, or by prosecutors seeking a rich source of self-incriminating evidence. As the CEO of Vivametrica, Dr. Rich Hu, told Forbes, insurers can’t force claimants to wear Fitbits. But they can request a court order from anyone who stores wearable data to release it. Will it change people’s relationship to their wearable device when they know that it can be an informant? These devices can give their own interpretation of your daily activity, sleep, and moods, and that analysis may be seen to carry more evidentiary weight than the owner’s experience.

The law provides very few answers to these questions. In America, the Fifth Amendment protects the right against self-incrimination and the Sixth Amendment provides the right in criminal prosecutions “to be confronted with the witnesses” against you. Canadian courts have similar safeguards. Yet with wearables, who is the witness? The device? Your body? The service provider? Or the analytics algorithm operated by a third party? It’s unclear how courts will handle the possibility of quantified self-incrimination.

This becomes significantly more complex considering the variability of data with wearable trackers. The Jawbone UP, Nike Fuelband, Fitbit, and Withings Pulse all have their own peculiarities in how they work: Some will count moving your arms around as walking (which is great if you want writing to count as exercise), others can’t easily register cycling as activity. The sleep-tracking functions deploy relatively crude methods to determine the division between light and deep sleep. This “chaos of the wearable” might be merely amusing or frustrating when you're using the data to reflect on our own lives. But it can be perilous when that data is used to represent objective truth for insurers or courtrooms. And now that data is being further abstracted by analytics companies that create proprietary algorithms to analyze it and map it against their particular standard of the "normal" healthy person.

At one level, this shouldn’t really surprise us. The legal system already draws on a range of technological self-tracking devices as forms of evidence. GPS devices and apps for tracking bike rides like Strava have been used in court proceedings around cycling accidents, and of course, there are multiple forms of remote tracking used by the police, like Automatic License Plate Readers (ALPR). The difference is that wearable devices are elective. And when they make that decision they are effectively splitting their daily record into two streams: experience and data. These may converge or diverge for reasons to do with the fallibility of human memory, or the fallibility of data-tracking systems.

This similarity—the fact that both systems can be fallible—is what courtrooms should keep in mind. Courts have experience with this. They know that eye witnesses can’t always be trusted, even if they were there to witness the crime. They understand that doctors and other witnesses have expertise, but they aren’t all-knowing beings. There are expert witnesses for each side, and judges and juries can consider the general range of human bias and inaccuracy. When large data sets are brought to bear, they should be treated the same way.

Ultimately, the Fitbit case may be just one step in a much bigger shift toward a data-driven regime of “truth.” Prioritizing data—irregular, unreliable data—over human reporting, means putting power in the hands of an algorithm. These systems are imperfect—just as human judgments can be—and it will be increasingly important for people to be able to see behind the curtain rather than accept device data as irrefutable courtroom evidence. In the meantime, users should think of wearables as partial witnesses, ones that carry their own affordances and biases.