PRIM&R Member Newsletter 

May 25, 2023


Article Review: Does Deidentification of Data from Wearable Devices Give us a False Sense of Security?

Editor's Note: We rely on experts in the PRIM&R community to provide insight and commentary on substantial issues and changes within the research ethics and oversight field. If you would like volunteer to be on our source list for future articles, please fill out this form. 

Wearable digital devices, which measure a broad range of human activities including steps, heart rate, breathing, pulse, and brain activity, are part of a rapidly growing market of tools that collect massive amounts of personal health data. The results of a systematic review recently published in the April 2023 edition of the Lancet, “Does deidentification of data from wearable devices give us a false sense of security?,” indicate that merely deidentifying or anonymizing digital data from wearable devices and sensors is inadequate in protecting the privacy of individuals whose data are included in the datasets. One of the major findings of the review was that data from recordings of extremely short durations, between one to 300 seconds, was sufficient to enable reidentification, according to the article.

“This discovery is concerning since publicly available data is becoming increasingly abundant, especially given data sharing advocacy and policy by influential bodies, such as the FDA and NIH,” the article says.

The NIH have adopted policies encouraging extensive data-sharing practices, including most recently the NIH-wide Data Management and Sharing Policy, which went into effect in January of this year. But, according to the Lancet authors, “Although data sharing provides tremendous benefits, it also poses many crucial questions around privacy risks to patients and study participants that remain unanswered.”

“For example, could machine-learning algorithms be applied to public datasets or data shared through third-party data-sharing agreements to enable reidentification? Is there an opportunity for data misuse by governments, corporations, or individuals? If so, how significant is this risk, and is there a way to mitigate it?”

The authors go on to say, “Advances in machine learning have made it possible to infer sensitive information about individuals, such as their medical diagnoses, mental health, personality traits, and emotions, thus making it possible to learn information that an individual has not directly shared.”

PRIM&R reached out to experts throughout the country to get their take on the Lancet article.

“Once again, we are reminded that de-identification isn’t a magic wand that makes a study ethically responsible. The participants who provide data must retain some sense of control over the process. They need to see the research as legitimate and beneficial,” said Jonathan Herington, PhD, Assistant Professor of Philosophy and Health Humanities and Bioethics, at the University of Rochester, when asked for his thoughts on the issues raised in the article.

The wearable device industry is expected to more than double in the next three years, raising substantial questions about how this data should be collected, stored, and used.

The Road Ahead

Megan Doerr, MS, LGC, is the Director of Applied Ethical, Legal, and Social Research at Sage Bionetworks, a Seattle-based non-profit organization that develops, builds, and shares tools to conduct dynamic, large-scale, collaborative biomedical research. Doerr sat down with PRIM&R to discuss the article earlier this month. She is not surprised at the findings and says we as a research community, have ritualized a “theater of anonymity,” around participant data, the futility of which this study clearly communicates.

Become a Member to Read More

Monthly Member Newsletter

Just One of the Many PRIM&R Member Benefits!