The Risks to Patient Privacy from Publishing Data from Clinical Anesthesia Studies

Anesthesia & Analgesia: June 2016 – Volume 122 – Issue 6 – p 2017–2027

Authors: O’Neill, Liam PhD et al

From the *Department of Health Management and Policy, School of Public Health, University of North Texas–Health Science Center, Fort Worth, Texas; Division of Management Consulting, Department of Anesthesia, University of Iowa, Iowa City, Iowa; and Department of Computer Science, George Washington University, Washington, DC.

Accepted for publication February 24, 2016.

Funding: National Science Foundation, grant no. 1064628 and 1343976.

The authors declare no conflicts of interest.

Reprints will not be available from the authors.

Address correspondence to Liam O’Neill, PhD, School of Public Health, UNT-HSC, 3500 Camp Bowie Blvd., Fort Worth, TX 76107. Address e-mail to liam.oneill@unthsc.edu.

Abstract

In this article, we consider the privacy implications of posting data from small, randomized trials, observational studies, or case series in anesthesia from a few (e.g., 1–3) hospitals. Prior to publishing such data as supplemental digital content, the authors remove attributes that could be used to re-identify individuals, a process known as “anonymization.” Posting health information that has been properly “de-identified” is assumed to pose no risks to patient privacy. Yet, computer scientists have demonstrated that this assumption is flawed. We consider various realistic scenarios of how the publication of such data could lead to breaches of patient privacy. Several examples of successful privacy attacks are reviewed, as well as the methods used. We survey the latest models and methods from computer science for protecting health information and their application to posting data from small anesthesia studies. To illustrate the vulnerability of such published data, we calculate the “population uniqueness” for patients undergoing one or more surgical procedures using data from the State of Texas. For a patient selected uniformly at random, the probability that an adversary could match this patient’s record to a unique record in the state external database was 42.8% (SE < 0.1%). Despite the 42.8% being an unacceptably high level of risk, it underestimates the risk for patients from smaller states or provinces. We propose an editorial policy that greatly reduces the likelihood of a privacy breach, while supporting the goal of transparency of the research process.

 

Leave a Reply

Your email address will not be published. Required fields are marked *