Current EHR de-identified police data can affect decades of patients medical data jealously guarded
The strict confidentiality to which medical service providers are subject has always been fundamental as regards patient health records, and data is jealously guarded in what is commonly called the patient’s paper chart or medical history. With the universal emergence of electronic health records (EHR), defined as “a digital version of a patient’s paper chart. EHRs are real-time, patient-centered records that make information available instantly and securely to authorized users”, several measures have been implemented to maintain these criteria. Nevertheless, in 2019, we published an article about “Google’s ‘Project Nightingale’, in which we discussed the massive violation of the right to privacy of patient data. The news “caused great distrust in the American health system…” (Read the full article HERE)
Now, a recent article published in the American journal NEJM (June 5, 2021) written by Kenneth Mandl, director of the computational health informatics program at Boston Children’s Hospital and Eric Perakslis, with similar experience in the field, reports that “The permissible sharing of patient data among health care organizations and their business associates for treatment, payment, and operations purposes has led to a torrent of electronic health records (EHR) data flowing out of health care provider silos”.
What kind of financial interest does access to these records hold for third parties? The files with detailed patient data show the patient’s illness, how they were treated, and the results obtained with different treatments and care. Taken as a whole, this data is a valuable resource to medical research. With the generalized EHR and the current practice of selling digital data, under certain provisions of the Health Insurance Portability and Accountability Act (HIPAA), health care organizations are able to give or sell this data to research partners after removing 18 specific identifiers data (age, phone number, sex, names, ethnicity, civil status and so on). They are allowed to use this “de-identified” data without patient consent or even tell them about it.
In this respect, the aforementioned article says “Organizations that don’t qualify as business associates under HIPAA nay also gain access to and use de-identified data sets. Such policies have enabled the rise of a multibillion-dollar industry comprising dozens of heath-data aggregation companies and hundreds of more companies producing tools and technologies that aggregate, link and monetize EHR data […] It is ironic that although patients (and their physicians) still have difficulty obtaining complete medical record information in a timely fashion, the HIPAA Privacy Rule permits massive troves of patients’ digital health data to traverse the medical-industrial complex unmonitored and unregulated”. The rule is founded on the principle of the interest of the patient and public benefit supporting medical research.
The authors go on to say that assembling vast data can help support the public good, but markets that use the patient data for secondary use do not always follow the aforementioned principles. “For example, a data-aggregation company might target physicians and patients for pharmaceutical detailing, which could drive up drug prices and result in overprescribing”. They also highlight the objective risk that the personal data of unidentified patients could be discovered using digital means. This possibility alone should require the informed consent of each patient.
We transcribe the authors’ qualified opinion on this particular issue with objective bioethical implications.
“Moreover, although the deidentification process is often treated as infallible, it is not. Nor is a particular method required for monitoring the success of deidentification efforts. Even after many deidentification-related processes, individual patients can potentially be reidentified on the basis of only a handful of attributes (read the study cited HERE). Deidentification technologies relying on encryption could be vulnerable to future advances in computing. In the absence of contractual controls governing data produced by a covered entity and shared with a business associate, if something goes wrong, only patients are harmed; the United States doesn’t have a comprehensive data privacy law, and none of the various privacy-related laws or regulations protects patients from the potentially harmful use of de-identified data. There is no duty to report instances in which data have been reidentified or linked to external data sources, such as financial records, and patients have little or no opportunity for redress in cases of reidentification (read the study cited HERE)”.
The article continues, reporting that research involving de-identified data is generally conducted without institutional review board oversight.
EHR de-identified police can put the credibility of the health care system in danger
The authors end this interesting article by suggesting some technical measures to avoid the aforementioned risks for patients and propose an update to the HIPAA Privacy Rule that was made prior to the current development of digital medicine and EHRs.
From personalist bioethics, the current and future problems detailed in the article – which threaten patients’ right to privacy – must be prevented with adequate patient consent. The US Administration must change its policies to protect patient privacy on such a delicate subject.