If health care reform depends on patient engagement and the mining of public health data, it depends equally on protecting the patient’s privacy. Moreover, real-life stories from victimized patients show that privacy is caught up with issues of security, clinical decision-making, mobile health, and medical errors. After the patient access summit and the health data initiative forum, therefore, it was supremely appropriate for me to attend the second annual health privacy summit, which I helped to organize.
Joy Pritts and others on panel.
The conference this year had even more detail and more subtle nuance than the conference I reported on last year. Last year’s summit put a valuable stake in the ground to acknowledge the importance of privacy in health policy, and this year we took off from that point. Two leading members of the Office of the National Coordinator at the Department of Health and Human Services came to speak–National Coordinator Farzad Mostashari and Chief Privacy Officer Joy Pritts–and Patient Privacy Rights, the conference organizers, created a new Louis D. Brandeis privacy award that was accepted by Congressmen Joe Barton and Ed Markey, world-renowned security expert Ross Anderson, and long-term privacy researcher Alan Westin.
About 150 people came to the conference, which took place Wednesday and Thursday last week. Hundreds more followed webcasts live, and these will be posted online.
Scope of the privacy debate
The health care field is divided between those who think privacy is pretty good already and should not suck up resources that could go into other reforms, and those who insist on reviewing all changes to practices and technology. The latter sometimes say that it need not be a “zero-sum game” (in fact, Mostashari stated that in his keynote). On the contrary, they suggest that a patient’s trust in privacy protection is actually a prerequisite to data sharing and good medical care, because a patient will just keep embarrassing information secret if she is afraid it will fall into the wrong hands.
The debate can get complicated because it involves laws that have changed over time and vary from state to state, common practices that undermine stated commitments to following the law (such as taking data home on unencrypted laptops), ignorance on many sides, and bad actors who are not dissuaded by even the best regulations and institutional practices. Because the debate was covered in my article from last year’s conference, I’ll just update that to say that more speakers this year affirmed a tension between privacy and the kind of data sharing needed to improve patient care. I heard several statements along the lines of one by Ann Freeman Cook, a psychology professor and ethics researcher, who found IRBs struggling and finding it impossible to reconcile patient privacy with the needs of researchers and the public.
A number of heart-rending stories from patients were shared at the beginning of the summit. If one examined them carefully, one could cavil over whether each story really represented a privacy breach. Some of the stories were more about errors or about poorly recorded decisions (often in EHRs that were too rigid to accurately represent patient complaints). And the privacy breaches were sometimes just bad luck–more the result of a malicious actor bypassing safeguards than a lack of safeguards.
Nevertheless, I accepted that all of them fell under the umbrella of “privacy protections.” Privacy is about the right of the patient to control his data, and it involves all these things. So the topics at this conference are relevant to all the issues health care advocates talk about regularly: data exchange and ACOs, clinical research, the use of apps on mobile devices, the Quantified Self movement, and social networking in patient empowerment.
Here are some of the interesting topics mentioned at the conference.
Leading privacy researcher Latanya Sweeney showed off her Data Map that shows all the places patient data gets sent in the normal run of treatment, payment, public health, and research. Suggestions are requested.
Built-in privacy: Mostashari pointed out that a concern for privacy led the group designing the Direct project to make sure that the middleman routing data should never know who is sending or receiving. Identities are buried in the encrypted body of the message.
Ross Anderson delivers keynote.
Security expert Ross Anderson, who has studied health care systems all over Europe, suggested a number of measures to protect patient privacy. Some are standard security measures: keep information scattered in different repositories (this would mandate HIEs in the US that query doctors for information instead of uploading it to their own servers); don’t give central authorities automatic access to data; use role-based access (but that’s hard to do properly). Another safeguard is to let the patients audit their own data. Anderson pointed out that longitudinal data–which researchers value highly–is impossible to de-identify because there is too much data snoopers can use to link the data with other sources about the patient. He also said problems arise when the government tries to move fast and throws a lot of money at a problem, which sounds uncomfortably like the meaningful use payments.
Three companies were chosen for the best health privacy technologies of 2012:
Trend Micro wins technology award.
Jericho Systems captures patent consents and translates them to technological controls. A patient can can see in his PHR who is making a request for his data, for instance.
Trend Micro’s Deep Security incorporates the standard security protections for a networked environment (virus scanner, firewall, file integrity checker, etc.) into a cloud solution. Thus, even if the server is breached, the system may be able to prevent data from being extracted.
Segmented data, which means the ability to share certain specific information while hiding other, more sensitive information, came up several times. The field is nowhere near ready, technically or organizationally, to support something like sharing information about your broken arm while hiding your psychiatric records. But several institutions are working on standards.
Several panelists called for privacy by default: it isn’t fair to present a complex document to a patient and expect her to understand all the implications (which no one can do anyway). Maneesha Mithal reported a policy at the Federal Trade Commission that the most important privacy impacts must be highlighted, not buried in an inscrutable policy. Information technology research Andrew Dillon suggested that, instead of educating patients about the awful forms they sign, we should improve the forms (and by implication, the policies they define).
A couple doctors spoke up to say that they felt uneasy entering information into records (particularly psychiatric information) because they didn’t know who would end up seeing it.
A lot of discussion covered who should explain privacy policies to the patient. Handing them a form at the start of a visit is not an effective way to get meaningful consent. Some said the doctor herself should ideally explain the privacy implications of the visit, although this eats into the severely restricted time that the doctor has with the patient.
Two speakers–EPIC representative Lillie Coney and re-identification expert Daniel Barth-Jones–reported that, luckily, it’s quite hard to re-identify patient data that has been de-identified for the purposes of research and public health. Barth-Jones doubted that anyone has performed any actual re-identifications, other than researchers proving that re-identification is theoretically possible.
Ann Freeman Cook pointed out that people often agree to share data, tissues, and other samples with with researchers in order to get free care. Therefore, the poor and uninsured are more likely to relinquish privacy safeguards. And these samples are kept for a long time, so it’s impossible to know how they’ll be used.
The ONC’s Standards & Interoperation Framework got contrasting reviews. On the one hand, it is hard to understand because it refers to so many technologies and standards. On the other hand, these references root it firmly in state-of-the-art practices and make implementation feasible.
Last week’s series of conferences in Washington–of which I attended maybe half–were the most intense concentration I’ve seen of health care events. A few people got to bounce around and experience everything. Only that elite tends to put in the research to really understand all the facets of patient engagement, data sharing, application development, business opportunities, privacy issues, and points to leverage institutions for change that will really improve our health care system and lower costs. I think that most providers, administrators, and researchers stumble along with good intentions but a lack of a full vision.
We can fix our health care systems if we educate doctors and patients to work together; create teams that have incentives to deliver the best care; open up data about the health care industry; incorporate low-cost devices into patient-centered medical homes, and incorporate the best research into clinical decision support. I’m sure readers could suggest other related elements of a solution. A crucial background role will be played by technological improvements and standards. All this is extremely hard to explain in a single coherent vision, although numerous books about radical reform to the health care system have come out over the past couple years. Those with expertise in a particular area of technology or organizational development must do their best to educate themselves with the wider vision, and then act locally to make it happen.