Not so fast: assessing achievements and barriers at a Massachusetts Health IT conference

The state of electronic health records and electronic health data exchange offers plenty to celebrate along with plenty to make one grind one’s teeth (not medically recommended). Both the bright lights of success and the mire of gridlock were held up for examination this week at the conference Health Information Technology Improving Healthcare and the Economy in Worcester, Massachusetts. The contrasts were so great that I felt as if I were attending two conferences jammed together, one ceremonially congratulatory and the other riddled with anger but a determination to make things better. I think most of the press attended only the first conference, but I was present for the whole thing and will try to integrate the insights I got into a view of health IT in the United States today.

This conference was a follow-on to last year’s Governors Conference, which I covered at the time. The first half of this year’s conference, like last year, dazzled the audience with a cornucopia of distinguished keynoters: Governor Deval Patrick himself; David Blumenthal, who has returned to Massachusetts after serving as National Coordinator for Health Information Technology at the Health and Human Services Department; JudyAnn Bigby, Secretary of the Massachusetts Executive Office of Health and Human Services; and Sachin Jain of the Centers for Medicare and Medicaid Services (CMS). An impressive 400 people turned up even though the conference was held an hour outside Boston (granted, Worcester itself is something of a medical center and home to the University of Massachusetts Medical School). But while the DCU Center is a pleasant and serviceable enough conference space, it did not achieve the pomp of the first conference in the series.

I can’t count how many times the speakers reminded us of the progress in universal coverage in Massachusetts. It bears repeating, however, because it is quite remarkable: 98% of the population covered with some form of insurance, including 99.8% of the state’s children. These figures cannot be matched by any other state, and are just one sign of the leading role Massachusetts has played in US health care. I reported last month on other elements of Massachusetts success. And the Boston Globe just reported that a coalition has been launched to find and sign up the rest of the children.

Governor Deval Patrick and JudyAnn Bigby at health care conference
Governor Deval Patrick and JudyAnn Bigby at health care conference

While sticking to the theme of acknowledging success, Blumenthal delivered a fact-packed and deep keynote that laid out both how far we have come since the passage of the stimulus bill that started the current reform (long before the better-known and controversial Affordable Health Care for America Act) and how much is left for us to accomplish. He boldly claimed that the dominant EHR of the future will be “different from what we have today.”

David Blumenthal at health care conference
David Blumenthal at health care conference.

Blumenthal acknowledged that the government has stirred up a pot, opening a period of uncertainty and costly experimentation in the field of EHRs. But he assured us this is a good thing, saying that we have “revitalized a relatively sleepy side of American technology” that was “slow to innovate.” Already, over 700 products and modules have been certified for meaningful use, most of them produced by companies with fewer than 15 employees.

He also suggested that ultimately there would be a shake-out and consolidation around three or four products. I silently cheered when I heard this, because consolidation is a stage on the way to commodification, and commodification in the field of software is a boost to open source. As I suggested over a year ago, free software possesses many traits qualifying it as a natural cure for the current ills of the EHR industry. And it’s a cure many profitable companies can participate in.

Jain touted the CMS Innovation Center, which not only solicits research on lowering health care costs, but can quickly spread findings throughout the country by requiring changes of Medicaid and Medicare providers. He pointed out the need for sophisticated tracking: providers often claim to have reduced costs when they merely shift them to another provider. I’m not convinced that the Innovation Center will be the driving force Jain thinks it is, but it is a welcome island of disruption in Blumenthal’s “sleepy” technology.

How deep and lasting is the progress in meaningful use?

Meaningful use–a very young term, coined by the HITECH act as part of the stimulus package–is one of those many-faceted concepts that can be defined extremely narrowly or quite broadly. Like an Impressionist painting, it appears completely different when viewed up close versus at a distance.

In its most narrow form, meaningful use is laid out by documents from CMS. For instance, Stage 1 defines 15 things a medical practice should be able to do, and requires each practice to demonstrate a certain number of these things in order to receive government payments (for instance, placing a certain number of pharmacy orders electronically). Stage 2 will add requirements, and Stage 3 still more.

Already the narrowness of this definition creates problems for doctors attracted by the promise of payments for adopting electronic health records. One audience member pointed out that an EHR vendor may implement a subset of meaningful use requirements in order to allow a doctor to get Stage 1 payments. But a specialist might not do all the things that the vendor has implemented. This specialist might need other, unimplemented requirements in order to earn a payment. In effect, the specialist has bought a system that both the vendor and the government (indirectly, through the certification process) has promised will make her eligible for a payment, only to find herself cheated.

This, and other gripes, flowed freely at a workshop led by two representatives from the Office of the National Coordinator, Fadesola Adetosoye and Jim Daniel. Adetosoye placated the audience with the suggestion that lapses in EHRs were caused by vendor ignorance of doctors’ needs, not a malicious strategy. She recommended meetings between doctors and vendors, and suggested that doctors band together to present a more impressive front. Daniel said HHS has organized forums of vendors and doctors to cut down on miscommunication during the design of EHRs.

Meaningful use at its broadest is a handle for all the advances in delivering health care that are supposed to eliminate the 100,000 unnecessary hospital deaths each year, reduce the 10% or 20% jumps in annual insurance costs to the level of regular cost-of-living increases, and bring the doctor into the home of the patient. Once we raise our eyes to this horizon, we see more barriers along the way, many embedded in EHRs.

No one delivered stronger blows to current EHR vendors than the chair of the non-profit that has certified most of them over the years, the Certification Commission for Health Information Technology. Karen Bell presented a modest list of transactions and activities that an EHR should permit–coordinated care, integration of functionalities, communication between different parts of the EHR–and warned the audience that certification means only that an EHR can do the handful of things required for Stage 1 meaningful use payments. Some EHRs include the more sophisticated features that facilitate cost savings and improvements of care, but many do not. Her overall message was “buyer beware.” I found her presentation courageous, and it greatly enhanced my impression of CCHIT.

The stimulus to better and cheaper care: data sharing

Most public health research in the United States today uses administrative data that is easy to get, because it has been collected for years for critical purposes unrelated to research (mostly to bill insurers, including Medicare and Medicaid). Starting as a combined effort among employers to cut costs by tracking diseases across a large population, the collection of such administrative data has gradually evolved into a government function. The source of most research data now is each state’s All-Payer Claims Database (APCD). The federal government will probably begin its own national project, either starting from scratch or trying to combine and harmonize the data from the states. Jo Porter of the University of New Hampshire explored different uses for administrative data in a workshop on secondary data use.

Workshop on secondary use of data
Workshop on secondary use of data

As one audience member pointed out, “Claims data is a proxy, not a real measure.” For instance, the data can tell us whether a patient took a lab test, but not what the results were. Meaningful Use begins the process of collecting clinical data with baby steps, such as reporting the number of smokers at each medical practice. But still, a lot of useful things turn up with administrative data. Even a logistical question such as how far patients travel to get care has clinical implications.

Data from private health insurers cannot always produce accurate results when combined with Medicare and Medicaid data, because the populations are so different. And when it does make sense to combine them, the resulting statistics still leave out the uninsured, who frequent community treatment centers. Porter said that Maine gives identity cards to the uninsured for use when they visit these centers, so that data on their treatment can be factored into the statistics in that state. The Department of Veterans Affairs has indicated that it would like to add its considerable patient data to the statistics too. The patient-centered medical home also has the potential to generate enormous amounts of useful research data.

In addition to Porter’s examples of data aggregation, I saw some interesting displays at the booth of JEN Associates, who provide tools to CMS as well as private organizations.

Another problem in trying to extract long-term information from administrative data is tracking a patient as he moves from one insurance provider to another. Some states collect more identifying data than other states. For a long time, providers simply identified each patient by her social security number. This was not a privacy risk because the number was encrypted before being shared. Because each insurer used the same encryption algorithm, a patient could be tracked without being identified as he moved from one insurer to another.

Well, given the well-known problems with using social security numbers as universal identifiers, insurers are moving away from that. We are famously a country opposed to universal identifiers and ID cards. So as each insurer adopts its own way of identifying patients, and as hospitals use multiple demographic markers (name, age, etc.), tracking patients through their lifetimes becomes harder.

States informally share information and suggestions for using administration data in an APCD Council. One of the current issues is whether to charge for this data. It is clearly of great value to companies in various parts of the health care industry. But charging for data, while fair in relation to companies using it for product development and marketing, obviously puts a crimp in research. Also, Porter said that collection and use of this data was still so new that it’s hard to establish its commercial value.

In addition to fueling research, data is critical for better patient care. For instance, you obviously need it for the integrated care that lies at the core of modern cost-saving initiatives, not to mention patient-centered care. Strangely enough, one of the biggest buzzwords in American health care today, Accountable Care Organizations, got relatively little attention at the this conference. Sachin Jain even put distance explicitly between himself and the term, claiming that it was “just one model” and might be a transitional stage in the evolution of providers.

Whatever one’s treatment model, the inconsistencies and incompatibilities of EHRs have made sharing data between health care providers both costly and cumbersome. John Halamka, a leading CIO in healthcare and advisor to federal efforts, spent several minutes on his panel listing the various standards that the government was developing, most of them to be released for review during the summer and finalized in the Fall. A few examples include standards for:

  • Submitting meaningful use data to CMS (so manual data entry will no longer be necessary)

  • Metadata to represent privacy preferences

  • A provider directory

  • Simplified data on doctor quality (always a sensitive measure that scares the doctors being monitored)

  • Transition of care documents

He predicted that Stage 2 would be split into multiple stages to give vendors time to produce conforming systems. But Blumenthal warned earlier that delaying stages would play havoc with the schedule of payments, which Congress laid out year by year rather than stage by stage.

The ONC is also working on a Standards and Interoperability (S&I) Framework that addresses many of the problems discussed in this section.

Formats and data exchange were also the topic of a workshop on Health Information Exchanges, led by Richard Shoup of the organization that put together the conference, the Massachusetts e-Health Institute.

Neither existing standards (based mostly on a very old and complex set of formats called HL7, which have only gotten harder to implement and parse as they have evolved) nor recent government efforts are enough to produce data that can easily be shared between EHRs. Massachusetts has led the way in forming a consortium of states and EHR vendors to fill the gap. In his panel, Halamka expressed the wish that standards for data exchange had been codified before doctors were asked to buy EHR systems, because the ability of those systems to exchange data could then have been verified.

Although formats and standards excite technical specialists, much more is required to make exchanges possible while preserving the rights of the patient. Exchanges require trust between institutions, and this trust goes down rapidly as the distance between them increases. As Shoup said, governance for data must be a topic of the multi-state consortium.

And one audience member claimed that actually, the scenario so often reported to justify national health exchanges (the patient on vacation who goes to an emergency room citing chest pains) is actually extremely rare. Very few patients have to seek care outside their states, at least for symptoms where their prior conditions are an important diagnostic factor. This calls into question the value of spending the huge sums of money that would be required to create a universal exchange.

However, Shoup said that 15% of patients who enter an emergency room are treated in the absence of information needed to treat them properly, and that 15% of admissions from the ER to the hospital could be avoided if the medical records were available. This is not an interstate problem, but one many of us have right at home.

All data sharing initiatives raise questions of patient privacy. Someone asked Shoup “Who owns the data?” He said it was a difficult question to answer (perhaps one that cannot be answered, and therefore that we should stop asking.) But throughout the conference, speakers acknowledged the importance of patient consent and preserving privacy. Certainly, the two big ONC projects–CONNECT and Direct–center on the assurance of privacy during data exchange. But they do not solve problems of consent and trust, only authorization and secure data transfer.

Jobs and balance sheets

The biggest contradiction I’ve found during my coverage of health IT is the oft-cited prediction that we’ll have to hire 40,000 to 50,000 new staff in the field to deal with IT changes, no one explaining how we’ll pay all those people while cutting costs. Already there are anecdotal reports of IT staff demanding six-figure salaries and difficulties finding trained staff at any price.

One answer, which came up during a presentation by Lynn Nicholas, President of the Massachusetts Hospital Association, is that a lot of existing staff will lose their jobs. Hopefully, as hospitals upgrade from routine clinical functions to more sophisticated data processing, the staff can get training to do these better jobs instead of just receiving pink slips.

Another promising way to cut costs lies in telemedicine, introduced by Dr. Joseph Kvedar with examples from the Center for Connected Health at Massachusetts’ largest (and sometimes most resented) health provider, Partners HealthCare. The Center for Connected Health has pioneered projects in the leading health epidemics of our time: diabetes, hypertension, and congestive heart failure.

Telemedicine could be as simple as sending a text message to remind someone of an appointment or the time to take a pill. (In fact, one could argue whether this is telemedicine, because it uses automation rather than human intervention.) But Partners goes much farther as well, giving patients devices that let them upload statistics such as blood pressure to a server at the hospital, where software can determine whether an event requiring a doctor’s intervention has occurred.

They have given pedometers to Boston school students and installed sensors in the schools that are triggered when they walk by, so that schools can measure how much walking they do and encourage them to increase it. (This must be a change from when I went to public school, when monitors were always telling us to get out of the halls.)

Strangely, people with chronic illness use technology less than the average person, according to Kvedar, and studies show that this link is independent of the usual factors for explaining such differences (age, socio-economic status, and so forth). Health care costs follow a very strong power law, meaning that a tiny percentage of the population (3%) accounts for a huge proportion of costs (40%), so we have to engage the chronic patient somehow, and telemedicine seems to be a key part of the solution.

Telemedicine also includes interactions between doctors, such as remote monitoring of ICUs. One study found that sending the information from cameras and monitors to remote sites can reduce fatalities by 20%. Kvedar says this is an example of when telemedicine seems to be prescribed: when outcomes are determined by low-frequency but high-impact events. Massachusetts General Hospital now keeps doctors on call to examine video images of stroke patients at home so that they can help the on-site doctor make the right determination of how to treat the stroke.

Blumenthal started his keynote–which, as I said, included much that is worthy of celebration–by listing four factors that make it hard for physicians to adopt and use EHRs:

  • The “paralysis of uncertainty” created by having two many systems to choose from, along with a grab-bag of worries ranging from whether they’ll stay up to whether they’ll meet clinicians’ needs

  • A basic psychological barrier stemming from habits of recording information, which go back one’s first medical training and involve the visceral activities of using a pen

  • The ongoing lack of technical and cultural foundations for exchanging data

  • The fear of data breaches and violations of patient privacy

All the other complaints and admonishments in the conference could probably fit into one of those categories. Solutions are available, but because data exchange and research are fundamental to change, these solutions have to be discovered and adopted by the field as a whole. Most doctors who adopt electronic systems are ultimately happy they did so–a finding that was not true just a few years ago–but the process is still expensive and painful for those who go ahead of their peers.

tags: , , , , ,