Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

August 09 2012

Five elements of reform that health providers would rather not hear about

The quantum leap we need in patient care requires a complete overhaul of record-keeping and health IT. Leaders of the health care field know this and have been urging the changes on health care providers for years, but the providers are having trouble accepting the changes for several reasons.

What’s holding them back? Change certainly costs money, but the industry is already groaning its way through enormous paradigm shifts to meet current financial and regulatory climate, so the money might as well be directed to things that work. Training staff to handle patients differently is also difficult, but the staff on the floor of these institutions are experiencing burn-out and can be inspired by a new direction. The fundamental resistance seems to be expectations by health providers and their vendors about the control they need to conduct their business profitably.

A few months ago I wrote an article titled Five Tough Lessons I Had to Learn About Health Care. Here I’ll delineate some elements of a new health care system that are promoted by thought leaders, that echo the evolution of other industries, that will seem utterly natural in a couple decades–but that providers are loathe to consider. I feel that leaders in the field are not confronting that resistance with an equivalent sense of conviction that these changes are crucial.

1. Reform will not succeed unless electronic records standardize on a common, robust format

Records are not static. They must be combined, parsed, and analyzed to be useful. In the health care field, records must travel with the patient. Furthermore, we need an explosion of data analysis applications in order to drive diagnosis, public health planning, and research into new treatments.

Interoperability is a common mantra these days in talking about electronic health records, but I don’t think the power and urgency of record formats can be conveyed in eight-syllable words. It can be conveyed better by a site that uses data about hospital procedures, costs, and patient satisfaction to help consumers choose a desirable hospital. Or an app that might prevent a million heart attacks and strokes.

Data-wise (or data-ignorant), doctors are stuck in the 1980s, buying proprietary record systems that don’t work together even between different departments in a hospital, or between outpatient clinics and their affiliated hospitals. Now the vendors are responding to pressures from both government and the market by promising interoperability. The federal government has taken this promise as good coin, hoping that vendors will provide windows onto their data. It never really happens. Every baby step toward opening up one field or another requires additional payments to vendors or consultants.

That’s why exchanging patient data (health information exchange) requires a multi-million dollar investment, year after year, and why most HIEs go under. And that’s why the HL7 committee, putatively responsible for defining standards for electronic health records, keeps on putting out new, complicated variations on a long history of formats that were not well enough defined to ensure compatibility among vendors.

The Direct project and perhaps the nascent RHEx RESTful exchange standard will let hospitals exchange the limited types of information that the government forces them to exchange. But it won’t create a platform (as suggested in this PDF slideshow) for the hundreds of applications we need to extract useful data from records. Nor will it open the records to the masses of data we need to start collecting. It remains to be seen whether Accountable Care Organizations, which are the latest reform in U.S. health care and are described in this video, will be able to use current standards to exchange the data that each member institution needs to coordinate care. Shahid Shaw has laid out in glorious detail the elements of open data exchange in health care.

2. Reform will not succeed unless massive amounts of patient data are collected

We aren’t giving patients the most effective treatments because we just don’t know enough about what works. This extends throughout the health care system:

  • We can’t prescribe a drug tailored to the patient because we don’t collect enough data about patients and their reactions to the drug.

  • We can’t be sure drugs are safe and effective because we don’t collect data about how patients fare on those drugs.

  • We don’t see a heart attack or other crisis coming because we don’t track the vital signs of at-risk populations on a daily basis.

  • We don’t make sure patients follow through on treatment plans because we don’t track whether they take their medications and perform their exercises.

  • We don’t target people who need treatment because we don’t keep track of their risk factors.

Some institutions have adopted a holistic approach to health, but as a society there’s a huge amount more that we could do in this area. O’Reilly is hosting a conference called Strata Rx on this subject.

Leaders in the field know what health care providers could accomplish with data. A recent article even advises policy-makers to focus on the data instead of the electronic records. The question is whether providers are technically and organizationally prepped to accept it in such quantities and variety. When doctors and hospitals think they own the patients’ records, they resist putting in anything but their own notes and observations, along with lab results they order. We’ve got to change the concept of ownership, which strikes deep into their culture.

3. Reform will not succeed unless patients are in charge of their records

Doctors are currently acting in isolation, occasionally consulting with the other providers seen by their patients but rarely sharing detailed information. It falls on the patient, or a family advocate, to remember that one drug or treatment interferes with another or to remind treatment centers of follow-up plans. And any data collected by the patient remains confined to scribbled notes or (in the modern Quantified Self equivalent) a web site that’s disconnected from the official records.

Doctors don’t trust patients. They have some good reasons for this: medical records are complicated documents in which a slight rewording or typographical error can change the meaning enough to risk a life. But walling off patients from records doesn’t insulate them against errors: on the contrary, patients catch errors entered by staff all the time. So ultimately it’s better to bring the patient onto the team and educate her. If a problem with records altered by patients–deliberately or through accidental misuse–turns up down the line, digital certificates can be deployed to sign doctor records and output from devices.

The amounts of data we’re talking about get really big fast. Genomic information and radiological images, in particular, can occupy dozens of gigabytes of space. But hospitals are moving to the cloud anyway. Practice Fusion just announced that they serve 150,000 medical practitioners and that “One in four doctors selecting an EHR today chooses Practice Fusion.” So we can just hand over the keys to the patients and storage will grow along with need.

The movement for patient empowerment will take off, as experts in health reform told US government representatives, when patients are in charge of their records. To treat people, doctors will have to ask for the records, and the patients can offer the full range of treatment histories, vital signs, and observations of daily living they’ve collected. Applications will arise that can search the data for patterns and relevant facts.

Once again, the US government is trying to stimulate patient empowerment by requiring doctors to open their records to patients. But most institutions meet the formal requirements by providing portals that patients can log into, the way we can view flight reservations on airlines. We need the patients to become the pilots. We also need to give them the information they need to navigate.

4. Reform will not succeed unless providers conform to practice guidlines

Now that the government is forcing doctors to release informtion about outcomes, patients can start to choose doctors and hospitals that offer the best chances of success. The providers will have to apply more rigor to their activities, using checklists and more, to bring up the scores of the less successful providers. Medicine is both a science and an art, but many lag on the science–that is, doing what has been statistically proven to produce the best likely outcome–even at prestigious institutions.

Patient choice is restricted by arbitrary insurance rules, unfortunately. These also contribute to the utterly crazy difficulty determining what a medical procedure will cost as reported by e-Patient Dave and WBUR radio. Straightening out this problem goes way beyond the doctors and hospitals, and settling on a fair, predictable cost structure will benefit them almost as much as patients and taxpayers. Even some insurers have started to see that the system is reaching a dead-end and are erecting new payment mechanisms.

5. Reform will not succeed unless providers and patients can form partnerships

I’m always talking about technologies and data in my articles, but none of that constitutes health. Just as student testing is a poor model for education, data collection is a poor model for medical care. What patients want is time to talk intensively with their providers about their needs, and providers voice the same desires.

Data and good record keeping can help us use our resources more efficiently and deal with the physician shortage, partly by spreading out jobs among other clinical staff. Computer systems can’t deal with complex and overlapping syndromes, or persuade patients to adopt practices that are good for them. Relationships will always have to be in the forefront. Health IT expert Fred Trotter says, “Time is the gas that makes the relationship go, but the technology should be focused on fuel efficiency.”

Arien Malec, former contractor for the Office of the National Coordinator, used to give a speech about the evolution of medical care. Before the revolution in antibiotics, doctors had few tools to actually cure patients, but they live with the patients in the same community and know their needs through and through. As we’ve improved the science of medicine, we’ve lost that personal connection. Malec argued that better records could help doctors really know their patients again. But conversations are necessary too.

August 08 2012

Technical requirements for coordinating care in an Accountable Care Organization

The concept of an Accountable Care Organization (ACO) reflects modern hopes to improve medicine and cut costs in the health system. Tony MCormick, a pioneer in the integration of health care systems, describes what is needed on the ground to get doctors working together.

Highlights from the full video interview include:

  • What an Accountable Care Organization is. [Discussed at the 00:19 mark]
  • Biggest challenge in forming an ACO. [Discussed at the 01:23 mark]
  • The various types of providers who need to exchange data. [Discussed at the 03:08 mark]
  • Data formats and gaps in the market. [Discussed at the 03:58 mark]
  • Uses for data in ACOs. [Discussed at the 5:39 mark]
  • Problems with current Medicare funding and solutions through ACOs. [Discussed at the 7:50 mark]

You can view the entire conversation in the following video:

July 25 2012

Democratizing data, and other notes from the Open Source convention

There has been enormous talk over the past few years of open data and what it can do for society, but proponents have largely come to admit: data is not democratizing in itself. This topic is hotly debated, and a nice summary of the viewpoints is available in this PDF containing articles by noted experts. At the Open Source convention last week, I thought a lot about the democratizing potential of data and how it could be realized.

Who benefits from data sets

At a high level, large businesses and other well-funded organizations have three natural advantages over the general public in the exploitation of data sets:

  • The resources to gather the data
  • The resources to do the necessary programming to crunch and interpret the data
  • The resources to act on the results

These advantages will probably always exist, but data can be useful to the public too. We have some tricks that can compensate for each of the large institutions’ advantages:

  • Crowdsourcing can create data sets that can help everybody, including the formation of new businesses. OpenStreetMap, an SaaS project based on open source software, is a superb example. Its maps have been built up through years of contributions by people trying to support their communities, and it supports interesting features missing from proprietary map projects, such as tools for laying out bike paths.

  • Data-crunching is where developers, like those at the Open Source convention, come in. Working at non-profits, during week-end challenges, or just on impulse, they can code up the algorithms that make sense of data sets and apps to visualize and accept interaction from people with less technical training.

  • Some apps, such as reports of neighborhood crime or available health facilities, can benefit individuals, but we can really drive progress by joining together in community organizations or other associations that use the data. I saw a fantastic presentation by high school students in the Boston area who demonstrated a correlation between funding for summer jobs programs and lowered homicides in the inner city–and they won more funding from the Massachusetts legislature with that presentation.

Health care track

This year was the third in which the Open Source convention offered a health care track. IT plays a growing role in health care, but a lot of the established institutions are creaking forward slowly, encountering lots of organizational and cultural barriers to making good use of computers. This year our presentations clustered around areas where innovation is most robust: personal tracking, using data behind the scenes to improve care, and international development.

Open source coders Fred Trotter and David Neary gave popular talks about running and tracking one’s achievements. Bob Evans discussed a project named PACO that he started at Google to track productivity by individuals and in groups of people who come together for mutual support, while Anne Wright and Candide Kemmler described the ambitious BodyTrack project. Jason Levitt gave the science of sitting (and how to make it better for you).

In a high-energy presentation, systems developer Shahid Shah described the cornucopia of high-quality, structured data that will be made available when devices are hooked together. “Gigabytes of data is being lost every minute from every patient hooked up to hospital monitors,” he said. DDS, HTTP, and XMPP are among the standards that will make an interconnected device mesh possible. Michael Italia described the promise of genome sequencing and the challenges it raises, including storage requirements and the social impacts of storing sensitive data about people’s propensity for disease. Mohamed ElMallah showed how it was sometimes possible to work around proprietary barriers in electronic health records and use them for research.

Representatives from OpenMRS and IntraHealth international spoke about the difficulties and successes of introducing IT into very poor areas of the world, where systems need to be powered by their own electricity generators. A maintainable project can’t be dropped in by external NGO staff, but must cultivate local experts and take a whole-systems approach. Programmers in Rwanda, for instance, have developed enough expertise by now in OpenMRS to help clinics in neighboring countries install it. Leaders of OSEHRA, which is responsible for improving the Department of Veteran Affairs’ VistA and developing a community around it, spoke to a very engaged audience about their work untangling and regularizing twenty years’ worth of code.

In general, I was pleased with the modest growth of the health care track this year–most session drew about thirty people, and several drew a lot more–and both the energy and the expertise of the people who came. Many attendees play an important role in furthering health IT.

Other thoughts

The Open Source convention reflected much of the buzz surrounding developments in computing. Full-day sessions on OpenStack and Gluster were totally filled. A focus on developing web pages came through in the popularity of talks about HTML5 and jQuery (now a platform all its own, with extensions sprouting in all directions). Perl still has a strong community. A few years ago, Ruby on Rails was the must-learn platform, and knock-off derivatives appeared in almost every other programming language imaginable. Now the Rails paradigm has been eclipsed (at least in the pursuit of learning) by Node.js, which was recently ported to Microsoft platforms, and its imitators.

No two OSCons are the same, but the conference continues to track what matters to developers and IT staff and to attract crowds every year. I enjoyed nearly all the speakers, who often pump excitement into the dryest of technical topics through their own sense of continuing wonder. This is an industry where imagination’s wildest thoughts become everyday products.

May 06 2012

The state of health IT according to the American Hospital Association

Last week, the American Hospital Association released a major document. Framed as comments on a major federal initiative, the proposed Stage 2 Meaningful Use criteria by the Centers for Medicare & Medicaid Services (CMS), the letter also conveys a rather sorrowful message about the state of health IT in the United States. One request--to put brakes on the requirement for hospitals to let patients see their own information electronically--has received particularly strong coverage and vigorous responses from e-Patient Dave deBronkart, Regina Holliday, Dr. Adrian Gropper, Fred Trotter, the Center for Democracy and Technology, and others.

I think the AHA has overreached in its bid to slow down patient access to data, which I'll examine later in this article. But to me, the most poignant aspect of the AHA letter is its careful accumulation of data to show the huge gap between what health care calls for and what hospitals, vendors, standards bodies, and even the government are capable of providing.

Two AHA staff were generous enough to talk to me on very short notice and offer some clarifications that I'll include with the article.

A survey of the U.S. health care system

According to the AHA (translated into my own rather harsh words), the state of health IT in American hospitals is as follows:

  • Few hospitals and doctors can fulfill basic requirements of health care quality and cost control. For instance, 62% could not record basic patient health indicators such as weight and blood pressure (page 51 of their report) in electronic health records (EHRs).

  • Many EHR vendors can't support the meaningful use criteria in real-life settings, even when their systems were officially certified to do so. I'll cite some statements from the AHA report later in the article. Meaningful use is a big package of reforms, of course, promulgated over just a few years, but it's also difficult because vendors and hospitals had also been heading for a long time in the opposite direction: toward closed, limited functionality.

  • Doctors still record huge globs of patient data in unstructured text format, where they are unavailable for quality reporting, tracking clinical effectiveness, etc. Data is often unstructured because humans are complex and their symptoms don't fit into easy categories. Yet doctors have learned to make diagnoses for purposes of payment and other requirements; we need to learn what other forms of information are worth formalizing for the sake of better public health.

  • Quality reporting is a mess. The measures currently being reported are unreliable, and standards have not been put in place to allow valid comparisons of measures from different hospitals.

  • Government hasn't stepped up to the plate to perform its role in supporting electronic reporting. For instance, the Centers for Medicare & Medicaid Services (CMS) wants the hospitals to report lots of quality measures, but its own electronic reporting system is still in the testing stages, so hospitals must enter data through a cumbersome and error-prone manual "attestation." States aren't ready to accept electronic submissions either. The Direct project is moving along, but its contribution to health data exchange is still very new.

There's no easy place to assign blame for a system that is killing hundreds of thousands of people a year while sticking the US public with rising costs. The AHA letter constantly assures us that they approve the meaningful use objectives , but say their implementation in a foreseeable time frame is unfeasible. "We can envision a time when all automated quality reporting will occur effortlessly in a reliable and valid fashion. However, we are not there yet." (pp. 42-43)

So the AHA message petition to the CMS can be summarized overall as, "Slow everything down, but keep the payments coming."

AHA staff referred to the extensively researched article, A Progress Report On Electronic Health Records In U.S. Hospitals. It corroborates observations that adoption of EHRs has vastly increased between 2010 and 2011. However, the capabilities of the EHRs and hospitals using them have not kept up with meaningful use requirements, particularly among small rural hospitals with few opportunities to hire sophisticated computer technicians, etc. Some small hospitals have trouble even getting an EHR vendor to talk to them.

Why all this matters

Before looking at some details, let me lay out some of the reasons that meaningful use criteria are so important to patients and the general public:

  • After treatment, data must be transferred quickly to patients and the next organizations treating them (such as rehab centers and visiting nurses) so that the patients receive proper care.

  • Quality measures are critical so that hospitals can be exposed to sunshine, the best disinfectant, and be shamed into lowering costs and reducing errors.

  • Data must be collected by public agencies so that data crunchers can find improvements in outreach and treatment. Hospitals love to keep their data private, but that gives them relatively tiny samples on which to base decisions, and they often lack the skills to analyze the data.

No one can predict what will break logjams and propel health care forward, but the patient engagement seems crucial because most health care problems in developed countries involve lifestyle issues such as smoking and body weight. Next, to provide the kind of instant, pervasive patient engagement that can produce change, we need electronic records that are open to innovative apps, that can accept data from the patient-centered medical home, and that link together all care-givers.

The state of electronic health records

The EHR industry does not come out well in the AHA list of woes. The letter cites "unworkable, but certified, vendor products" (p.3) and say, "Current experience is marked by limited vendor and workforce capacity." (p. 7) The latter complaint points to one of the big hurdles facing health care reform: we don't have enough staff who understand computer systems and who can adapt their behavior to use them effectively.

Functionality falls far short of real hospital needs:

...one hospital system spent more than $1 million on a quality reporting tool from its vendor that was, for the most part, an unwieldy data entry screen. Even medication orders placed using CPOE [computerized physician order entry] needed to be manually re-entered for the CQM [Center For Quality Management] calculation. Even then, the data were not reliable, despite seven months of working with the vendor to attempt to get it right. Thus, after tremendous investment of financial and human resources, the data are not useful. (p. 45)

The AHA claims that vendors were lax in testing their systems, and that the government abetted the omission: "the proposals within the certification regulation require vendors to incorporate all of the data elements needed to calculate only one CQM. There is no proposal to require that certified EHRs be capable of generating all of the relevant CQMs proposed/finalized by CMS." (p. 41) With perhaps a subtle sarcasm, the AHA proposes, "CMS should not require providers to report more e-measures than vendors are required to generate." (p. 36)

Vendors kind of take it on the chin for fundamental failures in electronic capabilities. "AHA survey data indicate that only 10 percent of hospitals had a patient portal of any kind in Fall 2011. Our members report that none had anywhere near the functionality required by this objective. In canvassing vendors, they report no technology companies can currently support this volume of data or the listed functions." (p. 26)

We can add an observation from the College of Healthcare Information Management Executives (CHIME):

...in Stage 1, some vendors were able to dictate which clinical quality measures providers chose to report--not based on the priorities of the provider, but based on the capabilities of the system. Subsequently, market forces corrected this and vendors have gone on to develop more capabilities. But this anecdote provides an important lesson when segmenting certification criteria--indeed for most technologies in general--flexibility for users necessitates consistent and robust standards for developers. In short, the 2014 Edition must require more of the vendor community if providers are to have space to pursue meaningful use of Meaningful Use. (p. 2)

Better standards--which take time to develop--could improve the situation, which is why the Office of the National Coordinator (ONC) has set up a Health IT Standards Committee. For instance, the AHA says, "we have discovered that vendors needed to program many decisions into EHRs that were not included in the e-specifications. Not only has this resulted in rampant inconsistencies between different vendors, it produced inconsistent measure results when the e-measures are compared to their counterparts in the Inpatient Quality Reporting (IQR) Program." (p. 35)

The AHA goes so far as to say, "The market cannot sustain this level of chaos." (p. 7) They conclude that the government is pushing too hard. One of their claims, though, comes across as eccentric: "Providers and vendors agree that the meaningful use program has stifled innovation in the development of new uses of EHRs." (p. 9)

To me, all the evidence points in the opposite direction. The vendors were happy for decades to push systems that performed minimal record-keeping and modest support such as formularies at huge costs, and the hospitals that adopted EHRs failed to ask for more. It wasn't a case of market failure because, as I have pointed out (and others have too), health care is not a market. But nothing would have changed had not the government stepped in.

Patient empowerment

Now for the point that has received the most press, AHA's request to weaken the rules giving patients access to their data. Once again, the AHA claims to favor patient access--and actually, they have helped hospitals over the years to give patients summaries of care, mostly on paper--but are passing on the evidence they have accumulated from their members that the systems will not be in place to support electronic distribution for some time. I won't repeat all the criticisms of the experts mentioned at the beginning of this article, but provide some perspective about patient engagement.

Let's start with the AHA's request to let the hospital can choose the format for patient data (pp. 25-26). So long as hospitals can do that, we will be left with formats that are not interoperable. Many hospitals will choose formats that are human-readable but not machine-readable, so that correlations and useful data cannot be extracted programmatically. Perhaps the technology lags in this area--but if the records are not in structured format already, hospitals themselves lose critical opportunities to check for errors, mine data for trends, and perform other useful tasks with their records.

The AHA raises alarms at the difficulties of providing data. They claim that for each patient who is treated, the hospital will have to invest resources "determining which records are relevant and appropriate." (p. 26) "It is also unclear whether a hospital would be expected to spend resources to post information and verify that all of the data listed are available within 36 hours." (p. 27)

From my perspective, the patient download provisions would simply require hospitals to clean up their ways of recording data so that it is in a useable and structured format for all, including their own staff. Just evaluate what the AHA is admitting to in the following passage: "Transferring these clinical observations into a structured, coded problem list in the EHR requires significant changes to work flows and training to ensure accuracy. It also increases time demands for documentation by physicians who already are stretched thin." (p. 27)

People used to getting instant information from commercial web sites find it very hard to justify even the 36-hour delay offered by the Stage 2 meaningful use guidelines. Amazon.com can provide me with information on all my current and recent orders. Google offers each registered user a dashboard that shows me everything they track about me, including all my web searches going back to mid-2006. They probably do this to assure people that they are not the egregious privacy violators they are regularly accused of being. Nevertheless, it shows that sites collecting data can make it available to users without friction, and with adequate security to manage privacy risks.

The AHA staff made a good point in talking to me. The CMS "transmit" requirement would let a patient ask the hospital to send his records to any institution or individual of his choice. First of all, this would assume that the recipient has encrypted email or access to an encrypted web site. And it could be hard for a hospital to make sure both the requester and the intended recipient are who they claim to be. "The transmit function also heightens security risks, as the hospital could be asked to send data to an individual with whom it has no existing relationship and no mechanism for authentication of their identity." (p. 27) Countering this claim, Gropper and the Society for Participatory Medicine offer the open OAuth standard to give patients easy and secure access. But while OAuth is a fairly stable standard, the AHA's concerns are justified because it hasn't been applied yet to the health care field.

Unfortunately, allowing a patient to send his or her data to a third party is central to Accountable Care Organizations (ACOs), which hold the promise of improving patient care by sharing data among cooperating health care providers. If the "transmit" provision is delayed, I don't see how ACOs can take off.

The AHA drastically reduces the information hospitals would have to give patients, at least for the next stage of the requirements. Among the material they would remove are diagnoses, the reason for hospitalization, providers of care during hospitalization, vital signs at discharge, laboratory test results, the care transition summary and plan for next provider of care, and discharge instructions for patient. (p. 27) All this vastly reduces the value of data for increasing quality care. For instance, removing lab test results will lead to expensive and redundant retesting. (However, the AHA staff told me they support the ability of patients to get results directly from the labs.)

I'll conclude this section with the interesting observation that the CHIME comments on meaningful use I mentioned earlier say nothing about the patient engagement rules. In other words, the hospital CIOs in CHIME don't back up the hospitals' own claims.

Some reasonable AHA objections

Now I'm happy to turn to AHA proposals that leave fewer impediments to the achievement of better health care. Their 49-page letter (plus appendices) details many aspects of Stage 2 that seem unnecessarily burdensome or of questionable value.

It seems reasonable to me to ask the ONC, "Remove measures that make the performance of hospitals and EPs contingent on the actions of others." (p. 2) For instance, to engage in successful exchanges of patient data, hospitals depend on their partners (labs, nursing homes, other hospitals) to have Stage 2 capabilities, and given the slow rate of adoption, such partners could be really hard to find.

The same goes for patient downloads. Not only do hospitals have to permit patients to get access to data over the Internet, but they have to get 10% of the patients to actually do it. I don't think the tools are in place yet for patients to make good use of the data. When data is available, apps for processing the data will flood the market and patients will gradually understand the data's value, but right now there are few reasons to download it: perhaps to give it to a relative who is caring for the patient or to a health provider who doesn't have the technical means to request the data directly. Such uses may allow hospitals to reach the 10% required by the Stage 2 rule, but why make them responsible?

The AHA documents a growing digital divide among hospitals and other health care providers. "Rural, smaller and nonteaching hospitals have fewer financial and technical resources at their disposal. They also are starting from a lower base of adoption." (p. 59) The open source community needs to step up here. There are plenty of free software solutions to choose from, but small providers can't use them unless they become as easy to set up and configure as MySQL or even LibreOffice.

The AHA is talking from deep experience when it questions whether patients will actually be able to make use of medical images. "Images are generally very large files, and would require that the individual downloading or receiving the file have specialized, expensive software to access the images. The effort required to make the images available would be tremendous." (p. 26) We must remember that parts of our country don't even have high-speed Internet access.

The AHA's detailed comments about CMS penalties for the slow adoption of EHRs (pp. 9-18) also seem to reflect the hard realities out in the field.

But their attitude toward HIPAA is unclear. They point out that Congress required meaningful use to "take into account the requirements of HIPAA privacy and security law." (p. 25) Nevertheless, they ask the ONC to remove its HIPAA-related clauses from meaningful use because HIPAA is already administered by the Office of Civil Rights (OCR). It's reasonable to remove redundancy by keeping regulations under a single agency, but the AHA admits that the OCR proposal itself is "significantly flawed." Their staff explained to me that their goal is to wait for the next version of the OCR's own proposal, which should be released soon, before creating a new requirement that could well be redundant or conflicting.

Unless we level the playing field for small providers, an enormous wave of buy-outs and consolidation will occur. Market forces and the push to form ACOs are already causing such consolidation. Maybe it's even a good thing--who feels nostalgic for the corner grocery? But consolidation will make it even more important to empower patients with their data, in order to counterbalance the power of the health care institutions.

A closing note about hospital inertia

The AHA includes in its letter some valuable data about difficulties and costs of implementing new systems (pp. 47-48). They say, "More than one hospital executive has reported that managing the meaningful use implementation has been more challenging than building a new hospital, even while acknowledging the need to move ahead." (p. 49)

What I find particularly troublesome about their report is that the AHA offers no hint that the hospitals spent all this money to put in place new workflows that could improve care. All the money went to EHRs and the minimal training and installation they require. What will it take for hospitals to make the culture changes that reap the potential benefits of EHRs and data transfers? The public needs to start asking tough questions, and the Stage 2 requirements should be robust enough to give these questions a basis.

March 26 2012

Five tough lessons I had to learn about health care

Working in the health care space has forced me to give up many hopes and expectations that I had a few years ago. Forgive me for being cynical (it's an easy feeling to have following the country's largest health IT conference, as I reported a month ago), and indeed some positive trends do step in to shore up hope. I'll go over the redeeming factors after listing the five tough lessons.

1. The health care field will not adopt a Silicon Valley mentality

Wild, willful, ego-driven experimentation--a zeal for throwing money after intriguing ideas with minimal business plans--has seemed work for the computer field, and much of the world is trying to adopt a "California optimism." A lot of venture capitalists and technology fans deem this attitude the way to redeem health care from its morass of expensive solutions that don't lead to cures. But it won't happen, at least not the way they paint it.

Health care is one of the most regulated fields in public life, and we want it that way. From the moment we walk into a health facility, we expect the staff to be following rigorous policies to avoid infections. (They don't, but we expect them to.) And not just anybody can set up a shield outside the door and call themselves a doctor. In the nineteenth century it was easier, but we don't consider that a golden age of medicine.

Instead, doctors go through some of the longest and most demanding training that exists in the world today. And even after they're licensed, they have to regularly sign up for continuing education to keep practicing. Other fields in medicine are similar. The whole industry is constrained by endless requirements that make sure the insiders remain in their seats and no "disruptive technologies" raise surprises. Just ask a legal expert about the complex mesh of Federal and state regulations that a health care provider has to navigate to protect patient privacy--and you do want your medical records to be private, don't you?--before you rave about the Silicon Valley mentality. Also read the O'Reilly book by Fred Trotter and David Uhlman about the health care system as it really is.

Nor can patients change treatments with the ease of closing down a Facebook account. Once a patient has established a trust relationship with a doctor and obtained a treatment plan, he or she won't say, "I think I'll go down the road to another center that charges $100 less for this procedure." And indeed, health reform doesn't prosper from breaking down treatments into individual chunks. Progress lies in the opposite direction: the redemptive potential of long-term relationships.

2. Regulations can't force change

I am very impressed with the HITECH act (a product of the American Recovery and Reinvestment Act, more than the Affordable Care Act) that set modern health reform in motion, as well as the efforts of the Department of Health and Human Services to push institutions forward. But change in health care, like education, boils down to the interaction in a room between a professional and a client. Just as lesson plans and tests can't ensure that a teacher inspires a child to learn, regulations can't keep a doctor from ordering an unnecessary test to placate an anxious patient.

We can offer clinical decision support to suggest what has worked for other patients, but we can't keep a patient from asking for a expensive procedure that has a 10% chance of making him better (and a 20% chance of making him worse), nor can we make the moral decision about what treatment to pursue, for the patient or the doctor. Each patient is different, anyway. No one wants to be a statistic.

3. The insurance companies are not the locus of cost and treatment problems

Health insurers are a favorite target of hatred by Americans, exemplified by Michael Moore's 2007 movie Sicko and more surprisingly in the 1997 romantic comedy As Good as it Gets, where I saw an audience applaud as Helen Hunt delivered a rant against health maintenance organizations. A lot of activists, looking at other countries, declare that our problems would be solved (well, would improve a lot) if we got private insurers out of the picture.

Sure, there's a lot of waste in the current insurance system, which deliberately stretches out the task of payment and makes it take up the days of full-time staff in each doctor's office. But that's not the cause of the main problems in either costs or treatment failures. The problems lie with the beloved treatment staff. We can respect their hard work and the lives they save, but we don't have to respect them for releasing patients from hospitals without adequate follow-up, or for ordering unnecessary radiation that creates harm for patients, or for the preventable errors that still (after years of publicity) kill 90,000 to 100,000 patients a year.

4. Doctors don't want to be care managers

The premise of health reform is to integrate patients into a larger plan for managing a population. A doctor is supposed to manage a case load and keep his or her pipeline full while not spending too much. The thrust of various remuneration schemes, old and new, that go beyond fee for service (capitation, global payment systems) is to reward a doctor for handling patients of a particular type (for instance, elderly people with hypertension) at a particular cost. But doctors aren't trained for this. They want to fix the immediate, presenting complaint and send the patient home until they're needed again. Some think longitudinally, and diligently try to treat the whole person rather than a symptom. But managing their treatment options as a finite resource is just not in their skill set.

The United Kingdom--host of one of the world's great national care systems--is about to launch a bold new program where doctors have to do case management. The doctors are rebelling. If this is the future of medicine, we'll have to find new medical personnel to do it.

5. Patients don't want to be care managers

Now that the medical field has responded superbly to acute health problems, we are left with long-term problems that require lifestyle and environmental changes. The patient is even more important than the doctor in these modern ills. But the patients who cost the most and need to make the most far-ranging changes are demonstrating an immunity to good advice. They didn't get emphysema or Type 2 diabetes by acting healthily in the first place, and they aren't about to climb out of their condition voluntarily either.

You know what the problem with chronic disease is? Its worst effects are not likely to show up early in life when lifestyle change could make the most difference. (Serious pain can come quickly from some chronic illnesses, such as asthma and Crohn's disease, but these are also hard to fix through lifestyle changes, if by "lifestyle change" you mean breathing clean air.) The changes a patient would have to make to prevent smoking-related lung disease or obesity-related problems would require a piercing re-evaluation of his course of life, which few can do. And incidentally, they are neither motivated nor trained to store their own personal health records.

Hope for the future

Despite the disappointments I've undergone in learning about health care, I expect the system to change for the better. It has to, because the public just won't tolerate more precipitous price hikes and sub-standard care.

There's a paucity of citations in my five lessons because they tend not to be laid out bluntly in research or opinion pieces; for the most part, they emerged gradually over many hallway conversations I had. Each of the five lessons contain a "not," indicating that they attack common myths. Myths (in the traditional sense) in fact are very useful constructs, because they organize the understanding of the world that societies have trouble articulating in other ways. We can realize that myths are historically inaccurate while finding positive steps forward in them.

The Silicon Valley mentality will have some effect through new devices and mobile phone apps that promote healthy activity. They can help with everything from basic compliance--remembering to take prescribed meds--to promoting fitness crazes and keeping disabled people in their homes. Lectures given once in a year in the doctor's office don't lead to deep personal change, but having a helper nearby (even a digital one) can impel a person to act better, hour by hour and day by day. This has been proven by psychologists over and over: motivation is best delivered in small, regular doses (a theme found in my posting from HIMSS).

Because the most needy patients are often the most recalcitrant ones, personal responsibility has to intersect with professional guidance. A doctor has to work the patient, and other staff can shore up good habits as well. This requires the doctors' electronic record systems to accept patient data, such as weight and mood. Projects such as Indivo X support these enhancements, which traditional electronic record systems are ill-prepared for.

Although doctors eschew case management, there are plenty of other professionals who can help them with it, and forming Accountable Care Organizations gives the treatment staff access to such help. Tons of potential savings lie in the data that clinicians could collect and aggregate. Still more data is being loaded by the federal government regularly at Health.Data.Gov. ACOs and other large institutions can hire people who love to crunch big data (if such staff can be found, because they're in extremely high demand now in almost every industry) to create systems that slide seamlessly into clinical decision support and provide guidelines for better treatment, as well as handle the clinic's logistics better. So what we need to do is train a lot more experts in big data to understand the health care field and crunch its numbers.

Change will be disruptive, and will not be welcomed with open arms. Those who want a better system need to look at the areas where change is most likely to make a difference.

February 29 2012

Report from HIMSS 12: wrap-up of the largest health IT conference

This is a time of great promise in health care, yet an oppressive atmosphere hung over much of href="http://www.himssconference.org/">HIMSS. All the speakers--not least the government representatives who announced rules for the adoption of electronic health records--stressed commendable practices such as data exchange, providing the patient with information, and engaging with the patient. Many berated hospitals, doctors, and vendors for neglecting the elements that maintain health. But the thrust of most sessions was on such details as how to convert patient records to the latest classification of diseases (ICD-10).

Intelligent Hospital pavilion shows off tempting technology
Intelligent Hospital pavilion shows off tempting technology.

I have nothing against ICD-10 and I'm sure adopting it is a big headache that deserves attention at the conference. The reason I call the atmosphere oppressive is that I felt stuck among health care providers unable to think long-term or to embrace the systems approach that we'll need to cure people and cut costs. While some health care institutions took the ICD-10 change-over seriously and put resources into meeting the deadline, others pressured the Dept. of Health and Human services to delay implementation, and apparently won a major reprieve. The health IT community, including HIMSS, criticized the delay. But resistance to progress usually does not break out so overtly, and remains ingrained in day-to-day habits.

But ICD-10 is a sideline to the major issue of Stage 2 meaningful use. Why, as I reported on Wednesday, were so many of the 35,000 HIMSS attendees wrapped up in the next step being forced on them by the federal government? The scandal is that these meaningful use concepts (using data to drive care, giving care-givers information that other care-givers have collected about the patient) have to be forced on them. Indeed, institutions like Kaiser Permanente that integrated their electronic records years ago and concentrated on the whole patient had relatively little work to do to conform to Stage 1, and probably have the building blocks for Stage 2 in place. And of course these things are part of the landscape of health care in other countries. (The proposed regulations were finally posted last Thursday.)

Recipients of Regina Holliday jackets record patient involvement stories
Recipients of Regina Holliday jackets record patient involvement stories.

Haven't our providers heard that an ounce of prevention is worth a pound of cure? Don't well-educated and well-paid executives invest in quality measures with the expectation that they'll pay off in the long run? And aren't we all in the field for the good of the patients? What is that snickering I hear?

Actually, I don't accept the premise that providers are all in it for the money. If so many are newly incentivized to join the government's program for a mere $15,000 per doctor (plus avoiding some cuts in Medicare payments), which is a small fraction of the money they'll have to spend implementing the program, they must know that it's time to do the right thing. Meaningful use can be a good framework to concretize the idealistic goals of health care reform, but I just wish the vendors and doctors would keep their eyes more on the final goal.

Redwood MedNet in Northern California is an example of a health information exchange that adopted standards (CONNECT, before the Direct project was in place) to simplify data exchange between health providers. Will Ross of Redwood MedNet told me that qualifying for Stage 2 would be simple for them, "but you won't hear that from many vendors in this exhibit hall."

Annual surveys by Family Practice Management journal about their readers' satisfaction with EHRs, reviewed in one HIMSS session, showed widespread dissatisfaction that doesn't change from year to year. For instance, 39% were dissatisfied with support and training, although a few vendors rated quite high. Still, considering that doctors tend to veer away from open source solutions and pay big bucks for proprietary ones out of a hope of receiving better support and training, they deserve better. It's worth noting that the longer a practice uses its system, the more they're likely to express satisfaction. But only 38% of respondents would purchase the same systems now if they weren't already locked in.

That's the big, frustrating contradiction at HIMSS. The vendors have standards (HL7 and others), they've been setting up health information exchanges (under various other names) for years, they have a big, popular interoperability lab at each conference--and yet most patients still have to carry paper records and CDs with images from one doctor to another. (A survey of HIMSS members showed that one-quarter allowed access by patients to their data, which is an advance but still just a start.) The industry as a whole has failed to make a dent in the 90,000 to 100,000 needless deaths that occur in treatment facilities each year. And (according to one speaker) 20% of patients hospitalized under Medicare have to return to the hospital shortly after discharge.

Omens of change

Suffice it say that by my fourth day at HIMSS I was not happy. Advances come, but slowly. Examples of developments I can give a thumbs-up to at HIMSS were data sharing among physicians who use Practice Fusion, a popular example of a growing move to web services for electronic records, and a CardioEngagement Challenge funded by Novartis to encourage at-risk patients to take more interest in their health. The winner was a Sensei mobile app that acts as an automated coach. Sensei CEO Robert Schwarzberg, a cardiologist, told me had put together phone-in coaching services for heart patients during the years before mobile apps, and was frustrated that these coaches were available less than once a week when what patients needed was round-the-clock motivation. Sensei Wellness is one of the many mobile apps that make both patients and doctors more connected, and HIMSS quite properly devoted a whole section of the exhibit floor to them.

Talking about Sensei Wellness with Dr. Robert Schwarzberg
Talking about Sensei Wellness with Dr. Robert Schwarzberg.

I dropped by the IBM booth for the obligatory demo of Watson's medical application, and some background from Dr. Josko Silobrcic. I also filled in some of this report from an earlier conversation with tech staff.

Medical diagnosis involves more structured data than solving Jeopardy riddles, structure that appears mostly in the form of links between data sets. For instance, medicines are linked to diagnoses, to lab results, and to other medicines (for instance, some drugs are counter-indicated when the patient is taking other drugs). Watson follows these relationships.

But because Watson is a natural language processing application--based on UIMA, which IBM donated to the Apache Foundation--it doesn't try to do much reasoning to pick out the best diagnosis or treatment, both of which are sometimes requested of it. Instead, it dumps huge indexes of medical articles into its data stores on one side, and takes in the text about the patient's complaint and doctor's evaluation on the other. Matching them up is not so different from a Jeopardy question, after all. Any possible match is considered and kept live until the final round of weighing answers, even if the chance of matching is near zero.

Dr. Josko Silobrcic before Watson demonstration
Dr. Josko Silobrcic before Watson demonstration.

Also because of the NLP basis for matching, there is rarely a need to harmonize disparate data taken in from different journals or medical sources.

I assumed that any processing that uses such a large data set and works so fast must run on a huge server farm, but the staff assured me it's not as big as one would think. For production use, of course, they'll need to take into account the need to scale. The medical informatics equivalent of a Christmas rush on sales would be an epidemic where everybody in the region is urgently hitting Watson for critical diagnoses.

Coming to peace

Healing came to me on my last day at HIMSS, at too related conferences off to the side of the main events: a meeting of Open Health Tools members and the eCollaboration forum, run by health activists who want to break down barriers to care. Both groups have partnerships with HIMSS.

Open Health Tools positions itself as an umbrella organization for projects making free software for a lot of different purposes in health care: recording, treatment, research and more. One illustrative project I got to hear about at their meeting was the Medical Imaging Network Transport (MINT), which Johns Hopkins is working on in coordination with other teams

MINT cuts down on the transfers of huge images by doing some processing in place and transferring only portions of the data. Switching to modern storage formats (XML and JSON) and better methods of data transfer also reduces waste. For instance, current DICOM vendors transmit images over TCP, which introduces more overhead than necessary when handling the packet losses engendered by transmitting files that are several gigabytes in size. MINT allows UDP and other protocols that are leaner than TCP.

Best of all, MINT DICOM images can be displayed through HTML5, which means any browser can view them in good resolution, there is no need to install a specialized viewer at each location where the doctor is checking the image, and dependence on proprietary software is reduced. (The same reliance on standard browsers is also claimed by eMix in a recent interview.

At the eCollaboration forum, E-patient Dave DeBronkart reported that being an engaged patient is still swimming upstream. It's hard to get one's records, hard to find out what treatments will cost, and hard to get taken seriously as an adult interested in monitoring one's own care. Meg McCabe of Aetna says that insurers need to offer more sophisticated guidance to patients trying to choose a health provider--simple lists of options are confusing and hard to choose from.

One speaker warned providers that if they try to open their data for collaborative care, they may find themselves hampered by contracts that maintain vendor ownership of EHR data. But speakers assured us vendors are not evil. The issue is what the providers ask for when they buy the EHR systems.

Here's the strange thing about the eCollaboration forum: they signed up enough people to fill the room ahead of time and left many potential attendees lamenting that they couldn't get in. Yet on the actual day of the event, there were about eight empty seats for every attendee. Maybe HIMSS attendees felt that had to devote all their time to the stage 2 regulations, previously mentioned. But I take the disappointing turn-out as a sign of the providers' and vendors' lack of commitment to change. Shown a dazzling roster of interesting talks about data exchange, open record sharing, and patient engagement, they're quick to sign up--but they don't show up when it counts.

As members of the general public, we can move the health care field forward by demanding more from our providers, at the point where we have some influence. Anyone looking for concrete guidance for increasing their influence as a patient can try e-Patients Live Longer: The Complete Guide to Managing Health Care Using Technology, by Nancy B. Finn.

Public attention and anger have been focused on insurers, who have certainly engaged in some unsavory practices to avoid paying for care--but nothing as destructive as the preventable errors and deaths caused by old-fashioned medical practices. And while economists complain about the 30 cents out of every dollar wasted in the American hodge-podge of payment systems, we know that unnecessary medical procedures or, conversely, preventative steps that were omitted, also suck up a lot of money. One speaker at the eCollaboration forum compared the sky-rocketing costs of health care and insurance to a financial bubble that can't last. Let's all take some responsibility for instituting better medical and reporting systems so the costs come down in a healthy manner.

Other articles about HIMSS were posted last Tuesday and Wednesday.

February 23 2012

Report from HIMSS 2012: toward interoperability and openness

I was wondering how it would feel to be in the midst of 35,000 people whose livelihoods are driven by the decisions of a large institution at the moment when that institution releases a major set of rules. I didn't really find out, though. The 35,000 people I speak of are the attendees of the HIMSS conference and the institution is the Department of Health and Human Services. But HHS just sort of half-released the rules (called Stage 2 of meaningful use), telling us that they would appear online tomorrow and meanwhile rushing over a few of the key points in a presentation that drew overflow crowds in two rooms.

The reaction, I sensed, was a mix of relief and frustration. Relief because Farzad Mostashari, National Coordinator for Health Information Technology, promised us the rules would be familiar and hewed closely to what advisors had requested. Frustration, however, at not seeing the details. The few snippets put up on the screen contained enough ambiguities and poorly worded phrases that I'm glad there's a 60-day comment period before the final rules are adopted.

There isn't much one can say about the Stage 2 rules until they are posted and the experts have a chance to parse them closely, and I'm a bit reluctant to throw onto the Internet one of potentially 35,000 reactions to the announcement, but a few points struck me enough to be worth writing about. Mostashari used his pulpit for several pronouncements about the rules:

  • HHS would push ahead on goals for interoperability and health information exchange. "We can't wait five years," said Mostashari. He emphasized the phrase "standard-based" in referring to HIE.

  • Patient engagement was another priority. To attest to Stage 2, institutions will have to allow at least half their patients to download and transfer their records.

  • They would strive for continuous quality improvement and clinical decision support, key goals enabled by the building blocks of meaningful use.

Two key pillars of the Stage 2 announcement are requirements to use the Direct project for data exchange and HL7's consolidated CDA for the format (the only data exchange I heard mentioned was a summary of care, which is all that most institutions exchange when a patient is referred).

The announcement demonstrates the confidence that HHS has in the Direct project, which it launched just a couple years ago and that exemplifies a successful joint government/private sector project. Direct will allow health care providers of any size and financial endowment to use email or the Web to share summaries of care. (I mentioned it in yesterday's article.) With Direct, we can hope to leave the cumbersome and costly days of health information exchange behind. The older and more complex CONNECT project will be an option as well.

The other half of that announcement, regarding adoption of the CDA (incarnated as a CCD for summaries of care), is a loss for the older CCR format, which was an option in Stage 1. The CCR was the Silicon Valley version of health data, a sleek and consistent XML format used by Google Health and Microsoft HealthVault. But health care experts criticized the CCR as not rich enough to convey the information institutions need, so it lost out to the more complex CCD.

The news on formats is good overall, though. The HL7 consortium, which has historically funded itself by requiring organizations to become members in order to use its standards, is opening some of them for free use. This is critical for the development of open source projects. And at an HL7 panel today, a spokesperson said they would like to head more in the direction of free licensing and have to determine whether they can survive financially while doing so.

So I'm feeling optimistic that U.S. health care is moving "toward interoperability and openness," the phrase I used in the title to his article and also used in a posting from HIMSS two years ago.

HHS allowed late-coming institutions (those who began the Stage 1 process in 2011) to continue at Stage 1 for another year. This is welcome because they have so much work to do, but means that providers who want to demonstrate Stage 2 information exchange may have trouble because they can't do it with other providers who are ready only for Stage 1.

HHS endorsed some other standards today as well, notably SNOMED for diseases and LRI for lab results. Another nice tidbit from the summit includes the requirement to use electronic medication administration (for instance, bar codes to check for errors in giving medicine) to foster patient safety.

February 22 2012

Report from HIMSS: health care tries to leap the chasm from the average to the superb

I couldn't attend the session today on StealthVest--and small surprise. Who wouldn't want to come see an Arduino-based garment that can hold numerous health-monitoring devices in a way that is supposed to feel like a completely normal piece of clothing? As with many events at the HIMSS conference, which has registered over 35,000 people (at least four thousand more than last year), the StealthVest presentation drew an overflow crowd.

StealthVest sounds incredibly cool (and I may have another chance to report on it Thursday), but when I gave up on getting into the talk I walked downstairs to a session that sounds kind of boring but may actually be more significant: Practical Application of Control Theory to Improve Capacity in a Clinical Setting.

The speakers on this session, from Banner Gateway Medical Center in Gilbert, Arizona, laid out a fairly standard use of analytics to predict when the hospital units are likely to exceed their capacity, and then to reschedule patients and provider schedules to smooth out the curve. The basic idea comes from chemical engineering, and requires them to monitor all the factors that lead patients to come in to the hospital and that determine how long they stay. Queuing theory can show when things are likely to get tight. Hospitals care a lot about these workflow issues, as Fred Trotter and David Uhlman discuss in the O'Reilly book Beyond Meaningful Use, and they have a real effect on patient care too.

The reason I find this topic interesting is that capacity planning leads fairly quickly to visible cost savings. So hospitals are likely to do it. Furthermore, once they go down the path of collecting long-term data and crunching it, they may extend the practice to clinical decision support, public health reporting, and other things that can make a big difference to patient care.

A few stats about data in U.S. health care

Do we need a big push to do such things? We sure do, and that's why meaningful use was introduced into HITECH sections of the American Recovery and Reinvestment Act. HHS released mounds of government health data on Health.data.gov hoping to serve a similar purpose. Let's just take a look at how far the United States is from using its health data effectively.

  • Last November, a CompTIA survey (reported by Health Care IT News) found that only 28% of providers have comprehensive EHRs in use, and another 17% have partial implementations. One has to remember that even a "comprehensive" EHR is unlikely to support the sophisticated data mining, information exchange, and process improvement that will eventually lead to lower costs and better care.

  • According to a recent Beacon Partners survey (PDF), half of the responding institutions have not yet set up an infrastructure for pursuing health information exchange, although 70% consider it a priority. The main problem, according to a HIMSS survey, is budget: HIEs are shockingly expensive. There's more to this story, which I reported on from a recent conference in Massachusetts.

Stats like these have to be considered when HIMSS board chair, Charlene S. Underwood, extolled the organization's achievements in the morning keynote. HIMSS has promoted good causes, but only recently has it addressed cost, interoperability, and open source issues that can allow health IT to break out of the elite of institutions large or sophisticated enough to adopt the right practices.

As signs of change, I am particularly happy to hear of HIMSS's new collaboration with Open Health Tools and their acquisition of the mHealth summit. These should guide the health care field toward more patient engagement and adaptable computer systems. HIEs are another area crying out for change.

An HIE optimist

With the flaccid figures for HIE adoption in mind, I met Charles Parisot, chair of Interoperability Standards and Testing Manager for EHRA, which is HIMSS's Electronic Health Records Association. The biggest EHR vendors and HIEs come together in this association, and Parisot was just stoked with positive stories about their advances.

His take on the cost of HIEs is that most of them just do it in a brute force manner that doesn't work. They actually copy the data from each institution into a central database, which is hard to manage from many standpoints. The HIEs that have done it right (notably in New York state and parts of Tennessee) are sleek and low-cost. The solution involves:

  • Keeping the data at the health care providers, and storing in the HIE only some glue data that associates the patient and the type of data to the provider.

  • Keeping all metadata about formats out to the HIE, so that new formats, new codes, and new types of data can easily be introduced into the system without recoding the HIE.

  • Breaking information exchange down into constituent parts--the data itself, the exchange protocols, identification, standards for encryption and integrity, etc.--and finding standard solutions for each of these.

So EHRA has developed profiles (also known by its ONC term, implementation specifications) that indicate which standard is used for each part of the data exchange. Metadata can be stored in the core HL7 document, the Clinical Document Architecture, and differences between implementations of HL7 documents by different vendors can also be documented.

A view of different architectures in their approach can be found in an EHRA white paper, Supporting a Robust Health Information Exchange Strategy with a Pragmatic Transport Framework. As testament to their success, Parisot claimed that the interoperability lab (a huge part of the exhibit hall floor space, and a popular destination for attendees) could set up the software connecting all the vendors' and HIEs' systems in one hour.

I asked him about the simple email solution promised by the government's Direct project, and whether that may be the path forward for small, cash-strapped providers. He accepted that Direct is part of the solution, but warned that it doesn't make things so simple. Unless two providers have a pre-existing relationship, they need to be part of a directory or even a set of federated directories, and assure their identities through digital signatures.

And what if a large hospital receives hundreds of email messages a day from various doctors who don't even know to whom their patients are being referred? Parisot says metadata must accompany any communications--and he's found that it's more effective for institutions to pull the data they want than for referring physicians to push it.

Intelligence for hospitals

Finally, Parisot told me EHRA has developed standards for submitting data to EHRs from 350 types of devices, and have 50 manufacturers working on devices with these standards. I visited a booth of iSirona as an example. They accept basic monitoring data such as pulses from different systems that use different formats, and translate over 50 items of information into a simple text format that they transmit to an EHR. They also add networking to devices that communicate only over cables. Outlying values can be rejected by a person monitoring the data. The vendor pointed out that format translation will be necessary for some time to come, because neither vendors nor hospitals will replace their devices simply to implement a new data transfer protocol.

For more about devices, I dropped by one of the most entertaining parts of the conference, the Intelligent Hospital Pavilion. Here, after a badge scan, you are somberly led through a series of locked doors into simulated hospital rooms where you get to watch actors in nursing outfits work with lifesize dolls and check innumerable monitors. I think the information overload is barely ameliorated and may be worsened by the arrays of constantly updated screens.

But the background presentation is persuasive: by using attaching RFIDs and all sorts of other devices to everything from people to equipment, and basically making the hospital more like a factory, providers can radically speed up responses in emergency situations and reduce errors. Some devices use the ISM "junk" band, whereas more critical ones use dedicated spectrum. Redundancy is built in throughout the background servers.

Waiting for the main event

The US health care field held their breaths most of last week, waiting for Stage 2 meaningful use guidelines from HHS. The announcement never came, nor did it come this morning as many people had hoped. Because meaningful use is the major theme of HIMSS, and many sessions were planned on helping providers move to Stage 2, the delay in the announcement put the conference in an awkward position.

HIMSS is also nonplussed over a delay in another initiative, the adoption of a new standard in the classification of disease and procedures. ICD-10 is actually pretty old, having been standardized in the 1980s, and the U.S. lags decades behind other countries in adopting it. Advantages touted for ICD-10 are:

  • It incorporates newer discoveries in medicine than the dominant standard in the U.S., ICD-9, and therefore permits better disease tracking and treatment.

  • Additionally, it's much more detailed than ICD-9 (with an order of magnitude more classifications). This allows the recording of more information but complicates the job of classifying a patient correctly.

ICD-10 is rather controversial. Some people would prefer to base clinical decisions on SNOMED, a standard described in the Beyond Meaningful Use book mentioned earlier. Ultimately, doctors lobbied hard against the HHS timeline for adopting ICD-10 because providers are so busy with meaningful use. (But of course, the goals of adopting meaningful use are closely tied to the goals of adopting ICD-10.) It was the pushback from these institutions that led HHS to accede and announce a delay. HIMSS and many of its members were disappointed by the delay.

In addition, there is an upcoming standard, ICD-11, whose sandal some say ICD-10 is not even worthy to lace. A strong suggestion that the industry just move to ICD-11 was aired in Government Health IT, and the possibility was raised in Health Care IT News as well. In addition reflecting the newest knowledge about disease, ICD-11 is praised for its interaction with SNOMED and its use of Semantic Web technology.

That last point makes me a bit worried. The Semantic Web has not been widely adopted, and if people in the health IT field think ICD-10 is complex, how are they going to deal with drawing up and following relationships through OWL? I plan to learn more about ICD-11 at the conference.

February 06 2012

Small Massachusetts HIT conference returns to big issues in health care

I've come to look forward to the Massachusetts Heath Data Consortium's annual HIT conference because--although speakers tout the very real and impressive progress made by Massachusetts health providers--you can also hear acerbic and ruthlessly candid critiques of policy and the status quo. Two notable take-aways from last year's conference (which I wrote up at the time) were the equivalence of old "managed care" to new "accountable care organizations" and the complaint that electronic health records were "too expensive, too hard to use, and too disruptive to workflow." I'll return to these claims later.

The sticking point: health information exchange

This year, the spears were lobbed by Ashish Jha of Harvard Medical School, who laid out a broad overview of progress since the release of meaningful use criteria and then accused health care providers of undermining one of its main goals, the exchange of data between different providers who care for the same patient. Through quantitative research (publication in progress), Jha's researchers showed a correlation between fear of competition and low adoption of HIEs. Hospitals with a larger, more secure position in their markets, or in more concentrated markets, were more likely to join an HIE.

The research bolsters Jha's claim that the commonly cited barriers to using HIEs (technical challenges, cost, and privacy concerns) are surmountable, and that the real problem is a refusal to join because a provider fears that patients would migrate to other providers. It seems to me that the government and public can demand better from providers, but simply cracking the whip may be ineffective. Nor should it be necessary. An urgent shortage of medical care exists everywhere in the country, except perhaps a few posh neighborhoods. There's plenty for all providers. Once insurance is provided to all the people in need, no institution should need to fear a lack of business, unless it's performance record is dismal.

Jha also put up some research showing a strong trend toward adopting electronic health records, although the small offices that give half the treatment in the United States are still left behind. He warned that to see big benefits, we need to bring in health care institutions that are currently given little attention by the government--nursing home, rehab facilities, and so forth--and give them incentives to digitize. He wrapped up by quoting David Blumenthal, former head of the ONC, on the subject of HIEs. Blumenthal predicted that we'd see EHRs in most providers over the next few years, and that the real battle would be getting them to adopt health information exchange.

Meanwhile, meaningful use could trigger a shake-out in the EHR industry, as vendors who have spent years building silo'd projects fail to meet the Stage 2 requirements that fulfill the highest aspirations of the HITECH act that defined meaningful use, including health information exchange. Meanwhile, a small but steadily increasing number of open source projects have achieved meaningful use certification. So we'll see more advances in the adoption of both EHRs and HIEs.

Low-hanging fruit signals a new path for cost savings

The big achievement in Massachusetts, going into the conference today, was a recent agreement between the state's major insurer, Blue Cross Blue Shield, and the 800-pound gorilla of the state's health care market, Partners HealthCare System. The pact significantly slows the skyrocketing costs that we've all become accustomed to in the United States, through the adoption of global payments (that is, fixed reimbursements for treating patients in certain categories). That two institutions of such weight can relinquish the old, imprisoning system of fee-for-service is news indeed.

Note that the Blue Cross/Partners agreement doesn't even involve the formation of an Accountable Care Organization. Presumably, Partners believes it can pick some low-hanging fruit through modest advances in efficiency. Cost savings you can really count will come from ACOs, where total care of the patient is streamlined through better transfers of care and intensive communication. Patient-centered medical homes can do even more. So an ACO is actually much smarter than old managed care. But it depends on collecting good data and using it right.

The current deal is an important affirmation of the path Massachusetts took long before the rest of the country in aiming for universal health coverage. We all knew at the time that the Massachusetts bill was not addressing costs and that these would have to be tackled eventually. And at first, of course, health premiums went up because a huge number of new people were added to the roles, and many of them were either sick or part of high-risk populations.

The cost problem is now being addressed through administrative pressure (at one point, Governor Deval Patrick flatly denied a large increase requested by insurers), proposed laws, and sincere efforts at the private level such as the Blue Cross/Partners deal. I asked a member of the Patrick administration whether they problem could be solved without a new law, and he expressed the opinion that there's a good chance it could be. Steven Fox of Blue Cross Blue Shield said that 70% of their HMO members go to physicians in their Alternative Quality Network, which features global payments. And he said these members have better outcomes at lower costs.

ACOs have a paradoxical effect on health information exchange Jha predicted that ACOs, while greatly streamlining the exchanges between their member organizations, because these save money, they will resist exchanging data with outside providers because keeping patients is even more important for ACOs than for traditional hospitals and clinics. Only by keeping a patient can the ACO reap the benefits of the investments they make in long-term patient health.

As Doris Mitchell received an award for her work with the MHDC, executive directory Ray Campbell mentioned the rapid growth and new responsibilities of her agency, the Group Insurance Commission, which negotiates all health insurance coverage for state employees, as cities and towns have been transferring their municipal employees to it. A highly contentious bill last year that allowed the municipalities to transfer their workers to the GIC was widely interpreted as a blow against unionized workers, when it was actually just a ploy to save money through the familiar gambit of combining the insured into a larger pool. I covered this controversy at the time.

A low-key conference

Attendance was down at this year's conference, with about half the attendees and vendors as last year's. Lowered interest seemed to be reflected as none of the three CEOs receiving awards turned up to represent their institutions (the two institutions mentioned earlier for their historic cost-cutting deal--Blue Cross Blue Shield and Partners HealthCare--along with Steward Health Care).

The morning started with a thoughtful look at the requirements for ACOs by Frank Ingari of Essence Healthcare, who predicted a big rise in investment by health care institutions in their IT departments. Later speakers echoed this theme, saying that hospitals should invest less in state-of-the-art equipment that leads to immediately billable activities, and more in the underlying IT that will allow them to collect research data and cut down waste. Some of the benefits available through this research were covered in a talk at the Open Source convention a couple years ago.

Another intriguing session covered technologies available today that could be more widely adopted to improve health care. Videos of robots always draw an enthusiastic response, but a more significant innovation ultimately may be a database McKesson is developing that lets doctors evaluate genetic tests and decide when such tests are worth the money and trouble.

The dozen vendors were joined by a non-profit, Sustainable Healthcare for Haiti. Their first project is one of the most basic health interventions one can make: providing wells for drinkable water. They have a local sponsor who can manage their relationship with the government, and an ambitious mission that includes job development, an outpatient clinic, and an acute care children's hospital.

September 21 2011

David Blumenthal lauds incrementalism at forum on electronic health records

Anyone who follows health issues in the U.S. has to be obsessed with the workings of the Office of the National Coordinator (ONC). During the critical early phases of implementing HITECH and meaningful use, the National Coordinator himself was Dr. David Blumenthal, who came to speak yesterday in the Longwood medical area in Boston.

A long-time Bostonian, where he moved up from being a primary care physician, Blumenthal is now back at Mass General and Harvard Business School. Most of his speech yesterday was a summary of the reasoning behind meaningful use, but some off-the-cuff remarks at the end, as well as vigorous discussion during a following panel, provided some interesting perspectives. Best of all was hearing a lot of facts on the ground. These helped explain the difference between EHRs in theory and in practice.

Which comes first, electronic records or standard formats?

There were a lot of complaints at the forum about the lack of interoperability between electronic health records. Blumenthal declared twice that pushing doctors to adopt EHRs was a good idea because we have to have our information digitized before we can think of interchanging it. Coming from the perspective of having seen systems and standards develop--and having seen the mess that results from products out of sync with standards in areas ranging from CORBA to browsers--I disagree with this claim. Luckily, Blumenthal's actual work didn't match the simplistic "digitize first" approach. The ONC built some modest requirements for interoperability into the first stage of meaningful use and plans to ramp these requirements up quickly. Furthermore, they're engaging in intensive negotiations with industry players over EHR standards (see, for instance, my write-up of a presentation by John Halamka last May) and worked quite early on the ground-breaking CONNECT and Direct projects for information exchange.

I understand that an ideal standard can't be expected to spring from the head of Zeus. What perhaps the standards proponents should have worked on is a separation of formats from products. Most EHRs reflect an old-fashioned design that throws together data format, architecture, and user interface. Wouldn't it be great to start the formats off on their own course, and tell EHR vendors to design wonderful interfaces that are flexible enough to adapt to format changes, while competing on providing clinicians with the best possible interface and workflow support? (Poor workflow was another common complaint at last night's forum.) That's the goal of the Indivo project. I interviewed Daniel Haas from that project in June.

Incrementalism in EHRs: accepting imperfection

Perhaps Blumenthal's enthusiasm for putting electronic records in place and seek interoperability later may reflect a larger pragmatism he brought up several times yesterday. He praised the state of EHRs (pushing back against members of the audience with stories to tell of alienated patients and doctors quitting the field in frustration), pointing to a recent literature survey where 92% of studies found improved outcomes in patient care, cost control, or user satisfaction. And he said we would always be dissatisfied with EHRs because we compare them to some abstract ideal

I don't think his assurances or the literature survey can assuage everyone's complaints. But his point that we should compare EHRs to paper is a good one. Several people pointed out that before EHRs, doctors simply lacked basic information when making decisions, such as what labs and scans the patient had a few months ago, or even what diagnosis a specialist had rendered. How can you complain that EHRs slow down workflow? Before EHRs there often was no workflow! Many critical decisions were stabs in the dark.

Too much content, too much discontent

Even so, it's clear that EHRs have to get better at sifting and presenting information. Perhaps even more important, clinicians have to learn how to use them better, so they can focus on the important information. One member of the audience said that after her institution adopted EHRs, discharge summaries went from 3 pages to 10 pages in average length. This is probably not a problem with EHRS, but with clinicians being lazy and taking advantage of the cut-and-paste function.

The computer was often described as a "third person in the room" during patient visits, and even, by panelist and primary care physician Gerard Coste, as a two-year-old who takes up everybody's attention. One panelist, law professor and patient representative Michael Meltsner, suggested that medical residents need to be trained about how to maintain a warm, personal atmosphere during an interview while looking up and entering data. Some people suggested that better devices for input and output (read: iPads) would help.

Blumenthal admitted that electronic records can increase workloads and slow doctors down. "I've said that the EHR made me a better doctor, but I didn't say it made me a faster one." He used this as a lead-in to his other major point during the evening, which is that EHRs have to be adopted in conjunction with an overhaul of our payment and reward system for doctors. He cited Kaiser Permanente (a favorite of health care reformers, even though doctors and patients in that system have their share of complaints) as a model because they look for ways to keep patients healthy with less treatment.

While increasing workloads, electronic records also raise patient expectations. Doctors are really on the hook for everything in the record, and have to act as if they know everything in it. Similar expectations apply to coordination of care. Head nurse Diane L Gilworth said, "Patients think we talk to each other much more than we do." The promise of EHRs and information interchange hasn't been realized.

New monitoring devices and the movement for a patient centered medical home will add even more data to the mix. I didn't ask a question during the session (because I felt it was for clinicians and they should be the ones to have their say), but if I could have posed a question, it would be this: one speaker reminded the audience that the doctor is liable for all the information in the patient's record. But the patient centered medical home requires the uploading of megabytes of data that is controlled by the patient, not the doctor. Doctors are reluctant to accept such data. How can we get the doctor and patient to collaborate to produce high-quality data, and do we need changes in regulations for that to happen?

A plea for an old-fashioned relationship

One theme bubbled up over and over at yesterday's meeting The clinicians don't want to be dazzled by more technology. They just want more time to interview patients and a chance to understand them better. Their focus is not on meaningful use but on meaningful contact. If EHRs can give them and their patients that experience, EHRs are useful and will be adopted enthusiastically. If EHRs get in the way, they will be rejected or undermined. This was an appropriate theme for a panel organized by the Schwartz Center for Compassionate Healthcare.

That challenge is harder to deal with than interchange formats or better I/O devices. It's at the heart of complaints over workflow and many other things. But perhaps it should be at the top of the EHR vendors' agendas.

July 30 2011

Report from Open Source convention health track, 2011

Open source software in health care? It's limited to a few pockets of use--at least in the United States--but if you look at it a bit, you start to wonder why any health care institution uses any proprietary software at all.

What the evidence suggests

Take the conference session by University of Chicago researchers commissioned to produce a report for Congress on open source in health care. They found several open source packages that met the needs for electronic records at rural providers with few resources, such as safety-net providers.

They found that providers who adopted open source started to make the changes that the adoption of electronic health records (or any major new system) is supposed to do, but rarely does in proprietary health settings.

  • They offer the kinds of extra attention to patients that improve their health, such as asking them questions about long-term health issues.

  • They coordinate care better between departments.

  • They have improved their workflows, saving a lot of money

And incidentally, deployment of an open source EHR took an estimated 40% of the cost of deploying a proprietary one.

Not many clinics of the type examined--those in rural, low-income areas--have the time and money to install electronic records, and far fewer use open source ones. But the half-dozen examined by the Chicago team were clear success stories. They covered a variety of areas and populations, and three used WorldVistA while three used other EHRs.

Their recommendations are:

  • Greater coordination between open source EHR developers and communities, to explain what open source is and how they benefit providers.

  • Forming a Community of Practice on health centers using open source EHRs.

  • Greater involvement from the Federal Government, not to sponsor open source, but to make communities aware that it's an option.

Why do so few providers adopt open source EHRs? The team attributed the problem partly to prejudice against open source. But I picked up another, deeper concern from their talk. They said success in implementing open source EHRs depends on a "strong, visionary leadership team." As much as we admire health providers, teams like that are hard to form and consequently hard to find. But of course, any significant improvement in work processes would require such a team. What the study demonstrated is that it happens more in the environment of an open source product.

There are some caveats to keep in mind when considering these findings--some limitations to the study. First, the researchers had very little data about the costs of implementing proprietary health care systems, because the vendors won't allow customers to discuss it, and just two studies have been published. Second, the sample of open source projects was small, although the consistency of positive results was impressive. And the researchers started out sympathetic to open source. Despite the endorsement of open source represented by their findings, they recognized that it's harder to find open source and that all the beneficial customizations take time and money. During a Birds-of-a-Feather session later in the conference, many of us agreed that proprietary solutions are here for quite some time, and can benefit by incorporating open source components.

The study nevertheless remains important and deserves to be released to Congress and the public by the Department of Health and Human Services. There's no point to keeping it under wraps; the researchers are proceeding with phase 2 of the study with independent funding and are sure to release it.

So who uses open source?

It's nice to hear about open source projects (and we had presentations on several at last year's OSCon health care track) but the question on the ground is what it's like to actually put one in place. The implementation story we heard this year was from a team involving Roberts-Hoffman Software and Tolven.

Roberts-Hoffman is an OSCon success story. Last year they received a contract from a small health care provider to complete a huge EHR project in a crazily short amount of time, including such big-ticket requirements as meeting HIPAA requirements. Roberts-Hoffman knew little about open source, but surmised that the customization it permitted would let them achieve their goal. Roberts-Hoffman CEO Vickie Hoffman therefore attended OSCon 2010, where she met a number of participants in the health care track (including me) and settled on Tolven as their provider.

The customer put some bumps in the road to to the open source approach. For instance, they asked with some anxiety whether an open source product would expose their data. Hoffman had a little educating to do.

Another hurdle was finding a vendor to take medication orders. Luckily, Lexicomp was willing to work with a small provider and showed a desire to have an open source solution for providers. Roberts-Hoffman ended up developing a Tolven module using Lexicomp's API and contributing it back to Tolven. This proprietary/open source merger was generally quite successful, although it was extra work providing tests that someone could run without a Lexicomp license.

In addition to meeting what originally seemed an impossible schedule, Tolven allowed an unusual degree of customization through templating, and ensured the system would work with standard medical vocabularies.

Why can't you deliver my data?

After presentations on health information exchanges at OSCON, I started to ruminate about data delivery. My wife and I had some problems with appliances this past Spring and indulged in some purchases of common household items, a gas grill from one company and a washing machine from another. Each offered free delivery. So if low-margin department stores can deliver 100-pound appliances, why can't my doctor deliver my data to a specialist I'm referred to?

The CONNECT Gateway and Direct project hopefully solve that problem. CONNECT is the older solution, with Direct offering an easier-to-implement system that small health care providers will appreciate. Both have the goal of allowing health care providers to exchange patient data with each other, and with other necessary organizations such as public health agencies, in a secure manner.

David Riley, who directed the conversion of CONNECT to an open-source, community-driven project at the Office of the National Coordinator in the Department of Health and Human Services, kicked off OSCon's health care track by describing the latest developments. He had led off last year's health care track with a perspective on CONNECT delivered from his role in government, and he moved smoothly this time into covering the events of the past year as a private developer.

The open-source and community aspects certainly proved their value when a controversy and lawsuit over government contracts threatened to stop development on CONNECT. Although that's all been resolved now, Riley decided in the Spring to leave government and set up an independent non-profit foundation, Alembic, to guide CONNECT. The original developers moved over to Alembic, notably Brian Behlendorf, and a number of new companies and contributors came along. Most of the vendors who had started out on the ONC project stayed with the ONC, and were advised by Riley to do so until Alembic's course was firm.

Lots of foundations handle open source projects (Apache, etc.) but Riley and Behlendorf decided none of them were proper for a government-centric health care project. CONNECT demanded a unique blend of sensitivity to the health care field and experience dealing with government agencies, who have special contract rules and have trouble dealing with communities. For instance, government agencies are tasked by Congress with developing particular solutions in a particular time frame, and cannot cite as an excuse that some developer had to take time off to get a full-time job elsewhere.

Riley knows how to handle the myriad pressures of these projects, and has brought that expertise to Alembic. CONNECT software has been released and further developed under a BSD license as the Aurion project. Now that the ONC is back on track and is making changes of its own, the two projects are trying to heal the fork and are following each other's changes closely. Because Aurion has to handle sensitive personal data deftly, Riley hopes to generalize some of the software and create other projects for handling personal data.

Two Microsoft staff came to OSCon to describe Direct and the open-source .NET libraries implementing it. It turned out that many in the audience were uninformed about Direct (despite an intense outreach effort by the ONC) and showed a good deal of confusion about it. So speakers Vaibhav Bhandari and Ali Emami spent the whole time alloted (and more) explaining Direct, with time for just a couple slides pointing out what the .NET libraries can do.

Part of the problem is that security is broken down into several different functions in ONC's solution. Direct does not help you decide whether to trust the person you're sending data to (you need to establish a trust relationship through a third party that grants certificates) or find out where to send it (you need to know the correspondent's email address or another connection point). But two providers or other health care entities who make an agreement to share data can use Direct to do so over email or other upcoming interfaces.

There was a lot of cynicism among attendees and speakers about whether government efforts, even with excellent protocols and libraries, can get doctors to offer patients and other doctors the necessary access to data. I think the reason I can get a big-box store to deliver an appliance but I can't get my doctor to deliver data is that the big-box store is part of a market, and therefore wants to please the customer. Despite all our talk of free markets in this country, health care is not a market. Instead, it's a grossly subsidized system where no one has choice. And it's not just the patients who suffer. Control is removed from the providers and payers as well.

The problem will be solved when patients start acting like customers and making appropriate demands. If you could say, "I'm not filling out those patient history forms one more time--you just get the information where I'm going," it might have an effect. More practically speaking, let's provide simple tools that let patients store their history on USB keys or some similar medium, so we can walk into a doctor's office and say "Here, load this up and you'll have everything you need."

What about you, now?

Patient control goes beyond data. It's really core to solving our crisis in health care and costs. A lot of sessions at OSCon covered things patients could do to take control of their health and their data, but most of them were assigned to the citizen health track (I mentioned them at the end of my preview article a week ago) and I couldn't attend them because they were concurrent with the health care track.

Eri Gentry delivered an inspiring keynote about her work in the biology start-up BioCurious, Karen Sandler (who had spoken in last year's health care track scared us all with the importance of putting open source software in medical devices, and Fred Trotter gave a brief but riveting summary of the problems in health care. Fred also led a session on the Quantified Self, which was largely a discussion with the audience about ways we could encourage better behavior in ourselves and the public at large.

Guaranteed to cause meaningful change

I've already touched on the importance of changing how most health care institutions treat patients, and how open source can help. David Uhlman (who has written a book for O'Reilly with Fred Trotter) covered the complex topic of meaningful use, a phrase that appeared in the recovery act of 2009 and that drives just about all the change in current U.S. institutions. The term "meaningful use" implies that providers do more than install electronic systems; they use them in ways that benefit the patients, the institutions themselves, and the government agencies that depend on their data and treatments.

But Uhlman pointed out that doctors and health administrators--let alone the vendors of EHRs--focus on the incentive money and seem eager to do the minimum that gets them a payout. This is self-defeating, because as the government will raise the requirements for meaningful use over the years, and will overwhelm quick-and-dirty implementations that fail to solve real problems. Of course, the health providers keep pushing back the more stringent requirements to later years, but they'll have to face the music someday. Perhaps the delay will be good for everyone in the long run, because it will give open source products a chance to demonstrate their value and make inroads where they are desperately needed.

As a crude incentive to install electronic records, meaningful use has been a big success. Before the recover act was passed, 15%-20% of U.S. providers had EHRs. Now the figures is 60% or 70% percent, and by the end of 2012 it will probably be 90%. But it remains to be seen whether doctors use these systems to make better clinical decisions, follow up with patients so they comply with treatments, and eliminate waste.

Uhlman said that technology accounts for about 20% of the solution. The rest is workflow. For instance, every provider should talk to patients on every visit about central health concerns, such as hypertension and smoking. Research has suggested that this will add 30% more time per visit. If it reduces illness and hospital admissions, of course, we'll all end up paying less in taxes and insurance. His slogan: meaningful use is a payout for quality data.

It may be surprising--especially to an OSCon audience--that one of the biggest hurdles to achieving meaningful use is basic computer skills. We're talking here about typing information in correctly, knowing that you need to scroll down to look at all information on the screen, and such like. All the institutions Uhlman visits think they're in fine shape and everybody has the basic skills, but every examination he's done proves that 20%-30% of the staff are novices in computer use. And of course, facilities are loath to spend extra money to develop these skills.

Open source everywhere

Open source has image and marketing problems in the health care field, but solutions are emerging all over the place. Three open source systems right now are certified for meaningful use: ClearHealth (Uhlman's own product), CareVue from MedSphere, and WorldVistA. OpenEMR is likely to join them soon, having completed the testing phase. vxVistA is certified but may depend on some proprietary pieces (the status was unclear during the discussion).

Two other intriguing projects presented at OSCon this year were popHealth and Indivo X. I interviewed architects from Indivo X and popHealth before they came to speak at OSCon. I'll just say here that popHealth has two valuable functions. It helps providers improve quality by providing a simple web interface that makes it easy for them to view and compare their quality measures (for instance, whether they offered appropriate treatment for overweight patients). Additionally, popHealth saves a huge amount of tedious manual effort by letting them automatically generate reports about these measures for government agencies. Indivo fills the highly valued space of personal health records. It is highly modular, permitting new data sources and apps to be added; in fact, speaker Daniel Haas wants it to be an "app store" for medical applications. Both projects use modern languages, frameworks, and databases, facilitating adoption and use.

Other health care track sessions

An excellent and stimulating track was rounded out with several other talks.

Shahid Shah delivered a talk on connecting medical devices to electronic record systems. He adroitly showed how the data collected from these devices is the most timely and accurate data we can get (better than direct reports from patients or doctors, and faster than labs), but we currently let it slip away from us. He also went over standard pieces of the open source stacks that facilitate the connection of devices, talked a bit about regulations, and discussed the role of routine engineering practices such as risk assessments and simulations.

Continuing on the quality theme, David Richards mentioned some lessons he learned designing a ways clinical decision support system. It's a demanding discipline. Accuracy is critical, but results must be available quickly so the doctor can use them to make decisions during the patient visit. Furthermore, the suggestions returned must be clear and precise.

Charlie Quinn talked about the collection of genetic information to achieve earlier diagnoses of serious conditions. I could not attend his talk because I was needed at another last-minute meeting, but I sat down for a while with him later.

The motto at his Benaroya Research Institute is to have diagnosis be more science, less art. With three drops of blood, they can do a range of tests on patients suspected of having particular health conditions. Genomic information in the blood can tell a lot about health, because blood contains viruses and other genomic material besides the patient's own genes.

Tests can compare the patients to each other and to a healthy population, narrowing down comparisons by age, race, and other demographics. As an example, the institute took samples before a vaccine was administered, and then at several frequent intervals in the month afterward. They could tell when the vaccine had the most powerful effect on the body.

The open source connection here is the institute's desire to share data among multiple institutions so that more patients can be compared and more correlations can be made. Quinn said it's hard to get institutions to open up their data.

All in all, I was energized by the health care track this year, and really impressed with the knowledge and commitment of the people I met. Audience questions were well-informed and contributed a lot to the presentations. OSCon shows that open source health care, although it hasn't broken into the mainstream yet, already inspires a passionate and highly competent community.

July 11 2011

popHealth open source software permits viewing and reporting of quality measures in health care

A couple weeks ago I talked to two members of the popHealth project, which culls quality measures from electronic health records and formats them either for convenient display--so providers can review their quality measures on the Web--or for submission to regulators who require reports on these measures. popHealth was produced by the MITRE corporation under a grant from the Office of the National Coordinator at the US Department of Health and Human Services. One of my interviewees, Andrew Gregorowicz, will speak about popHealth at the Open Source convention later this month.

Videos of the interviews follow.

Lisa Tutterow: The importance of quality measures in health care, and the niche filled by open source popHealth

Lisa Tutterow: How popHealth improves the reporting of quality measures in health care

Andrew Gregorowicz: popHealth's extendability and use of RESTful interfaces

Andrew Gregorowicz: popHealth's use of standard information from electronic health records, the goals of making it open source, and technical information

Andrew Gregorowicz: The relation of popHealth to standards, and the related hData project

Useful links:

Two other interviews with speakers at OSCon's health care track this year include Shahid N. Shah on medical devices and open source and Indivo X personal health record: an interview with Daniel Haas of Children's Hospital.

June 07 2011

Indivo X personal health record: an interview with Daniel Haas of Children's Hospital

I recently interviewed Daniel Haas from the Intelligent Health Lab (IHL) of the Children's Hospital Informatics Program (CHIP), within the Children's Hospital Boston, about Indivo, a project he will speak about at the O'Reilly Open Source convention.

Indivo is an open-source Personal Health Record (PHR) system. It is in use at Children's Hospital as well as several other institutions, it supports a wide range of applications through a RESTful interface, and it was the architectural and conceptual inspiration for a variety of systems in the Personally Controlled Health Record space, including Microsoft HealthVault.

The first video (below) describes how a PHR gives patients more control over data, where Indivo came from, and the purpose of releasing Indivo under the GPL.

The second video describes the difference between Indivo and open source EHRs such as VistA, the goal of portability, the RESTful interface of the new Indivo X project, and the SMART Platform for standard health data exchange.

The third video touches on application development for Indivo, the goal of substitutable apps created by a wide range of developers, the Indivo users meeting (held before the release of this video), and the development of a community to maintain and develop Indivo.

The final video describes some of Indivo's partners and derivative products, and privacy issues in relation to health records.

We hope you can come to OSCON and catch Daniel there.

Health IT at OSCON 2011 — The conjunction of open source and open data with health technology promises to improve creaking infrastructure and give greater control and engagement for patients. These topics will be explored in the healthcare track at OSCON (July 25-29 in Portland, Ore.)

Save 20% on registration with the code OS11RAD

Related:

Algorithms are the new medical tests

ekg by krzakptak, on FlickrPredictive Medical Technologies claims that it can use real-time, intensive care unit (ICU) monitoring data to predict clinical events like cardiac arrest up to 24 hours ahead of time. Effectively, the startup's algorithms are new types of medical tests that an ICU doctor can take into consideration when deciding on a course of treatment.

Predictive Medical Systems is based in the University of Utah's medical accelerator, which is attached to a hospital. The system will soon be tested on a trial basis with real patients and ICU physicians.

I recently talked to CEO Bryan Hughes about using data in diagnosis. Our interview follows


What kinds of data is already available from hospital electronic medical records (EMR) and patient monitoring systems?

Bryan Hughes: We require that a hospital be at a certain technological level, in particular that the hospital has an EMR solution that is at minimum classified as Stage 4, or a Computerized Physician Order Entry system. Only about 100 hospitals in the U.S. are at this stage right now.

Once a hospital has achieved this stage, we can integrate with their computer systems and extract the raw data coming from the monitors, lab reports and even nursing notes. We can then perform realtime patient data mining and data analytics.

Our system works behind the scenes constantly analyzing the raw patient data coming in from a variety of sources like chemistry panels, urinalysis, micro biology, respiratory and bedside monitors. We attempt to alert the doctor early of an adverse event such as cardiac arrest, or that a patient might be trending toward an arrhythmia or pneumonia.

Health IT at OSCON 2011 — The conjunction of open source and open data with health technology promises to improve creaking infrastructure and give greater control and engagement for patients. These topics will be explored in the healthcare track at OSCON (July 25-29 in Portland, Ore.)

Save 20% on registration with the code OS11RAD

How does the system integrate into an ICU doctor's existing routine?

Bryan Hughes: Depending on the technological development of a hospital, doctors either do their rounds in the ICU using a piece of paper or using a bedside computer terminal. Older systems might employ a COW (Computer On Wheels).

For hospitals that are still paper based, they have to first get to the EMR stage. It is surprising that health care, the largest and quintessential information-based industry, has failed to harness modern information exchange for so long. The oral tradition and handwritten manuscripts remain prevalent throughout most of the sector.

For hospitals that have an EMR, there are still several fundamental problems. The single most daunting problem facing modern doctors is the overwhelming amount of data. Unfortunately, especially with the growing adoption of electronic medical records, this information is disparate and not immediately available. The ability for a clinician to practice medicine is rooted in the ability to make sound decisions on reliable information.

Disparate information makes it hard to connect the dots. Massive amounts of disparate information turns the dots into a confusing sea of blobs. The dots must be connected in a manner that allows the doctor to make immediate and intelligent decisions.

We look at the current trends and progressions of disease states in the now, and then look at what may be happening in the next 24 hours. We then push this information to a mobile device such as an iPad allowing the doctor to see the clinically relevant dots, allowing them to make better decisions in a timely manner.

Eventually we hope to expand to the entire hospital. But for now, the ICU is a big enough problem and a great starting point.

How do you use data to predict outcomes like cardiac arrest?

Bryan Hughes: We have two first-generation models: cardiac arrest and respiratory failure. We plan to apply our novel techniques to modeling sepsis, renal failure and re-intubation risk.

Without giving away too much of our secret sauce, we use non-hypothesis machine learning techniques, which have proven very promising so far. This approach allows us to eliminate any human "expert" bias from the models. The key then is to ensure that the data we use for development and training is clean. It is only now that medical data is in electronic and structured form that this is becoming readily available.

What kinds of data mining techniques do you use in the product?

Bryan Hughes: We use a variety of techniques. Again, without giving too much away, our approach is to use transparent algorithms rather than a black box approach. We have a patent strategy that allows us to effectively place a white fence around our technology while allowing the academic and medical community to review our results.

How do you judge the accuracy of the algorithms?

Bryan Hughes: To date, our results have been proven using retrospective models (historical ICU monitoring data and outcomes). Our next step is to deploy our technology into a validation trial — a validation trial produces evidence that a test or treatment produces a clinical benefit. That trial is about to start at the University of Utah Medical Center in Salt Lake City.

Once the integration is completed in the next several weeks, we will be running a double-blind, prospective study with patient data. While this is only a validation trial, we are following the FDA guidance. Once the trial is up and running, we plan on expanding the validation trial to include several more hospitals. It will be at least 12 months before we start any formal FDA trial.

How is the system updated over time?

Bryan Hughes; We have developed a unique architecture that allows the system to reduce the experiment to validation cycle to 8 to 10 months. Typically in the medical community, a hypothesis is developed, a model is built and then tested and if valid, a paper is published for peer review. Once the model is accepted, it can have a life span of several years of adoption and application, which is bad because as we know, information and knowledge changes as we learn and understand more. Models need to be consistently re-evaluated and re-examined.

Are any similar systems available?

Bryan Hughes: None in the ICU, or even dealing with patient care, that we have found to date. In other industries, predictive analysis and modeling are pretty common place. Even your spam filter employs many of the techniques that the most sophisticated risk analysis system might use.

Photo: ekg by krzakptak, on Flickr



Related:


May 11 2011

Not so fast: assessing achievements and barriers at a Massachusetts Health IT conference

The state of electronic health records and electronic health data exchange offers plenty to celebrate along with plenty to make one grind one's teeth (not medically recommended). Both the bright lights of success and the mire of gridlock were held up for examination this week at the conference Health Information Technology Improving Healthcare and the Economy in Worcester, Massachusetts. The contrasts were so great that I felt as if I were attending two conferences jammed together, one ceremonially congratulatory and the other riddled with anger but a determination to make things better. I think most of the press attended only the first conference, but I was present for the whole thing and will try to integrate the insights I got into a view of health IT in the United States today.

This conference was a follow-on to last year's Governors Conference, which I covered at the time. The first half of this year's conference, like last year, dazzled the audience with a cornucopia of distinguished keynoters: Governor Deval Patrick himself; David Blumenthal, who has returned to Massachusetts after serving as National Coordinator for Health Information Technology at the Health and Human Services Department; JudyAnn Bigby, Secretary of the Massachusetts Executive Office of Health and Human Services; and Sachin Jain of the Centers for Medicare and Medicaid Services (CMS). An impressive 400 people turned up even though the conference was held an hour outside Boston (granted, Worcester itself is something of a medical center and home to the University of Massachusetts Medical School). But while the DCU Center is a pleasant and serviceable enough conference space, it did not achieve the pomp of the first conference in the series.


I can't count how many times the speakers reminded us of the progress in universal coverage in Massachusetts. It bears repeating, however, because it is quite remarkable: 98% of the population covered with some form of insurance, including 99.8% of the state's children. These figures cannot be matched by any other state, and are just one sign of the leading role Massachusetts has played in US health care. I reported last month on other elements of Massachusetts success. And the Boston Globe just reported that a coalition has been launched to find and sign up the rest of the children.

Governor Deval Patrick and JudyAnn Bigby at health care conference
Governor Deval Patrick and JudyAnn Bigby at health care conference

While sticking to the theme of acknowledging success, Blumenthal delivered a fact-packed and deep keynote that laid out both how far we have come since the passage of the stimulus bill that started the current reform (long before the better-known and controversial Affordable Health Care for America Act) and how much is left for us to accomplish. He boldly claimed that the dominant EHR of the future will be "different from what we have today."

David Blumenthal at health care conference
David Blumenthal at health care conference.

Blumenthal acknowledged that the government has stirred up a pot, opening a period of uncertainty and costly experimentation in the field of EHRs. But he assured us this is a good thing, saying that we have "revitalized a relatively sleepy side of American technology" that was "slow to innovate." Already, over 700 products and modules have been certified for meaningful use, most of them produced by companies with fewer than 15 employees.

He also suggested that ultimately there would be a shake-out and consolidation around three or four products. I silently cheered when I heard this, because consolidation is a stage on the way to commodification, and commodification in the field of software is a boost to open source. As I suggested over a year ago, free software possesses many traits qualifying it as a natural cure for the current ills of the EHR industry. And it's a cure many profitable companies can participate in.

Jain touted the CMS Innovation Center, which not only solicits research on lowering health care costs, but can quickly spread findings throughout the country by requiring changes of Medicaid and Medicare providers. He pointed out the need for sophisticated tracking: providers often claim to have reduced costs when they merely shift them to another provider. I'm not convinced that the Innovation Center will be the driving force Jain thinks it is, but it is a welcome island of disruption in Blumenthal's "sleepy" technology.

How deep and lasting is the progress in meaningful use?

Meaningful use--a very young term, coined by the HITECH act as part of the stimulus package--is one of those many-faceted concepts that can be defined extremely narrowly or quite broadly. Like an Impressionist painting, it appears completely different when viewed up close versus at a distance.

In its most narrow form, meaningful use is laid out by documents from CMS. For instance, Stage 1 it defines 15 things a medical practice should be able to do, and requires each practice to demonstrate a certain number of these things in order to receive government payments (for instance, placing a certain number of pharmacy orders electronically). Stage 2 will add requirements, and Stage 3 still more.

Already the narrowness of this definition creates problems for doctors attracted by the promise of payments for adopting electronic health records. One audience member pointed out that an EHR vendor may implement a subset of meaningful use requirements in order to allow a doctor to get Stage 1 payments. But a specialist might not do all the things that the vendor has implemented. This specialist might need other, unimplemented requirements in order to earn a payment. In effect, the specialist has bought a system that both the vendor and the government (indirectly, through the certification process) has promised will make her eligible for a payment, only to find herself cheated.

This, and other gripes, flowed freely at a workshop led by two representatives from the Office of the National Coordinator, Fadesola Adetosoye and Jim Daniel. Adetosoye placated the audience with the suggestion that lapses in EHRs were caused by vendor ignorance of doctors' needs, not a malicious strategy. She recommended meetings between doctors and vendors, and suggested that doctors band together to present a more impressive front. Daniel said HHS has organized forums of vendors and doctors to cut down on miscommunication during the design of EHRs.

Meaningful use at its broadest is a handle for all the advances in delivering health care that are supposed to eliminate the 100,000 unnecessary hospital deaths each year, reduce the 10% or 20% jumps in annual insurance costs to the level of regular cost-of-living increases, and bring the doctor into the home of the patient. Once we raise our eyes to this horizon, we see more barriers along the way, many embedded in EHRs.

No one delivered stronger blows to current EHR vendors than the chair of the non-profit that has certified most of them over the years, the Certification Commission for Health Information Technology. Karen Bell presented a modest list of transactions and activities that an EHR should permit--coordinated care, integration of functionalities, communication between different parts of the EHR--and warned the audience that certification means only that an EHR can do the handful of things required for Stage 1 meaningful use payments. Some EHRs include the more sophisticated features that facilitate cost savings and improvements of care, but many do not. Her overall message was "buyer beware." I found her presentation courageous, and it greatly enhanced my impression of CCHIT.

The stimulus to better and cheaper care: data sharing

Most public health research in the United States today uses administrative data that is easy to get, because it has been collected for years for critical purposes unrelated to research (mostly to bill insurers, including Medicare and Medicaid). Starting as a combined effort among employers to cut costs by tracking diseases across a large population, the collection of such administrative data has gradually evolved into a government function. The source of most research data now is each state's All-Payer Claims Database (APCD). The federal government will probably begin its own national project, either starting from scratch or trying to combine and harmonize the data from the states. Jo Porter of the University of New Hampshire explored different uses for administrative data in a workshop on secondary data use.

Workshop on secondary use of data
Workshop on secondary use of data

As one audience member pointed out, "Claims data is a proxy, not a real measure." For instance, the data can tell us whether a patient took a lab test, but not what the results were. Meaningful Use begins the process of collecting clinical data with baby steps, such as reporting the number of smokers at each medical practice. But still, a lot of useful things turn up with administrative data. Even a logistical question such as how far patients travel to get care has clinical implications.

Data from private health insurers cannot always produce accurate results when combined with Medicare and Medicaid data, because the populations are so different. And when it does make sense to combine them, the resulting statistics still leave out the uninsured, who frequent community treatment centers. Porter said that Maine gives identity cards to the uninsured for use when they visit these centers, so that data on their treatment can be factored into the statistics in that state. The Department of Veterans Affairs has indicated that it would like to add its considerable patient data to the statistics too. The patient-centered medical home also has the potential to generate enormous amounts of useful research data.

In addition to Porter's examples of data aggregation, I saw some interesting displays at the booth of JEN Associates, who provide tools to CMS as well as private organizations.

Another problem in trying to extract long-term information from administrative data is tracking a patient as he moves from one insurance provider to another. Some states collect more identifying data than other states. For a long time, providers simply identified each patient by her social security number. This was not a privacy risk because the number was encrypted before being shared. Because each insurer used the same encryption algorithm, a patient could be tracked without being identified as he moved from one insurer to another.

Well, given the well-known problems with using social security numbers as universal identifiers, insurers are moving away from that. We are famously a country opposed to universal identifiers and ID cards. So as each insurer adopts its own way of identifying patients, and as hospitals use multiple demographic markers (name, age, etc.), tracking patients through their lifetimes becomes harder.

States informally share information and suggestions for using administration data in an APCD Council. One of the current issues is whether to charge for this data. It is clearly of great value to companies in various parts of the health care industry. But charging for data, while fair in relation to companies using it for product development and marketing, obviously puts a crimp in research. Also, Porter said that collection and use of this data was still so new that it's hard to establish its commercial value.

In addition to fueling research, data is critical for better patient care. For instance, you obviously need it for the integrated care that lies at the core of modern cost-saving initiatives, not to mention patient-centered care. Strangely enough, one of the biggest buzzwords in American health care today, Accountable Care Organizations, got relatively little attention at the this conference. Sachin Jain even put distance explicitly between himself and the term, claiming that it was "just one model" and might be a transitional stage in the evolution of providers.

Whatever one's treatment model, the inconsistencies and incompatibilities of EHRs have made sharing data between health care providers both costly and cumbersome. John Halamka, a leading CIO in healthcare and advisor to federal efforts, spent several minutes on his panel listing the various standards that the government was developing, most of them to be released for review during the summer and finalized in the Fall. A few examples include standards for:

  • Submitting meaningful use data to CMS (so manual data entry will no longer be necessary)

  • Metadata to represent privacy preferences

  • A provider directory

  • Simplified data on doctor quality (always a sensitive measure that scares the doctors being monitored)

  • Transition of care documents

He predicted that Stage 2 would be split into multiple stages to give vendors time to produce conforming systems. But Blumenthal warned earlier that delaying stages would play havoc with the schedule of payments, which Congress laid out year by year rather than stage by stage.

The ONC is also working on a Standards and Interoperability (S&I) Framework that addresses many of the problems discussed in this section.

Formats and data exchange were also the topic of a workshop on Health Information Exchanges, led by Richard Shoup of the organization that put together the conference, the Massachusetts e-Health Institute.

Neither existing standards (based mostly on a very old and complex set of formats called HL7, which have only gotten harder to implement and parse as they have evolved) nor recent government efforts are enough to produce data that can easily be shared between EHRs. Massachusetts has led the way in forming a consortium of states and EHR vendors to fill the gap. In his panel, Halamka expressed the wish that standards for data exchange had been codified before doctors were asked to buy EHR systems, because the ability of those systems to exchange data could then have been verified.

Although formats and standards excite technical specialists, much more is required to make exchanges possible while preserving the rights of the patient. Exchanges require trust between institutions, and this trust goes down rapidly as the distance between them increases. As Shoup said, governance for data must be a topic of the multi-state consortium.

And one audience member claimed that actually, the scenario so often reported to justify national health exchanges (the patient on vacation who goes to an emergency room citing chest pains) is actually extremely rare. Very few patients have to seek care outside their states, at least for symptoms where their prior conditions are an important diagnostic factor. This calls into question the value of spending the huge sums of money that would be required to create a universal exchange.


However, Shoup said that 15% of patients who enter an emergency room are treated in the absence of information needed to treat them properly, and that 15% of admissions from the ER to the hospital could be avoided if the medical records were available. This is not an interstate problem, but one many of us have right at home.

All data sharing initiatives raise questions of patient privacy. Someone asked Shoup "Who owns the data?" He said it was a difficult question to answer (perhaps one that cannot be answered, and therefore that we should stop asking.) But throughout the conference, speakers acknowledged the importance of patient consent and preserving privacy. Certainly, the two big ONC projects--CONNECT and Direct--center on the assurance of privacy during data exchange. But they do not solve problems of consent and trust, only authorization and secure data transfer.

Jobs and balance sheets

The biggest contradiction I've found during my coverage of health IT is the oft-cited prediction that we'll have to hire 40,000 to 50,000 new staff in the field to deal with IT changes, no one explaining how we'll pay all those people while cutting costs. Already there are anecdotal reports of IT staff demanding six-figure salaries and difficulties finding trained staff at any price.

One answer, which came up during a presentation by Lynn Nicholas, President of the Massachusetts Hospital Association, is that a lot of existing staff will lose their jobs. Hopefully, as hospitals upgrade from routine clinical functions to more sophisticated data processing, the staff can get training to do these better jobs instead of just receiving pink slips.

Another promising way to cut costs lies in telemedicine, introduced by Dr. Joseph Kvedar with examples from the Center for Connected Health at Massachusetts' largest (and sometimes most resented) health provider, Partners HealthCare. The Center for Connected Health has pioneered projects in the leading health epidemics of our time: diabetes, hypertension, and congestive heart failure.

Telemedicine could be as simple as sending a text message to remind someone of an appointment or the time to take a pill. (In fact, one could argue whether this is telemedicine, because it uses automation rather than human intervention.) But Partners goes much farther as well, giving patients devices that let them upload statistics such as blood pressure to a server at the hospital, where software can determine whether an event requiring a doctor's intervention has occurred.

They have given pedometers to Boston school students and installed sensors in the schools that are triggered when they walk by, so that schools can measure how much walking they do and encourage them to increase it. (This must be a change from when I went to public school, when monitors were always telling us to get out of the halls.)


Strangely, people with chronic illness use technology less than the average person, according to Kvedar, and studies show that this link is independent of the usual factors for explaining such differences (age, socio-economic status, and so forth). Health care costs follow a very strong power law, meaning that a tiny percentage of the population (3%) accounts for a huge proportion of costs (40%), so we have to engage the chronic patient somehow, and telemedicine seems to be a key part of the solution.

Telemedicine also includes interactions between doctors, such as remote monitoring of ICUs. One study found that sending the information from cameras and monitors to remote sites can reduce fatalities by 20%. Kvedar says this is an example of when telemedicine seems to be prescribed: when outcomes are determined by low-frequency but high-impact events. Massachusetts General Hospital now keeps doctors on call to examine video images of stroke patients at home so that they can help the on-site doctor make the right determination of how to treat the stroke.

Blumenthal started his keynote--which, as I said, included much that is worthy of celebration--by listing four factors that make it hard for physicians to adopt and use EHRs:

  • The "paralysis of uncertainty" created by having two many systems to choose from, along with a grab-bag of worries ranging from whether they'll stay up to whether they'll meet clinicians' needs

  • A basic psychological barrier stemming from habits of recording information, which go back one's first medical training and involve the visceral activities of using a pen

  • The ongoing lack of technical and cultural foundations for exchanging data

  • The fear of data breaches and violations of patient privacy

All the other complaints and admonishments in the conference could probably fit into one of those categories. Solutions are available, but because data exchange and research are fundamental to change, these solutions have to be discovered and adopted by the field as a whole. Most doctors who adopt electronic systems are ultimately happy they did so--a finding that was not true just a few years ago--but the process is still expensive and painful for those who go ahead of their peers.

May 10 2011

Interview: Protecting patient privacy rights in a wired world

In this podcast, Andy Oram interviews Dr. Deborah Peel of the Patient Privacy Rights Coalition about Getting IT Right: Protecting Patient Privacy Rights in a Wired World, a preconference to be held in conjunction with the illustrious Computers, Freedom, and Privacy conference this year.

Topics covered in the interview include:

  • The evolution of patient privacy.
  • Weaknesses in the current privacy regime for health care.
  • Goals, structure, and intended outcomes for the conference.
  • A look at featured speakers and attendees, including: Joy Pritts, ONC, Chief Privacy Officer; Jessica Rich, Deputy Director, FTC Bureau of Consumer Protection; Stephania Griffin, VHA Privacy Officer; AZ Senator Nancy Barto, Chairman of the Senate Healthcare and Medical Liability Reform Committee; Stephanie Perrin, Canadian privacy expert; Ross Anderson, Cambridge University, UK; Latanya Sweeney, Harvard, MIT, Carnegie Mellon; Helen Nissenbaum , Professor of Media, Culture and Communication, and Computer Science, New York University; Lee Tien, EFF.

Related links:

April 13 2011

Four short links: 13 April 2011

  1. Web Ecology Project -- Researching quantized social interaction. Most recent work was a competition to write social bots that would be followed/friended on social networks--essentially scoring 51% on the Turing test. There are privacy implications (often social network buddies see profile information that strangers can't). (via The Atlantic)
  2. We Need to Stop Google's Exploitation of Open Communities (Mikel Maron) -- much as Google's ill-fated Knol smelled like an attempt to sidestep Wikipedia, their MapMaker is directly modelled on OSM [OpenStreetMap], but with a restrictive data license, where you can not use the data as you see fit. Mikel argues passionately and pointedly about this. Also interesting: how quickly OSM's own community is turning against itself on licensing issues. Nothing else divides open communities as much as the license that makes them possible, not even big companies' dickish behaviour.
  3. A Truly Open VistA -- the Veterans Administration attempts to build an open source community (instead of simply releasing the source code). This article by RedHat's Chief Technology Strategist outlines some of challenges they're facing: obscure source and bureaucracy. The obscure source is a significant impediment: it's written in MUMPS which predates C and combines the elegance of roadkill with all the capability for abstract expression of a brick. Existing businesses aren't an impediment, though: Linux has shown that deforking (aka "contributing") makes sound business sense once the momentum of new features builds up in the commons. (via Glyn Moody)
  4. Rare Javascript Operators (Timmy Willison) -- enlightening, but reminds me of the important gulf between "can" and "should": Tilde is useful! We can use for any functions that return -1:
    // We can do
    if ( ~checkFoo ) {

    }
    (via Javascript Weekly)

March 23 2011

SMART challenge and P4: open source projects look toward the broader use of health records

In a country where doctors are still struggling to transfer basic
patient information (such as continuity of care records) from one
clinic to another, it may seem premature to think about seamless data
exchange between a patient and multiple care organizations to support
such things as real-time interventions in patient behavior and better
clinical decision support. But this is precisely what medicine will
need for the next breakthrough in making patients better and reducing
costs. And many of the building blocks have recently fallen into
place.

Two recent open source developments have noticed these opportunities
and hope to create new ones from them. One is the href="http://challenge.gov/challenges/134">SMART Apps for Health
contest at Challenge.gov, based on the href="http://www.smartplatforms.org/">SMART Platform that is one
of the darlings of href="http://www.whitehouse.gov/blog/2011/03/10/smart-prize-patients-physicians-and-researchers">Federal
CTO Aneesh Chopra and other advocates for health care innovation.
The other development is href="http://healthurl.com/www/P4.html">P4, the brainchild of a
physician named Adrian Gropper who has recognized the importance of
electronic records and made the leap into technology.

SMART challenge: Next steps for a quickly spreading open source API

I'm hoping the SMART Platform augurs the future of health IT: an open
source project that proprietary vendors are rushing to adopt. The
simple goal of SMART is to pull together health data from any
appropriate source--labs, radiology, diagnoses, and even
administrative information--and provide it in a common,
well-documented, simple format so any programmer can write an app to
process it. It's a sign of the mess electronic records have become
over the years that this functionality hasn't emerged till now. And
it's a sign of the tremendous strides health IT has made recently that
SMART (and the building blocks on which it is based) has become so
popular.

SMART has been released under the GPL, and is based on two other
important open source projects: the href="http://indivohealth.org/">INDIVO health record system and
the I2B2 informatics system. Like
INDIVO, the SMART project was largely developed by Children's Hospital
Boston, and was presented at a meeting I attended today by Dr. Kenneth
D. Mandl, a director of the Intelligent Health Laboratory at the
hospital and at Harvard Medical School. SMART started out with the
goal of providing a RESTful API into data. Not surprisingly, as Mandl
reported, the team quickly found itself plunged into the task of
developing standards for health-related data. Current standards either
didn't apply to the data they were exposing or were inappropriate for
the new uses to which they wanted to put it.

Health data is currently stored in a Babel of formats. Converting them
all to a single pure information stream is hopeless; to make them
available to research one must translate them on the fly to some
universally recognized format. That's one of the goals of the href="http://www.whitehouse.gov/sites/default/files/microsites/ostp/pcast-health-it-report.pdf">report
on health care released in December 2010 by the President's
Council of Advisors on Science and Technology. SMART is developing
software to do the translation and serve up data from whatever desired
source in "containers." Applications can then query the containers
through SMART's API to retrieve data and feed to research and clinical
needs.

Justifying SMART, Mandl presented solid principles of modern data
processing that will be familiar to regular Radar readers:

Data as a platform

Storage should be as flexible and free of bias as possible, so that
innovators can easily write new applications that do surprising and
wonderful things with it. This principle contrasts starkly with most
current health records, which make the data conform to a single
original purpose and make it hard to extract the data for any other
use, much less keep it clean enough for unanticipated uses. (Talk to
doctors about how little the diagnoses they enter for billing purposes
have to do with the actual treatments patients need.)

An "Appstore for health"

New applications should be welcome from any quarter. Mandl is hoping
that apps will eventually cost just a few dollars, like a cell phone
app. (Note to Apple: Mandl and the audience tended to use the terms
"iPhone" and "Appstore" in a casual manner that slid from metaphors to
generic terms for mobile devices and program repositories.) Mandl said
that his teams' evaluation of apps would be on the loose side, more
like Android than iPhone, but that the environment would not be a
"Wild West." At each hospital or clinic, IT staff could set up their
own repositories of approved apps, and add custom-built ones.

A "learning health system"

Data should be the engine behind continuous improvement of our health
care system. As Mandl said, "every patient should be an opportunity to
learn."

Open source and open standards

As we've seen, standards are a prerequisite for data as a platform.
Open source has done well for SMART and the platforms on which is
based. But the current challenge, notably, allows proprietary as well
as open source submissions. This agnosticism about licensing is a
common factor across Challenge.gov. Apparently the sponsors believe
they will encourage more and better submissions by allowing the
developers to keep control over the resulting code. But at least most
Challenge.gov rules require some kind of right to use the app the
SMART challenge is totally silent on rights. The danger, of course, is
the developers will get tired of maintaining an app or will add
onerous features after it becomes popular.

An impressive list of electronic record vendors have promised support
for SMART or integrated it into products in some way: Cerner, Siemens,
Google, Microsoft, General Electric, and more. SMART seems to be on
its way to a clean sweep of the electronic health care record
industry. And one of its projects is aimed at the next frontier:
integrating devices such as blood glucose readers into the system.

P4: Bringing patients into the health record and their own treatment

SMART is a widely championed collaboration among stellar institutions;
P4 is the modest suggestion of a single doctor. But I'm including P4
in this blog because I think it's incredibly elegant. As you delve
into it, the concept evolves from seeming quite clever to completely
natural.

The project aims to create a lightweight communication system based on
standards and open source software. Any device or application that the
patient runs to record such things as blood pressure or mood could be
hooked into the system. Furthermore, the patient would be able to
share data with multiple care providers in a fine-grained way--just
the cholesterol and blood pressure readings, for example, or just
vaccination information. (This was another goal of the PCAST report
mentioned in the previous section.)

Communicating medical records is such a central plank of health care
reform that a division of Health and Human Services called the Office
of the National Coordinator created two major open source projects
with the help of electronic health record vendors: href="http://www.connectopensource.org/">CONNECT and href="http://wiki.directproject.org/">Direct. The latter is more
lightweight, recently having released libraries that support the
secure exchange of data over email.

Vendors will jump in now and produce systems they can sell to doctors
for the exchange of continuity of care records. But Gropper wants the
patients to have the same capabilities. To do that, he is linking up
Direct with another open source project developed by the Markle
Foundation for the Veterans Administration and Department of Defense:
Blue Button.

Blue Button is a patient portal with a particularly simple interface.
Log in to your account, press the button, and get a flat file in an
easy-to-read format. Linked Data proponents grumble that the format is
not structured enough, but like HTML it is simple to use and can be
extended in the future.

Blue Button is currently only a one-way system, however. A veteran can
look at his health data but can't upload new information. Nor can
multiple providers share the data. P4 will fix all that by using a
Direct interface to create two-way channels. If you are recovering
from a broken leg and want to upload your range-of-motion progress
every day, you will be able to do this (given that a format for the
data is designed and universally recognized) with your orthopedic
surgeon, your physical therapist, and your primary care provider. P4
will permit fine-grained access, so you can send out only the data you
think is relevant to each institution.

Gropper is aiming to put together a team of open source coders to
present this project to a VA challenge. Details can be found on the href="http://healthurl.com/www/P4.html">P4 web page.

February 05 2011

Report from Massachusetts Health IT forum

To talk of a "revolution" in health care would be demeaning to the
thousands of people staking their lives on real revolutions right now
in various countries, but there is no doubt that the conflation of
out-of-control health care costs, fancy new technologies, and various
government mandates (not only from the US government, but from many
states including Massachusetts) have forced doctors, vendors, and
other people in the heath care field to scramble and order changes
throughout their organizations. A couple hundred of these people came
to the "Tools
for Meaningful and Accountable Care" conference
held yesterday by
the Massachusetts Health Data
Consortium
.

I didn't interview many participants (the ones I talked to were very
happy with the presentations) but I wonder whether all of them got
what they came for. They may well be haggling over questions such as
"How many prescriptions do we need to order online in 2011 in order to
qualify for the first stage of Meaningful Use booty?" or "How do I get
an image from the radiologist down the street while satisfying HIPAA
privacy regulations?" What they got, however, was a broad look at the
needs of health care and a set of projections by various speakers that
congealed into what I find to be a coherent vision for health care in
the future.

And I think the communication of this vision is important. Costs will
continue to rise and reform will fail if doctors, vendors, and IT
staffs simply race to meet each stage of regulations and act in an ad
hoc manner without an overall coordination of effort. Just how broad
this coordination of effort must be--we're not talking here just about
gathering an entire hospital around a program, or even a whole
consortium such as Partners HealthCare, the biggest Massachusetts
provider--will come out during this article.

Capitation versus Clinical Effectiveness Research

One of the big changes in the Massachusetts health care scene went
oddly unmentioned during the whole day of talks. I'm referring to the
href="http://www.boston.com/business/healthcare/articles/2011/01/23/blue_cross_ceo_says_providers_must_control_health_care_costs_or_else/">dictat
from Blue Cross/Blue Shield of Massachusetts announcing that they
will change from fee-for-service to a "global payment plan." This
mirrors recent plans from the state government to pressure the health
care insurers and providers to pay for outcome rather than for
individual procedures. But imposed on the current delivery system,
such a "global payment plan" is just a repackaging of old-fashioned
capitation.

Nobody seems to want to admit this, just as people are reluctant to
announce the return of "managed care" and prefer to assign the new,
as-yet untainted term "accountable care organization." It was up to
the CEO of a leading ACO--Dr. Craig Samitt of Dean Healthcare in
Wisconsin--to display a slide in his keynote with an equal sign
connecting "managed care" and "accountable care organization." He said
this moment is our chance to do managed care right.

(It's also sobering that in Massachusetts, world center for health
care, the two organizations singled out at this conference for their
achievements in bringing to life the potential in health care IT both
lay outside the state: Wisconsin's Dean Healthcare and central Texas's
Scott & White Health Plan. Furthermore, the individuals who
traveled here to describe their work had both spent long careers in
Massachusetts health care organizations before traveling elsewhere to
lead these advances.)

Payments for outcome and ACOs can work: they can lower costs while
simultaneously improving health care. But by themselves they're like a
meringue fashioned out of only sugar and cornstarch. The egg that will
make them work is clinical effectiveness research, a topic excellently
covered in talks by two doctors, Harold C. Sox and Michael Fischer.

CER is a medical application of the quality control routinely done in
other industries; it perhaps has its origin in time-and-motion
studies. It asks tough questions such as why one surgeon has far
greater success on the same patient population as another--not in
order to reward or punish, but to convey the best practices from one
clinic and region to another. CER should overcome the enormous
disparities that we all know exist between doctors, between hospitals,
between patient populations (such as differences in outcome by race)
and between different parts of the country.

Dr. Sox pointed out that CER was being tried as early as the 1960s,
but took a great leap in the mid-1990s and continues to make advances
despite such cynical political pot-shots as raising the fear of death
panels. (I highly recommend href="http://www.newyorker.com/reporting/2010/08/02/100802fa_fact_gawande">Atul
Gawande's New Yorker article for a sweeping overview of the real
purpose and effect of end-of-life decisions.) CER is now formalized by
the Federal Government in several initiatives that are not likely to
go away.

Dr. Fischer said that CER required big changes in education and in
how results are delivered. Crude impressions like "death panels" have
to be fought with better outreach to the public. Continuing medical
education (which has impolitely referred to as "broken") needs to be
more hands-on and to demonstrate that doctors understand the material
they've been given. And EMRs have to become much more sophisticated at
delivering information.

Currently, doctors using EMRs are pelted with notorious "alerts" that
resemble the warnings web browsers give all of us when we visit web
sites with outdated security certificates. Most doctors treat the
alerts like web users treat the security dialog boxes--they ignore
them and click through. And that's because they're just too darned
many alerts. Every medication has some potential impact on something
(a rare food allergy, for instance) and the computer systems can't
figure out what doctors really need to know. Furthermore, if a system
displays an alert and the patient experiences a problem later, the
doctor's liability is increased. If a doctor dismisses an alert, he or
she had better type in a reason for doing so.

Making CER work will require vendors to design more flexible systems,
and the IT staff at each institution to choose the alerts that can
actually affect medical decisions. Some of the enforcement can also be
spread around: nurses and other staff can be involved in CER.

All Together Now

The value that comes from aggregating results of procedures and
treatments raises several questions. One is the effects on patient
privacy, because it's well-known that anonymized data can often be
de-anonymized, and we're talking here of widely shared data being
crunched by dozens or hundreds of organizations. (I'm on the planning
committee for an href="http://www.utexas.edu/lbj/healthprivacy">upcoming conference on
health data privacy.) But a deeper question concerns the ability
of many forces to work together to make change.

A small medical practice can't internally collect enough data to
determine statistically what works and what doesn't. Unless someone
coordinates these small practices, they will fall behind and lose both
money and patients. But even a large institution has limited access to
data. Michael Lee, a director at the fairly large Atrius Health
group, said they wished they could see data on their patients after
they move on to other institutions. Better care and lower costs will
require massive coordination across the country.

The Direct
Project
at Health and Human Services, which reached a major
milestone last week with the announcement of some successful data
transfers, should become a platform for such exchange and coordination
(and they're taking privacy quite seriously). But it's just a
platform--echoing a point made by Joel Vengco of GE--whose value
emerges only through the proper encoding, exchange, and interpretation
of data, followed ultimately by the dissemination of results and their
use by providers. (Whew.)

This is perhaps why Micky Tripathi, president of the Massachusetts
eHealth Collective, stressed that doctors have to move from just
entering data into their EHRs to entering accurate data, and using
structures that allow it to be parsed and crunched. It was also
pointed out that many of the requirements for receiving meaningful use
payments depend on multiple institutions: specialists, labs,
pharmacies, and other institutions the doctor deals with have to be
set up to send and receive the communications for which the government
rewards the individual doctor.

It used to be that doctors would practice and health care researchers
would do research (with some overlap, of course, at teaching hospitals
and major research centers). Practice and research are now
intertwining and perhaps merging.

All these ways in which health reform becomes a group effort show why
a high-level vision is important. And someone at the top must firmly
hold on to this vision. That's why I had a second surprise yesterday
at a news item that went unmentioned: the upcoming href="http://www.healthcareitnews.com/news/blumenthal-leave-onc">departure
of David Blumenthal, National Coordinator for Health Information
Technology. I trust that the ONC is suffused with enough smart
people holding firm to their goals for it to continue to make change.
If Blumenthal's departure slows down implementation, though, maybe it
will give us a welcome breathing space to re-evaluate our tools and
what we need them to accomplish.

Too expensive, too hard to use, and too disruptive to workflow

That was the three-sided accusation delivered to vendors of EHRs by
Dr. Marylou Buyse, the chief medical director of Scott & White
Health Plan, who spoke at yesterday's conference and won an
achievement award there. Nobody blinked when she delivered the
judgment, and indeed it's one I've heard repeatedly. Dr. Buyse should
have added that their proprietary formats and imprecise
implementations of standards throw up enormous barriers to the data
exchange required for meaningful use, as I discussed in a href="http://radar.oreilly.com/2010/03/report-from-himms-health-it-co-1.html">report
from last year's HIMSS conference.

Few speakers picked up this theme, perhaps because many vendors were
present, and perhaps because the speakers figured we had to soldier on
with whatever we had. My third surprise of the day, regarding
unmentioned news, was the alarming href="http://www.whitehouse.gov/sites/default/files/microsites/ostp/pcast-health-it-report.pdf">
report by the President’s Council of Advisors on Science and
Technology expressing concern about the ability of current EHRs to
carry out the basic data exchanges required for improvements in health
care.

Maybe health care in the US is so far behind the digital age that any
upgrade would produce benefits. Paul Grabscheid of InterSystems
reminded the audience of a recent study showing that two-thirds of
doctors still use fax machines to send medical records out, and the
next biggest medium in use is snail mail. Adoption of EHRs is rising
in this country (it may be up to 20%, depending on how it's counted)
but is still extremely inadequate. Nevertheless, most observers don't
call for moving full-speed ahead with existing computer systems and
workflows. Before making investments, it's important to be smart.

Better standards, as the PCAST report called for, are important, and open source systems would address interoperability. (I feel it justified to insert a plug here for the health care track at O'Reilly's Open Source convention. But most of all, we've all got to work together. Spent forces with nothing new to offer must be pushed out of the way, while the vast majority of people have to set aside maneuvers for short-term advantage and focus on a common goal. To return to the stirring words of keynoter Dr. Samitt, insurers and providers (and he could have added patients and politicians) have to "work together to drive change."

Maybe that's the key to any revolution.

October 19 2010

Nationwide Health Information Network hackathon: Direct Project reaches milestone

When the Direct Project (a component of the Nationwide Health Information Network) announced its first hackathon yesterday I felt personally gratified as well as excited to see the achievement of this milestone. The hackathon will take place on October 27 and 28 and can benefit from the participation of any programmers using Java and C#.

The Nationwide Health Information Network is the U.S. government's
major open source initiative in health care. You could argue that
VistA, from the U.S. Department of Veterans Affairs, is more important
(it certainly is a vastly bigger code base), but VistA so far is
sparsely adopted outside the government, whereas the Nationwide Health
Information Network is positioned to become the platform for all
hospitals, clinics, doctors' offices, and other health care
institutions to exchange information throughout the country.

The basic goal of this network is to allow health care providers to
exchange data on patients who move from one institution to another.
This could be an everyday occurrence such as a referral, or an
emergency situation such as a patient's ER visit during travel. The
network will also facilitate the collection of data to government
agencies that do important work in health care statistics and
evidence-based medicine. What makes this all hard is the strict
privacy requirements that call for careful authentication and secure
data transfer; that's why special code is necessary. Intermediaries
are also required to help health care providers authenticate each
other, and help providers that don't have special security-enhanced
software to encrypt their data exchanges.

The original network rested a complex SOAP implementation that had
only scattered implementations and required most institutions to hire
consultants. The Direct Project will reimplement the security and
authentication through simpler protocols, starting with garden-variety
email.

Doctors span a wide range of technical capabilities. Some are barely
storefront operations who consider themselves lucky to have PCs with
consumer-grade email clients. Others can afford special software that
supports S/MIME for encryption. The Direct Project has to encompass
all these participants. So the interface presented to health care
providers will be as simple as possible, but the implementations have
to be sophisticated and flexible.

This project has been conducted with the highest degree of openness
from the start. Anyone who's interested can join a working group (I
dropped in on the Documentation and Testing group to review documents)
and a wide range of volunteers from major health care providers and
EHR vendors have been collaborating. From the conference calls and
email I've been on, things look very collegial and orderly. The
upcoming hackathon is the natural next stage in this open process.

The Nationwide Health Information Network has held hackathons before,
but this one is the first for the Direct subproject and shows that
it's reaching a viable stage. A reference implementation for the
platform is nearly ready, but that's only one node in a fairly
complicated architecture. For doctors to connect to the network,
client software and other mediators are needed.

So if you're a programmer with an interest in health care, check out
the hackathon. It's a chance to see where health care is going in the
United States, and help make it happen.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl