Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

June 12 2012

Data in use from public health to personal fitness

Back in 2010, the first health data initiative forum by the Dept. of Health and Human Services introduced the public to the idea of an agency releasing internal data in forms easy for both casual viewers and programmers to use. The third such forum, which took place last week in Washington, DC, was so enormous (1,400 participants) that it had to be held in a major convention center. Todd Park, who as CTO made HHS a leader in the open data movement, has moved up to take a corresponding role for the entire federal government. Open data is a world movement, and the developer challenges that the HDI forum likes to highlight are standard strategies for linking governments with app programmers.

Todd Park on main stage
Todd Park on main stage.

Following my attendance at a privacy access summit the previous day, the HDI forum made me think of a government bent on reform and an open-minded public crossing hands over the heads of the hidebound health institutions that blunder onward without the benefits of tapping their own data. I am not tossing all hospitals, doctors, and clinics into this category (in fact, I am constantly talking to institutions who work with available data to improve care), but recording and storage of information in health care generally retards anyone interested in change.

The "datapalooza" was already covered on Radar by Alex Howard, so here I'll list some of the observations I made during the parts I attended.

Health and Human Services chooses torrents over leaks

Able to attend the forum only on the first day, I spent a lot of it in a session on HHS data sets at Healthdata.gov because I wanted to know exactly what the department has to offer and how the data is being used.

HHS staff at break-out session
HHS staff at break-out session.

Several things impressed me about the procession of HHS staff that crossed the stage to give five- or ten-minute presentations on data sets. First was the ethos of data sharing that the department heads have instilled. Each staff person showed visible pride in finding data that could be put on the Web. A bit of competitive spirit drives different departments that may have more or fewer resources, and data that comes naturally in a more structured or less structured form. One person, for instance, said, "We're a small division and don't have the resources of the others, but we managed to release several data sets this year and one has an API."

Second, the department is devoting resources to quality. I've heard several complaints in the field about lack of consistency and other problems in public health data. One could hardly avoid such issues when data is being collected from hundreds of agencies scattered across the country. But the people I talked to at the HHS forum had ways of dealing with it, such as by requiring the researchers who collect data to submit it (so that trained professionals do the data entry), and running it through quality checks to look for anomalies.

Third, the department knows that outside developers coming to their site will need extra help understanding the data being collected: what the samples represent, what the scope of collection was, and so forth. In addition to a catalog powered by a Solr search engine, HHS provides direct guidance to the perplexed for those developing apps. They are also adding Linked Data elements to help developers combine data sets.

A few examples of data sets include:

  • The Center for Medicare & Medicaid Services offers aggregate data on emergency visits, hospital readmission rates (a major source of waste in health costs), and performance measurement.

  • The Administration for Children and Families has a Head Start locator that helps parents find services, aggregate data on people who apply for Low Income Home Energy Assistance, etc.

  • The Agency for Healthcare Research and Quality has longitudinal data abut spending on health care and its effect on outcomes, based on an annual survey, plus a service offering statistics on hospital treatments, morbidity, etc.

  • The Assistant Secretary for Planning and Evaluation tracks workforce development, particularly in health IT, and measures the affordability of health care reflected in costs to employers, patients, and the government.

Recently, HHS has intensified its efforts by creating a simple Web interface where its staff can enter data about new data sets. Data can be uploaded automatically from spreadsheets. And a new Data Access and Use Committee identifies data sets to release.

So now we have public health aids like the Community Indicators Data Portal, which maps the use of Medicaid services to poverty indicators, infant mortality, etc.

HealthMap, created by Children's Hospital Boston, is used by a fascinating range of projects. They scoop in huge amounts of data--mostly from news sites, but also blogs, and social networks--in multiple languages around the world, and apply a Bayesian filter to determine what's a possible report of a recent disease outbreak. After a successful flu-tracking program based on accepting reports from the public, they did a dengue-tracking program and, in Haiti, a cholera-tracking program.

But valuable as HHS data is to public health, most of it is not very sexy to the ordinary patient or consumer. If you're curious how your Medicare charges compare with average payments for your county, go ahead and mine the data. But what about something immediately practical, such as finding the best hospital for a procedure?

Recently, it turns out, HHS has been collecting and releasing data on that level, such as comparative information on the quality of care at hospitals. So a datapalooza like the HDI forum really takes on everyday significance. HHS also provides the Healthcare.gov site, with services such finding insurance plans for individuals and small groups.

Other jurisdictions are joining the health data movement. Many countries have more centralized systems and therefore can release large amounts of data about public health. The United Kingdom's National Health Service was featured at the HDI forum, where they boasted of posting 3,000 health indicators to their web site.

The state of Louisiana showed off a cornucopia of data, ranging from user restaurant ratings to ratings of oyster beds. Pregnancy risk factors, morbidity rates, etc. are broken down by race, sex, and other demographics. The representative freely admitted that the state has big health problems, and urgently called on developers to help it mine its data. The state recently held a "Cajun codefest" to kick off its effort. HHS also announced five upcoming local datapaloozas in other states around the U.S.

I talked to Sunnie Southern, a cofounder of a Cincinnati incubator called Innov8 for Health. They offer not only challenges for new apps, but guidance to help developers turn the apps into sustainable businesses. The organization also signs up local hospitals and other institutional users to guarantee a market to app developers. Southern describes Innov8 for Health as a community-wide initiative to support local developers and attract new ones, while maintaining deep roots among multiple stakeholders across the health care, university, startup, investors, and employer stake holders. At the inaugural class, which just took place, eight companies were chosen to receive intensive mentoring, introductions and connections to potential customers and investors, and $20,000 to start their company in 12 weeks. Health data is a core element.

How far can a datapalooza take the health care field?

Health apps are a fast-growing segment of mobile development, and the government can certainly take some of the credit, along with VC and developer recognition that there's a lot of potential money to be made fixing health care. As Todd Park said, "The health innovation ecosystem is beautifully chaotic, self-propelled, and basically out of control." That means the toothpaste can't be put back in the tube, which is a good thing.

The HDI forum is glitzy and exciting--everybody in health care reform shows up, and the stage show is slickly coordinated--but we must remember the limits of apps in bringing about systemic change. It's great that you can use myDrugCo$ts.com to find a discount drug store near you. Even better, if your employer hooks you up to data sets provided by your insurer, myDrugCo$ts.com can warn you about restrictions that affect costs. But none of this will change the crazy pricing in the insurance plans themselves, or the overuse of drugs in medicine, or the inefficient development and testing methods that lead to high medication prices in the first place.

Caucus of Society for Participatory Medicine and friends
Caucus of Society for Participatory Medicine and friends.

Transparency by one department on one level can lead to expectations of transparency in other places too. As pricing in health care becomes more visible, it will become less defensible. But this requires a public movement. We could do great things if we could unlock the data collected by each hospital and insurance agency, but they see that data as their competitive arsenal and we are left with a tragedy of the anti-commons. It would be nice to say, "You use plenty of public data to aid your decision-making, now reciprocate with some of your own." This can be a campaign for reformers such as the Society for Participatory Medicine.

At the HDI forum, United Healthcare reported that they had enough data to profile patients at risk for diabetes and brought them in for a diabetes prevention program. This is only a sample of what can be done with data that is not yet public.

Aetna presenter shows CarePass on the main conference stage al at health care conference
Aetna presenter shows CarePass on the main conference stage.

Aetna is leading the way with a service called CarePass, currently holding a developer challenge. CarePass offers Aetna's data through an API, and they partner with other major data centers (somewhat as Microsoft does with HealthVault) to hook up data. Practice Fusion is also offering some data to researchers.

Even those bright-faced entrepreneurs launching businesses around data from HHS and elsewhere--certainly their success is one of the goals of the open data movement, but I worry that they will recreate the silos of the health care field in the area of patient data. What are they collecting on us as we obsessively enter our personal statistics into those devices? Who will be able to use the aggregate data building up on their servers?

So there are hints of a qualitative change that can come from quantitative growth in the release and reuse of health care data. The next step involves the use of personal data, which raises its own litany of issues in quality and privacy. That will be the subject of the last posting in this series.

June 11 2012

Health reform leaders focus on patient access to records as key barrier

A convocation of trend-setters and organizational leaders in U.S. health care was called together in Washington last Monday, June 4. The attendees advised two government organizations driving health reform--the Office of the National Coordinator at the Dept. of Health and Human Services, and the Dept. of Veteran Affairs--how to push forward one of their top goals, patient engagement.

The results of the meeting, to me, demonstrated mostly the primitive state of communications and coordinated care in the U.S. health system. In an earlier posting I discussed the sorry state of health data exchange, and Monday's patient access summit centered on the same factors of siloing and data hoarding as barriers to patient engagement.

Farzad Mostashari, the National Coordinator for Health Information Technology, tried to set the scope of the meeting as an incubator to suggest practical ways patients could use the data they get from health providers. (As I'll explain later, we also touched on data patients generate themselves.) His reasoning, which I endorse, is that patients currently can't do much with data except keep it somewhere and pass it to other health providers, so in order to engage them we need to provide tools for them to improve their health with this data.

But the pulse of the 75 or so attendees gave quite a different message: that we're nowhere near ready to discuss uses of data, and that our efforts at patient engagement should start with getting the data to the patients in the first place.

Several attendees have already blogged about various aspects of the meeting:

  • Brian Ahier summarizes the purpose and outcomes.

  • Dave Chase urges the government to create an environment that encourages the release of data to the patient.

  • Keith Boone focused on some interesting statements and ideas aired at the meeting.

In this posting, I'll discuss:

Why patient access is so important, and why it doesn't happen

The notions of patients pouring over doctors' notes, correlating their own test results, and making demands on their care providers may carry a faint whiff of utopianism, but thousands of patients do these things every day--and do them even when deprived of the electronic aids that could make these activities natural. The people in the room for the patient access summit were by no means utopians. They are intense movers in the health care field with deadlines to meet and budgets to allocate. So when they call for patient access to data, it's because they all see it as critical to solving the quality and cost problems their own organizations face.

Patient engagement is critical because most health care takes place outside the doctor's office or operating room. Patients need to take control of their own lifestyles for the problems that put a lot of strain on our health care system, such as obesity. They need to follow through on post-release instructions and monitor themselves for symptoms.

And in the silo'd state of today's health system, the patients need to make sure their data gets to health providers. We heard over and over at the patient access summit how patients have entered treatment centers without the information needed to treat them, how doctors would refuse point-blank (in violation of the law) to give patients their folders, and how patients received inadequate care because of the lack of information.

Patient participation in health care is not only good for the individuals who do it, but are crucial for prying open the system as a whole. The providers, vendors, and insurers are moving too slowly. Their standards and electronic health records lack fields for all the data people are generating through their Fitbits and Zeos, and they don't have pathways for continuously uploading patient-generated data. This lapse can be turned into a plus: device manufacturers and programmers out in the field will develop new, more flexible, more robust standards that will become the next generation of EHRs and personal health records. A strong push from empowered patients can really change the way doctors work, and the associated costs.

Major topics of debate

Opinions differ about the roles of electronic records, interchange systems, culture, and business models in the recalcitrance of doctors to release patient data, which I'll discuss in the last section of the article. Getting the answer to these questions right should determine the strategy government and consumers use to breach the silos. But the consensus at the patient access summity was that we need to pursue these strategies fast, and that the fate of the rest of health care reform will rest on our success.

The first half of the Washington meeting meandered through various classic areas under constant debate in the health care field. This seemed necessary so that the participants in the summit could feel each other out, untangle some of their differences and ultimately come to a position of trust so they could agree on the topics in the previous section. I noted the following topics that threaded through the debate without resolution.

Technology versus culture

Debates come up all the time when organizational change is on the agenda about the importance of the technologies people use versus their workflows, attitudes, and willingness to change. I find the discussions silly because people usually find themselves pushed to an either-or position and that just doesn't make sense. Of course technology can facilitate change, and of course the technology will be a big waste of time and money if the human participants fail to understand the behavior changes they need to make along the way.

But the Washington attendees raised these issues as part of the strategy-setting I mentioned earlier. Certainly, the government would prefer to avoid creating or mandating the use of certain technologies. The question is whether the ONC and VA can set goals and leave it up to the market to find the way.

Sometimes the health care field is so distorted and dysfunctional that the government feels it has to step in, such as when HHS created CONNECT and then Direct. Without these, the health care providers and health information exchanges (HIEs) would claim that exchanging patient data was an expensive or intractable problem. One might also interpret the release of VistA and BlueButton to the general public as the VA's statements about how health care should be conducted.

So Mostashari's original call for actions that patients could take fits into the technology end of the debate. By suggesting technological paths forward, we can effect cultural change. For instance, if a patient uses an app or web site to view all the potential reactions between the drugs she takes (and I heard one estimate this week that people in their 80s take between five and eight medications), she can warn her own doctor about an adverse reaction.

Ultimately, the working groups that today's meeting settled on included a lot of technological innovation.

The need for standards

Standard setting is another perennial area for disagreement, because premature standard-setting, like premature optimization, can have an effect opposite to what you want. If we took all the efforts that companies put into standards that bombed in the marketplace and devoted the resources over the decades to competition between innovations, we might have an explosion of new technologies. So even if you accept the value of technology to effect culture change, you can ask where and when can governments and standards committees can intervene positively.

And this caution applies to health care too. The old guard of EHRs and HIE suffer from a lack of (useful) standards. But I mentioned earlier, an exciting explosion of patient-centered apps and devices is developing in the absence of standards. The Washington meeting ended up endorsing many standard-setting efforts, although these applied mostly to mature fields such as EHRs.

Transfer standards versus data format standards

Mixed up in the debate over the timing of standards was a distinction between standards used for sending data around and standards used to represent the data. The former are called protocols in the communications field. HTTP is a transfer standard, for instance, whereas as HTML is a data format standard. Both are needed to make the World Wide Web operate. And both ended up part of the action items from the patient access summit.

Privacy versus data availability

As I reported from the first health privacy conference, health care advocates argue over the importance of privacy. At the patient access summit, everybody who spoke on this topic prioritized the exchange of data. Privacy concerns are the magic amulet that providers wave at patients to ward off their requests for data. But in fact, the much-derided Health Insurance Portability and Accountability Act (HIPAA) requires providers to give patients data: that's what the terms Portability and Accountability in the name refer to. The providers are required to take reasonable steps to preserve privacy--and the Direct project aims to simplify these--but the patient can waive even these modest safeguards if he or she is anxious to get the data quickly.

Given our skepticism toward claims of security concerns, a bit of security theater we encountered as we entered the conference center is illustrative. We were warned ahead of time that the facility was secure and told to bring a government-issued photo ID. Indeed, the guard checked my ID and looked at my face when I entered, but nobody checked my name against a list to see whether I was actually supposed to be there.

A later article in this series will explore the relationships between privacy, security, patient access, accuracy, and accountability that create a philosophy of control.

Motivations for doctors versus patients

Another topic at the patient access summit that reflected a dilemma in the health care field is how much effort to aim at the doctors versus the patients, when trying to change the behavior of both. Many patients try to engage as adults in their own care and are stymied by resistant doctors. And as I pointed out in an earlier posting, the patients who need the most lifestyle changes ignore their own perilous conditions. So these considerations would suggest focusing on motivations for doctors to change.

But a market approach would suggest that, when enough patients want to have a say in their care, and have the means to choose their doctors, change will reach the examination rooms. The conclusions of the patient access summit did not reflect any particular positions along this spectrum. Participants pointed out, however, that institutions such as Kaiser Permanente who wanted patients to use their portals invested a lot into advertising them.

Pushing versus pulling data

Telephone calls, email, and online chats are push technology, in that the person sending them decides when (approximately) they are delivered. The web is a pull technology, because the recipient visits the site at his or her choosing. In health exchange, one doctor may push a patient's records to the next provider, or the next provider can pull them when the patient is due to arrive. Sometimes articulated unhelpfully as a battle for push versus pull, our discussion revealed that each had its uses.

The issue is especially salient when a patient has records stored by multiple institutions. Currently, a patient can pull records from each and (if they use a common format such as BlueButton) combine them. In fact, a mobile app named iBlueButton allows a patient to show data from providers to a doctor during a visit. But it would be much better for each institution to push information to the patient as it's added to the institution's record. This would bring us closer to the ideal situation where records are stored by a site on behalf of the patient, not the doctor.

Three action items from today's meeting

Now we get to the meat of the summit. Leaders asked participants to define areas for research and to make commitments to incorporate the results of the research teams into their products and activities. Three action items were chosen, and two were excluded from consideration at this round.

Automated downloads

A number of organizations, such as Aetna Health Plans have adopted the BlueButton format created at the VA. In the line-up of data formats available for storing health information, BlueButton is shockingly casual. But it's list of plain-text fields is easy to read and unfrightening for patients. It is also undeniably popular, as the number of VA patients downloading their data approaches one million. So the immediate impetus for the first goal of the patient access summit, dubbed "automating BlueButton," is to keep patients' records up to date and integrated by pushing data to them from institutional EHRs.

But BlueButton can be massaged into other formats easier for programs to manipulate, the so the "automating BlueButton" task really refers to the entire movement to empower patients who want control over their records. One way to state the principle is that every action in a hospital's or doctor's EHR will be accompanied by an update to the patient's copy of the data. Hopefully this movement will soon lead to simple but program-friendly XML formats, robust transfer standards such as Direct, and universal integration of hospital and clinic EHRs with patient health records.

Identification and access technologies

Congress has ruled out a single nation-wide ID for patients, thanks to worries from privacy advocates that the system could facilitate identity theft and commercial data mining. Some have proposed a Voluntary Universal Healthcare Identifier (VUHID), but that's encumbered with the same problems. Identification systems used nowadays for HIE are cumbersome and error-prone, and revolve around cooperating health care institutions rather than individual patients with few resources. Individual hospitals can verify patients' email addresses and passwords when they come in for treatment, but in-person authentication doesn't scale to data exchange.

A more rational solution revolves around certificates and digital signatures, which security-conscious institutions in government and industry have used for years. The has gotten a bit of a bad rep because it has been poorly implemented on the Web (where browsers trust too many certificate authorities, and system administrators fail to keep accurate signatures) but the health care system is quite capable of implementing it properly. The Direct Trust project is creating a set of practices and hopefully will stimulate the industry to create such a system. In fact, I think Direct Trust is already addressing the issues listed under this task. OAuth was also mentioned repeatedly at the summit. the National Strategy for Trusted Identities in Cyberspace was also mentioned.

The questions of identifying oneself and of authorizing access to data are linked, so they were combined in a single working group even though they are somewhat distinct technically.

Standards for content

The final task approved at the patient access summit was to work further on data standards. It was late in the day and the task was defined only in a very broad manner. But I think it's an important leg of the patient access stool because current standards for patient data, such as HL7's CDA, were meant for communicating the results of clinical interventions. They'll be hard to use when patients generate and store their own data, both because they lack the appropriate fields and because they aren't designed for continuous uploads of data. Segmented access (allowing providers to see certain records while withholding records that the patient considers sensitive) was also mentioned.

Patient-generated data

I mentioned at the summit that patients are starting to generate data that could be invaluable in their treatment, and that the possession of this data gives them leverage. Doctors who are serious about treating common chronic issues such as hypertension, or any condition that can be improved through careful monitoring, will want the patient data. And patients can use their leverage to open up doctors' EHRs. As patients got more involved in their care, the very term "provider" (meaning a doctor or other professional who provides diagnosis and treatment) will become obsolete. Patients will be co-providers along with their professional team.

Patient-generated data got some attention during the day, but the attendees concluded that not enough time had been spent on it to turn it into an action item.

Privacy

The final issue on the agenda for the day was privacy. I estimate that we spent a full half-hour at one point, in addition to which it was raised at other times. Because I am covering privacy in the third article of this series, I'll simply say here that the attendees were most concerned about removing excuses for data exchange, and did not treat risks to privacy as a problem to be fixed.

What did the patient access summit accomplish?

I'm proud that the ONC and VA created a major discussion forum for patient access. I think the issues that came up were familiar to all participants in the meeting, and that ONC together with industry partners is already moving forward on them. The summit provided affirmation that the health care field as a whole takes the issues seriously, and the commitments that will arise from the meeting will lend more weight to government efforts.

And a lot of the time, knowledgeable patients need to know that progressive health care leaders and the government have "got their back" as they demand their rights to know what's going on in their bodies. The Office of Civil Rights has publicly championed the patients' right to their data (in fact, the biggest fine they've levied for a HIPAA violation concerns a refusal to release data to a patient), and the initiatives we all supported last Monday will give them more tools to use it.

Regulations can make a difference. A representative from Practice Fusion told me they offered a patient download option on their EHR service years ago, but that most doctors refused to allow it. After the ONC's meaningful use regulations required patient access, adoption by doctors went up 600%.

While laying the groundwork for patient access, we are ready to look forward to wonderful things patients and providers can do with data. That will be the subject of my next article in the series, which will cover the health data initiative forum I attended the next day.

February 29 2012

Report from HIMSS 12: wrap-up of the largest health IT conference

This is a time of great promise in health care, yet an oppressive atmosphere hung over much of href="http://www.himssconference.org/">HIMSS. All the speakers--not least the government representatives who announced rules for the adoption of electronic health records--stressed commendable practices such as data exchange, providing the patient with information, and engaging with the patient. Many berated hospitals, doctors, and vendors for neglecting the elements that maintain health. But the thrust of most sessions was on such details as how to convert patient records to the latest classification of diseases (ICD-10).

Intelligent Hospital pavilion shows off tempting technology
Intelligent Hospital pavilion shows off tempting technology.

I have nothing against ICD-10 and I'm sure adopting it is a big headache that deserves attention at the conference. The reason I call the atmosphere oppressive is that I felt stuck among health care providers unable to think long-term or to embrace the systems approach that we'll need to cure people and cut costs. While some health care institutions took the ICD-10 change-over seriously and put resources into meeting the deadline, others pressured the Dept. of Health and Human services to delay implementation, and apparently won a major reprieve. The health IT community, including HIMSS, criticized the delay. But resistance to progress usually does not break out so overtly, and remains ingrained in day-to-day habits.

But ICD-10 is a sideline to the major issue of Stage 2 meaningful use. Why, as I reported on Wednesday, were so many of the 35,000 HIMSS attendees wrapped up in the next step being forced on them by the federal government? The scandal is that these meaningful use concepts (using data to drive care, giving care-givers information that other care-givers have collected about the patient) have to be forced on them. Indeed, institutions like Kaiser Permanente that integrated their electronic records years ago and concentrated on the whole patient had relatively little work to do to conform to Stage 1, and probably have the building blocks for Stage 2 in place. And of course these things are part of the landscape of health care in other countries. (The proposed regulations were finally posted last Thursday.)

Recipients of Regina Holliday jackets record patient involvement stories
Recipients of Regina Holliday jackets record patient involvement stories.

Haven't our providers heard that an ounce of prevention is worth a pound of cure? Don't well-educated and well-paid executives invest in quality measures with the expectation that they'll pay off in the long run? And aren't we all in the field for the good of the patients? What is that snickering I hear?

Actually, I don't accept the premise that providers are all in it for the money. If so many are newly incentivized to join the government's program for a mere $15,000 per doctor (plus avoiding some cuts in Medicare payments), which is a small fraction of the money they'll have to spend implementing the program, they must know that it's time to do the right thing. Meaningful use can be a good framework to concretize the idealistic goals of health care reform, but I just wish the vendors and doctors would keep their eyes more on the final goal.

Redwood MedNet in Northern California is an example of a health information exchange that adopted standards (CONNECT, before the Direct project was in place) to simplify data exchange between health providers. Will Ross of Redwood MedNet told me that qualifying for Stage 2 would be simple for them, "but you won't hear that from many vendors in this exhibit hall."

Annual surveys by Family Practice Management journal about their readers' satisfaction with EHRs, reviewed in one HIMSS session, showed widespread dissatisfaction that doesn't change from year to year. For instance, 39% were dissatisfied with support and training, although a few vendors rated quite high. Still, considering that doctors tend to veer away from open source solutions and pay big bucks for proprietary ones out of a hope of receiving better support and training, they deserve better. It's worth noting that the longer a practice uses its system, the more they're likely to express satisfaction. But only 38% of respondents would purchase the same systems now if they weren't already locked in.

That's the big, frustrating contradiction at HIMSS. The vendors have standards (HL7 and others), they've been setting up health information exchanges (under various other names) for years, they have a big, popular interoperability lab at each conference--and yet most patients still have to carry paper records and CDs with images from one doctor to another. (A survey of HIMSS members showed that one-quarter allowed access by patients to their data, which is an advance but still just a start.) The industry as a whole has failed to make a dent in the 90,000 to 100,000 needless deaths that occur in treatment facilities each year. And (according to one speaker) 20% of patients hospitalized under Medicare have to return to the hospital shortly after discharge.

Omens of change

Suffice it say that by my fourth day at HIMSS I was not happy. Advances come, but slowly. Examples of developments I can give a thumbs-up to at HIMSS were data sharing among physicians who use Practice Fusion, a popular example of a growing move to web services for electronic records, and a CardioEngagement Challenge funded by Novartis to encourage at-risk patients to take more interest in their health. The winner was a Sensei mobile app that acts as an automated coach. Sensei CEO Robert Schwarzberg, a cardiologist, told me had put together phone-in coaching services for heart patients during the years before mobile apps, and was frustrated that these coaches were available less than once a week when what patients needed was round-the-clock motivation. Sensei Wellness is one of the many mobile apps that make both patients and doctors more connected, and HIMSS quite properly devoted a whole section of the exhibit floor to them.

Talking about Sensei Wellness with Dr. Robert Schwarzberg
Talking about Sensei Wellness with Dr. Robert Schwarzberg.

I dropped by the IBM booth for the obligatory demo of Watson's medical application, and some background from Dr. Josko Silobrcic. I also filled in some of this report from an earlier conversation with tech staff.

Medical diagnosis involves more structured data than solving Jeopardy riddles, structure that appears mostly in the form of links between data sets. For instance, medicines are linked to diagnoses, to lab results, and to other medicines (for instance, some drugs are counter-indicated when the patient is taking other drugs). Watson follows these relationships.

But because Watson is a natural language processing application--based on UIMA, which IBM donated to the Apache Foundation--it doesn't try to do much reasoning to pick out the best diagnosis or treatment, both of which are sometimes requested of it. Instead, it dumps huge indexes of medical articles into its data stores on one side, and takes in the text about the patient's complaint and doctor's evaluation on the other. Matching them up is not so different from a Jeopardy question, after all. Any possible match is considered and kept live until the final round of weighing answers, even if the chance of matching is near zero.

Dr. Josko Silobrcic before Watson demonstration
Dr. Josko Silobrcic before Watson demonstration.

Also because of the NLP basis for matching, there is rarely a need to harmonize disparate data taken in from different journals or medical sources.

I assumed that any processing that uses such a large data set and works so fast must run on a huge server farm, but the staff assured me it's not as big as one would think. For production use, of course, they'll need to take into account the need to scale. The medical informatics equivalent of a Christmas rush on sales would be an epidemic where everybody in the region is urgently hitting Watson for critical diagnoses.

Coming to peace

Healing came to me on my last day at HIMSS, at too related conferences off to the side of the main events: a meeting of Open Health Tools members and the eCollaboration forum, run by health activists who want to break down barriers to care. Both groups have partnerships with HIMSS.

Open Health Tools positions itself as an umbrella organization for projects making free software for a lot of different purposes in health care: recording, treatment, research and more. One illustrative project I got to hear about at their meeting was the Medical Imaging Network Transport (MINT), which Johns Hopkins is working on in coordination with other teams

MINT cuts down on the transfers of huge images by doing some processing in place and transferring only portions of the data. Switching to modern storage formats (XML and JSON) and better methods of data transfer also reduces waste. For instance, current DICOM vendors transmit images over TCP, which introduces more overhead than necessary when handling the packet losses engendered by transmitting files that are several gigabytes in size. MINT allows UDP and other protocols that are leaner than TCP.

Best of all, MINT DICOM images can be displayed through HTML5, which means any browser can view them in good resolution, there is no need to install a specialized viewer at each location where the doctor is checking the image, and dependence on proprietary software is reduced. (The same reliance on standard browsers is also claimed by eMix in a recent interview.

At the eCollaboration forum, E-patient Dave DeBronkart reported that being an engaged patient is still swimming upstream. It's hard to get one's records, hard to find out what treatments will cost, and hard to get taken seriously as an adult interested in monitoring one's own care. Meg McCabe of Aetna says that insurers need to offer more sophisticated guidance to patients trying to choose a health provider--simple lists of options are confusing and hard to choose from.

One speaker warned providers that if they try to open their data for collaborative care, they may find themselves hampered by contracts that maintain vendor ownership of EHR data. But speakers assured us vendors are not evil. The issue is what the providers ask for when they buy the EHR systems.

Here's the strange thing about the eCollaboration forum: they signed up enough people to fill the room ahead of time and left many potential attendees lamenting that they couldn't get in. Yet on the actual day of the event, there were about eight empty seats for every attendee. Maybe HIMSS attendees felt that had to devote all their time to the stage 2 regulations, previously mentioned. But I take the disappointing turn-out as a sign of the providers' and vendors' lack of commitment to change. Shown a dazzling roster of interesting talks about data exchange, open record sharing, and patient engagement, they're quick to sign up--but they don't show up when it counts.

As members of the general public, we can move the health care field forward by demanding more from our providers, at the point where we have some influence. Anyone looking for concrete guidance for increasing their influence as a patient can try e-Patients Live Longer: The Complete Guide to Managing Health Care Using Technology, by Nancy B. Finn.

Public attention and anger have been focused on insurers, who have certainly engaged in some unsavory practices to avoid paying for care--but nothing as destructive as the preventable errors and deaths caused by old-fashioned medical practices. And while economists complain about the 30 cents out of every dollar wasted in the American hodge-podge of payment systems, we know that unnecessary medical procedures or, conversely, preventative steps that were omitted, also suck up a lot of money. One speaker at the eCollaboration forum compared the sky-rocketing costs of health care and insurance to a financial bubble that can't last. Let's all take some responsibility for instituting better medical and reporting systems so the costs come down in a healthy manner.

Other articles about HIMSS were posted last Tuesday and Wednesday.

February 23 2012

Report from HIMSS 2012: toward interoperability and openness

I was wondering how it would feel to be in the midst of 35,000 people whose livelihoods are driven by the decisions of a large institution at the moment when that institution releases a major set of rules. I didn't really find out, though. The 35,000 people I speak of are the attendees of the HIMSS conference and the institution is the Department of Health and Human Services. But HHS just sort of half-released the rules (called Stage 2 of meaningful use), telling us that they would appear online tomorrow and meanwhile rushing over a few of the key points in a presentation that drew overflow crowds in two rooms.

The reaction, I sensed, was a mix of relief and frustration. Relief because Farzad Mostashari, National Coordinator for Health Information Technology, promised us the rules would be familiar and hewed closely to what advisors had requested. Frustration, however, at not seeing the details. The few snippets put up on the screen contained enough ambiguities and poorly worded phrases that I'm glad there's a 60-day comment period before the final rules are adopted.

There isn't much one can say about the Stage 2 rules until they are posted and the experts have a chance to parse them closely, and I'm a bit reluctant to throw onto the Internet one of potentially 35,000 reactions to the announcement, but a few points struck me enough to be worth writing about. Mostashari used his pulpit for several pronouncements about the rules:

  • HHS would push ahead on goals for interoperability and health information exchange. "We can't wait five years," said Mostashari. He emphasized the phrase "standard-based" in referring to HIE.

  • Patient engagement was another priority. To attest to Stage 2, institutions will have to allow at least half their patients to download and transfer their records.

  • They would strive for continuous quality improvement and clinical decision support, key goals enabled by the building blocks of meaningful use.

Two key pillars of the Stage 2 announcement are requirements to use the Direct project for data exchange and HL7's consolidated CDA for the format (the only data exchange I heard mentioned was a summary of care, which is all that most institutions exchange when a patient is referred).

The announcement demonstrates the confidence that HHS has in the Direct project, which it launched just a couple years ago and that exemplifies a successful joint government/private sector project. Direct will allow health care providers of any size and financial endowment to use email or the Web to share summaries of care. (I mentioned it in yesterday's article.) With Direct, we can hope to leave the cumbersome and costly days of health information exchange behind. The older and more complex CONNECT project will be an option as well.

The other half of that announcement, regarding adoption of the CDA (incarnated as a CCD for summaries of care), is a loss for the older CCR format, which was an option in Stage 1. The CCR was the Silicon Valley version of health data, a sleek and consistent XML format used by Google Health and Microsoft HealthVault. But health care experts criticized the CCR as not rich enough to convey the information institutions need, so it lost out to the more complex CCD.

The news on formats is good overall, though. The HL7 consortium, which has historically funded itself by requiring organizations to become members in order to use its standards, is opening some of them for free use. This is critical for the development of open source projects. And at an HL7 panel today, a spokesperson said they would like to head more in the direction of free licensing and have to determine whether they can survive financially while doing so.

So I'm feeling optimistic that U.S. health care is moving "toward interoperability and openness," the phrase I used in the title to his article and also used in a posting from HIMSS two years ago.

HHS allowed late-coming institutions (those who began the Stage 1 process in 2011) to continue at Stage 1 for another year. This is welcome because they have so much work to do, but means that providers who want to demonstrate Stage 2 information exchange may have trouble because they can't do it with other providers who are ready only for Stage 1.

HHS endorsed some other standards today as well, notably SNOMED for diseases and LRI for lab results. Another nice tidbit from the summit includes the requirement to use electronic medication administration (for instance, bar codes to check for errors in giving medicine) to foster patient safety.

July 30 2011

Report from Open Source convention health track, 2011

Open source software in health care? It's limited to a few pockets of use--at least in the United States--but if you look at it a bit, you start to wonder why any health care institution uses any proprietary software at all.

What the evidence suggests

Take the conference session by University of Chicago researchers commissioned to produce a report for Congress on open source in health care. They found several open source packages that met the needs for electronic records at rural providers with few resources, such as safety-net providers.

They found that providers who adopted open source started to make the changes that the adoption of electronic health records (or any major new system) is supposed to do, but rarely does in proprietary health settings.

  • They offer the kinds of extra attention to patients that improve their health, such as asking them questions about long-term health issues.

  • They coordinate care better between departments.

  • They have improved their workflows, saving a lot of money

And incidentally, deployment of an open source EHR took an estimated 40% of the cost of deploying a proprietary one.

Not many clinics of the type examined--those in rural, low-income areas--have the time and money to install electronic records, and far fewer use open source ones. But the half-dozen examined by the Chicago team were clear success stories. They covered a variety of areas and populations, and three used WorldVistA while three used other EHRs.

Their recommendations are:

  • Greater coordination between open source EHR developers and communities, to explain what open source is and how they benefit providers.

  • Forming a Community of Practice on health centers using open source EHRs.

  • Greater involvement from the Federal Government, not to sponsor open source, but to make communities aware that it's an option.

Why do so few providers adopt open source EHRs? The team attributed the problem partly to prejudice against open source. But I picked up another, deeper concern from their talk. They said success in implementing open source EHRs depends on a "strong, visionary leadership team." As much as we admire health providers, teams like that are hard to form and consequently hard to find. But of course, any significant improvement in work processes would require such a team. What the study demonstrated is that it happens more in the environment of an open source product.

There are some caveats to keep in mind when considering these findings--some limitations to the study. First, the researchers had very little data about the costs of implementing proprietary health care systems, because the vendors won't allow customers to discuss it, and just two studies have been published. Second, the sample of open source projects was small, although the consistency of positive results was impressive. And the researchers started out sympathetic to open source. Despite the endorsement of open source represented by their findings, they recognized that it's harder to find open source and that all the beneficial customizations take time and money. During a Birds-of-a-Feather session later in the conference, many of us agreed that proprietary solutions are here for quite some time, and can benefit by incorporating open source components.

The study nevertheless remains important and deserves to be released to Congress and the public by the Department of Health and Human Services. There's no point to keeping it under wraps; the researchers are proceeding with phase 2 of the study with independent funding and are sure to release it.

So who uses open source?

It's nice to hear about open source projects (and we had presentations on several at last year's OSCon health care track) but the question on the ground is what it's like to actually put one in place. The implementation story we heard this year was from a team involving Roberts-Hoffman Software and Tolven.

Roberts-Hoffman is an OSCon success story. Last year they received a contract from a small health care provider to complete a huge EHR project in a crazily short amount of time, including such big-ticket requirements as meeting HIPAA requirements. Roberts-Hoffman knew little about open source, but surmised that the customization it permitted would let them achieve their goal. Roberts-Hoffman CEO Vickie Hoffman therefore attended OSCon 2010, where she met a number of participants in the health care track (including me) and settled on Tolven as their provider.

The customer put some bumps in the road to to the open source approach. For instance, they asked with some anxiety whether an open source product would expose their data. Hoffman had a little educating to do.

Another hurdle was finding a vendor to take medication orders. Luckily, Lexicomp was willing to work with a small provider and showed a desire to have an open source solution for providers. Roberts-Hoffman ended up developing a Tolven module using Lexicomp's API and contributing it back to Tolven. This proprietary/open source merger was generally quite successful, although it was extra work providing tests that someone could run without a Lexicomp license.

In addition to meeting what originally seemed an impossible schedule, Tolven allowed an unusual degree of customization through templating, and ensured the system would work with standard medical vocabularies.

Why can't you deliver my data?

After presentations on health information exchanges at OSCON, I started to ruminate about data delivery. My wife and I had some problems with appliances this past Spring and indulged in some purchases of common household items, a gas grill from one company and a washing machine from another. Each offered free delivery. So if low-margin department stores can deliver 100-pound appliances, why can't my doctor deliver my data to a specialist I'm referred to?

The CONNECT Gateway and Direct project hopefully solve that problem. CONNECT is the older solution, with Direct offering an easier-to-implement system that small health care providers will appreciate. Both have the goal of allowing health care providers to exchange patient data with each other, and with other necessary organizations such as public health agencies, in a secure manner.

David Riley, who directed the conversion of CONNECT to an open-source, community-driven project at the Office of the National Coordinator in the Department of Health and Human Services, kicked off OSCon's health care track by describing the latest developments. He had led off last year's health care track with a perspective on CONNECT delivered from his role in government, and he moved smoothly this time into covering the events of the past year as a private developer.

The open-source and community aspects certainly proved their value when a controversy and lawsuit over government contracts threatened to stop development on CONNECT. Although that's all been resolved now, Riley decided in the Spring to leave government and set up an independent non-profit foundation, Alembic, to guide CONNECT. The original developers moved over to Alembic, notably Brian Behlendorf, and a number of new companies and contributors came along. Most of the vendors who had started out on the ONC project stayed with the ONC, and were advised by Riley to do so until Alembic's course was firm.

Lots of foundations handle open source projects (Apache, etc.) but Riley and Behlendorf decided none of them were proper for a government-centric health care project. CONNECT demanded a unique blend of sensitivity to the health care field and experience dealing with government agencies, who have special contract rules and have trouble dealing with communities. For instance, government agencies are tasked by Congress with developing particular solutions in a particular time frame, and cannot cite as an excuse that some developer had to take time off to get a full-time job elsewhere.

Riley knows how to handle the myriad pressures of these projects, and has brought that expertise to Alembic. CONNECT software has been released and further developed under a BSD license as the Aurion project. Now that the ONC is back on track and is making changes of its own, the two projects are trying to heal the fork and are following each other's changes closely. Because Aurion has to handle sensitive personal data deftly, Riley hopes to generalize some of the software and create other projects for handling personal data.

Two Microsoft staff came to OSCon to describe Direct and the open-source .NET libraries implementing it. It turned out that many in the audience were uninformed about Direct (despite an intense outreach effort by the ONC) and showed a good deal of confusion about it. So speakers Vaibhav Bhandari and Ali Emami spent the whole time alloted (and more) explaining Direct, with time for just a couple slides pointing out what the .NET libraries can do.

Part of the problem is that security is broken down into several different functions in ONC's solution. Direct does not help you decide whether to trust the person you're sending data to (you need to establish a trust relationship through a third party that grants certificates) or find out where to send it (you need to know the correspondent's email address or another connection point). But two providers or other health care entities who make an agreement to share data can use Direct to do so over email or other upcoming interfaces.

There was a lot of cynicism among attendees and speakers about whether government efforts, even with excellent protocols and libraries, can get doctors to offer patients and other doctors the necessary access to data. I think the reason I can get a big-box store to deliver an appliance but I can't get my doctor to deliver data is that the big-box store is part of a market, and therefore wants to please the customer. Despite all our talk of free markets in this country, health care is not a market. Instead, it's a grossly subsidized system where no one has choice. And it's not just the patients who suffer. Control is removed from the providers and payers as well.

The problem will be solved when patients start acting like customers and making appropriate demands. If you could say, "I'm not filling out those patient history forms one more time--you just get the information where I'm going," it might have an effect. More practically speaking, let's provide simple tools that let patients store their history on USB keys or some similar medium, so we can walk into a doctor's office and say "Here, load this up and you'll have everything you need."

What about you, now?

Patient control goes beyond data. It's really core to solving our crisis in health care and costs. A lot of sessions at OSCon covered things patients could do to take control of their health and their data, but most of them were assigned to the citizen health track (I mentioned them at the end of my preview article a week ago) and I couldn't attend them because they were concurrent with the health care track.

Eri Gentry delivered an inspiring keynote about her work in the biology start-up BioCurious, Karen Sandler (who had spoken in last year's health care track scared us all with the importance of putting open source software in medical devices, and Fred Trotter gave a brief but riveting summary of the problems in health care. Fred also led a session on the Quantified Self, which was largely a discussion with the audience about ways we could encourage better behavior in ourselves and the public at large.

Guaranteed to cause meaningful change

I've already touched on the importance of changing how most health care institutions treat patients, and how open source can help. David Uhlman (who has written a book for O'Reilly with Fred Trotter) covered the complex topic of meaningful use, a phrase that appeared in the recovery act of 2009 and that drives just about all the change in current U.S. institutions. The term "meaningful use" implies that providers do more than install electronic systems; they use them in ways that benefit the patients, the institutions themselves, and the government agencies that depend on their data and treatments.

But Uhlman pointed out that doctors and health administrators--let alone the vendors of EHRs--focus on the incentive money and seem eager to do the minimum that gets them a payout. This is self-defeating, because as the government will raise the requirements for meaningful use over the years, and will overwhelm quick-and-dirty implementations that fail to solve real problems. Of course, the health providers keep pushing back the more stringent requirements to later years, but they'll have to face the music someday. Perhaps the delay will be good for everyone in the long run, because it will give open source products a chance to demonstrate their value and make inroads where they are desperately needed.

As a crude incentive to install electronic records, meaningful use has been a big success. Before the recover act was passed, 15%-20% of U.S. providers had EHRs. Now the figures is 60% or 70% percent, and by the end of 2012 it will probably be 90%. But it remains to be seen whether doctors use these systems to make better clinical decisions, follow up with patients so they comply with treatments, and eliminate waste.

Uhlman said that technology accounts for about 20% of the solution. The rest is workflow. For instance, every provider should talk to patients on every visit about central health concerns, such as hypertension and smoking. Research has suggested that this will add 30% more time per visit. If it reduces illness and hospital admissions, of course, we'll all end up paying less in taxes and insurance. His slogan: meaningful use is a payout for quality data.

It may be surprising--especially to an OSCon audience--that one of the biggest hurdles to achieving meaningful use is basic computer skills. We're talking here about typing information in correctly, knowing that you need to scroll down to look at all information on the screen, and such like. All the institutions Uhlman visits think they're in fine shape and everybody has the basic skills, but every examination he's done proves that 20%-30% of the staff are novices in computer use. And of course, facilities are loath to spend extra money to develop these skills.

Open source everywhere

Open source has image and marketing problems in the health care field, but solutions are emerging all over the place. Three open source systems right now are certified for meaningful use: ClearHealth (Uhlman's own product), CareVue from MedSphere, and WorldVistA. OpenEMR is likely to join them soon, having completed the testing phase. vxVistA is certified but may depend on some proprietary pieces (the status was unclear during the discussion).

Two other intriguing projects presented at OSCon this year were popHealth and Indivo X. I interviewed architects from Indivo X and popHealth before they came to speak at OSCon. I'll just say here that popHealth has two valuable functions. It helps providers improve quality by providing a simple web interface that makes it easy for them to view and compare their quality measures (for instance, whether they offered appropriate treatment for overweight patients). Additionally, popHealth saves a huge amount of tedious manual effort by letting them automatically generate reports about these measures for government agencies. Indivo fills the highly valued space of personal health records. It is highly modular, permitting new data sources and apps to be added; in fact, speaker Daniel Haas wants it to be an "app store" for medical applications. Both projects use modern languages, frameworks, and databases, facilitating adoption and use.

Other health care track sessions

An excellent and stimulating track was rounded out with several other talks.

Shahid Shah delivered a talk on connecting medical devices to electronic record systems. He adroitly showed how the data collected from these devices is the most timely and accurate data we can get (better than direct reports from patients or doctors, and faster than labs), but we currently let it slip away from us. He also went over standard pieces of the open source stacks that facilitate the connection of devices, talked a bit about regulations, and discussed the role of routine engineering practices such as risk assessments and simulations.

Continuing on the quality theme, David Richards mentioned some lessons he learned designing a ways clinical decision support system. It's a demanding discipline. Accuracy is critical, but results must be available quickly so the doctor can use them to make decisions during the patient visit. Furthermore, the suggestions returned must be clear and precise.

Charlie Quinn talked about the collection of genetic information to achieve earlier diagnoses of serious conditions. I could not attend his talk because I was needed at another last-minute meeting, but I sat down for a while with him later.

The motto at his Benaroya Research Institute is to have diagnosis be more science, less art. With three drops of blood, they can do a range of tests on patients suspected of having particular health conditions. Genomic information in the blood can tell a lot about health, because blood contains viruses and other genomic material besides the patient's own genes.

Tests can compare the patients to each other and to a healthy population, narrowing down comparisons by age, race, and other demographics. As an example, the institute took samples before a vaccine was administered, and then at several frequent intervals in the month afterward. They could tell when the vaccine had the most powerful effect on the body.

The open source connection here is the institute's desire to share data among multiple institutions so that more patients can be compared and more correlations can be made. Quinn said it's hard to get institutions to open up their data.

All in all, I was energized by the health care track this year, and really impressed with the knowledge and commitment of the people I met. Audience questions were well-informed and contributed a lot to the presentations. OSCon shows that open source health care, although it hasn't broken into the mainstream yet, already inspires a passionate and highly competent community.

July 22 2011

Preview of OSCON's health care track

The success of our health care track at the O'Reilly Open Source convention last year (which I covered in a series of blogs) called for a follow-up. This year we offer another impressive line-up. In fact, we had to turn away several interesting presenters, some of whom I am following up with for separate interviews or work projects. This year we're looking more at what you — patients, clinicians, and researchers — can do with the data you collect, while we continue our coverage of critical IT parts of the health care system.

The health care sessions are tucked away in a little-trafficked area of the Oregon Convention Center. To get to the room, you have to come down to the ground level and walk all the around, away from the registration area, to the part of the building facing the Boulevard. I'm writing this blog to encourage more OSCON attendees to take the steps there.

Open source and health care go together like alkyls and hydroxyls. Open source software offers greater standards compliance, which helps institutions exchange critical data, and gives the extremely diverse field of health providers the flexibility they need to offer the kinds of interfaces and services they want.

And open source software is starting to take its rightful place in the health care field. The people that the federal government put in charge of driving improvements in health care — the Office of the National Coordinator in the Department of Health and Human Services — know the benefits of open source well, and have instigated many such projects. OSCON highlights some of the best-known, as well as some valuable ones that most people have never heard of.

Alison Muckle and, Jason Goldwater will report on the state of open source software in health care. Under contract from the ONC, these researchers studied several systems and found they could provide the features needed by health care providers at a low cost.

Health IT at OSCON 2011 — The conjunction of open source and open data with health technology promises to improve creaking infrastructure and give greater control and engagement for patients. These topics will be explored in the health care track at OSCON (July 25-29 in Portland, Ore.)

Save 20% on registration with the code OS11RAD

The CONNECT Gateway was an existing government project that the ONC decided to make open source. It cut through the proprietary mess in health information exchanges that has been holding doctors back from sharing information on patients for years. Instead of individual translations from one proprietary record system to another (leading to N2 translation procedures among N systems), CONNECT provided a standard protocol that vendors are adopting.

Drawing on the expertise in open source communities demonstrated by Brian Behlendorf (of Apache fame) and David Riley, the ONC built a robust community around CONNECT, including both commercial entities and individuals who care about health care. Behlendorf and Riley spun out the non-profit Alembic Foundation to coordinate further community efforts, and Riley is coming to the health care track to describe where the project is going.

Although CONNECT is going to turn more and more in health information exchanges, it is SOAP-based and heavyweight for casual exchanges among small medical practices. The rural one-physician practice needs to exchange data as much as the hospitals that use CONNECT, and needs to meet the same requirements for privacy. The Direct project provides secure email and other simple channels between facilities that trust each other.

Microsoft provided an open source .NET library for creating tools that use Direct, and its many facets will be covered by Vaibhav Bhandari and Ali Emami.

As Bhandari's and Emami's talk shows, implementation is a big part of the job of making standards and open source software work. Three representatives of open source projects will discuss their collaboration on an open source health project.

Among the lesser-known beneficiaries of ONC funding is the popHealth project, run by MITRE. The vision behind this project is improving the quality of health care, which requires collecting data from each health care provider. When providers know they're being measured for quality, they do more of the right things like encouraging patients to come in for follow-up care. And when we know where we did well and not so well, we can apply resources to raise up the laggards.

So the health care reform in the stimulus bill, in calling for "meaningful use" of provider data, gives the providers incentives to send a range of data to the government, such as how many smokers and diabetics they treat. popHealth hooks into electronic health records and makes it easy to generate reports on standard measures of quality. It also provides simple Web interfaces where providers can check their data, and this encourages them to track quality themselves. Programmer Andrew Gregorowicz will speak about popHealth at OSCON. (I interviewed Gregorowicz about his presentation.)

Another intriguing presentation can be expected by David Uhlman, who covers not only the use of open source for meeting meaningful use requirements, but how providers can use open source tools to manage the data.

David Richards also covers ways to use data for planning and quality improvement, a practice generally known as clinical decision support.

Patients can collect and use data as well, as explained by Fred Trotter in his talk on the quantified self and software.

Personal health records let patients keep track of their health information, but as the recent demise of Google Health shows, it's hard to make a system that ordinary people find useful. Institutions can use the open source Indivo system to give clients access to PHRs. Its API and uses will be described in a talk by chief architect Daniel Haas. (I interviewed Haas about his presentation.)

Another key link in the chain of patient data is the increasing number of devices that measure blood pressure, glucose levels, and other vital signs. Shahid Shah explains why high-quality treatment depends on connecting these devices to health records, and interviewed Shah about his presentation.)

Charlie Quinn addresses himself to researchers trying to tame the flood of data that will be generated by meaningful use and other international sources.

Several sessions in other tracks are also related to health care:

There's so much happening this year at OSCON that I wish I were multi-threaded. (But, actually, threads are on the way out this year — didya hear? Asynchronous callbacks are in.) I hope some of you can make time, by cloning yourselves if necessary, and join us at some of the talks in this blog.



Related:


July 21 2011

OSCON Preview: Interview with Greg Biggers on DIY clinical trials

Participatory, open methods are revolutionizing some of the most rigid and controlled practices of modern life, and thanks to an organization called Greg Biggers (who is presenting a talk on this topic at the O'Reilly Open Source convention with co-presenter Raymond McCauley) heralds the opening of clinical trials as a way to accelerate findings, reveal more data of value to future trials, and--perhaps most important--make participants feel really good about doing it.

In this podcast interview, I talk to Greg about:

  • The benefits of participants sharing information about their experiences during clinical trials

  • Why secrecy is not necessarily a good thing in trials

  • Why researchers are warming to these kinds of trials

  • The sense of engagement and ownership created by the system

July 27 2010

Wrap-up of the health care IT track at O'Reilly's Open Source convention

The first health care track to be included in an O'Reilly conference covered all three days of sessions at last week's Open Source convention and brought us 22 talks from programmers, doctors, researchers, corporate heads, and health care advocates. We grappled throughout these three days--which included two popular and highly vocal Birds of a Feather gatherings--with the task of opening up health care.

It's not surprising that, given this was an open source conference,
the point we heard from speakers and participants over and over again
was how critical it is to have open data in health care, and how open
source makes open data possible. Like most commercial fields, health
care is replete with managers and technologists who don't believe open
source software can do the job of powering and empowering busy
clinicians in high-risk situations. Some of the speakers spent time
challenging that view.

I decided over the course of the week that the health care industry
has two traits that make it more conservative than many fields. On the
one hand, the level of regulation and certification is mind-boggling.
Hardly any technical job can be taken without a particular course of
training and a certificate. Privacy regulations--which are interpreted
somewhat differently at every clinic--get in the way of almost anyone
doing anything new. Software has to be certified too, not something
that software firms in most domains are accustomed to. All these
controls are in place for good reason, and help you feel safe
proffering your arm for a needle or popping the pills each day your
doctor told you to take.

Paradoxically, though, the health care field is also resistant to
change because the actors in it are so independent. Health care is the
most fragmented industry in the country, with 80% of medical practices
consisting of one or two physicians.

Doctors don't like to be told what to do. A lot of them are not
persuaded that they should supplement their expert opinion with the
results of evidence-based medicine and clinical decision support, the
big campaigns right now among health care researchers and leaders
within the Administration, notably the recent appointee Donald Berwick
at the Centers for Medicare and Medicaid Services.

And even medical researchers are hard to gather around one set of
standards for data, because each one is looking for new ways to cut
and crunch the results and believes his or her approach is special.

So these are the conditions that software developers and vendors have
to deal with. Beckoning us forward are the Administration's
"meaningful use" criteria, which list the things a health care record
system should do to improve health care and cut costs.

Open source definitely needs more commercial champions to bridge the
classic gap in packaging and support between the developer community
and the not-so-computer-savvy health care teams. We heard from three
such companies at the conference: href="http://www.mirthcorp.com/">Mirth, href="http://www.vxvista.org/">vxVistA, and href="http://medsphere.org">Medsphere.

Of the major projects in electronic health records presented at the
conference --VistA, Tolven,
and openEMR--two were developed for
purposes outside the mainstream U.S. health care industry (VistA for
the Veterans Administration and openEMR for developing countries).
Although all these projects can point to successful installations in
mainstream organizations, they haven't hit the critical mass that
makes inherently conservative health care practices feel comfortable
adopting them.

But in this specific area of electronic records, I think the
proprietary software vendors are equally challenged to show that they
can meet the nation's needs. After some thirty years, they have become
common only in large hospitals and penetrated only a small number of
those small providers I mentioned before. The percentage of health
care providers who use electronic health records is between 18 and the
low 20's.

Licensing can easily be $15,000 per year per doctor, which small
practices just don't have. I won't harp on this, because converting
old records costs more than the licenses, and converting your whole
workflow and staff behavior is harder still. More disturbing is that a
large number of providers who go through the strain of installing
electronic records find that they don't produce cost savings or other
benefits.

Electronic records have been a success at huge providers like Partners
in Massachusetts and Kaiser Permanente in California, but one speaker
reported that Kaiser had to spend one billion (yes, that's a "b")
dollars to implement the kinds of data exchange and quality control
functions specified by the meaningful use criteria.

But we have to look pass the question of who would win the race to
digitize the offices of doctors in the U.S.--and around the world--and
envision a more open health care system where data can drive
high-quality care. I covered the first two days of the health care
track in the following blogs:

href="http://radar.oreilly.com/2010/07/day-one-of-the-health-care-it.html">
Day one of the health care IT track at O'Reilly's Open Source
convention

href="http://radar.oreilly.com/2010/07/vista-scenarios-and-other-cont.html">
VistA scenarios, and other controversies at the Open Source health
care track

and I'll summarize the tracks from day 3 here.

Open source for the things that keep you alive


Karen Sandler, a lawyer from the Software Freedom Law Center,
spoke
about the hundreds of thousands of devices--pacemakers,
insulin delivery devices, defibrillators, and others--that are
implanted in people's bodies each year. These devices fail sometimes,
and although reports do not classify which failures are caused by
software problems, some of them pretty clearly are.

The FDA does not audit software as part of the approval process for
devices, although it occasionally requires the manufacturer to show it
the software when failures are reported. Devices are also controlled
by unencrypted messages over ordinary wireless connections. (The
manufacturers avoid encryption in order to spare the device's
battery.) In short, software with control over life and death is being
installed in millions of people with essentially no regulation.

Sandler's key policy call is to force the source code open for
auditing purposes. She also would like to see open hardware and give
the patients the right to alter both hardware and software, although
these are more remote possibilities. Sandler's talk, based both on
careful research and painful personal health experiences, drew a
sizeable audience and excited fervent sympathy. The talk was aptly
timed just as the SFLC released a href="http://www.softwarefreedom.org/news/2010/jul/21/software-defects-cardiac-medical-devices-are-life-/">report
on this issue.

HealthVault and open data on the web

Two brief talks from Microsoft programmers, href="http://www.oscon.com/oscon2010/public/schedule/detail/15292">
Vaibhav Bhandari and href="http://www.oscon.com/oscon2010/public/schedule/detail/14952">Teddy
Bachour, did a nice job of introducing key standards in the health
care field and showing how flexible, carefully designed tools could
turn those standards into tools for better patient and doctor control
over data.

I felt that standards were underrepresented in our health care track,
and scheduled a BOF the night before where we discussed some of the
general issues making standards hard to use. Bhandari showed a few of
the libraries that Microsoft HealthVault uses to make standards useful
ways to store and manipulate health data. Bachour showed the use of
Microsoft toolkits, some open source in CodePlex.

As an example of what programmers can do with these libraries and
toolkits, the Clinical Documentation Solution Accelerator enhances
Microsoft Word enhanced so that, as a doctor enters a report of a
patient visit, Word can prompt for certain fields and offer a
selection of valid keywords for such fields as diagnoses and
medications.

Data mining with open source tools

David Uhlman, who had spoken on Thursday about VistA and his company
ClearHealth, ended the
health care track with a href="http://www.oscon.com/oscon2010/public/schedule/detail/15242">dazzling
tour applying neural network analysis, genetic algorithms,
visualization, and other tools to basic questions such as "How many of
my patients are likely to miss their visits today?" and common tasks
such as viewing multiple lab results together over time.

Every conference has to have a final session, of course, and every
final session suffers from decreased attendance. So did Uhlman's
scintillating talk, but I felt that his talk deserves more attention
because he goes to the heart of our job in health care IT: to take the
mounds of new data that electronic records and meaningful use will
generate and find answers to everyday problems bedeviling
practitioners.

Luckily, Uhlman's talk was videotapes--as were all the others that I
reported in my three blogs--and will be put on the Web at some point.
Stay tuned, and stay healthy.

July 23 2010

VistA scenarios, and other controversies at the Open Source health care track

The history and accomplishments attributed to VistA, the Veterans
Administration's core administrative software, mark it as one of the
most impressive software projects in history. Still, lots of smart
people in the health care field deprecate VistA and cast doubt that it
could ever be widely adopted. Having spent some time with people on
both sides, I'll look at their arguments in this blog, and then
summarize other talks I heard today at the href="http://www.oscon.com/oscon2010">Open Source Convention
health care track.

Yesterday, as href="http://radar.oreilly.com/2010/07/day-one-of-the-health-care-it.html">I
described in my previous blog, we heard an overview of trends in
health care and its open source side in particular. Two open source
free software projects offering electronic health records were
presented, Tolven and href="http://www.oemr.org/">openEMR. Today was VistA day, and
those who stayed all the way through were entertained by accolades of
increasing fervor from the heads of href="http://www.oscon.com/oscon2010/public/schedule/detail/15291">vxVistA,
href="http://www.oscon.com/oscon2010/public/schedule/detail/15255">Medsphere,
and ClearHealth. (Anyone
who claims that VistA is cumbersome and obsolete will have to explain
why it seems to back up so many successful companies.) In general, a
nice theme to see today was so many open source companies making a go
of it in the health care field.

VistA: historical anomaly or the future of electronic medical systems?

We started our exploration of VistA with a href="http://www.oscon.com/oscon2010/public/schedule/detail/15274p">stirring
overview by Phillip Longman, author of the popular paperback book,
Best Care Anywhere: Why VA Health Care is Better Than
Yours
. The story of VistA's development is a true medical
thriller, with scenes ranging from sudden firings to actual fires
(arson). As several speakers stressed, the story is also about how the
doctors at the VA independently developed the key aspects of open
source development: programming by the users of the software, loose
coordination of independent coders, freedom to fork, and so on.

Longman is convinced that VistA could and should be the basis of
universal health records in the U.S., and rains down omens of doom on
the comprehensive health care bill if it drives physicians to buy
proprietary health record systems.

VistA is much more than an electronic health record system, and even
bigger than a medical system. It is really a constellation of hundreds
of applications, including food preparation, library administration,
policing, and more.

The two main objections to VistA are:


That it is clunky old code based on an obsolete language and database technology

As a project begun by amateurs, VistA probably contains some fearsome
passages. Furthermore, it is written in MUMPS (standardized by ANSI as
simply M), a language that dates from the time of LISP and
COBOL. Predating relational databases, MUMPS contains a hierarchical
database based on a B*-tree data structure.

Supporters of Vista argue that anything qualifying as "legacy code"
can just as well be called "stable." They can also answer each of
these criticisms:

  • The code has been used heavily by the VA long enough to prove that
    it is extendable and maintainable.

  • It is strangely hypocritical to hear VistA's use of MUMPS criticized
    by proprietary vendors when so any of them are equally dependent on
    that language. Indeed, the best-known vendors of proprietary health
    care software, including Epic and InterSystems, use MUMPS. Need I
    remind readers that we put a man on the moon using 1960s-style
    FORTRAN?

    It's interesting to learn, however, that ClearHealth is migrating
    parts of VistA away from MUMPS and does most of its coding in
    higher-level languages (and many modern programmers would hardly offer
    praise for the language chosen for ClearHealth's interface, PHP).

  • Similarly, many current vendors use the Cache hierarchical
    database. Aspersions concerning pre-relational databases sound less
    damning nowadays in an age of burgeoning interest in various NoSQL
    projects. Still, Medsphere and the community-based href="http://www.worldvista.org/">WorldVistA project are
    creating a SPARQL interface and a mechanism for extracting data from
    VistA into a MySQL database.


That it works well only in the unique environment of the Veterans Administration

This critique seems to be easier to validate through experience. The
VA is a monolithic, self-contained environment reflected in VistA. For
instance, the critical task of ordering prescriptions in VistA depends
on the pharmacy also running VistA.

Commercial pharmacies could theoretically interact with VistA, but it
would require effort on the part of those companies, which in turn
would depend on VistA being adopted by a substantial customer base of
private hospitals.

Several successful deployments of VistA by U.S. hospitals, as well as
adoption by whole networks of hospitals in several other countries,
indicate that it's still a viable option. And the presence of several
companies in the space shows that adopters can count on support.

On the other hand, the competing implementations by vxVistA,
Medsphere, and ClearHealth complicate the development landscape. It
might have been easier if a single organization such as WorldVistA
could have unified development as the Apache or GNOME foundation does.

vxVistA has come in for particular criticism among open source
advocates. In fact, the speakers at today's conference started
out defensive, making me feel some sympathy for them.

vxVistA's developers, the company DSS, kept their version of VistA
closed for some time until they had some established customers.
Speaker Deanne Clark argued that they did this to make sure they had
enough control over their product to produce some early successes,
warning that any failure would hurt the image of the whole VistA
community. I don't know why a closed development process is necessary
to ensure quality, but I'll accept her explanation. And DSS seems to
be regarded highly for its quality work by everyone, including those
who embroil

More galling to other open source advocates is that when DSS did
release vxVistA as open source, they did so under an Eclipse license
that is incompatible with the GPL used by WorldVistA.

I wouldn't dare guess whether VistA will continue as a niche product
or will suddenly emerge to eat up the U.S. market for electronic
medical systems. But I think it's definitely something to watch.

The odd position of the VA as the source for new versions of VistA, as
well as its role as VistA's overwhelmingly largest user, could also
introduce distortions into the open source development pattern outside
the VA. For instance, commercial backers of VistA are determined to
get it certified for meaningful use so that their clients can win
financial rewards from the Department of Health and Human
Services. But the VA doesn't have to be certified for meaningful use
and doesn't care about it. (As David Uhlman of ClearHealth pointed
out, nearly everything in the meaningful use criteria was done thirty
years ago by the VA using VistA.)

The VA even goes through periods of refusing bug fixes and
improvements from the outside community. Luckily, the VA lets some of
its programmers participate on WorldVistA forums, and seems interested
in getting more involved.

Other presentations

Attendance varies between 30 and 70 people for today's health care
session. Roni Zeiger of Google brought out a big crowd for his href="http://www.oscon.com/oscon2010/public/schedule/detail/15272">discussion
of Google's interest in health care, with a focus on how its API
accepts data from devices.

Zeiger pointed out that we lead most of our lives outside doctor's
offices (unless we're very unlucky) and that health information should
be drawn from everyday life as well. A wide range of devices can
measure everything from how fast we walk to our glucose levels. Even
if all you have is a smart phone, there are a lot of things you can
record. Collecting this kind of data, called Observations of Daily
Living, is becoming more and more popular.

  • One app uses GPS to show your path during a run.

  • Another app uses the accelerometer to show your elevation during a
    bike ride.

  • One researcher uses a sensor, stuck into an inhaler, to feed data to a
    phone and collect information on where and when people have asthma
    attacks. If we collect a lot of data from a lot of people over time,
    we may learn more about what triggers these attacks.

  • On the fun side, a Google employee figured out how to measure the
    rotation of bike pedals using the magnet in an Android phone. This
    lets employees maintain the right aerobic speed and record what how
    fast and their friends are peddling.

You can set up Google Health to accept data from these
devices. Ultimately, we can also feed the data automatically to our
doctors, but first they'll need to set up systems to accept such
information on a regular basis.

Will Ross href="http://www.oscon.com/oscon2010/public/schedule/detail/14944">described
a project to connect health care providers across a mostly rural
county in California and exchange patient data. The consortium
found that they had barely enough money to pay a proprietary vendor of
Health Information Exchange systems, and no money for maintenance. So
they contracted with
Mirth
Corporation
to use an open source solution. Mirth supports
CONNECT, which I described in
href="http://radar.oreilly.com/2010/07/day-one-of-the-health-care-it.html">yesterday's
blog, and provides tools for extracting data from structured
documents as well as exchanging it.

Nagesh Bashyam, who runs the large consulting practice that Harris
Corporation provides to CONNECT, href="http://www.oscon.com/oscon2010/public/schedule/detail/15267">talked
about how it can lead to more than data exchange--it can let a doctor
combine information from many sources and therefore be a platform for
value-added services.

Turning to academic and non-profit research efforts, we also heard
today from href="http://www.oscon.com/oscon2010/public/schedule/detail/15279">
Andrew Hart of NASA's Jet Propulsion Laboratory and some colleagues at
Children's Hospital Los Angeles. Hart described a reference
architecture that has supported the sharing of research data among
institutions on a number of large projects. The system has to be able
to translate between formats seamlessly so that researchers can
quickly query different sites for related data and combine it.

Sam Faus of Sujansky & Associates href="http://www.oscon.com/oscon2010/public/schedule/detail/15275">recounted
a project to create a Common Platform for sharing Observations of
Daily Living between research projects. Sponsored by the Robert Wood
Johnson Foundation to tie together a number of other projects in the
health care space, Sujansky started its work in 2006 before there were
systems such as Google Health and Microsoft Health Vault. Even after
these services were opened, however, the foundation decided to
continue and create its own platform.

Currently, there are several emerging standards for ODL, measuring
different things and organizing them in different ways. Faus said this
is a reasonable state of affairs because we are so early in the
patient-centered movement.

I talked about standards later with David Riley, the government's
CONNECT initiative lead. HHS can influence the adoption of standards
through regulation. But Riley's office has adopted a distributed and
participatory approach to finding new standards. Whenever they see a
need, they can propose an area of standardization to HHS's
specification advisory body. The body can prioritize these
requests and conduct meetings to hammer out a standard. To actually
enter a standard into a regulation, however, HHS has to follow the
federal government's rule-making procedures, which require an
eighteen-month period of releasing draft regulations and accepting
comments.

It's the odd trait of standards that discussions excite violent
emotions among insiders while driving outsiders to desperate
boredom. For participants in this evening's Birds of a Feather
session, the hour passed quickly discussing standards.

The 800-pound gorilla of health care standards is the HL7 series,
which CONNECT supports. Zeiger said that Google (which currently
supports just the CCR, a lighter-weight standard) will have to HL7's
version of the continuity of care record, the CCD. HL7 standards have
undergone massive changes over the decades, though, and are likely to
change again quite soon. From what I hear, this is urgently
necessary. In its current version, the HL7 committee layered a
superficial XML syntax over ill-structured standards.

A major problem with many health care standards, including HL7, is the
business decision by standard-setting bodies to fund their activities
by charging fees that put standards outside the reach of open source
projects, as well as ordinary patients and consumers. Many standards
bodies require $5.00 or $10.00 per seat.

Brian Behlendorf discussed the recent decision of the NHIN Direct
committee to support both SOAP versus SMTP for data exchange. Their
goal was to create a common core that lets proponents of each system
do essentially the same thing--authenticate health care providers and
exchange data securely--while also leaving room for further
development.

July 22 2010

Day one of the health care IT track at O'Reilly's Open Source convention

I think the collective awe of health care aficionados at the href="http://www.oscon.com/oscon2010">Open Source Convention came
to a focal point during our evening Birds of a Feather session, when
open source advocate Fred Trotter, informally stepping in as session
leader, pointed out that the leaders of key open source projects in
the health care field were in the room, including two VistA
implementors (Medsphere and href="http://www.worldvista.org/">WorldVistA), href="http://www.tolvenhealth.com/">Tolven, and href="http://wwwf.oemr.org/">openEMR--and not to forget two other
leading health care software initiatives from the U.S. government, href="http://www.connectopensource.org">CONNECT and href="http://nhindirect.org/">NHIN Direct.

This meeting, which drew about 40 doctors, project leaders,
programmers, activist patients, and others, was the culmination of a
full day of presentations in the first track on health care at an
O'Reilly conference. The day's sessions unveiled the potential of open
source in health care and how dedicated implementors were making it a
reality, starting with an scene-setting talk by Tim O'Reilly that
attracted over 75 people and continuing through the next seven hours
until a dwindling hard core delayed drinks and hors d'oeuvres for half
an hour to hear a final late talk by href="http://www.oscon.com/oscon2010/public/schedule/detail/13943">Melanie
Swan on DIYgenomics.

Nine talks representing the breadth of a vital programming area can't
be summarized in one sentence, but for me the theme of the day was
open source advocates reaching out to solve pressing problems that
proprietary vendors will not or cannot address.

Tim O'Reilly's talk laid out key elements of the health care
revolution: electronic records, the quantified self (measuring one's
bodily activities), and the Internet of things that allows one to track
behavior such as whether a patient has taken his medicine.

Talk to me

We were honored to have key leaders from Health and Human Services
speak at today's conferences about its chief open source projects. href="http://www.oscon.com/oscon2010/public/schedule/detail/13257">David
Riley and Brian Behlendorf (known best for his work on Apache)
came from the Office of the National Coordinator along with lead
contractor href="http://www.oscon.com/oscon2010/public/schedule/detail/15304">Arien
Malec to show us the current status and--most exciting--the future
plans for CONNECT and NHIN Direct, which are key pieces of the
Administration's health care policy because they allow different
health care providers to exchange patient information securely.

I have href="http://radar.oreilly.com/2010/07/health-and-human-services-fina.html">written
recently about "meaningful use" for health care records. Malec
provided a homespun and compelling vision of the problems with the
current health care system: in contrast to the old days where doctors
knew every patient personally, modern health care is delivered as
episodic interventions. As Fred Trotter said in his talk, we've
reached the limit of what we can achieve through clinical efforts.
Doctors can do miracles compared to former times, but the problems we
suffer from increasingly call for long-range plans. Malec said that
health care systems need to remember us. That's what
electronic health records can do, combined with the data exchange
protocols provided by NHIN.

Riley, in what is likely to be one of the most revisited talks of the
conference--yes, we recorded the sessions and will put them
online--rapidly laid out the architecture of CONNECT and what's
planned for upcoming releases. Requests between agencies for health
care data have gone from months to minutes with CONNECT. Currently
based on SOAP, it is being refactored so that in the future it can run
over REST, XMPP, and SMTP.

NHIN Direct, the newer and more lightweight protocol, is also based on
digital certificates and uses S/MIME with SMTP over TLS. Parties can
do key exchange themselves or work through a trusted third party. It
seems to me, therefore, that CONNECT and NHIN Direct will eventually
merge. It is as if the NHIN Direct project was started to take a big
step back from CONNECT, look at what it achieved for the government
agencies that produce or consume health care and how the same benefits
could be provided to health care providers all over the country, and
to formalize an architecture that would become the new CONNECT.

NHIN is an even more impressive case of open government and
collaborative development than CONNECT. The public was involved from
the earliest design stage. Some people complained that established
vendors bent the process to preserve their advantages, but they
probably had less success this way than if HHS followed normal
government procedures. NHIN already has reference implementations in
Java and C#. If you're inspired to help bring health records to the
public, you can read the wikis and attend some training and contribute
reference implementations in your language of choice.

In addition to supporting the NHIN Direct protocol, some of the
upcoming features in CONNECT include:

  • Identity management services. This will probably be based on a
    voluntary patient identifier.

  • Support for meaningful use criteria.

  • Support for structured data, allowing the system to accept input in
    standards such as the CCR or CCD and populate documents. One feature
    enabled by this enhancement will be the ability to recognize sensitive
    health data and remove it before sending a record. (CONNECT can be
    used for all health-related data, not just personal medical records.)

  • Moving to the Spring Framework.

Riley has done some pretty rigorous cost analysis and determines that
careful management--which includes holding costs down and bringing
multiple agencies together to work on CONNECT--has reduced development
costs from over 200 million dollars to about 13 million dollars.
Recent code sprints drew heavily from community volunteers: 4 or 5
volunteers along with 12 contractors.

In an href="http://www.oscon.com/oscon2010/public/schedule/detail/15296">overview
talk, Deborah Bryant of OSU Open Source Lab raised the issue
continuity in relation to NHIN and CONNECT. Every open source project
has to figure out how to keep a community of volunteers interested so
that the project continues to evolve and adapt to changing
circumstances. Government-backed projects, she admitted, provide
funding over a sustained period of time, but this does not obviate the
need for community management.

In addition, CONNECT is run by a consulting firm with paid contractors
who have to learn how to accept community input and communicate with
outsiders. Behlendorf said that simple things like putting all code
in Subversion and all documentation on a wiki helps. Consultants are
also encouraged to request feedback on designs and to talk about the
goals of sprints as far as possible in advance.

IntraHealth International manages the basic health care resource: people

The problems of the developing world were represented most directly by
the open source human resource information system href="http://www.intrahealth.org/">IntraHealth International,
presented by href="http://www.oscon.com/oscon2010/public/schedule/detail/15268">
Carl Leitner. IntraHealth International helps many Sub-Saharan and
South Asian countries manage one of their most precious and dwindling
resources: health care professionals. The system, called iHRIS lets
individual hospitals as well as whole nations determine where their
most pressing staffing needs lie, break down staff by demographic
information such as age and gender (even language can be tracked), and
track their locations.

Training is one of the resources that must be managed carefully. If
you know there's a big gap between the professionals you need and ones
you have, you can direct scarce funding to training new ones. When
iHRIS records expenditures, what do countries often find? Some
administrator has splurged on sending himself to the same training
program over and over, just to get the per diem. Good information can
expose graft.

Open source is critical for a system like iHRIS, not just because
funds are scarce, but because localization is critical. Lots of
languages whose very existence is hidden from proprietary vendors need
to be supported. Each country also has different regulations and
conditions. IntraHealth International holds regular unconferences,
mentoring, and other forms of training in its target countries in the
hope of (in Leitner's words) putting themselves out of business. Of
course, trained IT staff tend to drift into higher-paying jobs, so the
organization tries to spread the training over many people.

OpenEMR and Tolven

The overarching challenge for any electronic health record system, if
its developers hope it to be taken seriously over the next couple
years in the United States, is support for meaningful use criteria.
Proprietary systems have, for several decades, met the needs of large
institutions with wads of cash to throw at them. And they will gain
certification to support meaningful use as well. But smaller providers
have been unable to afford these systems.

The need for an open source solution with meaningful use certification
is pressing, and two project leaders of OpenEMR devoted href="http://www.oscon.com/oscon2010/public/schedule/detail/14893">their
talk to their push to make their system ready. They estimate that
they have implemented about 80% of the required functionality, but
more slowly than expected. Extraordinary measures were required on
many fronts:

  • Medical experts had to read thousands of pages of specifications as
    they came out, and follow comments and debates to determine which
    requirements would likely be dropped or postponed, so as not to waste
    development time.

  • Contractors were hired to speed up the coding. Interestingly, the
    spike in productivity created by the contractors attracted a huge
    number of new volunteers. At one point openEMR became number 37 on
    SourceForge in terms of activity, and it is still up around 190. The
    project leaders had to upgrade some of their infrastructure to handle
    an increased number of commits. They also discovered that lack of
    documentation was a hindrance. Like the CONNECT team, they found that
    maintaining a community required--well, maintenance.


  • Project leaders had to go to Washington and argue with government
    bureaucrats to change requirements that would have essentially made it
    impossible for open source projects to meet the meaningful use
    requirements. They succeeded in removing the offending clauses, and
    believe they were also responsible for winning such accomplishments as
    allowing sites to certify modules instead of entire stand-alone
    systems. Nevertheless, some aspects of certification require
    contracts with proprietary vendors, such as lab interface, which is
    done through a proprietary company, and drug-to-drug and
    drug-to-allergy interactions, which require interaction with expensive
    databases.

Tony McCormick pointed out that the goal of meaningful use
certification provided a focus that most open source projects lack.
In addition, the government provided tests (called scripts) that
served as a QA plan.

Meaningful use, as much as it represents an advance over today's
health information silos, does not yet involve the patient. The
patient came to the fore in two other talks, one by href="http://www.oscon.com/oscon2010/public/schedule/detail/13943">Melanie
Swan on her company DIYgenomics and the other by href="http://www.oscon.com/oscon2010/public/schedule/detail/15051">Tom
Jones on Tolven.

Swan summarized the first two generations of DNA sequencing (which
went a bit above my head) and said we were on the verge of a third
generation that could bring full genome sequencing down to a cost that
consumers could afford. A part of the open science movement,
DIYgenomics helps patients combine with others to do research, a
process that is certainly less rigorous than controlled experiments
but can provide preliminary data that suggests future research. For
many rare conditions, the crowdsourced approach can fill a gap that
professional researchers won't fill.

In addition to providing access to studies and some other useful
apps--such as one that helps you evaluate your response to
drugs--DIYgenomics conducts its own longitudinal studies. One current
study checks for people who do not absorb vitamin B12 (folic acid)
properly, a condition to which up to half the population is
vulnerable. Another study, for which they are seeking 10,000
participants, covers aging.

Jones's talk centered on privacy, but spread its tent to include the
broader issues of patient-centered medicine. Tolven simultaneously
supports records held by the doctor (clinical health records) and by
the patient (personal health records).

In a system designed especially for the Netherlands--where privacy
laws are much stricter and better specified than in the United
States--Tolven stores medical records in large, centralized
repositories because it's easier to ensure security that way. However,
strict boundaries between doctors prevent them from viewing each
other's data. Even more significantly, data is encrypted during both
transmission and storage, and only the patient has the key to unlock
it. Audit trails add another layer of protection.

In this architecture, there are no release forms. Instead, the patient
explicitly approves every data transfer. (Patients can designate
special repositories to which their relatives have access, in case of
emergencies when they're not competent to make the transfer.)

That was one day of health care at OSCon--two more are coming up. We
started our evening BOF with introductions, but more and more people
kept coming in the room, and everyone was so interesting that the
introductions ended up taking the entire hour allocated for the BOF.
The sense that our health care system needs to change radically, and
the zeal expressed to take part in that change, brought energy into
the room. This was a great place to meet like-minded people.

March 04 2010

Report from HIMMS Health IT conference: toward interoperability and openness

Yesterday and today I spent once again at the href="http://www.himss.org/">Healthcare Information and Management
Systems Society (HIMSS) conference in Atlanta, rushing from panel
session to vendor booth to interoperability demo and back (or
forward--I'm not sure which direction I've been going). All these
peregrinations involve a quest to find progress in the areas of
interoperability and openness.

The U.S. has a mobile population, bringing their aches and pains to a
plethora of institutions and small providers. That's why health care
needs interoperability. Furthermore, despite superb medical research,
we desperately need to share more information and crunch it in
creative new ways. That's why health care needs openness.

My href="http://radar.oreilly.com/2010/03/report-from-himms-health-it-co.html">blog
yesterday covered risk-taking; today I'll explore the reasons it's
so hard to create change.

The health care information exchange architecture

Some of the vendors I talked to boasted of being in the field for 20
years. This give them time to refine and build on their offerings,
but it tends to reinforce approaches to building and selling software
that were prominent in the 1980s. These guys certainly know what the
rest of the computer field is doing, such as the Web, and they reflect
the concerns for interoperability and openness in their own ways. I
just feel that what I'm seeing is a kind of hybrid--more marsupial
than mammal.

Information exchange in the health care field has evolved the
following architecture:

Electronic medical systems and electronic record systems

These do all the heavy labor that make health care IT work (or fail).
They can be divided into many categories, ranging from the simple
capturing of clinical observations to incredibly detailed templates
listing patient symptoms and treatments. Billing and routine workflow
(practice management) are other categories of electronic records that
don't strictly speaking fall into the category of health records.
Although each provider traditionally has had to buy computer systems
to support the software and deal with all the issues of hosting it,
Software as a Service has come along in solutions such as href="http://www.practicefusion.com/">Practice Fusion.

Services and value-added applications

As with any complex software problem, nimble development firms partner
with the big vendors or offer add-on tools to do what health care
providers find too difficult to do on their own.

Health information exchanges (HIEs)

Eventually a patient has to see a specialist or transfer records to a
hospital in another city--perhaps urgently. Partly due to a lack of
planning, and partly due to privacy concerns and other particular
issues caught up in health care, transfer is not as simple as querying
Amazon.com or Google. So record transfer is a whole industry of its
own. Some institutions can transfer records directly, while others
have to use repositories--paper or electronic--maintained by states or
other organizations in their geographic regions.


HIE software and Regional Health Information Organizations
(RHIOs)

The demands of record exchange create a new information need that's
filled by still more companies. States and public agencies have also
weighed in with rules and standards through organizations called
Regional Health Information Organizations.

Let's see how various companies and agencies fit into this complicated
landscape. My first item covered a huge range of products that
vendors don't like to have lumped together. Some vendors, such as the
Vocera company I mentioned in yesterday's blog and href="http://solutions.3m.com/wps/portal/3M/en_US/3M_Health_Information_Systems/HIS/">3M,
offer products that capture clinicians' notes, which can be a job in
itself, particularly through speech recognition. href="http://emdeon.com/">Emdeon covers billing, and adds validity
checking to increase the provider's chances of getting reimbursed the
first time they submit a bill. There are many activities in a doctor's
office, and some vendors try to cover more than others.

Having captured huge amounts of data--symptoms, diagnoses, tests
ordered, results of those tests, procedures performed, medicines
ordered and administered--these systems face their first data exchange
challenge: retrieving information about conditions and medicines that
may make a critical difference to care. For instance, I saw a cool
demo at the booth of Epic, one of
the leading health record companies." A doctor ordered a diuretic that
has the side-effect of lowering potassium levels. So Epic's screen
automatically brought up the patient's history of potassium levels
along with information about the diuretic.

Since no physician can keep all the side-effects and interactions
between drugs in his head, most subscribe to databases that keep track
of such things; the most popular company that provides this data is href="http://firstdatabank.com/">First DataBank. Health record
systems simply integrate the information into their user interfaces.
As I've heard repeatedly at this conference, the timing and delivery
of information is just as important as having the information; the
data is not of much value if a clinician or patient has to think about
it and go searching for it. And such support is central to the HITECH
act's meaningful use criteria, mentioned in yesterday's blog.

So I asked the Epic rep how this information got into the system. When
the physicians sign up for the databases, the data is sent in simple
CSV files or other text formats. Although different databases are
formatted in different ways, the health record vendor can easily read
it in and set up a system to handle updates.

Variations on this theme turn up with other vendors. For instance, href="http://www.nextgen.com/">NextGen Healthcare contracts
directly with First DataBank so they can integrate the data intimately
with NextGen's screens and database.

So where does First DataBank get this data? They employ about 40
doctors to study available literature, including drug manufacturers'
information and medical journals. This leads to a constantly updated,
independent, reliable source for doses, side-effects,
counterindications, etc.

This leads to an interesting case of data validity. Like any
researchers--myself writing this blog, for instance--First DataBank
could theoretically make a mistake. Their printed publications include
disclaimers, and they require the companies who licence the data to
reprint the disclaimers in their own literature. But of course, the
disclaimer does not pop up on every dialog box the doctor views while
using the product. Caveat emptor...

Still, decision support as a data import problem is fairly well
solved. When health record systems communicate with each other,
however, things are not so simple.

The challenges in health information exchange: identification

When a patient visits another provider who wants to see her records,
the first issue the system must face is identifying the patient at the
other provider. Many countries have universal IDs, and therefore
unique identifiers that can be used to retrieve information on a
person wherever she goes, but the United States public finds such
forms of control anathema (remember the push-back over Read ID?).
There are costs to restraining the information state: in this case,
the hospital you visit during a health crisis may have trouble
figuring out which patient at your other providers is really you.

HIEs solve the problem by matching information such as name, birth
date, age, gender, and even cell phone number. One proponent of the
federal government's Nationwide
Health Information Network
told me it can look for up to 19 fields
of personal information to make a match. False positives are
effectively eliminated by strict matching rules, but legitimate
records may be missed.

Another issue HIEs face is obtaining authorization for health data,
which is the most sensitive data that usually concerns ordinary
people. When requesting data from another provider, the clinician has
to log in securely and then offer information not only about who he is
but why he needs the data. The sender, for many reasons, may say no:

  • Someone identified as a VIP, such as a movie star or high-ranking
    politician, is automatically protected from requests for information.

  • Some types of medical information, such as HIV status, are considered
    especially sensitive and treated with more care.

  • The state of California allows ordinary individuals to restrict the
    distribution of information at the granularity of a single institution
    or even a single clinician, and other states are likely to do the
    same.

Thus, each clinician needs to register with the HIE that transmits the
data, and accompany each request with a personal identifier as well as
the type of information requested and the purpose. One service I
talked to, Covisint, can query
the AMA if necessary to verify the unique number assigned to each
physician in the us, the Drug Enforcement Administration (DEA) number.
(This is not the intended use of a DEA number, of course; it was
created to control the spread of pharmaceuticals, not data.)

One of the positive impacts of all this identification is that some
systems can retrieve information about patients from a variety of
hospitals, labs, pharmacies, and clinics even if the requester doesn't
know where it is. It's still up to them to determine whether to send
the data to the requester. Currently, providers exchange a Data Use
and Reciprocal Support Agreement (DURSA) to promise that information
will be stored properly and used only for the agreed-on purpose.
Exchanging these documents is currently cumbersome, and I've been told
the government is looking for a way to standardize the agreement so
the providers don't need to directly communicate.

The challenges in health information exchange: format

Let's suppose we're at the point where the owner of the record has
decided to send it to the requester. Despite the reverence expressed
by vendors for HL7 and other
standards with which the health care field is rife, documents require
a good deal of translation before they can be incorporated into the
receiving system. Each vendor presents a slightly different challenge,
so to connect n different products a vendor has to implement
n2 different transformations.

Reasons for this interoperability lie at many levels:

Lack of adherence to standards

Many vendors created their initial offerings before applicable
standards existed, and haven't yet upgraded to the standards or still
offer new features not covered by standards. The meaningful use
criteria discussed in yesterday's blog will accelerate the move to
standards.

Fuzzy standards

Like many standards, the ones that are common in the medical field
leave details unspecified.

Problems that lie out of scope

The standards tend to cover the easiest aspect of data exchange, the
document's format. As an indication of the problem, the 7 in HL7
refers to the seventh (application) layer of the ISO model. Brian
Behlendorf of Apache fame, now consulting with the federal government
to implement the NHIN, offers the following analogy. "Suppose that we
created the Internet by standardizing HTML and CSS but saying nothing
about TCP/IP and DNS."

Complex standards

As in other fields, the standards that work best in health records are
simple ones. There is currently a debate, for instance, over whether
to use the CCR or CCD exchange format for patient data. The trade-off
seems to be that the newer CCD is richer and more flexible but a lot
harder to support.

Misuse

As one example, the University of Pittsburgh Medical Center tried to
harmonize its problem lists and found that a huge number of
patients--including many men--were coded as smoking during pregnancy.
They should have been coded with a general tobacco disorder. As Dr.
William Hogan said, "People have an amazing ability to make a standard
do what it's not meant to do, even when it's highly specified and
constrained."

So many to choose from

Dell/Perot manager Jack Wankowski told me that even though other
countries have digitized their health records far more than the U.S.
has, they have a lot fewer published standards. It might seem logical
to share standards--given that people are people everywhere--but in
fact, that's hard to do because diagnosis and treatment are a lot
different in different cultures. Wankowski says, "Unlike other
industries such as manufacturing and financial services, where a lot
can be replicated, health care is very individual on a country by
country basis at the moment. Because of this, change is a lot slower."

Encumbrances

The UPMC coded its problem lists in ICD-9-CM instead of SNOMED, even
through SNOMED was far superior in specificity and clarity. Along with
historical reasons, they avoided SNOMED because it was a licensed
product until 2003 whereas ICD-9-CM was free. As for ICD-9-CM, its
official standard is distributed as RTF documents, making correct
adoption difficult.

Here are a few examples of how vendors told me they handle
interoperability.

InterSystems is a major
player in health care. The basis of their offerings is Caché,
an object database written in the classic programming language for
medical information processing, MUMPS. (MUMPS was also standardized by
an ANSI committee under the name M.) Caché can be found in all
major hospitals. For data exchange, InterSystems provides an HIE
called HealthShare, which they claim can communicate with other
vendors' systems by supporting HL7 and other appropriate standards.
HealthShare is both communications software and an actual hub that can
create the connections for customers.

Medicity is another key
HIE vendor. Providers can set up their own hubs or contract with a
server set up by Medicity in their geographic area. Having a hub means
that a small practice can register just once with the hub and then
communicate with all other providers in that region.

Let's turn again to Epic. Two facilities that use it can exchange a
wide range of data, because some of its data is not covered by
standards. A facility that uses another product can exchange a
narrower set of data with an Epic system over href="http://www.epic.com/software-interoperability.php">Care
Everywhere, using the standards. The Epic rep said they will move
more and more fields into Care Everywhere as standards evolve.

What all this comes down to is an enormous redundant infrastructure
that adds no value to electronic records, but merely runs a Red
Queen's Race to provide the value that already exists in those
records. We've already seen that defining more standards has a
limited impact on the problem. But a lot of programmers at this point
will claim the solution lies in open source, so let's see what's
happening in that area.

The open source challengers

The previous sections, like acts of a play, laid out the character of
the vendors in the health care space as earnest, hard-working, and
sometimes brilliantly accomplished, but ultimately stumbling through a
plot whose bad turns overwhelm them. In the current act we turn to a
new character, one who is not so well known nor so well tested, one
who has shown promise on other stages but is still finding her footing
on our proscenium.

The best-known open source projects in health care are href="http://openmrs.org/">OpenMRS, the Veterans Administration's
VistA, and the href="http://www.connectopensource.org/">NHIN CONNECT Gateway. I
won't say anything more about OpenMRS because it has received high
praise but has made little inroads into American health care. I'll
devote a few paragraphs to the strengths and weaknesses of VistA and
CONNECT.

Buzz in the medical world is that VistA beats commercial offerings for
usability and a general fit to the clinicians' needs. But it's
tailored to the Veterans Administration and--as a rep for the href="http://www.vxvista.org/">vxVistA called it--has to be
deveteranized for general use. This is what vxVistA does, but they are
not open source. They make changes to the core and contribute it back,
but their own products are proprietary. A community project called href="http://www.worldvista.org/">WorldVistA also works on a
version of VistA for the non-government sector.

One of the hurdles of adapting VistA is that one has to learn its
underlying language, MUMPS. Most people who dive in license a MUMPS
compiler. The vxVistA rep knows of no significant users of the free
software MUMPS compiler GT.M. VistA also runs on the Caché
database, mentioned earlier in this article. If you don't want to
license Caché from InterSystems, you need to find some other
database solution.

So while VistA is a bona fide open source project with a community,
it's ecosystem does not fit neatly with the habits of most free
software developers.

CONNECT is championed by the same Office of the National Coordinator
for Health Information Technology that is implementing the HITECH
recovery plan and meaningful use. A means for authenticating requests
and sending patient data between providers, CONNECT may well be
emerging as the HIE solution for our age. But it has some maturing to
do as well. It uses a SOAP-based protocol that requires knowledge of
typical SOA-based technologies such as SAML.

Two free software companies that have entered the field to make
installing CONNECT easier are href="http://www.axialexchange.com/">Axial Exchange, which creates
open source libraries and tools to work with the system, and the href="http://www.mirthcorp.com/">Mirth Corporation. Jon Teichrow
of Mirth told me how a typical CONNECT setup at a rural hospital took
just a week to complete, and can run for the cost of just a couple
hours of support time per week. The complexities of handling CONNECT
that make so many people tremulous, he said, were actually much easier
for Mirth than the more typical problem of interpreting the hospital's
idiosyncratic data formats.

Just last week, href="http://www.healthcareitnews.com/news/nhin-direct-launched-simpler-data-exchange">the
government announced a simpler interface to the NHIN called NHIN
Direct. Hopefully, this will bring in a new level of providers
who couldn't afford the costs of negotiating with CONNECT.

CONNECT has certainly built up an active community. href="http://agilex.com/">Agilex employee Scott E. Borst, who is
responsible for a good deal of the testing of CONNECT, tells me that
participation in development, testing, and online discussion is
intense, and that two people were recently approved as committers
without being associated with any company or government agency
officially affiliated with CONNECT.

The community is willing to stand up for itself, too. Borst says that
when CONNECT was made open source last year, it came with a Sun-based
development environment including such components as NetBeans and
GlassFish. Many community members wanted to work on CONNECT using
other popular free software tools. Accommodating them was tough at
first, but the project leaders listened to them and ended up with a
much more flexible environment where contributors could use
essentially any tools that struck their fancy.

Buried in href="http://www.healthcareitnews.com/news/blumenthal-unveils-proposed-certification-rule-himss10">a
major announcement yesterday about certification for meaningful
use was an endorsement by the Office of the National Coordinator
for open source. My colleague and fellow blogger Brian Ahier href="http://ahier.blogspot.com/2010/03/meaningful-certification.html">points
out that rule 4 for certification programs explicitly mentions
open source as well self-developed solutions. This will not magically
lead to more open source electronic health record systems like
OpenMRS, but it offers an optimistic assessment that they will emerge
and will reach maturity.

As I mentioned earlier, traditional vendors are moving more toward
openness in the form of APIs that offer their products as platforms.
InterSystems does this with a SOAP-based interface called Ensemble,
for instance. Eclipsys,
offering its own SOAP-based interface called Helios, claims that they
want an app store on top of their product--and that they will not kick
off applications that compete with their own.

Web-based Practice Fusion has an API in beta, and is also planning an
innovation that makes me really excited: a sandbox provided by their
web site where developers can work on extensions without having to
download and install software.

But to a long-time observer such as Dr. Adrian Gropper, founder of the
MedCommons storage service,
true open source is the only way forward for health care records. He
says we need to replace all those SOAP and WS-* standards with RESTful
interfaces, perform authentication over OpenID and OAuth, and use the
simplest possible formats. And only an enlightenment among the major
users--the health care providers--will bring about the revolution.

But at this point in the play, having explored the characters of
electronic record vendors and the open source community, we need to
round out the drama by introducing yet a third character: the patient.
Gropper's MedCommons is a patient-centered service, and thus part of a
movement that may bring us openness sooner than OpenMRS, VistA, or
CONNECT.

Enter the patient

Most people are familiar with Microsoft's HealthVault and Google
Health. Both allow patients to enter data about their own health, and
provide APIs that individuals and companies alike are using to provide
services. A Journal of Participatory
Medicine
has just been launched, reflecting the growth of interest
in patient-centered or participatory medicine. I saw a book on the
subject by HIMSS itself in the conference bookstore.

The promise of personal health records goes far beyond keeping track
of data. Like electronic records in clinicians' hands, the data will
just be fodder for services with incredible potential to improve
health. In a lively session given today by Patricia Brennan of href="http://www.projecthealthdesign.org/">Project HealthDesign,
she used the metaphors of "intelligent medicines" and "smart
Band-Aids" that reduce errors and help patients follow directions.

Project HealthDesign's research has injected a dose of realism into
our understanding of the doctor-patient relationship. For instance,
they learned that we can't expect patients to share everything with
their doctors. They get embarrassed when they lapse in their behavior,
and don't want to admit they take extra medications or do other things
not recommended by doctors. So patient-centered health should focus on
delivering information so patients can independently evaluate what
they're doing.

As critical patient data becomes distributed among a hundred million
individual records, instead of being concentrated in the hands of
providers, simple formats and frictionless data exchange will emerge
to handle them. Electronic record vendors will adapt or die. And a
whole generation of products--as well as users--will grow up with no
experience of anything but completely open, interoperable systems.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl