Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

April 19 2012

Sage Congress: The synthesis of open source with genetics

For several years, O'Reilly Radar has been covering the exciting
potential that open source software, open data, and a general attitude
of sharing and cooperation bring to health care. Along with many
exemplary open source projects in areas directly affecting the
public — such as the VA's Blue
Button
in electronic medical records and the href="http://wiki.directproject.org/">Direct project in data
exchange — the study of disease is undergoing a paradigm shift.

Sage Bionetworks stands at the
center of a wide range of academic researchers, pharmaceutical
companies, government agencies, and health providers realizing that
the old closed system of tiny teams who race each other to a cure has
got to change. Today's complex health problems, such as Alzheimer's,
AIDS, and cancer, are too big for a single team. And these
institutions are slowly wrenching themselves out of the habit of data
hoarding and finding ways to work together.

A couple weeks ago I talked to the founder of Sage Bionetworks,
Stephen Friend, about recent advances in open source in this area, and
the projects to be highlighted at the upcoming http://sagecongress.org/">Sage Commons congress. Steve is careful
to call this a "congress" instead of a "conference" because all
attendees are supposed to pitch in and contribute to the meme pool. I
covered Sage Congress in a series of
articles last year
. The following podcast ranges over
topics such as:

  • what is Sage Bionetworks [Discussed at the 00:25 mark];
  • the commitment of participants to open source software [Discussed at the 01:01 mark];
  • how open source can support a business model in drug development [Discussed at the 01:40 mark];
  • a look at the upcoming congress [Discussed at the 03:47 mark];
  • citizen-led contributions or network science [Discussed at the 06:12 mark];
  • data sharing philosophy [Discussed at the 09:01 mark];
  • when projects are shared with other institutions [Discussed at the 12:43 mark];
  • how to democratize medicine [Discussed at the 17:10 mark];
  • a portable legal consent approach where the patient controls his or her own data [Discussed at the 20:07 mark];
  • solving the problem of non-sharing in the industry [Discussed at the 22:15 mark]; and
  • key speakers at the congress [Discussed at the 26:35 mark].

Sessions from the congress will be broadcast live via webcast and posted on the Internet.

March 26 2012

Five tough lessons I had to learn about health care

Working in the health care space has forced me to give up many hopes and expectations that I had a few years ago. Forgive me for being cynical (it's an easy feeling to have following the country's largest health IT conference, as I reported a month ago), and indeed some positive trends do step in to shore up hope. I'll go over the redeeming factors after listing the five tough lessons.

1. The health care field will not adopt a Silicon Valley mentality

Wild, willful, ego-driven experimentation--a zeal for throwing money after intriguing ideas with minimal business plans--has seemed work for the computer field, and much of the world is trying to adopt a "California optimism." A lot of venture capitalists and technology fans deem this attitude the way to redeem health care from its morass of expensive solutions that don't lead to cures. But it won't happen, at least not the way they paint it.

Health care is one of the most regulated fields in public life, and we want it that way. From the moment we walk into a health facility, we expect the staff to be following rigorous policies to avoid infections. (They don't, but we expect them to.) And not just anybody can set up a shield outside the door and call themselves a doctor. In the nineteenth century it was easier, but we don't consider that a golden age of medicine.

Instead, doctors go through some of the longest and most demanding training that exists in the world today. And even after they're licensed, they have to regularly sign up for continuing education to keep practicing. Other fields in medicine are similar. The whole industry is constrained by endless requirements that make sure the insiders remain in their seats and no "disruptive technologies" raise surprises. Just ask a legal expert about the complex mesh of Federal and state regulations that a health care provider has to navigate to protect patient privacy--and you do want your medical records to be private, don't you?--before you rave about the Silicon Valley mentality. Also read the O'Reilly book by Fred Trotter and David Uhlman about the health care system as it really is.

Nor can patients change treatments with the ease of closing down a Facebook account. Once a patient has established a trust relationship with a doctor and obtained a treatment plan, he or she won't say, "I think I'll go down the road to another center that charges $100 less for this procedure." And indeed, health reform doesn't prosper from breaking down treatments into individual chunks. Progress lies in the opposite direction: the redemptive potential of long-term relationships.

2. Regulations can't force change

I am very impressed with the HITECH act (a product of the American Recovery and Reinvestment Act, more than the Affordable Care Act) that set modern health reform in motion, as well as the efforts of the Department of Health and Human Services to push institutions forward. But change in health care, like education, boils down to the interaction in a room between a professional and a client. Just as lesson plans and tests can't ensure that a teacher inspires a child to learn, regulations can't keep a doctor from ordering an unnecessary test to placate an anxious patient.

We can offer clinical decision support to suggest what has worked for other patients, but we can't keep a patient from asking for a expensive procedure that has a 10% chance of making him better (and a 20% chance of making him worse), nor can we make the moral decision about what treatment to pursue, for the patient or the doctor. Each patient is different, anyway. No one wants to be a statistic.

3. The insurance companies are not the locus of cost and treatment problems

Health insurers are a favorite target of hatred by Americans, exemplified by Michael Moore's 2007 movie Sicko and more surprisingly in the 1997 romantic comedy As Good as it Gets, where I saw an audience applaud as Helen Hunt delivered a rant against health maintenance organizations. A lot of activists, looking at other countries, declare that our problems would be solved (well, would improve a lot) if we got private insurers out of the picture.

Sure, there's a lot of waste in the current insurance system, which deliberately stretches out the task of payment and makes it take up the days of full-time staff in each doctor's office. But that's not the cause of the main problems in either costs or treatment failures. The problems lie with the beloved treatment staff. We can respect their hard work and the lives they save, but we don't have to respect them for releasing patients from hospitals without adequate follow-up, or for ordering unnecessary radiation that creates harm for patients, or for the preventable errors that still (after years of publicity) kill 90,000 to 100,000 patients a year.

4. Doctors don't want to be care managers

The premise of health reform is to integrate patients into a larger plan for managing a population. A doctor is supposed to manage a case load and keep his or her pipeline full while not spending too much. The thrust of various remuneration schemes, old and new, that go beyond fee for service (capitation, global payment systems) is to reward a doctor for handling patients of a particular type (for instance, elderly people with hypertension) at a particular cost. But doctors aren't trained for this. They want to fix the immediate, presenting complaint and send the patient home until they're needed again. Some think longitudinally, and diligently try to treat the whole person rather than a symptom. But managing their treatment options as a finite resource is just not in their skill set.

The United Kingdom--host of one of the world's great national care systems--is about to launch a bold new program where doctors have to do case management. The doctors are rebelling. If this is the future of medicine, we'll have to find new medical personnel to do it.

5. Patients don't want to be care managers

Now that the medical field has responded superbly to acute health problems, we are left with long-term problems that require lifestyle and environmental changes. The patient is even more important than the doctor in these modern ills. But the patients who cost the most and need to make the most far-ranging changes are demonstrating an immunity to good advice. They didn't get emphysema or Type 2 diabetes by acting healthily in the first place, and they aren't about to climb out of their condition voluntarily either.

You know what the problem with chronic disease is? Its worst effects are not likely to show up early in life when lifestyle change could make the most difference. (Serious pain can come quickly from some chronic illnesses, such as asthma and Crohn's disease, but these are also hard to fix through lifestyle changes, if by "lifestyle change" you mean breathing clean air.) The changes a patient would have to make to prevent smoking-related lung disease or obesity-related problems would require a piercing re-evaluation of his course of life, which few can do. And incidentally, they are neither motivated nor trained to store their own personal health records.

Hope for the future

Despite the disappointments I've undergone in learning about health care, I expect the system to change for the better. It has to, because the public just won't tolerate more precipitous price hikes and sub-standard care.

There's a paucity of citations in my five lessons because they tend not to be laid out bluntly in research or opinion pieces; for the most part, they emerged gradually over many hallway conversations I had. Each of the five lessons contain a "not," indicating that they attack common myths. Myths (in the traditional sense) in fact are very useful constructs, because they organize the understanding of the world that societies have trouble articulating in other ways. We can realize that myths are historically inaccurate while finding positive steps forward in them.

The Silicon Valley mentality will have some effect through new devices and mobile phone apps that promote healthy activity. They can help with everything from basic compliance--remembering to take prescribed meds--to promoting fitness crazes and keeping disabled people in their homes. Lectures given once in a year in the doctor's office don't lead to deep personal change, but having a helper nearby (even a digital one) can impel a person to act better, hour by hour and day by day. This has been proven by psychologists over and over: motivation is best delivered in small, regular doses (a theme found in my posting from HIMSS).

Because the most needy patients are often the most recalcitrant ones, personal responsibility has to intersect with professional guidance. A doctor has to work the patient, and other staff can shore up good habits as well. This requires the doctors' electronic record systems to accept patient data, such as weight and mood. Projects such as Indivo X support these enhancements, which traditional electronic record systems are ill-prepared for.

Although doctors eschew case management, there are plenty of other professionals who can help them with it, and forming Accountable Care Organizations gives the treatment staff access to such help. Tons of potential savings lie in the data that clinicians could collect and aggregate. Still more data is being loaded by the federal government regularly at Health.Data.Gov. ACOs and other large institutions can hire people who love to crunch big data (if such staff can be found, because they're in extremely high demand now in almost every industry) to create systems that slide seamlessly into clinical decision support and provide guidelines for better treatment, as well as handle the clinic's logistics better. So what we need to do is train a lot more experts in big data to understand the health care field and crunch its numbers.

Change will be disruptive, and will not be welcomed with open arms. Those who want a better system need to look at the areas where change is most likely to make a difference.

March 15 2012

Left and right and wrong

Sometimes I find a picture or a blog post that leaps off the screen at me and says "your readers must see this as it applies to health IT."

Normal Modes, a solid UX company based in Houston, sends me fairly good UX tips on a regular business. The last one featured this photo (used with permission):

Parking log picture from Normal Modes

Normal Modes points out, very clearly, that points of confusion like this are bad for users. They regard their job, as UX experts, to eliminate this kind of experience for users. Their analysis about how to do this is right on.

I have seen this kind of error in EHR systems and PHR systems on countless occasions. From an engineering perspective, it is really useful to take a moment and consider how something like this happens. First, you have two different "levels" of operation here. One is concerned with how traffic flows in the parking lot. The other is concerned with directions in the parking lot. For whatever reasons, these two "parking lot features" were implemented separately by people who had access to two different sets of resources. It stands to reason that the people who had access to white paint and stencils to make the sign on the right were the same people using stencils to mark the parking spots. It stands to reason that the people who had access to the professional sign-making system were somewhat removed from the people actually designing the parking lot.

In short, what you are seeing here is the artifact of a political and process disconnect. In health IT, there are constant political disconnects that cause similar issues. The EHR vendor is one political group, the insurance companies another, and the government is so large that it actually has multiple groups with different agendas. (HHS alone has so many sub groups that it's very difficult to completely follow what is happening.)

As enthusiastic as I am about the potential for meaningful use incentives, I think there will be lots of artifacts like this in EHRs that do not make much sense because the EHR vendor was pulled in a new direction by these incentives.

I have said in almost every talk about health IT I have ever given that the problems in health IT are political and not technical. I think it is my most tweeted quote. But sometimes a picture is worth a thousand words.

Meaningful Use and Beyond: A Guide for IT Staff in Health Care — Meaningful Use underlies a major federal incentives program for medical offices and hospitals that pays doctors and clinicians to move to electronic health records (EHR). This book is a rosetta stone for the IT implementer who wants to help organizations harness EHR systems.

Related:

March 12 2012

Parts of healthcare are moving to the cloud

Healthcare providers are increasingly required to do more with less. Regulations, HIPAA, Meaningful Use, recovery audit contractor (RAC) audits and decreasing revenues are motivating providers to consider cloud computing as a solution to potentially help them cut costs, maintain quality, meet regulations, and increase productivity.

Some electronic health record (EHR) vendors are offering solutions as a cloud-based offering. This offers an approach intended to help providers better manage the IT investments that need to be made to support EHR implementations. And just as we've seen in other industries, there is an ongoing debate within healthcare as to the viability of cloud-based solutions given the care needed for patient privacy and sensitive personal information.

Providers' trust in the public cloud is still relatively weak, but increasing numbers are considering using private clouds. However, EHR applications hosted in the cloud do seem to be gaining traction.

One example of a cloud-based EHR offering is CareCloud. My fellow Radar blogger Andy Oram wrote about them two years ago at HIMSS, and they have made significant progress since then. CareCloud creates apps that help medical professionals run their businesses. Those apps include a community collaboration and communication platform to securely share patient information, a medical practice management system for billing and scheduling, and a revenue cycle management service. CareCloud also provides electronic health records. It's built with Ruby on Rails, a highly abstracted programming language quite well suited for rapid development of web applications. CareCloud was a co-winner of the IBM Global Entrepreneur Silicon Valley SmartCamp competition in 2010 (see video below).

I ran into the folks at CareCloud at the HIMSS 2012 conference and was impressed with both their use of open source and their strategy on leveraging the cloud in healthcare. Mike Cuesta, CareCloud's director of marketing and user experience, defined CareCloud's strategy as one of future survival.

"Being able to deliver the product across platforms is crucial," Cuesta said. "In healthcare there is a glaring lack of modern web apps. What we wanted to do was create an elegant and user-friendly application that is accessible anywhere. Companies have to be able to deliver a desktop-class experience that works across platforms."

CareCloud relies on open source. "I had my eyes opened to open source about eight years ago when I was looking for a project management system," said CareCloud CTO Tom Packert. "I discovered I could use something like dotproject, which is a GPL-licensed PHP-MySQL web-based project management application. It only took us a day to put it up on SUSE Linux and we didn't need SQL seat licenses. Open source allows you to scale horizontally. It's not as scary as a lot of people think it is."

Another EHR in the cloud is athenahealth. Athenahealth's co-founders Todd Park, the new U.S. chief technology officer (CTO), and Jonathan Bush, purchased a birthing practice in 1997. Soon, like most medical practices, they were buried in paper and spent most of their resources trying to get paid. Searching for innovative solutions led them to create their own software. Enlisting the help of Todd's younger brother Ed, a software developer, they created an EHR and financial revenue cycle system with a rules engine of dynamic billing rules data. I met Ed Park at HIMSS when I remarked that he looked a lot like Todd Park, and Jonathan introduced him to me as Todd's "younger, smarter, and much better looking brother." Apparently his programming skills are paying off ...

This year, athenahealth was named to the TR50, Technology Review's third annual list of the world's most innovative technology companies. At this year's HIMSS conference, athenahealth showed the company's plans for an iPhone app that will gives its EHR users access to certain features of its athenaClinicals cloud-based platform. An iPad version of the web-based athenahealth EHR app is also currently under development and set to launch in 2013.

Being based on cloud technology makes athenahealth much more nimble in launching mobile products in services. In the video below, I discuss with Jonathan Bush how athenahealth is using the cloud in their EHR.

(Thanks to Nate DeNiro and Open Affairs Television for their assistance with this video.)

Related:

February 29 2012

Report from HIMSS 12: wrap-up of the largest health IT conference

This is a time of great promise in health care, yet an oppressive atmosphere hung over much of href="http://www.himssconference.org/">HIMSS. All the speakers--not least the government representatives who announced rules for the adoption of electronic health records--stressed commendable practices such as data exchange, providing the patient with information, and engaging with the patient. Many berated hospitals, doctors, and vendors for neglecting the elements that maintain health. But the thrust of most sessions was on such details as how to convert patient records to the latest classification of diseases (ICD-10).

Intelligent Hospital pavilion shows off tempting technology
Intelligent Hospital pavilion shows off tempting technology.

I have nothing against ICD-10 and I'm sure adopting it is a big headache that deserves attention at the conference. The reason I call the atmosphere oppressive is that I felt stuck among health care providers unable to think long-term or to embrace the systems approach that we'll need to cure people and cut costs. While some health care institutions took the ICD-10 change-over seriously and put resources into meeting the deadline, others pressured the Dept. of Health and Human services to delay implementation, and apparently won a major reprieve. The health IT community, including HIMSS, criticized the delay. But resistance to progress usually does not break out so overtly, and remains ingrained in day-to-day habits.

But ICD-10 is a sideline to the major issue of Stage 2 meaningful use. Why, as I reported on Wednesday, were so many of the 35,000 HIMSS attendees wrapped up in the next step being forced on them by the federal government? The scandal is that these meaningful use concepts (using data to drive care, giving care-givers information that other care-givers have collected about the patient) have to be forced on them. Indeed, institutions like Kaiser Permanente that integrated their electronic records years ago and concentrated on the whole patient had relatively little work to do to conform to Stage 1, and probably have the building blocks for Stage 2 in place. And of course these things are part of the landscape of health care in other countries. (The proposed regulations were finally posted last Thursday.)

Recipients of Regina Holliday jackets record patient involvement stories
Recipients of Regina Holliday jackets record patient involvement stories.

Haven't our providers heard that an ounce of prevention is worth a pound of cure? Don't well-educated and well-paid executives invest in quality measures with the expectation that they'll pay off in the long run? And aren't we all in the field for the good of the patients? What is that snickering I hear?

Actually, I don't accept the premise that providers are all in it for the money. If so many are newly incentivized to join the government's program for a mere $15,000 per doctor (plus avoiding some cuts in Medicare payments), which is a small fraction of the money they'll have to spend implementing the program, they must know that it's time to do the right thing. Meaningful use can be a good framework to concretize the idealistic goals of health care reform, but I just wish the vendors and doctors would keep their eyes more on the final goal.

Redwood MedNet in Northern California is an example of a health information exchange that adopted standards (CONNECT, before the Direct project was in place) to simplify data exchange between health providers. Will Ross of Redwood MedNet told me that qualifying for Stage 2 would be simple for them, "but you won't hear that from many vendors in this exhibit hall."

Annual surveys by Family Practice Management journal about their readers' satisfaction with EHRs, reviewed in one HIMSS session, showed widespread dissatisfaction that doesn't change from year to year. For instance, 39% were dissatisfied with support and training, although a few vendors rated quite high. Still, considering that doctors tend to veer away from open source solutions and pay big bucks for proprietary ones out of a hope of receiving better support and training, they deserve better. It's worth noting that the longer a practice uses its system, the more they're likely to express satisfaction. But only 38% of respondents would purchase the same systems now if they weren't already locked in.

That's the big, frustrating contradiction at HIMSS. The vendors have standards (HL7 and others), they've been setting up health information exchanges (under various other names) for years, they have a big, popular interoperability lab at each conference--and yet most patients still have to carry paper records and CDs with images from one doctor to another. (A survey of HIMSS members showed that one-quarter allowed access by patients to their data, which is an advance but still just a start.) The industry as a whole has failed to make a dent in the 90,000 to 100,000 needless deaths that occur in treatment facilities each year. And (according to one speaker) 20% of patients hospitalized under Medicare have to return to the hospital shortly after discharge.

Omens of change

Suffice it say that by my fourth day at HIMSS I was not happy. Advances come, but slowly. Examples of developments I can give a thumbs-up to at HIMSS were data sharing among physicians who use Practice Fusion, a popular example of a growing move to web services for electronic records, and a CardioEngagement Challenge funded by Novartis to encourage at-risk patients to take more interest in their health. The winner was a Sensei mobile app that acts as an automated coach. Sensei CEO Robert Schwarzberg, a cardiologist, told me had put together phone-in coaching services for heart patients during the years before mobile apps, and was frustrated that these coaches were available less than once a week when what patients needed was round-the-clock motivation. Sensei Wellness is one of the many mobile apps that make both patients and doctors more connected, and HIMSS quite properly devoted a whole section of the exhibit floor to them.

Talking about Sensei Wellness with Dr. Robert Schwarzberg
Talking about Sensei Wellness with Dr. Robert Schwarzberg.

I dropped by the IBM booth for the obligatory demo of Watson's medical application, and some background from Dr. Josko Silobrcic. I also filled in some of this report from an earlier conversation with tech staff.

Medical diagnosis involves more structured data than solving Jeopardy riddles, structure that appears mostly in the form of links between data sets. For instance, medicines are linked to diagnoses, to lab results, and to other medicines (for instance, some drugs are counter-indicated when the patient is taking other drugs). Watson follows these relationships.

But because Watson is a natural language processing application--based on UIMA, which IBM donated to the Apache Foundation--it doesn't try to do much reasoning to pick out the best diagnosis or treatment, both of which are sometimes requested of it. Instead, it dumps huge indexes of medical articles into its data stores on one side, and takes in the text about the patient's complaint and doctor's evaluation on the other. Matching them up is not so different from a Jeopardy question, after all. Any possible match is considered and kept live until the final round of weighing answers, even if the chance of matching is near zero.

Dr. Josko Silobrcic before Watson demonstration
Dr. Josko Silobrcic before Watson demonstration.

Also because of the NLP basis for matching, there is rarely a need to harmonize disparate data taken in from different journals or medical sources.

I assumed that any processing that uses such a large data set and works so fast must run on a huge server farm, but the staff assured me it's not as big as one would think. For production use, of course, they'll need to take into account the need to scale. The medical informatics equivalent of a Christmas rush on sales would be an epidemic where everybody in the region is urgently hitting Watson for critical diagnoses.

Coming to peace

Healing came to me on my last day at HIMSS, at too related conferences off to the side of the main events: a meeting of Open Health Tools members and the eCollaboration forum, run by health activists who want to break down barriers to care. Both groups have partnerships with HIMSS.

Open Health Tools positions itself as an umbrella organization for projects making free software for a lot of different purposes in health care: recording, treatment, research and more. One illustrative project I got to hear about at their meeting was the Medical Imaging Network Transport (MINT), which Johns Hopkins is working on in coordination with other teams

MINT cuts down on the transfers of huge images by doing some processing in place and transferring only portions of the data. Switching to modern storage formats (XML and JSON) and better methods of data transfer also reduces waste. For instance, current DICOM vendors transmit images over TCP, which introduces more overhead than necessary when handling the packet losses engendered by transmitting files that are several gigabytes in size. MINT allows UDP and other protocols that are leaner than TCP.

Best of all, MINT DICOM images can be displayed through HTML5, which means any browser can view them in good resolution, there is no need to install a specialized viewer at each location where the doctor is checking the image, and dependence on proprietary software is reduced. (The same reliance on standard browsers is also claimed by eMix in a recent interview.

At the eCollaboration forum, E-patient Dave DeBronkart reported that being an engaged patient is still swimming upstream. It's hard to get one's records, hard to find out what treatments will cost, and hard to get taken seriously as an adult interested in monitoring one's own care. Meg McCabe of Aetna says that insurers need to offer more sophisticated guidance to patients trying to choose a health provider--simple lists of options are confusing and hard to choose from.

One speaker warned providers that if they try to open their data for collaborative care, they may find themselves hampered by contracts that maintain vendor ownership of EHR data. But speakers assured us vendors are not evil. The issue is what the providers ask for when they buy the EHR systems.

Here's the strange thing about the eCollaboration forum: they signed up enough people to fill the room ahead of time and left many potential attendees lamenting that they couldn't get in. Yet on the actual day of the event, there were about eight empty seats for every attendee. Maybe HIMSS attendees felt that had to devote all their time to the stage 2 regulations, previously mentioned. But I take the disappointing turn-out as a sign of the providers' and vendors' lack of commitment to change. Shown a dazzling roster of interesting talks about data exchange, open record sharing, and patient engagement, they're quick to sign up--but they don't show up when it counts.

As members of the general public, we can move the health care field forward by demanding more from our providers, at the point where we have some influence. Anyone looking for concrete guidance for increasing their influence as a patient can try e-Patients Live Longer: The Complete Guide to Managing Health Care Using Technology, by Nancy B. Finn.

Public attention and anger have been focused on insurers, who have certainly engaged in some unsavory practices to avoid paying for care--but nothing as destructive as the preventable errors and deaths caused by old-fashioned medical practices. And while economists complain about the 30 cents out of every dollar wasted in the American hodge-podge of payment systems, we know that unnecessary medical procedures or, conversely, preventative steps that were omitted, also suck up a lot of money. One speaker at the eCollaboration forum compared the sky-rocketing costs of health care and insurance to a financial bubble that can't last. Let's all take some responsibility for instituting better medical and reporting systems so the costs come down in a healthy manner.

Other articles about HIMSS were posted last Tuesday and Wednesday.

February 27 2012

Big data is the next big thing in health IT

During the 2012 HIMSS conference in Las Vegas I was invited by Dell Healthcare, along with a group of health IT experts, to discuss issues in health information technology. The session sparked some passionate discourse about the challenges and opportunities that are important to the health IT industry.

Moderator Dan Briody started the event with a question about things we had seen at HIMSS that had changed our thinking about health IT. Never being shy, I jumped right in and spoke about the issues of payment reform and how the private market is beginning to show signs of disruptive innovation. After a great deal of back and forth among the panelists it seemed we slipped into listing many of the barriers — technological, political and cultural — that health IT faces. I was hoping we would get back to sharing possible solutions, so I made the proposal that big data is the next big thing in health IT (see video below).

When I talk about "big data" I am referring to a dataset that is too large for a typical database software tool to store, manage, and analyze. Obviously, as technology changes and improves, the size of a dataset that would be qualify as "big data" will change as well. There is also a big data difference between healthcare and other industry sectors, since there are different tools available and the required datasets have varying sizes. Since health data is very personal and sensitive, it also has special security and privacy protections. This makes sharing, aggregating, sorting and analyzing the data sometimes challenging.

Another difficulty in making the most of big data in healthcare is those who control different pools of data have different financial incentives. There is a lack of transparency in performance, cost and quality; it is currently structured so that payers who would gain from decreasing revenue to providers, but the providers control the clinical data that is necessary to analyze in order to pay for value. The payers control another pool, which includes claims data. This is not very useful for advanced analysis that will provide real insight. But enabling transparency of the data will help to identify and analyze sources of variability as well as find waste and inefficiencies. Publishing quality and performance data will also help patients make informed health decisions.

The proliferation of digital health information, including both clinical and claims information, is creating some very large datasets. This also creates some significant opportunity. For instance, analyzing and synthesizing clinical records and claims data can help identify patients appropriate for inclusion in a particular clinical trial. These new datasets can also help to provide insight into improved clinical decision making. One great example of this is when an analysis of a database of 1.4 million Kaiser Permanente members helped determine that Vioxx, a popular pain reliever that was widely used by arthritis patients, was dangerous. Vioxx was a big moneymaker for Merck, generating about $2.5 billion in yearly sales, and there was quite a battle to get the drug off the market. Only by having the huge dataset available from years of electronic health records, and tools to properly analyze the data, was this possible.

The big data portion of the Dell think tank discussion is embedded below. You can find video from the full session here.

Related:

February 24 2012

Top stories: February 20-24, 2012

Here's a look at the top stories published across O'Reilly sites this week.

Data for the public good
The explosion of big data, open data and social data offers new opportunities to address humanity's biggest challenges. The open question is no longer if data can be used for the public good, but how.

Building the health information infrastructure for the modern epatient
The National Coordinator for Health IT, Dr. Farzad Mostashari, discusses patient empowerment, data access and ownership, and other important trends in healthcare.

Big data in the cloud
Big data and cloud technology go hand-in-hand, but it's comparatively early days. Strata conference chair Edd Dumbill explains the cloud landscape and compares the offerings of Amazon, Google and Microsoft.

Everyone has a big data problem
MetaLayer's Jonathan Gosier talks about the need to democratize data tools because everyone has a big data problem.

Three reasons why direct billing is ready for its close-up
David Sims looks at the state of direct billing and explains why it's poised to catch on beyond online games and media.


Strata 2012, Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work. Save 20% on Strata registration with the code RADAR20.

Cloud photo: Big cloud by jojo nicdao, on Flickr

The Direct Project in action

The Direct ProjectThe Direct Project is all over HIMSS12, and really all over the country now. But it still carries controversy. When I found out that one of the Houston Health Information Exchange efforts had successfully launched a Direct Pilot, I simply had to do an interview. After all, here was software that I had contributed to as an open source project that was being deployed in my own backyard.

Jim Langabeer is the CEO of the newly renamed Greater Houston Healthconnect. I caught up with Jim at Starbucks and peppered him with questions about where Health Information Exchange (HIE) is going and what HIE looks like in Houston.

What's your background?

Jim Langabeer: I have been in healthcare for a long time in the Texas Medical Center.  I started as a hospital administrator at UTMB, where my first project was to work on an IT project team developing a Human Resources Management System. It was a collaborative effort between three hospitals.

I recently led a software company in the business intelligence space, which was later acquired by Oracle. After that, I decided I wanted to come back to Houston and continue to work in healthcare, so I returned to work for MD Anderson leading project and performance management. I eventually worked with the CIO Lynn Vogel to assess the business value of information systems. I most recently taught healthcare administration at the UT School of Public Health.

Throughout my healthcare career, I have been using data to drive healthcare decisions. My PhD is in decision sciences — quantitative modeling of data for decision-making — and my research grants have all involved analyzing large datasets to make healthcare decisions better. I have also worked between organizations in a collaborative manner. Health Information Exchange was an obvious next step for me.

Where are you in the process of creating a health information exchange?

Jim Langabeer: We are in the middle stage of operations. We are finalizing our architectural vision and choosing vendors. Most importantly, we have strong community support: 41% of the doctors in the region have committed to the exchange with letters of support as well as 61 of the 117 local hospitals.

We are meeting with all of the doctors we can. We are calling them and faxing them and visiting them, with one simple message: Health IT is coming and we want you to participate.


You mentioned an "architectural vision." Can you expand on that?

Jim Langabeer: We really cannot have just one architecture, so our architectural vision really means choosing several protocols and architectures to support the various needs of our stakeholders in parallel. We need to accommodate the entire range of transactions that our physicians and hospitals perform. The numbers say that 50% of Houston docs work in small practices with only one or two doctors, and they typically do not have electronic health records (EHR). Hooking these doctors into a central hub model does not make sense, so a different model where they can use browser/view capabilities and direct connections between providers must be part of our architectural vision.

Houston also has several large hospitals using EPIC or other mature EHR systems. That means we need a range of solutions. Some docs just want to be able to share records. Some are more sophisticated and want to do full EHR linking. Some doctors just want to be able to view data on the exchange using a web portal.

We want to accommodate all of these requests. That means we want a portfolio of products and a flexible overall architectural vision. Practically, that means we will be supporting Direct, IHE and also older Hl7 v2.

Some people are saying Direct is all we want. We do not want a solution that is way over what small providers can handle and then it never gets used. We are architecture- and vendor-neutral, which can be difficult because EPIC is so prevalent in Houston.

We have practices that are still on paper on one hand and very sophisticated hospitals on the other, and that is just in the central Houston area. Immediately outside of Houston, lots of rural hospitals that we plan to support have older EHR systems or home-grown systems. That means we have to work with just about every potential health IT situation and still provide value.

Recently, a JAMIA perspectives article criticized Direct as a threat to non-profit HIE efforts like yours. Do you feel that Direct is a threat?

Jim Langabeer: I do not see Direct as a threat. I hear that from lots of sources, that Direct is a distraction for health information exchanges. I disagree.

I see it as another offering. The market is obviously responding to Direct. The price point on the software for Direct is definitely a benefit to smaller docs. We see it as a parallel path.

We do not see Surescripts (which is offering Direct email addresses to doctors with the AAFP) as a threat because we see them as a collaborator. We want them, and similar companies, to be part of our network. We are also having conversations with insurance companies and others who are not typically involved in health information exchanges because we are looking for partners.

The problem in healthcare is that it has always been very fragmented; no single solution gets much penetration. So, as we consider different protocols, we have to go with what people are asking for and what is already being adopted. We have to get to a point were these technologies have a very high penetration rate.

How are you narrowing your health IT vendors?

Jim Langabeer: What we want is a vendor that is going to be with us long term, sharing our risks and making sure we are successful. The sustainability of the vendor is connected to the sustainability of our exchange, so that is really important. Our 20-county region represents 6.4 million people, and that population is larger than most states that are pursing exchanges. Not many vendors have experience on that scale.

How important is the software licensing? Do open source vendors have an advantage?

Jim Langabeer: I am not sure they have an advantage. Of course, open source is ideal, but often proprietary vendors are ahead in terms of features. A mix in the long-term solution would be really cool.

How will you work with outside EHR vendors?


Jim Langabeer: We are trying to engage at the CIO level. We're trying to understand what solutions standards and data they want to share. There is a core set of things everyone needs to do. Beyond that core, some people want to go with SOA; other people really want IHE or Direct. There is not much data sharing between hospitals. That is why industry standards are so important to us. It helps us shorten those discussions and make a more narrow offering. So, we are focusing on protocols as a means to work with the various EHR vendors.

One CIO told us, "We do not want to exchange data at all; we just want our doctors to be able to open a browser and see your data." We may not like to hear that, but that is the reality for many organizations in Houston.

The other thing that is unique about Houston is that you are not going to see the state of Texas taking a dictatorial role. In other large exchanges, you often have a state-level government dictating HIE. In that environment, it is easier to insist on specific standards. That is not our situation in Houston, so we have to meet our constituents where they are.

I have been frustrated that the Direct Project reference implementations only come in Java and .NET at this point. I would like to see implementations in PHP, Python, Ruby, etc. — languages that are more popular with entrepreneurs. Are you concerned with issues like that?

Jim Langabeer: We're definitely thinking about things like that. We do not want to be merely business-to-business — we want to offer services to consumers. So, we care about the technology becoming accessible to consumers, which means getting to iPhones. We want to be able to offer consumers tools that will bring them value, so we certainly care about issues like implementation language because we see those issues as connected.

If I let you dictate which Houston clinic or hospital I go to, when can I go see a doctor and get my patient data sent to my HealthVault or other PHR Direct account?

Jim Langabeer: I would hope that the technology would be ready by the end of year. What I envision is a core group of early adopters. We already have several hospitals and some physician groups that are interested in taking that role.

This interview was edited and condensed.

Meaningful Use and Beyond: A Guide for IT Staff in Health Care — Meaningful Use underlies a major federal incentives program for medical offices and hospitals that pays doctors and clinicians to move to electronic health records (EHR). This book is a rosetta stone for the IT implementer who wants to help organizations harness EHR systems.

Related:

February 23 2012

Direct Project will be required in the next version of Meaningful Use

The Direct ProjectThe Office of the National Coordinator for Health Information Technology (ONC) announced that the Direct Project would be required in stage 2 of Meaningful Use.

As usual the outside world knew almost instantly because of Twitter. Nearly simultaneous posts from @ahier (Brain Ahier) and @techydoc (Steven Waldren MD). More information followed shortly after from @amalec Arien Malec a former leader for the Direct Project.


There are some other important announcements ahead of the official release, such as the end of support for CCD, but this requirement element has the deepest implications. This is jaw-dropping news! Meaningful Use is the standard by which all doctors and hospitals receive money for Electronic Health Record (EHR) systems from the federal government. In fact, the term "Electronic Health Record" is really just a synonym for "meaningful use software" (at least in the U.S. market). Meaningful Use is at the heart of what health IT will look like in the United States over the coming decades.

The Direct Project has a simple but ambitious goal: to replace the fax machine as the point-to-point communications tool for healthcare. That goal depends on adoption and nothing spurs adoption like a mandate. Every Health Information Exchange (HIE) in the country is going to be retooling as the result of this news. Some of them will be totally changing directions.



This mandate will make the Direct Project into the first Health Internet platform. Every doctor in the country will eventually use this technology to communicate. Given the way that healthcare is financed in the U.S., it is reasonable to say that doctors will either have a Direct email address to communicate with other doctors and their patients in a few years, or they will probably retire from the practice of medicine.

It was this potential, to be the first reliable communications platform for healthcare information, that has caused me to invest so heavily in this project. This is why I contributed so much time to the Direct Project Security and Trust Working Group when the Direct Protocol was just forming. This is an Open Source project that can still use your help.



The Direct Project is extensively covered in "Meaningful Use and Beyond" (chapter 11 is on interoperability). I wrote about the advantages of the Direct Project architecture. I helped arrange talks about about Direct at OSCON in 2010, and in 2011, I gave an OSCON keynote about the Health Internet , which featured Direct. I wrote a commentary for the Journal of Participatory Medicine, about how accuracy is more important than privacy for healthcare records and how to use the Direct Project to achieve that accuracy. I pointed out that the last significant impact from Google Health would be to make Direct more important. I am certainly not the only person at O'Reilly who has recognized the significance of the Direct Project, but I am one of the most vocal and consistent advocates of the Direct Project technology approach. So you can see why I think this a big announcement.

Of course, we will not know for sure exactly what has been mandated by the new revisions of Meaningful Use, but it is apparent that this is a huge victory for those of us who have really invested in this effort. My hat is off to Sean Nolan and Umesh Madan from Microsoft, to Brian Behlendorf and Arien Malec, who were both at at ONC during the birth of Direct, to Dr. David Kibbe, Brett Peterson and to John Moehrke. There are countless others who have contributed to the Direct Project, but these few are the ones who had to tolerate contributing with me, which I can assure you, is above and beyond the call of duty.

Obviously, we will be updating "Meaningful Use and Beyond" to include this new requirement as well as the other changes to the next version of Meaningful Use (which apparently will no longer be called "stage 2"). Most of the book will not change however, since it focuses on covering what you need to know in order to understand the requirements at all. While the requirements will be more stringent as time goes on, the core health IT concepts that are needed to understand them will not change that much. However, I recommend that you get a digital copy of the book directly through O'Reilly, because doing so entitles you to future versions of the book for free. You can get today's version and know we will update your digital edition with the arrival of subsequent versions of the Meaningful Use standard.



I wonder what other changes will be in store in the new requirements? ONC keeps promising to release the new rule "tomorrow." Once the new rules emerge, they will be devoured instantly, and you can expect to read more about the new standards here. The new rule will be subject to a 60-day commentary period. It will be interesting to see if the most dramatic aspects of the rule will survive this commentary. Supporters of CCR will be deeply upset and there are many entrenched EHR players who would rather not support Direct. Time will tell if this is truly a mandate, or merely a strong suggestion.


Meaningful Use and Beyond: A Guide for IT Staff in Health Care — Meaningful Use underlies a major federal incentives program for medical offices and hospitals that pays doctors and clinicians to move to electronic health records (EHR). This book is a rosetta stone for the IT implementer who wants to help organizations harness EHR systems.

Related:

Report from HIMSS 2012: toward interoperability and openness

I was wondering how it would feel to be in the midst of 35,000 people whose livelihoods are driven by the decisions of a large institution at the moment when that institution releases a major set of rules. I didn't really find out, though. The 35,000 people I speak of are the attendees of the HIMSS conference and the institution is the Department of Health and Human Services. But HHS just sort of half-released the rules (called Stage 2 of meaningful use), telling us that they would appear online tomorrow and meanwhile rushing over a few of the key points in a presentation that drew overflow crowds in two rooms.

The reaction, I sensed, was a mix of relief and frustration. Relief because Farzad Mostashari, National Coordinator for Health Information Technology, promised us the rules would be familiar and hewed closely to what advisors had requested. Frustration, however, at not seeing the details. The few snippets put up on the screen contained enough ambiguities and poorly worded phrases that I'm glad there's a 60-day comment period before the final rules are adopted.

There isn't much one can say about the Stage 2 rules until they are posted and the experts have a chance to parse them closely, and I'm a bit reluctant to throw onto the Internet one of potentially 35,000 reactions to the announcement, but a few points struck me enough to be worth writing about. Mostashari used his pulpit for several pronouncements about the rules:

  • HHS would push ahead on goals for interoperability and health information exchange. "We can't wait five years," said Mostashari. He emphasized the phrase "standard-based" in referring to HIE.

  • Patient engagement was another priority. To attest to Stage 2, institutions will have to allow at least half their patients to download and transfer their records.

  • They would strive for continuous quality improvement and clinical decision support, key goals enabled by the building blocks of meaningful use.

Two key pillars of the Stage 2 announcement are requirements to use the Direct project for data exchange and HL7's consolidated CDA for the format (the only data exchange I heard mentioned was a summary of care, which is all that most institutions exchange when a patient is referred).

The announcement demonstrates the confidence that HHS has in the Direct project, which it launched just a couple years ago and that exemplifies a successful joint government/private sector project. Direct will allow health care providers of any size and financial endowment to use email or the Web to share summaries of care. (I mentioned it in yesterday's article.) With Direct, we can hope to leave the cumbersome and costly days of health information exchange behind. The older and more complex CONNECT project will be an option as well.

The other half of that announcement, regarding adoption of the CDA (incarnated as a CCD for summaries of care), is a loss for the older CCR format, which was an option in Stage 1. The CCR was the Silicon Valley version of health data, a sleek and consistent XML format used by Google Health and Microsoft HealthVault. But health care experts criticized the CCR as not rich enough to convey the information institutions need, so it lost out to the more complex CCD.

The news on formats is good overall, though. The HL7 consortium, which has historically funded itself by requiring organizations to become members in order to use its standards, is opening some of them for free use. This is critical for the development of open source projects. And at an HL7 panel today, a spokesperson said they would like to head more in the direction of free licensing and have to determine whether they can survive financially while doing so.

So I'm feeling optimistic that U.S. health care is moving "toward interoperability and openness," the phrase I used in the title to his article and also used in a posting from HIMSS two years ago.

HHS allowed late-coming institutions (those who began the Stage 1 process in 2011) to continue at Stage 1 for another year. This is welcome because they have so much work to do, but means that providers who want to demonstrate Stage 2 information exchange may have trouble because they can't do it with other providers who are ready only for Stage 1.

HHS endorsed some other standards today as well, notably SNOMED for diseases and LRI for lab results. Another nice tidbit from the summit includes the requirement to use electronic medication administration (for instance, bar codes to check for errors in giving medicine) to foster patient safety.

February 22 2012

Data for the public good

Can data save the world? Not on its own. As an age of technology-fueled transparency, open innovation and big data dawns around the world, the success of new policy won't depend on any single chief information officer, chief executive or brilliant developer. Data for the public good will be driven by a distributed community of media, nonprofits, academics and civic advocates focused on better outcomes, more informed communities and the new news, in whatever form it is delivered.

Advocates, watchdogs and government officials now have new tools for data journalism and open government. Globally, there's a wave of transparency that will wash over every industry and government, from finance to healthcare to crime.

In that context, open government is about much more than open data — just look at the issues that flow around the #opengov hashtag on Twitter, including the nature identity, privacy, security, procurement, culture, cloud computing, civic engagement, participatory democracy, corruption, civic entrepreneurship or transparency.

If we accept the premise that Gov 2.0 is a potent combination of open government, mobile, open data, social media, collective intelligence and connectivity, the lessons of the past year suggest that a tidal wave of technology-fueled change is still building worldwide.

The Economist's support for open government data remains salient today:

"Public access to government figures is certain to release economic value and encourage entrepreneurship. That has already happened with weather data and with America's GPS satellite-navigation system that was opened for full commercial use a decade ago. And many firms make a good living out of searching for or repackaging patent filings."

As Clive Thompson reported at Wired last year, public sector data can help fuel jobs, and "shoving more public data into the commons could kick-start billions in economic activity." In the transportation sector, for instance, transit data is open government fuel for economic growth.

There is a tremendous amount of work ahead in building upon the foundations that civil society has constructed over decades. If you want a deep look at what the work of digitizing data really looks like, read Carl Malamud's interview with Slashdot on opening government data.

Data for the public good, however, goes far beyond government's own actions. In many cases, it will happen despite government action — or, often, inaction — as civic developers, data scientists and clinicians pioneer better analysis, visualization and feedback loops.

For every civic startup or regulation, there's a backstory that often involves a broad number of stakeholders. Governments have to commit to open up themselves but will, in many cases, need external expertise or even funding to do so. Citizens, industry and developers have to show up to use the data, demonstrating that there's not only demand, but also skill outside of government to put open data to work in service accountability, citizen utility and economic opportunity. Galvanizing the co-creation of civic services, policies or apps isn't easy, but tapping the potential of the civic surplus has attracted the attention of governments around the world.

There are many challenges for that vision to pass. For one, data quality and access remain poor. Socrata's open data study identified progress, but also pointed to a clear need for improvement: Only 30% of developers surveyed said that government data was available, and of that, 50% of the data was unusable.

Open data will not be a silver bullet to all of society's ills, but an increasing number of states are assembling platforms and stimulating an app economy.

Results-oriented mayors like Rahm Emanuel and Mike Bloomberg are committing to opening Chicago and opening government data in New York City, respectively.

Following are examples of where data for the public good is already having an impact upon the world we live in, along with some ideas about what lies ahead.

Financial good

Anyone looking for civic entrepreneurship will be hard pressed to find a better recent example than BrightScope. The efforts of Mike and Ryan Alfred are in line with traditional entrepreneurship: identifying an opportunity in a market that no one else has created value around, building a team to capitalize on it, and then investing years of hard work to execute on that vision. In the process, BrightScope has made government data about the financial industry more usable, searchable and open to the public.

Due to the efforts of these two entrepreneurs and their California-based startup, anyone who wants to learn more about financial advisers before tapping one to manage their assets can do so online.

Prior to BrightScope, the adviser data was locked up at the Securities and Exchange Commission (SEC) and the Financial Industry Regulatory Authority (FINRA).

"Ryan and I knew this data was there because we were advisers," said BrightScope co-founder Mike Alfred in a 2011 interview. "We knew data had been filed, but it wasn't clear what was being done with it. We'd never seen it liberated from the government databases."

While they knew the public data existed and had their idea years ago, Alfred said it didn't happen because they "weren't in the mindset of being data entrepreneurs" yet. "By going after 401(k) first, we could build the capacity to process large amounts of data," Alfred said. "We could take that data and present it on the web in a way that would be usable to the consumer."

Notably, the government data that BrightScope has gathered on financial advisers goes further than a given profile page. Over time, as search engines like Google and Bing index the information, the data has become searchable in places consumers are actually looking for it. That's aligned with one of the laws for open data that Tim O'Reilly has been sharing for years: Don't make people find data. Make data find the people.

As agencies adapt to new business relationships, consumers are starting to see increased access to government data. Now, more data that the nation's regulatory agencies collected on behalf of the public can be searched and understood by the public. Open data can improve lives, not least through adding more transparency into a financial sector that desperately needs more of it. This kind of data transparency will give the best financial advisers the advantage they deserve and make it much harder for your Aunt Betty to choose someone with a history of financial malpractice.

The next phase of financial data for good will use big data analysis and algorithmic consumer advice tools, or "choice engines," to make better decisions. The vast majority of consumers are unlikely to ever look directly at raw datasets themselves. Instead, they'll use mobile applications, search engines and social recommendations to make smarter choices.

There are already early examples of such services emerging. Billshrink, for example, lets consumers get personalized recommendations for a cheaper cell phone plan based on calling histories. Mint makes specific recommendations on how a citizen can save money based upon data analysis of the accounts added. Moreover, much of the innovation in this area is enabled by the ability of entrepreneurs and developers to go directly to data aggregation intermediaries like Yodlee or CashEdge to license the data.

EMC's Big Data solution accelerates business transformation. We offer a cost-efficient and scale-out IT infrastructure that allows organizations to access broad data sources, collaborate and execute real-time analysis and drive actionable insight.

Transit data as economic fuel

Transit data continues to be one of the richest and most dynamic areas for co-creation of services. Around the United States and beyond, there has been a blossoming of innovation in the city transit sector, driven by the passion of citizens and fueled by the release of real-time transit data by city governments.

Francisca Rojas, research director at the Harvard Kennedy School's Transparency Policy Project, has investigated the dynamics behind the disclosure of data by transit agencies in the United States, which she calls one of the most successful implementations of open government. "In just a few years, a rich community has developed around this data, with visionary champions for disclosure inside transit agencies collaborating with eager software developers to deliver multiple ways for riders to access real-time information about transit," wrote Rojas.

The Massachusetts Bay Transit Authority (MBTA) learned from Portland, Oregon's, TriMet that open data is better. "This was the best thing the MBTA had done in its history," said Laurel Ruma, O'Reilly's director of talent and a long-time resident in greater Boston, in her 2010 Ignite talk on real-time transit data. The MBTA's move to make real-time data available and support it has spawned a new ecosystem of mobile applications, many of which are featured at MBTA.com.

There are now 44 different consumer-facing applications for the TriMet system. Chicago, Washington and New York City also have a growing ecosystem of applications.

As more sensors go online in smarter cities, tracking the movements of traffic patterns will enable public administrators to optimize routes, schedules and capacity, driving efficiency and a better allocation of resources.

Transparency and civic goods

As John Wonderlich, policy director at the Sunlight Foundation, observed last year, access to legislative data brings citizens closer to their representatives. "When developers and programmers have better access to the data of Congress, they can better build the databases and tools that let the rest of us connect with the legislature."

That's the promise of the Sunlight Foundation's work, in general: Technology-fueled transparency will help fight corruption, fraud and reveal the influence behind policies. That work is guided by data, generated, scraped and aggregated from government and regulatory bodies. The Sunlight Foundation has been focused on opening up Congress through technology since the organization was founded. Some of its efforts culminated recently with the publication of a live XML feed for the House floor and a transparency portal for House legislative documents.

There are other horizons for transparency through open government data, which broadly refers to public sector records that have been made available to citizens. For a canonical resource on what makes such releases truly "open," consult the "8 Principles of Open Government Data."

For instance, while gerrymandering has been part of American civic life since the birth of the republic, one of the best policy innovations of 2011 may offer hope for improving the redistricting process. DistrictBuilder, an open-source tool created by the Public Mapping Project, allows anyone to easily create legal districts.

"During the last year, thousands of members of the public have participated in online redistricting and have created hundreds of valid public plans," said Micah Altman, senior research scientist at Harvard University Institute for Quantitative Social Science, via an email last year.

"In substantial part, this is due to the project's effort and software. This year represents a huge increase in participation compared to previous rounds of redistricting — for example, the number of plans produced and shared by members of the public this year is roughly 100 times the number of plans submitted by the public in the last round of redistricting 10 years ago," Altman said. "Furthermore, the extensive news coverage has helped make a whole new set of people aware of the issue and has re framed it as a problem that citizens can actively participate in to solve, rather than simply complain about."

Principles for data in the public good

As a result of digital technology, our collective public memory can now be shared and expanded upon daily. In a recent lecture on public data for public good at Code for America, Michal Migurski of Stamen Design made the point that part of the global financial crisis came through a crisis in public knowledge, citing "The Destruction of Economic Facts," by Hernando de Soto.

To arrive at virtuous feedback loops that amplify the signals that citizens, regulators, executives and elected leaders inundated with information need to make better decisions, data providers and infomediaries will need to embrace key principles, as Migurski's lecture outlined.

First, "data drives demand," wrote Tim O'Reilly, who attended the lecture and distilled Migurski's insights. "When Stamen launched crimespotting.org, it made people aware that the data existed. It was there, but until they put visualization front and center, it might as well not have been."

Second, "public demand drives better data," wrote O'Reilly. "Crimespotting led Oakland to improve their data publishing practices. The stability of the data and publishing on the web made it possible to have this data addressable with public links. There's an 'official version,' and that version is public, rather than hidden."

Third, "version control adds dimension to data," wrote O'Reilly. "Part of what matters so much when open source, the web, and open data meet government is that practices that developers take for granted become part of the way the public gets access to data. Rather than static snapshots, there's a sense that you can expect to move through time with the data."

The case for open data

Accountability and transparency are important civic goods, but adopting open data requires grounded arguments for a city chief financial officer to support these initiatives. When it comes to making a business case for open data, John Tolva, the chief technology officer for Chicago, identified four areas that support the investment in open government:

  1. Trust — "Open data can build or rebuild trust in the people we serve," Tolva said. "That pays dividends over time."
  2. Accountability of the work force — "We've built a performance dashboard with KPIs [key performance indicators] that track where the city directly touches a resident."
  3. Business building — "Weather apps, transit apps ... that's the easy stuff," he said. "Companies built on reading vital signs of the human body could be reading the vital signs of the city."
  4. Urban analytics — "Brett [Goldstein] established probability curves for violent crime. Now we're trying to do that elsewhere, uncovering cost savings, intervention points, and efficiencies."

New York City is also using data internally. The city is doing things like applying predictive analytics to building code violations and housing data to try to understand where potential fire risks might exist.

"The thing that's really exciting to me, better than internal data, of course, is open data," said New York City chief digital officer Rachel Sterne during her talk at Strata New York 2011. "This, I think, is where we really start to reach the potential of New York City becoming a platform like some of the bigger commercial platforms and open data platforms. How can New York City, with the enormous amount of data and resources we have, think of itself the same way Facebook has an API ecosystem or Twitter does? This can enable us to produce a more user-centric experience of government. It democratizes the exchange of information and services. If someone wants to do a better job than we are in communicating something, it's all out there. It empowers citizens to collaboratively create solutions. It's not just the consumption but the co-production of government services and democracy."

The promise of data journalism

NYTimes: 365/360 - 1984 (in color) by blprnt_van, on FlickrThe ascendance of data journalism in media and government will continue to gather force in the years ahead.

Journalists and citizens are confronted by unprecedented amounts of data and an expanded number of news sources, including a social web populated by our friends, family and colleagues. Newsrooms, the traditional hosts for information gathering and dissemination, are now part of a flattened environment for news. Developments often break first on social networks, and that information is then curated by a combination of professionals and amateurs. News is then analyzed and synthesized into contextualized journalism.

Data is being scraped by journalists, generated from citizen reporting, or gleaned from massive information dumps — such as with the Guardian's formidable data journalism, as detailed in a recent ebook. ScraperWiki, a favorite tool of civic coders at Code for America and elsewhere, enables anyone to collect, store and publish public data. As we grapple with the consumption challenges presented by this deluge of data, new publishing platforms are also empowering us to gather, refine, analyze and share data ourselves, turning it into information.

There are a growing number of data journalism efforts around the world, from New York Times interactive features to the award-winning investigative work of ProPublica. Here are just a few promising examples:

  • Spending Stories, from the Open Knowledge Foundation, is designed to add context to news stories based upon government data by connecting stories to the data used.
  • Poderopedia is trying to bring more transparency to Chile, using data visualizations that draw upon a database of editorial and crowdsourced data.
  • The State Decoded is working to make the law more user-friendly.
  • Public Laboratory is a tool kit and online community for grassroots data gathering and research that builds upon the success of Grassroots Mapping.
  • Internews and its local partner Nai Mediawatch launched a new website that shows incidents of violence against journalists in Afghanistan.

Open aid and development

The World Bank has been taking unprecedented steps to make its data more open and usable to everyone. The data.worldbank.org website that launched in September 2010 was designed to make the bank's open data easier to use. In the months since, more than 100 applications have been built using the data.

"Up until very recently, there was almost no way to figure out where a development project was," said Aleem Walji, practice manager for innovation and technology at the World Bank Institute, in an interview last year. "That was true for all donors, including us. You could go into a data bank, find a project ID, download a 100-page document, and somewhere it might mention it. To look at it all on a country level was impossible. That's exactly the kind of organization-centric search that's possible now with extracted information on a map, mashed up with indicators. All of sudden, donors and recipients can both look at relationships."

Open data efforts are not limited to development. More data-driven transparency in aid spending is also going online. Last year, the United States Agency for International Development (USAID) launched a public engagement effort to raise awareness about the devastating famine in the Horn of Africa. The FWD campaign includes a combination of open data, mapping and citizen engagement.

"Frankly, it's the first foray the agency is taking into open government, open data, and citizen engagement online," said Haley Van Dyck, director of digital strategy at USAID, in an interview last year.

"We recognize there is a lot more to do on this front, but are happy to start moving the ball forward. This campaign is different than anything USAID has done in the past. It is based on informing, engaging, and connecting with the American people to partner with us on these dire but solvable problems. We want to change not only the way USAID communicates with the American public, but also the way we share information."

USAID built and embedded interactive maps on the FWD site. The agency created the maps with open source mapping tools and published the datasets it used to make these maps on data.gov. All are available to the public and media to download and embed as well.

The combination of publishing maps and the open data that drives them simultaneously online is significantly evolved for any government agency, and it serves as a worthy bar for other efforts in the future to meet. USAID accomplished this by migrating its data to an open, machine-readable format.

"In the past, we released our data in inaccessible formats — mostly PDFs — that are often unable to be used effectively," said Van Dyck. "USAID is one of the premiere data collectors in the international development space. We want to start making that data open, making that data sharable, and using that data to tell stories about the crisis and the work we are doing on the ground in an interactive way."

Crisis data and emergency response

Unprecedented levels of connectivity now exist around the world. According to a 2011 survey from the Pew Internet and Life Project, more than 50% of American adults use social networks, 35% of American adults have smartphones, and 78% of American adults are connected to the Internet. When combined, those factors mean that we now see earthquake tweets spread faster than the seismic waves themselves. Networked publics can now share the effects of disasters in real time, providing officials with unprecedented insight into what's happening. Citizens act as sensors in the midst of the storm, creating an ad hoc system of networked accountability through data.

The growth of an Internet of Things is an important evolution. What we saw during Hurricane Irene in 2011 was the increasing importance of an Internet of people, where citizens act as sensors during an emergency. Emergency management practitioners and first responders have woken up to the potential of using social data for enhanced situational awareness and resource allocation.

An historic emergency social data summit in Washington in 2010 highlighted how relevant this area has become. And last year's hearing in the United States Senate on the role of social media in emergency management was "a turning point in Gov 2.0," said Brian Humphrey of the Los Angeles Fire Department.

The Red Cross has been at the forefront of using social data in a time of need. That's not entirely by choice, given that news of disasters has consistently broken first on Twitter. The challenge is for the men and women entrusted with coordinating response to identify signals in the noise.

First responders and crisis managers are using a growing suite of tools for gathering information and sharing crucial messages internally and with the public. Structured social data and geospatial mapping suggest one direction where these tools are evolving in the field.

A web application from ESRI deployed during historic floods in Australia demonstrated how crowdsourced social intelligence provided by Ushahidi can enable emergency social data to be integrated into crisis response in a meaningful way.

The Australian flooding web app includes the ability to toggle layers from OpenStreetMap, satellite imagery, and topography, and then filter by time or report type. By adding structured social data, the web app provides geospatial information system (GIS) operators with valuable situational awareness that goes beyond standard reporting, including the locations of property damage, roads affected, hazards, evacuations and power outages.

Long before the floods or the Red Cross joined Twitter, however, Brian Humphrey of the Los Angeles Fire Department (LAFD) was already online, listening. "The biggest gap directly involves response agencies and the Red Cross," said Humphrey, who currently serves as the LAFD's public affairs officer. "Through social media, we're trying to narrow that gap between response and recovery to offer real-time relief."

After the devastating 2010 earthquake in Haiti, the evolution of volunteers working collaboratively online also offered a glimpse into the potential of citizen-generated data. Crisis Commons has acted as a sort of "geeks without borders." Around the world, developers, GIS engineers, online media professionals and volunteers collaborated on information technology projects to support disaster relief for post-earthquake Haiti, mapping streets on OpenStreetMap and collecting crisis data on Ushahidi.

Healthcare

What happens when patients find out how good their doctors really are? That was the question that Harvard Medical School professor Dr. Atul Gawande asked in the New Yorker, nearly a decade ago.

The narrative he told in that essay makes the history of quality improvement in medicine compelling, connecting it to the creation of a data registry at the Cystic Fibrosis Foundation in the 1950s. As Gawande detailed, that data was privately held. After it became open, life expectancy for cystic fibrosis patients tripled.

In 2012, the new hope is in big data, where techniques for finding meaning in the huge amounts of unstructured data generated by healthcare diagnostics offer immense promise.

The trouble, say medical experts, is that data availability and quality remain significant pain points that are holding back existing programs.

There are, literally, bright spots that suggest what's possible. Dr. Gawande's 2011 essay, which considered whether "hotspotting" using health data could help lower medical costs by giving the neediest patients better care, offered another perspective on the issue. Early outcomes made the approach look compelling. As Dr. Gawande detailed, when a Medicare demonstration program offered medical institutions payments that financed the coordination of care for its most chronically expensive beneficiaries, hospital stays and trips to the emergency rooms dropped more than 15% over the course of three years. A test program adopting a similar approach in Atlantic City saw a 25% drop in costs.

Through sharing data and knowledge, and then creating a system to convert ideas into practice, clinicians in the ImproveCareNow network were able to improve the remission rate for Crohn's disease from 49% to 67% without the introduction of new drugs.

In Britain, researchers found that the outcomes for adult cardiac patients improved after the publication of information on death rates. With the release of meaningful new open government data about performance and outcomes from the British national healthcare system, similar improvements may be on the way.

"I do believe we are at the beginning of a revolutionary moment in health care, when patients and clinicians collect and share data, working together to create more effective health care systems," said Susannah Fox, associate director for digital strategy at the Pew Internet and Life Project, in an interview in January. Fox's research has documented the social life of health information, the concept of peer-to-peer healthcare, and the role of the Internet among people living with chronic disease.

In the past few years, entrepreneurs, developers and government agencies have been collaboratively exploring the power of open data to improve health. In the United States, the open data story in healthcare is evolving quickly, from new mobile apps that lead to better health decisions to data spurring changes in care at the U.S. Department of Veterans Affairs.

Since he entered public service, Todd Park, the first chief technology officer of the U.S. Department of Health and Human Services (HHS), has focused on unleashing the power of open data to improve health. If you aren't familiar with this story, read the Atlantic's feature article that explores Park's efforts to revolutionize the healthcare industry through better use of data.

Park has focused on releasing data at Health.Data.Gov. In a speech to a Hacks and Hackers meetup in New York City in 2011, Park emphasized that HHS wasn't just releasing new data: "[We're] also making existing data truly accessible or usable," he said, taking "stuff that's in a book or on a website and turning it into machine-readable data or an API."

Park said it's still quite early in the project and that the work isn't just about data — it's about how and where it's used. "Data by itself isn't useful. You don't go and download data and slather data on yourself and get healed," he said. "Data is useful when it's integrated with other stuff that does useful jobs for doctors, patients and consumers."

What lies ahead

There are four trends that warrant special attention as we look to the future of data for public good: civic network effects, hybridized data models, personal data ownership and smart disclosure.

Civic network effects

Community is a key ingredient in successful open government data initiatives. It's not enough to simply release data and hope that venture capitalists and developers magically become aware of the opportunity to put it to work. Marketing open government data is what repeatedly brought federal Chief Technology Officer Aneesh Chopra and Park out to Silicon Valley, New York City and other business and tech hubs.

Despite the addition of topical communities to Data.gov, conferences and new media efforts, government's attempts to act as an "impatient convener" can only go so far. Civic developer and startup communities are creating a new distributed ecosystem that will help create that community, from BuzzData to Socrata to new efforts like Max Ogden's DataCouch.

Smart disclosure

There are enormous economic and civic good opportunities in the "smart disclosure" of personal data, whereby a private company or government institution provides a person with access to his or her own data in open formats. Smart disclosure is defined by Cass Sunstein, Administrator of the White House Office for Information and Regulatory Affairs, as a process that "refers to the timely release of complex information and data in standardized, machine-readable formats in ways that enable consumers to make informed decisions."

For instance, the quarterly financial statements of the top public companies in the world are now available online through the Securities and Exchange Commission.

Why does it matter? The interactions of citizens with companies or government entities generate a huge amount of economically valuable data. If consumers and regulators had access to that data, they could tap it to make better choices about everything from finance to healthcare to real estate, much in the same way that web applications like Hipmunk and Zillow let consumers make more informed decisions.

Personal data assets

When a trend makes it to the World Economic Forum (WEF) in Davos, it's generally evidence that the trend is gathering steam. A report titled "Personal Data Ownership: The Emergence of a New Asset Class" suggests that 2012 will be the year when citizens start thinking more about data ownership, whether that data is generated by private companies or the public sector.

"Increasing the control that individuals have over the manner in which their personal data is collected, managed and shared will spur a host of new services and applications," wrote the paper's authors. "As some put it, personal data will be the new 'oil' — a valuable resource of the 21st century. It will emerge as a new asset class touching all aspects of society."

The idea of data as a currency is still in its infancy, as Strata Conference chair Edd Dumbill has emphasized. The Locker Project, which provides people with the ability to move their own data around, is one of many approaches.

The growth of the Quantified Self movement and online communities like PatientsLikeMe and 23andMe validates the strength of the movement. In the U.S. federal government, the Blue Button initiative, which enables veterans to download personal health data, has now spread to all federal employees and earned adoption at Aetna and Kaiser Permanente.

In early 2012, a Green Button was launched to unleash energy data in the same way. Venture capitalist Fred Wilson called the Green Button an "OAuth for energy data."

Wilson wrote:

"It is a simple standard that the utilities can implement on one side and web/mobile developers can implement on the other side. And the result is a ton of information sharing about energy consumption and, in all likelihood, energy savings that result from more informed consumers."

Hybridized public-private data

Free or low-cost online tools are empowering citizens to do more than donate money or blood: Now, they can donate, time, expertise or even act as sensors. In the United States, we saw a leading edge of this phenomenon in the Gulf of Mexico, where Oil Reporter, an open source oil spill reporting app, provided a prototype for data collection via smartphone. In Japan, an analogous effort called Safecast grew and matured in the wake of the nuclear disaster that resulted from a massive earthquake and subsequent tsunami in 2011.

Open source software and citizens acting as sensors have steadily been integrated into journalism over the past few years, most dramatically in the videos and pictures uploaded after the 2009 Iran election and during 2011's Arab Spring.

Citizen science looks like the next frontier. Safecast is combining open data collected by citizen science with academic, NGO and open government data (where available), and then making it widely available. It's similar to other projects, where public data and experimental data are percolating.

Public data is a public good

Despite the myriad challenges presented by legitimate concerns about privacy, security, intellectual property and liability, the promise of more informed citizens is significant. McKinsey's 2011 report dubbed big data as the next frontier for innovation, with billions of dollars of economic value yet to be created. When that innovation is applied on behalf of the public good, whether it's in city planning, transit, healthcare, government accountability or situational awareness, those effects will be extended.

We're entering the feedback economy, where dynamic feedback loops between customers and corporations, partners and providers, citizens and governments, or regulators and companies can both drive efficiencies and leaner, smarter governments.

The exabyte age will bring with it the twin challenges of information overload and overconsumption, both of which will require organizations of all sizes to use the emerging toolboxes for filtering, analysis and action. To create public good from public goods — the public sector data that governments collect, the private sector data that is being collected and the social data that we generate ourselves — we will need to collectively forge new compacts that honor existing laws and visionary agreements that enable the new data science to put the data to work.

Photo: NYTimes: 365/360 - 1984 (in color) by blprnt_van, on Flickr

Related:

Report from HIMSS: health care tries to leap the chasm from the average to the superb

I couldn't attend the session today on StealthVest--and small surprise. Who wouldn't want to come see an Arduino-based garment that can hold numerous health-monitoring devices in a way that is supposed to feel like a completely normal piece of clothing? As with many events at the HIMSS conference, which has registered over 35,000 people (at least four thousand more than last year), the StealthVest presentation drew an overflow crowd.

StealthVest sounds incredibly cool (and I may have another chance to report on it Thursday), but when I gave up on getting into the talk I walked downstairs to a session that sounds kind of boring but may actually be more significant: Practical Application of Control Theory to Improve Capacity in a Clinical Setting.

The speakers on this session, from Banner Gateway Medical Center in Gilbert, Arizona, laid out a fairly standard use of analytics to predict when the hospital units are likely to exceed their capacity, and then to reschedule patients and provider schedules to smooth out the curve. The basic idea comes from chemical engineering, and requires them to monitor all the factors that lead patients to come in to the hospital and that determine how long they stay. Queuing theory can show when things are likely to get tight. Hospitals care a lot about these workflow issues, as Fred Trotter and David Uhlman discuss in the O'Reilly book Beyond Meaningful Use, and they have a real effect on patient care too.

The reason I find this topic interesting is that capacity planning leads fairly quickly to visible cost savings. So hospitals are likely to do it. Furthermore, once they go down the path of collecting long-term data and crunching it, they may extend the practice to clinical decision support, public health reporting, and other things that can make a big difference to patient care.

A few stats about data in U.S. health care

Do we need a big push to do such things? We sure do, and that's why meaningful use was introduced into HITECH sections of the American Recovery and Reinvestment Act. HHS released mounds of government health data on Health.data.gov hoping to serve a similar purpose. Let's just take a look at how far the United States is from using its health data effectively.

  • Last November, a CompTIA survey (reported by Health Care IT News) found that only 28% of providers have comprehensive EHRs in use, and another 17% have partial implementations. One has to remember that even a "comprehensive" EHR is unlikely to support the sophisticated data mining, information exchange, and process improvement that will eventually lead to lower costs and better care.

  • According to a recent Beacon Partners survey (PDF), half of the responding institutions have not yet set up an infrastructure for pursuing health information exchange, although 70% consider it a priority. The main problem, according to a HIMSS survey, is budget: HIEs are shockingly expensive. There's more to this story, which I reported on from a recent conference in Massachusetts.

Stats like these have to be considered when HIMSS board chair, Charlene S. Underwood, extolled the organization's achievements in the morning keynote. HIMSS has promoted good causes, but only recently has it addressed cost, interoperability, and open source issues that can allow health IT to break out of the elite of institutions large or sophisticated enough to adopt the right practices.

As signs of change, I am particularly happy to hear of HIMSS's new collaboration with Open Health Tools and their acquisition of the mHealth summit. These should guide the health care field toward more patient engagement and adaptable computer systems. HIEs are another area crying out for change.

An HIE optimist

With the flaccid figures for HIE adoption in mind, I met Charles Parisot, chair of Interoperability Standards and Testing Manager for EHRA, which is HIMSS's Electronic Health Records Association. The biggest EHR vendors and HIEs come together in this association, and Parisot was just stoked with positive stories about their advances.

His take on the cost of HIEs is that most of them just do it in a brute force manner that doesn't work. They actually copy the data from each institution into a central database, which is hard to manage from many standpoints. The HIEs that have done it right (notably in New York state and parts of Tennessee) are sleek and low-cost. The solution involves:

  • Keeping the data at the health care providers, and storing in the HIE only some glue data that associates the patient and the type of data to the provider.

  • Keeping all metadata about formats out to the HIE, so that new formats, new codes, and new types of data can easily be introduced into the system without recoding the HIE.

  • Breaking information exchange down into constituent parts--the data itself, the exchange protocols, identification, standards for encryption and integrity, etc.--and finding standard solutions for each of these.

So EHRA has developed profiles (also known by its ONC term, implementation specifications) that indicate which standard is used for each part of the data exchange. Metadata can be stored in the core HL7 document, the Clinical Document Architecture, and differences between implementations of HL7 documents by different vendors can also be documented.

A view of different architectures in their approach can be found in an EHRA white paper, Supporting a Robust Health Information Exchange Strategy with a Pragmatic Transport Framework. As testament to their success, Parisot claimed that the interoperability lab (a huge part of the exhibit hall floor space, and a popular destination for attendees) could set up the software connecting all the vendors' and HIEs' systems in one hour.

I asked him about the simple email solution promised by the government's Direct project, and whether that may be the path forward for small, cash-strapped providers. He accepted that Direct is part of the solution, but warned that it doesn't make things so simple. Unless two providers have a pre-existing relationship, they need to be part of a directory or even a set of federated directories, and assure their identities through digital signatures.

And what if a large hospital receives hundreds of email messages a day from various doctors who don't even know to whom their patients are being referred? Parisot says metadata must accompany any communications--and he's found that it's more effective for institutions to pull the data they want than for referring physicians to push it.

Intelligence for hospitals

Finally, Parisot told me EHRA has developed standards for submitting data to EHRs from 350 types of devices, and have 50 manufacturers working on devices with these standards. I visited a booth of iSirona as an example. They accept basic monitoring data such as pulses from different systems that use different formats, and translate over 50 items of information into a simple text format that they transmit to an EHR. They also add networking to devices that communicate only over cables. Outlying values can be rejected by a person monitoring the data. The vendor pointed out that format translation will be necessary for some time to come, because neither vendors nor hospitals will replace their devices simply to implement a new data transfer protocol.

For more about devices, I dropped by one of the most entertaining parts of the conference, the Intelligent Hospital Pavilion. Here, after a badge scan, you are somberly led through a series of locked doors into simulated hospital rooms where you get to watch actors in nursing outfits work with lifesize dolls and check innumerable monitors. I think the information overload is barely ameliorated and may be worsened by the arrays of constantly updated screens.

But the background presentation is persuasive: by using attaching RFIDs and all sorts of other devices to everything from people to equipment, and basically making the hospital more like a factory, providers can radically speed up responses in emergency situations and reduce errors. Some devices use the ISM "junk" band, whereas more critical ones use dedicated spectrum. Redundancy is built in throughout the background servers.

Waiting for the main event

The US health care field held their breaths most of last week, waiting for Stage 2 meaningful use guidelines from HHS. The announcement never came, nor did it come this morning as many people had hoped. Because meaningful use is the major theme of HIMSS, and many sessions were planned on helping providers move to Stage 2, the delay in the announcement put the conference in an awkward position.

HIMSS is also nonplussed over a delay in another initiative, the adoption of a new standard in the classification of disease and procedures. ICD-10 is actually pretty old, having been standardized in the 1980s, and the U.S. lags decades behind other countries in adopting it. Advantages touted for ICD-10 are:

  • It incorporates newer discoveries in medicine than the dominant standard in the U.S., ICD-9, and therefore permits better disease tracking and treatment.

  • Additionally, it's much more detailed than ICD-9 (with an order of magnitude more classifications). This allows the recording of more information but complicates the job of classifying a patient correctly.

ICD-10 is rather controversial. Some people would prefer to base clinical decisions on SNOMED, a standard described in the Beyond Meaningful Use book mentioned earlier. Ultimately, doctors lobbied hard against the HHS timeline for adopting ICD-10 because providers are so busy with meaningful use. (But of course, the goals of adopting meaningful use are closely tied to the goals of adopting ICD-10.) It was the pushback from these institutions that led HHS to accede and announce a delay. HIMSS and many of its members were disappointed by the delay.

In addition, there is an upcoming standard, ICD-11, whose sandal some say ICD-10 is not even worthy to lace. A strong suggestion that the industry just move to ICD-11 was aired in Government Health IT, and the possibility was raised in Health Care IT News as well. In addition reflecting the newest knowledge about disease, ICD-11 is praised for its interaction with SNOMED and its use of Semantic Web technology.

That last point makes me a bit worried. The Semantic Web has not been widely adopted, and if people in the health IT field think ICD-10 is complex, how are they going to deal with drawing up and following relationships through OWL? I plan to learn more about ICD-11 at the conference.

February 21 2012

HIMSS asks: Who is Biz Stone and what is Twitter?


Today, one of the founders of Twitter, Biz Stone, gave the opening keynote at HIMSS.

This is probably going to be the best keynote at HIMSS, followed by a speech from Dr. Farzad Mostashari, which will also be excellent. It goes downhill after that: there will be a talk about politics and another talk from an "explorer." I am sure those will be great talks, but when I go to HIMSS, I want to hear about health information technology. Want to know what @biz actually said? As usual, Twitter itself provides an instant summary.

HIMSS stands for Healthcare Information and Management Systems Society. The annual HIMSS conference is the largest Health IT gathering on the planet. Almost 40,000 people will show up to discuss healthcare information systems. Many of them will be individuals sent by their hospitals to try and find out what solutions they will need to purchase in order to meet meaningful use requirements. But many of the attendees are old school health IT experts, many of whom have spent entire careers trying to bring technology into a healthcare system that has resisted computerization tooth and nail. This year will likely break all kind of attendance records for HIMSS. Rightly so: The value of connecting thousands of health IT experts with tens of thousands who are seeking health IT experts has never been higher.

It is ironic that Biz Stone is keynoting this year's talk, because Twitter has changed the health IT game so substantially. I say Twitter specifically, and not "social media" generally. I do not think Facebook or Google+ or your social media of choice has had nearly the impact that Twitter has had on healthcare communications.

HIMSS, and in many cases traditional health IT along with it, is experiencing something of a whirlwind. One force adding wind has been the fact that President Obama has funded EHR systems with meaningful use, and made it clear that the future of healthcare funding will take place at Accountable Care Organizations (ACO) that are paid to keep people healthy rather than to cover procedures when they are sick. It is hard to understate the importance of this. Meaningful Use and ACOs will do more to computerize medicine in five years than the previous 50 years without these incentive changes.

But in the same breath, we must admit that the healthcare system as a whole is strained and unable to meet the needs of millions of its patients. The new force in healthcare is peer to peer medicine. There are really only a few things that doctors provide to patients. They either provide treatment, or they provide facts, or perhaps, they provide context for those facts. More and more, patients are seeking facts and context for that information, from the Internet generally and other patients specifically. This can be dangerous, but when done correctly it can be revolutionary .

It's not rocket science really; our culture has changed. Baby boomers still wonder if it is OK to discuss sexual issues in polite company. Their kids blog about their vasectomies. It's not just that we blog about vasectomies. We read blogs about vasectomies and consider it normal.

Someday, I will decide whether or not I should get a vasectomy. (I would like to have kids first). When I make that decision, I might just give @johnbiggs a shout and ask him how its going. He might not have time to answer me. But some vasectomy patient somewhere will have the time to tell me what it is like. Some epatient will be willing to spend an hour talking to me about what it meant to them to have this procedure. I can talk with patients who had a good experience, I can talk to patients who had a bad experience. I will have access to insights that my urologist does not have, and most importantly does not have time to discuss with me in any case.

For whatever reason, the epatient community centers around Twitter. More than likely this is because of the fundamentally open nature of this network. Although it is possible to "protect" tweets, most account holders tend to tweet to the whole world. If you are interested in a particular health-related issue, you can use Twitter to find the group of people who are discussing that issue. Twitter is a natural way for people who are connected by a common thought or issue to organize. Facebook, on the other hand, is about connecting with people you already know. The famous quote applies: "Facebook is about people you used to know; Twitter is about people you'd like to know better." You could change that quote to read "Twitter is about people you'd like to know who have had vasectomies."

There are people on Twitter right now discussing very personal health issues. All you need to experience this is to do a little research to understand what hashtag a community is using to connect with each other. For instance:

I intentionally chose diseases that are not easy to discuss in person. Discussion on these delicate issues between people dealing with these problems happens all the time on Twitter. Very often Twitter is the place to find and meet people who are dealing with the same healthcare issues that you are, and then discover another place on the web where patients with similar conditions are gathering and helping each other. For better or worse, Twitter has become a kind of peer-to-peer healthcare marketplace. I think this is about a billion times more interesting than surgeons who update families via Twitter, although that is cool, too.

At Health 2.0 or the OSCON healthcare track, these kinds of insights are regarded as somewhat obvious. It is obvious that patients are seeking each other out using social media technologies and that this must somehow eventually be reconciled with the process that doctors are just undertaking to computerize medicine. But at HIMSS this is a revolutionary idea. HIMSS is full of old-school EHR vendors who are applying technology that was cutting edge in 1995 to 2012 problems. HIMSS is full of hospital administrators who recognize that their biggest barrier to meaningful use dollars is not an EHR, but the fact that 50% of their nurses do not know how to type.

I can promise you that the following conversation will be happening thousands of times in the main hall at HIMSS before Biz Stone speaks:

Attendee 1: Who is this speaking?

Attendee 2: Biz Stone.

Attendee 1: Who is that?

Attendee 2: One of the founders of Twitter.

Attendee 1: What is Twitter?

For this audience, Biz Stone talking about how Twitter revolutionizes healthcare will be electric. I wish I could be there.

Meaningful Use and Beyond: A Guide for IT Staff in Health Care — Meaningful Use underlies a major federal incentives program for medical offices and hospitals that pays doctors and clinicians to move to electronic health records (EHR). This book is a rosetta stone for the IT implementer who wants to help organizations harness EHR systems.

Related:

Building the health information infrastructure for the modern epatient

To learn more about what levers the government is pulling to catalyze innovation in the healthcare system, I turned to Dr. Farzad Mostashari (@Farzad_ONC). As the National Coordinator for Health IT, Mostashari is one of the most important public officials entrusted with improving the nation's healthcare system through smarter use of technology.

Dr. Farzad MostashariMostashari, a public-health informatics specialist, was named ONC chief in April 2011, replacing Dr. David Blumenthal. Mostashari's full biography, available at HHS.gov, notes that he "was one of the lead investigators in the outbreaks of West Nile Virus and anthrax in New York City, and was among the first developers of real-time electronic disease surveillance systems nationwide."

I talked to Mostashari on the same day that he published a look back over 2011, which he hailed as a year of momentous progress in health information technology. Our interview follows.

What excites you about your work? What trends matter here?

Farzad Mostashari‏: Well, it's a really fun job. It feels like this is the ideal time for this health IT revolution to tie into other massive megatrends that are happening around consumer and patient empowerment, payment and delivery reform, as I talked about in my TED Med Talk with Aneesh Chopra.

These three streams [how patients are cared for, how care is paid for, and how people take care of their own health] coming together feels great. And it really feels like we're making amazing progress.

How does what's happening today grow out of the passage of the Health Information Technology for Economic and Clinical Health Act (HITECH) Act in 2009?

Farzad Mostashari‏: HITECH was a key part of ARRA, the American Recovery and Reinvestment Act. This is the reinvestment part. People think of roadways and runways and railways. This is the information infrastructure for healthcare.

In the past two years, we made as much progress on adoption as we had made in the past 20 years before that. We doubled the adoption of electronic health records in physician offices between the time the stimulus passed and now. What that says is that a large number of barriers have been addressed, including the financial barriers that are addressed by the health IT incentive payments.

It also, I think, points to the innovation that's happening in the health IT marketplace, with more products that people want to buy and want to use, and an explosion in the number of options people have.

The programs we put in place, like the Regional Health IT Extension Centers modeled after the Agriculture Extension program, give a helping hand. There are local nonprofits throughout the country that are working with one-third of all primary care providers in this country to help them adopt electronic health records, particularly smaller practices and maybe health centers, critical access hospitals and so forth.

This is obviously a big lift and a big change for medicine. It moves at what Jay Walker called "med speed," not tech speed. The pace of transformation in medicine that's happening right now may be unparalleled. It's a good thing.

Healthcare providers have a number of options as they adopt electronic health records. How do you think about the choice between open source versus proprietary options?

Farzad Mostashari‏: We're pretty agnostic in terms of the technology and the business model. What matters are the outcomes. We've really left the decisions about what technology to use to the people who have to live with it, like the doctors and hospitals who make the purchases.

There are definitely some very successful models, not only on the EHR side, but also on the health information exchange side.

(Note: For more on this subject, read Brian Ahier's Radar post on the Health Internet.)

What role do open standards play in the future of healthcare?

Farzad Mostashari‏: We are passionate believers in open standards. We think that everybody should be using them. We've gotten really great participation by vendors of open source and proprietary software, in terms of participating in an open standards development process.

I think what we've enabled, through things like modular certification, is a lot more innovation. Different pieces of the entire ecosystem could be done through reducing the barrier to entry, enabling a variety of different innovative startups to come to the field. What we're seeing is, a lot of the time, this is migrating from installed software to web services.

If we're setting up a reference implementation of the standards, like the Connect software or popHealth, we do it through a process where the result is open source. I think the government as a platform approach at the Veterans Affairs department, DoD, and so forth is tremendously important.

How is the mobile revolution changing healthcare?

We had Jay Walker talking about big change [at a recent ONC Grantee Meeting]. I just have this indelible image of him waving in his left hand a clay cone with cuneiform on it that is from 2,000 B.C. — 4,000 years ago — and in his right hand he held his iPhone.

He was saying both of them represented the cutting edge of technology that evolved to meet consumer need. His strong assertion was that this is absolutely going to revolutionize what happens in medicine at tech speed. Again, not "med speed."

I had the experience of being at my clinic, where I get care, and the pharmacist sitting in the starched, white coat behind the counter telling me that I should take this medicine at night.

And I said, "Well, it's easier for me to take it in the morning." And he said, "Well, it works better at night."

And I asked, acting as an empowered patient, "Well, what's the half life?" And he answered, "Okay. Let me look it up."

He started clacking away at his pharmacy information system; clickity clack, clickity clack. I can't see what he's doing. And then he says, "Ah hell," and he pulls out his smartphone and Googles it.

There's now a democratization of information and information tools, where we're pushing the analytics to the cloud. Being able to put that in the hand of not just every doctor or every healthcare provider but every patient is absolutely going to be that third strand of the DNA, putting us on the right path for getting healthcare that results in health.

We're making sure that people know they have a right to get their own data, making sure that the policies are aligned with that. We're making sure that we make it easy for doctors to give patients their own information through things like the Direct Project, the Blue Button, meaningful use requirements, or the Consumer E-Health Pledge.

We have more than 250 organizations that collectively hold data for 100 million Americans that pledge to make it easy for people to get electronic copies of their own data.

Do you think people will take ownership of their personal health data and engage in what Susannah Fox has described as "peer-to-peer healthcare"?

Farzad Mostashari‏: I think that it will be not just possible, not even just okay, but actually encouraged for patients to be engaged in their care as partners. Let the epatient help. I think we're going to see that emerging as there's more access and more tools for people to do stuff with their data once they get it through things like the health data initiative. We're also beginning to work with stakeholder groups, like Consumer's Union, the American Nurses Association and some of the disease groups, to change attitudes around it being okay to ask for your own records.

This interview was edited and condensed. Photo from The Office of the National Coordinator for Health Information Technology.

Strata 2012 — The 2012 Strata Conference, being held Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work.

Save 20% on registration with the code RADAR20

Related:

February 10 2012

Preview of HIMSS 2012

I am very happy to be attending the Healthcare Information and Management Systems Society (HIMSS) conference this year. We are at a pivotal moment in the history of healthcare in this country and health IT is playing a very prominent role. This will be one of the most important healthcare conferences of the year. If you can't make it to Las Vegas in person, there are opportunities to attend virtually. Just go to himssvirtual.org for more information.

I will be moderating panel presentations at the HIMSS Social Media Center on Tuesday and Wednesday. This year I expect social media to play a much larger presence in the conference, and the new location for the pavilion will put it front and center. Since the keynote this year is from one of the founders of Twitter, Biz Stone, I'm sure there will be a social media flavor throughout the event.

I will also be participating in the brand new eCollaboration Forum at HIMSS on Thursday. The Collaborative Health Consortium has partnered with HIMSS to sponsor a new, exclusive event focused on the shift to collaborative care platforms to take place at the conference. The event will focus on collaborative platforms as foundations for transformation to accountable care. Attendees will be able to learn what a collaborative healthcare platform is and why the healthcare industry needs it, discover paths to take to effectively implement collaborative technologies, and get further resources to help evaluate the solutions available in the shift toward an accountable care health model.

I am honored to be moderating a panel with David C. Kibbe, MD MBA, senior advisor at the American Academy of Family Physicians; Jonathan Hare, chairman of Resilient Network Systems; and Scott Rea, vice president GOV/EDU Relations and senior PKI Architect at DigiCert.

Our session, "Developing Trust in the Health Internet as a Platform," will focus on the tools, technologies and rules we must decide upon to establish trust in the Internet as the platform for healthcare. Effective health information exchange of any resource requires deep trust, following from the right architecture and the right rules. We will discuss efforts like DirectTrust.org and the EHR/HIE Interoperability Workgroup as conveners that are creating a community to move us forward.

My fellow Radar blogger Andy Oram will also be on hand to provide context and his own unique perspective (as well as keep me focused on what matters).

Related:

February 06 2012

Small Massachusetts HIT conference returns to big issues in health care

I've come to look forward to the Massachusetts Heath Data Consortium's annual HIT conference because--although speakers tout the very real and impressive progress made by Massachusetts health providers--you can also hear acerbic and ruthlessly candid critiques of policy and the status quo. Two notable take-aways from last year's conference (which I wrote up at the time) were the equivalence of old "managed care" to new "accountable care organizations" and the complaint that electronic health records were "too expensive, too hard to use, and too disruptive to workflow." I'll return to these claims later.

The sticking point: health information exchange

This year, the spears were lobbed by Ashish Jha of Harvard Medical School, who laid out a broad overview of progress since the release of meaningful use criteria and then accused health care providers of undermining one of its main goals, the exchange of data between different providers who care for the same patient. Through quantitative research (publication in progress), Jha's researchers showed a correlation between fear of competition and low adoption of HIEs. Hospitals with a larger, more secure position in their markets, or in more concentrated markets, were more likely to join an HIE.

The research bolsters Jha's claim that the commonly cited barriers to using HIEs (technical challenges, cost, and privacy concerns) are surmountable, and that the real problem is a refusal to join because a provider fears that patients would migrate to other providers. It seems to me that the government and public can demand better from providers, but simply cracking the whip may be ineffective. Nor should it be necessary. An urgent shortage of medical care exists everywhere in the country, except perhaps a few posh neighborhoods. There's plenty for all providers. Once insurance is provided to all the people in need, no institution should need to fear a lack of business, unless it's performance record is dismal.

Jha also put up some research showing a strong trend toward adopting electronic health records, although the small offices that give half the treatment in the United States are still left behind. He warned that to see big benefits, we need to bring in health care institutions that are currently given little attention by the government--nursing home, rehab facilities, and so forth--and give them incentives to digitize. He wrapped up by quoting David Blumenthal, former head of the ONC, on the subject of HIEs. Blumenthal predicted that we'd see EHRs in most providers over the next few years, and that the real battle would be getting them to adopt health information exchange.

Meanwhile, meaningful use could trigger a shake-out in the EHR industry, as vendors who have spent years building silo'd projects fail to meet the Stage 2 requirements that fulfill the highest aspirations of the HITECH act that defined meaningful use, including health information exchange. Meanwhile, a small but steadily increasing number of open source projects have achieved meaningful use certification. So we'll see more advances in the adoption of both EHRs and HIEs.

Low-hanging fruit signals a new path for cost savings

The big achievement in Massachusetts, going into the conference today, was a recent agreement between the state's major insurer, Blue Cross Blue Shield, and the 800-pound gorilla of the state's health care market, Partners HealthCare System. The pact significantly slows the skyrocketing costs that we've all become accustomed to in the United States, through the adoption of global payments (that is, fixed reimbursements for treating patients in certain categories). That two institutions of such weight can relinquish the old, imprisoning system of fee-for-service is news indeed.

Note that the Blue Cross/Partners agreement doesn't even involve the formation of an Accountable Care Organization. Presumably, Partners believes it can pick some low-hanging fruit through modest advances in efficiency. Cost savings you can really count will come from ACOs, where total care of the patient is streamlined through better transfers of care and intensive communication. Patient-centered medical homes can do even more. So an ACO is actually much smarter than old managed care. But it depends on collecting good data and using it right.

The current deal is an important affirmation of the path Massachusetts took long before the rest of the country in aiming for universal health coverage. We all knew at the time that the Massachusetts bill was not addressing costs and that these would have to be tackled eventually. And at first, of course, health premiums went up because a huge number of new people were added to the roles, and many of them were either sick or part of high-risk populations.

The cost problem is now being addressed through administrative pressure (at one point, Governor Deval Patrick flatly denied a large increase requested by insurers), proposed laws, and sincere efforts at the private level such as the Blue Cross/Partners deal. I asked a member of the Patrick administration whether they problem could be solved without a new law, and he expressed the opinion that there's a good chance it could be. Steven Fox of Blue Cross Blue Shield said that 70% of their HMO members go to physicians in their Alternative Quality Network, which features global payments. And he said these members have better outcomes at lower costs.

ACOs have a paradoxical effect on health information exchange Jha predicted that ACOs, while greatly streamlining the exchanges between their member organizations, because these save money, they will resist exchanging data with outside providers because keeping patients is even more important for ACOs than for traditional hospitals and clinics. Only by keeping a patient can the ACO reap the benefits of the investments they make in long-term patient health.

As Doris Mitchell received an award for her work with the MHDC, executive directory Ray Campbell mentioned the rapid growth and new responsibilities of her agency, the Group Insurance Commission, which negotiates all health insurance coverage for state employees, as cities and towns have been transferring their municipal employees to it. A highly contentious bill last year that allowed the municipalities to transfer their workers to the GIC was widely interpreted as a blow against unionized workers, when it was actually just a ploy to save money through the familiar gambit of combining the insured into a larger pool. I covered this controversy at the time.

A low-key conference

Attendance was down at this year's conference, with about half the attendees and vendors as last year's. Lowered interest seemed to be reflected as none of the three CEOs receiving awards turned up to represent their institutions (the two institutions mentioned earlier for their historic cost-cutting deal--Blue Cross Blue Shield and Partners HealthCare--along with Steward Health Care).

The morning started with a thoughtful look at the requirements for ACOs by Frank Ingari of Essence Healthcare, who predicted a big rise in investment by health care institutions in their IT departments. Later speakers echoed this theme, saying that hospitals should invest less in state-of-the-art equipment that leads to immediately billable activities, and more in the underlying IT that will allow them to collect research data and cut down waste. Some of the benefits available through this research were covered in a talk at the Open Source convention a couple years ago.

Another intriguing session covered technologies available today that could be more widely adopted to improve health care. Videos of robots always draw an enthusiastic response, but a more significant innovation ultimately may be a database McKesson is developing that lets doctors evaluate genetic tests and decide when such tests are worth the money and trouble.

The dozen vendors were joined by a non-profit, Sustainable Healthcare for Haiti. Their first project is one of the most basic health interventions one can make: providing wells for drinkable water. They have a local sponsor who can manage their relationship with the government, and an ambitious mission that includes job development, an outpatient clinic, and an acute care children's hospital.

January 25 2012

AI will eventually drive healthcare, but not anytime soon

TechCrunch recently published a guest post from Vinod Khosla with the headline "Do We Need Doctors or Algorithms?". Khosla is an investor and engineer, but he is a little outside his depth on some of his conclusions about health IT.

Let me concede and endorse his main point that doctors will become bionic clinicians by teaming with smart algorithms. He is also right that eventually the best doctors will be artificial intelligence (AI) systems — software minds rather than human minds.

That said, I disagree with Khosla on almost all of the details. Khosla has accidentally embraced a perspective that too many engineers and software guys bring to health IT.

Bear with me — I am the guy trying to write the "House M.D." AI algorithms that Khosla wants. It's harder than he thinks because of two main problems that he's not considering: The search space problem and the good data problem.

The search space problem

Any person even reasonably informed about AI knows about Go, an ancient game with simple rules. Those simple rules hide the fact that Go is a very complex game indeed. For a computer, it is much harder to play than chess.

Almost since the dawn of computing, chess was regarded as something that required intelligence and was therefore a good test of AI. In 1997, the world chess champion was beaten by a computer. In the year after, a professional Go player beat the best Go software in the world with a 25 stone handicap. Artificial intelligence experts study Go carefully precisely because it is so hard for computers. The approach that computers take toward being smart — thinking of lots of options really fast — stops working when the number of options skyrockets, and the number of potentially right answers also becomes enormous. Most significantly, Go can always be made more computationally difficult by simply expanding the board.

Make no mistake, the diagnosis and treatment of human illness is like Go. It's not like chess. Khosla is making a classic AI mistake, presuming that because he can discern the rules easily, it means the game is simple. Chess has far more complex rules than Go, but it ends up being a simpler game for computers to play.

To be great at Go, software must learn to ignore possibilities, rather than searching through them. In short, it must develop "Go instincts." The same is true for any software that could claim to be a diagnostician.

How can you tell when software diagnosticians are having search problems? When they cannot tell the difference between all of the "right" answers to a particular problem. The average doctor does not need to be told "could it be Zebra Fever?" by a computer that cannot tell that it should have ignored any zebra-related possibilities because it is not physically located in Africa. (No zebras were harmed in the writing of this article, and I do not believe there is a real disease called Zebra Fever.)

The good data problem

The second problem is the good data problem, which is what I spend most of my time working on.

Almost every time I get over-excited about the Direct Project or other health data exchange progress, my co-author David Uhlman brings me back to earth:

What good is it to have your lab results transferred from hospital A to hospital B using secure SMTP and XML? They are going to re-do the labs anyway because they don't trust the other lab.

While I still have hope for health information exchange in the long term, David is right in the short term. Healthcare data is not remotely solid or trustworthy. A good majority of the time, it is total crap. The reason that doctors insist on having labs done locally is not because they don't trust the competitor's lab; it's more of a "devil that you know" effect. They do not trust their own labs either, but they have a better understanding of how and when their own labs screw up. That is not a good environment for medical AI to blossom.

The simple reality is that doctors have good reason to be dubious about the contents of an EHR record. For lots of reasons, not the least of which is that the codes they are potentially entering there are not diagnostically helpful or valid.

Non-healthcare geeks presume that the dictionaries and ontologies used to encode healthcare data are automatically valid. But in fact, the best assumption is that ontologies consistently lead to dangerous diagnostic practices, as they shepherd clinicians into choosing a label for a condition rather than a true diagnosis. Once a patient's chart has a given label, either for diagnosis or for treatment, it can be very difficult to reassess that patient effectively. There is even a name for this problem: clinical inertia. Clinical inertia is an issue with or without computer software involved, but it is very easy for an ontology of diseases and treatments to make clinical inertia worse. The fact is, medical ontologies must be constantly policed to ensure that they do not make things worse, rather then better.

It simply does not matter how good the AI algorithm is if your healthcare data is both incorrect and described with a faulty healthcare ontology. My personal experiences with health data on a wide scale? It's like having a conversation with a habitual liar who has a speech impediment.

So Khosla is not "wrong" per-se; he's just focused on solving the wrong parts of the problem. As a result, his estimations of when certain things will happen are pretty far off.

I believe that we will not have really good diagnostic software until after the singularity and until after we can ensure that healthcare data is reliable. I actually spend most of my time on the second problem, which is really a sociological problem rather then a technology problem.

Imagine if we had a "House AI" before we were able to feed it reliable data? Ironically it would be very much like the character on TV: constantly annoyed that everyone around him keeps screwing up and getting in his way.

Anyone who has seen the show knows that the House character is constantly trying to convince the other characters that the patients are lying. The reality is that the best diagnosticians typically assume that the chart is lying before they assume that the patient is lying. With notable exceptions, the typical patient is highly motivated to get a good diagnosis and is, therefore, honest. The chart, on the other hand, be it paper or digital, has no motivation whatsoever, and it will happily mix in false lab reports and record inane diagnoses from previous visits.

The average doctor doubts the patient chart but trusts the patient story. For the foreseeable future, that is going to work much better than an algorithmically focused approach.

Eventually, Khosla's version of the future (which is typical of forward-thinking geeks in health IT) will certainly happen, but I think it is still 30 years away. The technology will be ready far earlier. Our screwed up incentive systems and backward corporate politics will be holding us back. I hardly have to make this argument, however, since Hugo Campos recently made it so well.

Eventually, people will get better care from AI. For now, we should keep the algorithms focused on the data that we know is good and keep the doctors focused on the patients. We should be worried about making patient data accurate and reliable.

I promise you we will have the AI problem finished long before we have healthcare data that is reliable enough to train it.

Until that happens, imagine how Watson would have performed on "Jeopardy" if it had been trained on "Lord of the Rings" and "The Cat in the Hat" instead of encyclopedias. Until we have healthcare data that is more reliable than "The Cat in the Hat," I will keep my doctor, and you can keep your algorithms, thank you very much.

Meaningful Use and Beyond: A Guide for IT Staff in Health Care — Meaningful Use underlies a major federal incentives program for medical offices and hospitals that pays doctors and clinicians to move to electronic health records (EHR). This book is a rosetta stone for the IT implementer who wants to help organizations harness EHR systems.

Related:

January 16 2012

Medical imaging in the cloud: a conversation about eMix

Over the past few weeks I've been talking to staff at DR Systems about medical imaging in the cloud. DR Systems boasts of offering the first cloud solution for sharing medical images, called eMix. According to my contact Michael Trambert, Lead Radiologist for PACS Reengineering for the Cottage Health System and Sansum Clinic in Santa Barbara, California, eMix started off by offering storage for both images and the reports generated by radiologists, cardiologists, and other imaging specialists. It then expanded to include other medical records in HL7, CDA, CCR, PDF, RTF, and plain text formats. It is vendor neutral, thanks to DICOM (a standard that covers both images and reports) and HL7.

First a bit of background (some of which I offered in an earlier posting). In the U.S., currently, an estimated 30 billion dollars are wasted each year through re-imagining that could be avoided. In addition to cost, there are many reasons to cut down on images: many systems expose patients to small amounts of radiation that pose a cumulative risk over time, and in an emergency situation it's better to reuse a recent image than to waste time taking another.

The situation was brought home by a conversation I had with CIO Chuck Christian of Vincennes, Indiana's Good Samaritan Hospital, a customer of eMix. Patients are often tasked with carrying their own images around (originally as print-outs, and more recently as CD-ROMs). These things often get misplaced, or the CDs turn out to be corrupt or incompatible with the receiving IT system. It's a situation crying out for networked transfer, but HIPAA requires careful attention to security and privacy.

eMix is currently used by about 300 sites, most in the US, and a few in Europe. Uses include remote consulting and sending an eMix image and report "package" to an emergency treatment center ahead of the patient. The eMix package has a built-in viewing capability, so the recipient needs nothing beyond a web browser. Data is protected by encryption on the eMix site and through SSL during transmission.

Sharing is so easy that according to eMix General Manager Florent Saint-Clair, the chief privacy risk in eMix is user error. A sender may type in the wrong email address or accede to a request for an image without ensuring that the recipient is properly authorized to receive it.

This will be an issue with the Direct project, too, when that enters common use. The Direct project will allow the exchange of data over email, but because most doctors' email accounts are not currently secure, eMix just uses email to notify a recipient that an image is ready. Everything else takes place over the Web. The company stresses a number of measures they take to ensure security: for instance, data is always deleted after 30 days, physical security is robust, and storage is on redundant servers.

January 11 2012

The rise of programmable self

Programmable self is a riff on the Quantified Self (QS). It is a simple concept:

Quantify what you want to change about yourself + motivational hacks = personal change success.

There are several potential "motivation hacks" that people regularly employ. The simplest of these is peer pressure. You could tell all of your co-workers every morning whether you kept your diet last night, for instance. Lots of research has shown that sort of thing is an effective motivator for change. Of course, you can make peer pressure digital by doing the same thing on Facebook/Twitter/Google+/whatever. Peer pressure has two components: shame and praise. It's motivating to avoid shame and to get praise. Do it because of a tweet and viola, you have digital peer pressure motivation.

Several books have recently popularized using money, in one form or another, as a motivational tool. There is some evidence, for instance, that people feel worse about losing $10 then they feel good about earning $10. This is called loss aversion, and it can easily be turned into a motivational hack. Having trouble finishing that book? Give 10 envelopes with $100 each to your best friend. Instruct them to mail the envelopes to your favorite (or most hated) charity for each month that you do not finish a chapter. Essentially, you've made your friend a "referee" of your motivational hack.

So, is there any potential to automate this process? To use software to hack your own motivation? One of the coolest applications that does just that is Stickk.com, which is designed to electronically manage contracts you make with yourself.

But that, by itself, is not programmable self.

Programmable self is the combination of a digital motivation hack, like Stickk, with a digital system that tracks behavior, like Fitbit (that's the Quantified Self part). You have to have both. Recently, for example, Stickk started supporting the use of the Withings Scale to support weight entries. Withings is a Wi-Fi-enabled scale that broadcasts your weight automagically to the Withings servers. From there, Withings will send your weight generally wherever you want: HealthVault, other personal health record (PHR) systems, or over to Stickk.com. With that feature, Stickk became a programmable-self platform.

Stickk is pretty old, and Lose it or Lose It, which is focused specifically on losing weight, is also ancient in Internet time. It launched in 2009. The site requires you to take a picture of a weekly weigh in (you actually photograph the scale) and send it in. That counts as digital tracking, but I wonder if it supports Withings (or if it will).

In October 2011, Beeminder launched, billing itself as a direct Stickk competitor, but "for data geeks." Indeed, it is a little geeky: Beeminder is focused on weight change and other goals that are numerically similar to weight gain. The notion is that there is a proper path for the improvement of certain numbers — as well as a little "data jitter" to eliminate — in order to improve. Beeminder also refers to the classical term for the lack of self discipline: akrasia — so bonus points for that.

Last November, The Eatery launched from Massive Health. Massive Health is a massively funded dream team, and their first app is a classic programmable-self experiment. You simply take pictures of your food with your camera (digital tracking = photos) and let others rate your food choices (motivation hack = praise/shame). It's a good idea, and you can expect lots more from Massive Health that qualifies as programmable self.

Recently, GymPact made a big splash, even ending up in a New York Times blog post. Gympact is an iOS (soon Android) app that lets you check in at the gym. If you fail to check in, you get charged a fee. If you do keep your commitment to go to the gym, then you also earn some of the money from all of the people who failed to go to the gym.

Finally, Buster Benson and Jen S. McCabe are working on Bud.ge, which might be the first of the programmable-self platform plays.

All of these count as programmable self. I seriously doubt that any of these companies were aware of my original interview about programmable self or would even be comfortable with the term, which sounds pretty geeky and devious. (Which is, of course, why I love it.)

Other friends of mine in the serious games/games for health/gamification movement would probably count as programmable self, too. But some of them seem convinced that "fun" can have a deeper component in motivation then some of the more aggressive techniques that I, and other programmable self people, seem to favor. I should also mention that I am hardly the only one in the QS movement stumbling in this direction.

I will be writing about programmable self on Radar occasionally, but there is a lot more going on than I can track here . That's why I've also made a Tumblr about the subject and filled it with all of the "software for behavior change" goodness that anyone can take. My @fredtrotter Twitter account is mostly focused on programmable self as well.

Most importantly, I want to hear about what you have tried to do with your own personal change hacks, especially those that impact your health in one way or another. For that, I have set up a Programmable Self Google Group. Please join us. Some of the top minds in behavior change are already subscribers.

The Quantified Self movement is not primarily about the "tool creators" who make stuff for people to use, but a movement of users who defy the boundaries of tools and manage to create innovative quantification tools on their own. Many of these efforts also count as programmable-self approaches. No discussion of programmable self can ignore the work of individuals, so here is a decidedly non-exhaustive list of people innovating in this space:

Strata 2012 — The 2012 Strata Conference, being held Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work.

Save 20% on registration with the code RADAR20

January 10 2012

Software crumple zones

Have you had an auto mechanic look at your wrecked car and sigh, "they just don't make them like they used to"?

Darn right they don't make them like they used to. Old cars were much better about surviving wrecks, but at the expense of the occupants. Modern cars might seem "flimsy" compared to old cars, but that "flimsiness" is due to highly engineered "crumple zones" that ensure that in a wreck, the car takes the energy rather than your body. Thank the gods that cars have changed.

But notice that a superficial examination of highly engineered cars makes them seem worse. They appear to fail by being so "flimsy," relative to earlier, immature designs. In reality the fact that your car is super-sensitive to outside forces is a critical safety feature.

Along similar lines, we need to recognize highly engineered health information systems and embrace features that really impact patient safety. This is true of all critical software systems, not just health IT. But health IT is a great place to find ugly-but-highly-functional software.

One constant criticism of VA VistA (and its many derivatives), which is either the most highly engineered EHR software in the world or damn close, is that it is not "user-friendly enough." Other mature solutions — Epic, et al. — get the same treatment.

Often, clinicians encounter a software process that takes five steps when they believe it should only take one. Sometimes that's clearly due to a design flaw. Sometimes, however, it's a crumple zone. Sometimes those extra "steps" exist because the software engineers had to put 50 extra workflow "paths" into a process that a clinician sees as "simple" so that they could check, and double check, for all kinds of rare cases.

The design of old cars is frozen, but the design of old software is not. Software that is continuously developed over a long period of time is the embodiment of "highly engineered."

You can understand why people like me get a little sea sick when we hear about someone releasing EHR software that runs on an iPad. That seems like a really great idea until you realize how many things they might not have thought of. It's very much like you might feel if someone told you that they were releasing a three-wheeled car. A three-wheeled car, like any iPad application, might be great, or the new engineering substrate might introduce new types of problems that neither the clinicians nor the software designers could predict. More than likely, it will be great and introduce new unforeseen errors.

Don't get me wrong — I think tablet-based interfaces to EHR systems are a great idea. But no one should be taking it for granted that an input-limited device like that will always be safe. Think of the implications of touchscreens plus bacteria/virus transfer alone! That doesn't mean it's not a great idea, but we certainly should not assume that it's a great idea.

My hat is off to the car engineers or health IT developers who suffer criticism for designs that save lives.

Meaningful Use and Beyond: A Guide for IT Staff in Health Care — Meaningful Use underlies a major federal incentives program for medical offices and hospitals that pays doctors and clinicians to move to electronic health records (EHR). This book is a rosetta stone for the IT implementer who wants to help organizations harness EHR systems.

Related:

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl