Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

June 08 2012

mHealth apps are just the beginning of the disruption in healthcare from open health data

Two years ago, the potential of government making health information as useful as weather data felt like an abstraction. Healthcare data could give citizens the same "blue dot" for navigating health and illness akin to the one GPS data fuels on the glowing map of geolocated mobile devices that are in more and more hands.

After all, profound changes in entire industries, take years, even generations, to occur. In government, the pace of progress can feel even slower, measured in evolutionary time and epochs.

Sometimes, history works differently, particularly given the effect of rapid technological changes. It's only a little more than a decade since President Clinton announced he would unscramble global positioning system data (GPS) for civilian use. President Obama's second U.S. chief technology officer, Todd Park, estimated that GPS data is estimated to have unlocked some $90 billion dollars in value in the United States.

In the context, the arc of the Health Data Initiative (HDI) in the United States might leave some jaded observers with whiplash. From a small beginning, the initiative to put health data to work has now expanded around the United States and attracted great interest from abroad, including observers from England National Health Service eager to understand what strategies have unlocked innovation around public data sets.

While the potential of government health data driving innovation may well have felt like an abstraction to many observers, in June 2012, real health apps and services are here -- and their potential to change how society accesses health information, deliver care, lowers costs, connects patients to one another, creates jobs, empowers care givers and cuts fraud is profound. The venture capital community seems to have noticed the opportunity here: according to HHS Secretary Sebelius, investment in healthcare startups is up 60% since 2009.

Headlines about rockstar Bon Jovi 'rocking Datapalooza' and the smorgasbord of health apps on display, however, while both understandable and largely warranted, don't convey the deeper undercurrent of change.

On March 10, 2010, the initiative started with 36 people brainstorming in a room. On June 2, 2010, approximately 325 in-person attendees saw 7 health apps demoed at an historic forum in the theater of Institute of Medicine in Washington, D.C, with another 10 apps packed into an expo in the rotunda outside. All of the apps or services used open government data from the United States Department of Health and Human Services (HHS).

In 2012, 242 applications or services that were based upon or use open data were submitted for consideration to third annual "Health Datapalooza. About 70 health app exhibitors made it to the expo. The conference itself had some 1400 registered attendees, not counting press and staff, and was sold out in advance of the event in the cavernous Washington Convention Center in DC. On Wednesday, I asked Dr. Bob Kucher, now of Venrock Capital and the Brookings Institution, about how the Health Data Initiative has grown and evolved. Dr. Kucher was instrumental to its founding when he served in the Obama administration. Our interview is embedded below:

Revolutionizing the healthcare industry --- in HHS Secretary Sebelius's words, reformulating Wired executive editor Thomas Goetz's 'latent data' to "lazy data" --- has meant years of work unlocking government data and actively engaging the developers, entrepreneurial and venture capital community. While the process of making health data open and machine-readable is far from done, there has been incontrovertible progress in standing up new application programming interfaces (APIs) that enable entrepreneurs, academic institutions and government itself to retrieve it one demand. On Monday, in concert with the Health Data Palooza, a new version of HealthData.gov launched, including the release of new data sets that enable not just hospital quality comparisons but insurance fees as well.

Two years later, the blossoming of the HDI Forum into a massive conference that attracted the interest of the media, venture capitalists and entrepreneurs from around the nation is a short-term development that few people would have predicted in 2010 but that a nation starved for solutions to spiraling healthcare costs and some action from a federal government that all too frequently looks broken is welcome.

"The immense fiscal pressure driving 'innovation' in the health context actually means belated leveraging of data insights other industries take for granted from customer databases," said Chuck Curran, executive director and general counsel or the Network Advertising Initiative, when interviewed at this year's HDI Forum. For example, he suggested, look at "the dashboarding of latent/lazy data on community health, combined with geographic visualizations, to enable “hotspot”-focused interventions, or info about service plan information like the new HHS interface for insurance plan data (including the API).

Curran also highlighted the role that fiscal pressure is having on making both individual payers and employers a natural source of business funding and adoption for entrepreneurs innovating with health data, with apps like My Drugs Costs holding the potential to help citizens and businesses alike cut down on an estimated $95 billion dollars in annual unnecessary spending on pharmaceuticals.

Curran said that health app providers have fully internalized smart disclosure : "it’s not enough to have open data available for specialist analysis -- there must be simplified interfaces for actionable insights and patient ownership of the care plan."

For entrepreneurs eying the healthcare industry and established players within it, the 2012 Health Data Palooza offers an excellent opportunity to "take the pulse of mHealth, as Jody Ranck wrote at GigaOm this week:

Roughly 95 percent of the potential entrepreneur pool doesn’t know that these vast stores of data exist, so the HHS is working to increase awareness through the Health Data Initiative. The results have been astounding. Numerous companies, including Google and Microsoft, have held health-data code-a-thons and Health 2.0 developer challenges. These have produced applications in a fraction of the time it has historically taken. Applications for understanding and managing chronic diseases, finding the best healthcare provider, locating clinical trials and helping doctors find the best specialist for a given condition have been built based on the open data available through the initiative.

In addition to the Health Datapalooza, the Health Data Initiative hosts other events which have spawned more health innovators. RockHealth, a Health 2.0 incubator, launched at its SXSW 2011 White House Startup America Roundtable. In the wake of these successful events, StartUp Health, a network of health startup incubators, entrepreneurs and investors, was created. The organization is focused on building a robust ecosystem that can support entrepreneurs in the health and wellness space.

This health data ecosystem has now spread around the United States, from Silicon Valley to New York to Louisiana. During this year's Health Datapalooza, I spoke with Ramesh Kolluru, a technologist who works at the University of Louisiana, about his work on a hackathon in Louisiana, the "Cajun Codefest," and his impressions of the forum in Washington:

One story that stood out from this year's crop of health data apps was Symcat, an mHealth app that enables people to look up their symptoms and find nearby hospitals and clinics. The application was developed by two medical students at Johns Hopkins University who happened to share a passion for tinkering, engineering and healthcare. They put their passion to work - and somehow found the time (remember, they're in medical school) to build a beautiful, usable health app. The pair landed a $100,000 prize from the Robert Wood Johnson Foundation for their efforts. In the video embedded below, I interview Craig Munsen, one of the medical students, about his application. (Notably, the pair intends to use their prize to invest in the business, not pay off medical school debt.)

There are more notable applications and services to profile from this year's expo - and in the weeks ahead, expect to see some of them here on Radar, For now, it's important now to recognize the work of all of the men and women who have worked so hard over the past two years create public good from public data.

Releasing and making open health data useful, however, is about far more than these mHealth apps: It's about saving lives, improving the quality of care, adding more transparency to a system that needs it, and creating jobs. Park spoke with me this spring about how open data relates to much more than consumer-facing mHealth apps:

As the US CTO seeks to scale open data across federal government by applying the lessons learned in the health data initiative, look for more industries to receive digital fuel for innovation, from energy to education to transit and finance. The White House digital government strategy explicitly embraces releasing open data in APIs to enable more accountability, civic utility and economic value creation.

While major challenges lie ahead, from data quality to security or privacy, the opportunity to extend the data revolution in healthcare to other industries looks more tangible now than it has in years past.

Business publications, including the Wall Street Journal, have woken up to the disruptive potential of open government data As Michael Hickins wrote this week, "The potential applications for data from agencies as disparate as the Department of Transportation and Department of Labor are endless, and will affect businesses in every industry imaginable. Including yours. But if you can think of how that data could let someone disrupt your business, you can stop that from happening by getting there first."

This growing health data movement is not placed within any single individual city, state, agency or company. It's beautifully chaotic, decentralized, and self-propelled, said Park this past week.

"The Health Data Initiative is no longer a government initiative," he said. "It's an American one. "

May 29 2012

US CTO seeks to scale agile thinking and open data across federal government

In the 21st century, federal government must go mobile, putting government services and information at the fingertips of citizens, said United States Chief Technology Officer Todd Park in a recent wide-ranging interview. "That's the first digital government result, outcome, and objective that's desired."

To achieve that vision, Park and U.S. chief information officer Steven VanRoekel are working together to improve how government shares data, architects new digital services and collaborates across agencies to reduce costs and increase productivity through smarter use of information technology.

Park, who was chosen by President Obama to be the second CTO of the United States in March, has been (relatively) quiet over the course of his first two months on the job.

Last Wednesday, that changed. Park launched a new Presidential innovation Fellows program, in concert with VanRoekel's new digital government strategy, at TechCrunch's Disrupt conference in New York City. This was followed by another event for a government audience at the Interior Department headquarters in Washington, D.C. Last Friday, he presented his team's agenda to the President's Council of Advisors on Science and Technology.

"The way I think about the strategy is that you're really talking about three elements," said Park, in our interview. "First, it's going mobile, putting government services at the literal fingertips of the people in the same way that basically every other industry and sector has done. Second, it's being smarter about how we procure technology as we move government in this direction. Finally, it's liberating data. In the end, it's the idea of 'government as a platform.'"

"We're looking for a few good men and women"

In the context of the nation's new digital government strategy, Park announced the launch of five projects that this new class of Innovation Fellows will be entrusted with implementing: a broad Open Data Initiative, Blue Button for America, RFP-EZ, The 20% Campaign, and MyGov.

The idea of the Presidential Innovation Fellows Program, said Park, is to bring in people from outside government to work with innovators inside the government. These agile teams will work together within a six-month time frame to deliver results.

The fellowships are basically scaling up the idea of "entrepreneurs in residence," said Park. "It's a portfolio of five projects that, on top of the digital government strategy, will advance the implementation of it in a variety of ways."

The biggest challenge to bringing the five programs that the US CTO has proposed to successful completion is getting 15 talented men and women to join his team and implement them. There's reason for optimism. Park shared vie email that:

"... within 24 hours of TechCrunch Disrupt, 600 people had already registered via Whitehouse.gov to apply to be a Presidential Innovation Fellow, and another several hundred people had expressed interest in following and engaging in the five projects in some other capacity."

To put that in context, Code for America received 550 applications for 24 fellowships last year. That makes both of these fellowships more competitive than getting in to Harvard in 2012, which received 34,285 applications for its next freshman class. There appears to be considerable appetite for a different kind of public service that applies technology and data for the public good.

Park is enthusiastic about putting open government data to work on behalf of the American people, amplifying the vision that his predecessor, Aneesh Chopra, championed around the country for the past three years.

"The fellows are going to have an extraordinary opportunity to make government work better for their fellow citizens," said Park in our interview. "These projects leverage, substantiate and push forward the whole principle of liberating data. Liberate data."

"To me, one of the aspects of the strategy about which I am most excited, that sends my heart into overdrive, is the idea that going forward, the default state of government data shall be open and machine-readable," said Park. "I think that's just fantastic. You'll want to, of course, evolve the legacy data as fast as you can in that same direction. Setting that as 'this is how we are rolling going forward' — and this is where we expect data to ultimately go — is just terrific."

In the videos and interview that follow, Park talks more about his vision for each of the programs.

A federal government-wide Open Data Initiative

In the video below, Park discusses the Presidential Innovation Fellows program and introduces the first program, which focuses on open data:

Park: The Open Data Initiative is a program to seed and expand the work that we're doing to liberate government data as a platform. Encourage, on a voluntary basis, the liberation of data by corporations, as part of the national data platform, and to actively stimulate the development of new tools and services, and enhance existing tools and services, leveraging the data to help improve Americans' lives in very tangible ways, and create jobs for the future.

This leverages the Open Government Directive to say "look, the default going forward is open data." Also the directive to "API-ize" two high priority datasets and also, in targeted ways, go beyond that, and really push to get more data out there in, critically, machine-readable form, in APIs, and to educate the entrepreneur and innovators of the world that it's there through meetups, and hackathons, and challenges, and "Datapaloozas."

We're doubling down on the Health Data Initiative, we are also launching a much more high-profile Safety Data Initiative, which we kicked off last week. An Energy Data Initiative, which kicked off this week. An education data initiative, which we're kicking off soon, and an Impact Data Initiative, which is about liberating data with respect to inputs and outputs in the non-profit space.

We're also going to be exploring an initiative in the realm of personal finance, enabling Americans to access copies of their financial data from public sector agencies and private sector institutions. So, the format that we're going to be leveraging to execute these initiatives is cloned from the Health Data Initiative.

This will make new data available. It will also take the existing public data that is unusable to developers, i.e. in the form of PDFs, books or static websites, and turn it into liquid machine-readable, downloadable, accessible data via API. Then — because we're consistently hearing that 95% of the innovators and entrepreneurs who could turn our data into magic don't even know the data exists, let alone that it's available to them — engage the developer community and the entrepreneurial community with the data from the beginning. Let them know it's there, get their feedback, make it better.

Blue Button for America

Park: The idea is to develop an open source patient portal capability that will replace MyHealthyVet, which is the Veterans Administration's current patient portal. This will actually allow the Blue Button itself to iterate and evolve more rapidly, so that everY time you add more data to it, it won't require heart surgery. It will be a lot easier, and of course will be open source, so that anyone else who wants to use it can use it as well. On top of that, we're going to do a lot of "biz dev" in America to get the word out about Blue Button and encourage more and more holders of data in the private sector to adopt Blue Button. We're also going to work to help stimulate more tool development by entrepreneurs that can upload Blue Button data and make it useful in all kinds of ways for patients. That's Blue Button for America.

What is RFP-EZ?

Park: The objective is "buying smarter." The project that we're working ON with the Small Business Administration on is called "RFP-EZ."

Basically, it's the idea of setting up a streamlined process for the government to procure solutions from innovative, high-growth tech companies. As you know, most high-growth companies regard the government as way too difficult to sell to.

That A) deprives startups and high-growth companies from the government as a marketplace and, B) perhaps even more problematically, actually deprives the government of their solutions.

The hope here is, through the actions of the RFP-EZ team, to create a process and a prototype that the government can much more easily procure solutions from innovative private firms.

It A) opens up this emerging market called "the government" to high-tech startups and B) infects the government with more of their solutions, which are radically more, pound for pound, effective and cost efficient than a lot of the stuff that the government is currently procuring through conventional channels. That's RFP-EZ.

The 20% Campaign

Park: The 20% Campaign is a project that's being championed by USAID. It's an effort at USAID to, working with other government agencies, NGOs and companies, to catalog the movement of foreign assistance payments from cash to electronics. So, just for example, USAID pays its contractors electronically, obviously, but the contractor who, say, pays highway workers in Afghanistan or the way that police officers get paid in Afghanistan is actually principally via cash. Or has been. And that creates all kinds of waste issues, fraud, and abuse.

The idea is actually to move to electronic payment, including mobile payment — and this has the potential to significantly cut waste, fraud and abuse, to improve financial inclusion, to actually let people on phones, to enable them to access bank accounts set up for them. That leads to all kinds of good things, including safety: it's not ideal to be carrying around large amounts of cash in highly kinetic environments.

The Afghan National Police started paying certain contingents of police officers via mobile phones and mobile payments, as opposed to cash, and what happened is that the police officers started reporting an up to a 30% raise. Of course, their pay hadn't changed, but basically, when it was in cash, a bunch of it got lost. This is obviously a good thing, but it's even more important if you realize that when they were paid what they were paid in cash that they ultimately physically received, that was less than the Taliban in this province was actually paying people to join the Taliban — but the mobile payment, and that level of salary, was greater than the Taliban was paying. That's a critical difference.

It's basically taking foreign assistance payments through the last mile to mobile.

MyGov is the U.S. version of Gov.uk

Park: MyGov is an effort to rapidly prototype a citizen-centric system that allows Americans the information and resources of government that are right for them. Think of it as a personalized channel for Americans to be able to access information resources across government and get feedback from citizens about those information and resources.

How do you plan to scale what you learned while you were HHS CTO to the all of the federal government?

Park: Specifically, we're doing exactly the same thing we did with the Health Data Initiative, kicking off the initiatives with a "data jam" — an ideation workshop where we invite, just like with health data, 40 amazing tech and energy minds, tech and safety innovators, to a room — at the White House, in the case of the Safety Data Initiative, or at Stanford University, in the case of the Energy Initiative.

We walk into the room for several hours and say, "Here's a big pile of data. What would you do with this data?" And they invent 15 or 20 news classes of products or services of the future that we could build with the data. And then we challenge them to, at the end of the session, build prototypes or actual working products, that instantiates their ideas in 90 days, to be highlighted at a White House — hosted Safety Datapalooza, Energy Datapalooza, Education Datapalooza, Impact Datapalooza, etc.

We also take the intellectual capital from the workshops, publish it on the White House website, and publicize the opportunity around the country: Discover the data, come up with your own ideas, build prototypes, and throw your hat in the ring to showcase at a Datapalooza.

What happens at the Datapaloozas — our experience in health guides us — is that, first of all, the prototypes and working products inspire many more innovators to actually build new services, products and features, because the data suddenly becomes really concrete to them, in terms of how it could be used.

Secondly, it helps persuade additional folks in the government to liberate more data, making it available, making it machine-readable, as opposed to saying, "Look, I don't know what the upside is. I can only imagine downsides." What happened in health is, when they went to a Datapalooza, they actually saw that, if data is made available, then at no cost to you and no cost to taxpayers, other people who are very smart will build incredible things that actually enhance your mission. And so you should do the same.

As more data gets liberated, that then leads to more products and services getting built, which then inspires more data liberation, which then leads to more products and services getting built — so you have a virtual spiral, like what's happened in health.

The objective of each of these initiatives is not just to liberate data. Data by itself isn't helpful. You can't eat data. You can't pour data on a wound and heal it. You can't pour data on your house and make it more energy efficient. Data is only useful if it's applied to deliver benefit. The whole point of this exercise, the whole point of these kickoff efforts, is to catalyze the development of an ecosystem of data supply and data use to improve the lives of Americans in very tangible ways — and create jobs.

We have the developers and the suppliers of data actually talk to each other, create value for the American people, and then rinse, wash, repeat.

We're recruiting, to join the team of Presidential Innovation Fellows, entrepreneurs and developers from the outside to come in and help with this effort to liberate data, make it machine-readable, and get it out there to entrepreneurs and help catalyze development of this ecosystem.

We went to TechCrunch Disrupt for a reason: it's right smack dab center in the middle of people we want to recruit. We invite people to check out the projects on WhiteHouse.gov and, if you're interested in applying to be a fellow, indicate their interest. Even if they can't come to DC for 6-plus months to be a fellow, but they want to follow one of the projects or contribute or help in some way, we are inviting them express interest in that as well. For example, if you're an entrepreneur, and you're really interested in the education space, and learning about what data is available in education, you can check out the project, look at the data, and perhaps you can build something really good to show at the Education Datapalooza.

Is open data just about government data? What about smart disclosure?

Park: In the context of the Open Data Initiatives projects, it's not just about liberation of government health data: it's also about government catalyzing the release, on a voluntary basis, of private sector data.

Obviously, scaling Blue Button will extend the open data ecosystem. We're also doubling down on Green Button. I was just in California to host discussions around Green Button. Utilities representing 31 million households and businesses have now committed to make Green Button happen. Close to 10 million households and businesses already have access to Green Button data.

There's also a whole bunch of conversation happening about, at some point later this year, having the first utilities add the option of what we're calling "Green Button Connect." Right now, the Green Button is a download, where you go to a website, hit a green button and bam, you download your data. Green Button Connect is the ability for you to say as a consumer, "I authorize this third party to receive a continuous feed of my electricity usage data."

That creates massive additional opportunity for new products and services. That could go live later this year.

As part of the education data initiative, we are pursuing the launch and scale up of something called "My Data," which will have a red color button. (It will probably, ultimately, be called "Red Button.") This is the ability for students and their families to download an electronic copy of their student loan data, of their transcript data, of their academic assessment data.

That notion of people getting their own data, whether it's your health data, your education data, your finance data, your energy use data, that's an important part of these open data initiatives as well, with government helping to catalyze the release of that data to then feed the ecosystem.

How does open data specifically relate to the things that Americans care about, access to healthcare, reducing energy bills, giving their kids more educational opportunities, and job creation? Is this just about apps?

Park: In healthcare, for example, you'll see a growing array of examples that leverage data to create tangible benefit in many, many ways for Americans. Everything from helping me find the right doctor or hospital for my family to being notified of a clinical trial that could assist my profile and save my life, and the ability to get the latest and greatest information about how to manage my asthma and diabetes via government knowledge in the National Library of Medicine.

There is a whole shift in healthcare systems away from pay-for-volume of services to basically paying to get people healthy. It goes by lots of different names — accountable care organizations or episodic payment — but the fundamental common theme is that the doctors and hospitals increasingly will be paid to keep people healthy and to co-ordinate their care, and keep them out of the hospital, and out of the ER.

There's a whole fleet of companies and services that utilize data to help doctors and hospitals do that work, like utilize Medicare claims data to help identity segments of a patient population that are at real risk, and need to get to the ER or hospital soon. There are tools that help journalists identify easily public health issues, like healthcare outcomes disparities by race, gender and ethnicity. There are tools that help country commissioners and mayors understand what's going on in a community, from a health standpoint, and make better policy decisions, like showing them food desserts. There's just a whole fleet of rapidly growing services for consumers, for doctors, nurses, journalists, employers, public policy makers, that help them make decisions, help them deliver improved health and healthcare, and create jobs, all at the same time.

That's very exciting. If you look at all of those products and services — and a subset of them are the ones that self-identify to us, to actually be exhibited at the Health Datapaloozas. Look at the 20 healthcare apps that were at the first Datapalooza or the 50 that were at the second. This year, there are 230 companies that are being narrowed down to about a total of 100 that will be at the Datapalooza. They collectively serve millions of people today, either through brand new products and services or through new features on existing platforms. They help people in ways that we would never have thought of, let alone build.

The taxpayer dollars expended here were zero. We basically just took our data, made it available in machine-readable format, educated entrepreneurs that it was there, and they did the rest. Think about these other sectors, and think about what's possible in those sectors.

In education, through making the data that we've made available, you can imagine much better tools to help you shop for the college that will deliver the biggest bang for your buck and is the best fit for your situation.

We've actually made available a bunch of data about college outcomes and are making more data available in machine-readable form so it can feed college search tools much better. We are going to be enabling students to download machine-readable copies of their own financial aid application, student loan data and school records. That will really turbo charge "smart scholarship" and school search capabilities for those students. You can actually mash that up with college outcomes in a really powerful, personalized college and scholarship search engine that is enabled by your personal data plus machine-readable data. Tools that help kids and their parents pick the right college for their education and get the right financial aid, that's something government is going to facilitate.

In the energy space, there are apps and services that help you leverage your Green Button data and other data to really assess your electricity usage compared to that of others and get concrete tips on how you can actually save yourself money. We're already seeing very clever, very cool efforts to integrate gamification and social networking into that kind of app, to make it a lot more fun and engaging — and make yourself money.

One dataset that's particularly spectacular that we're making a lot more usable is the EnergyStar database. It's got 40,000 different appliances, everything from washing machines to servers that consumers and businesses use. We are creating a much, much easier to use public, downloadable NSTAR database. It's got really detailed information on the energy use profiles and performance of each of these 40,000 appliances and devices. Imagine that actually integrated into much smarter services.

On safety, the kinds of ideas that people are bringing together are awesome. They're everything from using publicly available safety data to plot the optimal route for your kid to walk home or for a first responder to travel through a city and get to a place most expeditiously.

There's this super awesome resource on Data.gov called the "Safer Products API," which is published by the Consumer Products Safety Commission (CPSC). Consumers send in safety reports to CPSC, but until March of last year, you had to FOIA [Freedom of Information Act] CPSC to get these. So what they've now done is actually publish an API which not only makes the entire database of these reports public, without you having to FOIA them, but also makes it available through an API.

One of the ideas that came up is that, when people buy products on eBay, Craiglist, etc, all the time, some huge percentage of Americans never get to know about a recall — a recall of a crib, a recall of a toy. And even when a company recalls new products, old products are in circulation. What if someone built the ability to integrate the recall data and attach it to all the stuff in the eBays and Craigslists of the world?

Former CIO Vivek Kundra often touted government recall apps based upon government data during his tenure. Is this API the same thing, shared again, or something new?

Park: I think the smartest thing the government can do with data like product recalls data is not build our own shopping sites, or our own product information sites: it's to get the information out there in machine-readable form, so that lots and lots of other platforms that have audiences with millions of people already, and who are really good at creating shopping experiences or product comparison experiences, get the data into their hands, so that they can integrate it seamlessly into what they do. I feel that that's really the core play that the government should be engaged in.

I don't know if the Safer Products API was included in the recall app. What I do know is that before 2011, you had to FOIA to get the data. I think that even if the government included it in some app the government built, that it's important for it to get used by lots and lots of other apps that have a collective audience that's massively greater than any app the government could itself build.

Another example of this is the Hospital Compare website. The Hospital Compare website has been around for a long time. Nobody knows about it. There was a survey done that found 94% of Americans didn't know that there was hospital quality data that was available, let alone that there was a hospital compare website. So, the notion of A) making the hospital care data downloadable and B), we actually deployed it a year and a half ago in API form at Medicare.gov.

That then makes the data much easier for lots of other platforms to incorporate it, that are far more likely than HospitalCompare.gov to be able to present the information in actionable forms for citizens. Even if we build our own apps, we have to get this data out to lots of other people that can help people with it. To do that, we have to make it machine-readable, we have to put it into RESTFUL APIs — or at least make it downloadable — and get the word out to entrepreneurs that it's something they can use.

This is a stunning arbitrage opportunity. Even if you take all this data and you "API-ize" it, it's not automatic that entrepreneurs are going to know it's there.

Let's assume that the hospital quality data is good — which it is — and that you build it, and put it into an API. If nobody knows about it, you've delivered no value to the American people. People don't care whether you API a bunch of data. What they care about is that when they need to find a hospital, like I did, for my baby, I can get that information.

The private sector, in the places where we have pushed the pedal to the medal on this, has just demonstrated the incredible ability to make this data a lot more relevant and help a lot more people with it than we could have by ourselves.

White House photo used on associated home and category pages: white house by dcJohn, on Flickr

April 10 2012

Open source is interoperable with smarter government at the CFPB

CFPBWhen you look at the government IT landscape of 2012, federal CIOs are being asked to address a lot of needs. They have to accomplish your mission. They need to be able to scale initiatives to tens of thousands of agency workers. They're under pressure to address not just network security but web security and mobile device security. They also need to be innovative, because all of this is supported by the same of less funding. These are common requirements in every agency.

As the first federal "start-up agency" in a generation, some of those needs at the Consumer Financial Protection Bureau (CFPB) are even more pressing. On the other hand, the opportunity for the agency to be smarter, leaner and "open from the beginning" is also immense.

Progress establishing the agency's infrastructure and culture over the first 16 months has been promising, save for larger context of getting a director at the helm. Enabling open government by design isn't just a catchphrase at the CFPB. There has been a bold vision behind the CFPB from the outset, where a 21st century regulator would leverage new technologies to find problems in the economy before the next great financial crisis escalates.

In the private sector, there's great interest right now is finding actionable insight in large volumes of data. Making sense of big data is increasingly being viewed as a strategic imperative in the public sector as well. Recently, the White House put its stamp on that reality with a $200 million big data research and development initiative, including a focus on improving the available tools. There's now an entire ecosystem of software around Hadoop, which is itself open source code. The problem that now exists in many organizations, across the public and private sector, is not so much that the technology to manipulate big data isn't available: it's that the expertise to apply big data doesn't exist in-house. The data science talent shortage is real.

People who work and play in the open source community understand the importance of sharing code, especially when that action leads to improving the code base. That's not necessarily an ethic or a perspective that has been pervasive across the federal government. That does seem to be slowly changing, with leadership from the top: the White House used Drupal for its site and has since contributed modules back into the open source community, including one that helps with 508 compliance.

In an in-person interview last week, CFPB CIO Chris Willey (@ChrisWilleyDC) and acting deputy CIO Matthew Burton (@MatthewBurton) sat down to talk about the agency's new open source policy, government IT, security, programming in-house, the myths around code-sharing, and big data.

The fact that this government IT leadership team is strongly supportive of sharing code back to the open source community is probably the most interesting part of this policy, as Scott Merrill picked up in his post on the CFPB and Github.

Our interview follows.

In addition to being the leader of the CFPB's development team over the past year and half, Burton was just named acting deputy chief information officer. What will that mean?

Willey: He hasn't been leading the software development team the whole time. In fact, we only really had an org chart as of October. In the time that he's been here, Matt has led his team to some amazing things. We're going to talk about a one of them today, but we've also got a great intranet. We've got some great internal apps that are being built and that we've built. We've unleashed one version of the supervision system that helps bank examiners do their work in the field. We've got a lot of faith he's going to do great things.

What it actually means is that he's going to be backing me up as CIO. Even though we're a fairly small organization, we have an awful lot going on. We have 76 active IT projects, for example. We're just building a team. We're actually doubling in size this fiscal year, from about 35 staff to 70, as well as adding lots of contractors. We're just growing the whole pie. We've got 800 people on board now. We're going to have 1,100 on board in the whole bureau by the end of the fiscal year. There's a lot happening, and I recognize we need to have some additional hands and brain cells helping me out.

With respect to building an internal IT team, what's the thinking behind having technical talent inside of an agency like this one? What does that change, in terms of your relationship with technology and your capacity to work?

Burton: I think it's all about experimentation. Having technical people on staff allows an organization to do new things. I think the way most agencies work is that when they have a technical need, they don't have the technical people on staff to make it happen so instead, that need becomes larger and larger until it justifies the contract. And by then, the problem is very difficult to solve.

By having developers and designers in-house, we can constantly be addressing things as they come up. In some cases, before the businesses even know it's a problem. By doing that, we're constantly staying ahead of the curve instead of always reacting to problems that we're facing.

How do you use open source technology to accomplish your mission? What are the tools you're using now?

Willey: We're actually trying to use open source in every aspect of what we do. It's not just in software development, although that's been a big focus for us. We're trying to do it on the infrastructure side as well.

As we look at network and system monitoring, we look at the tools that help us manage the infrastructure. As I've mentioned in the past, we are 100% in the cloud today. Open source has been a big help for us in giving us the ability to manipulate those infrastructures that we have out there.

At the end of the day, we want to bring in the tools that make the most sense for the business needs. It's not about only selecting open source or having necessarily a preference for open source.

What we've seen is that over time, the open source marketplace has matured. A lot of tools that might not have been ready for prime time a year ago or two years ago are today. By bringing them into the fold, we potentially save money. We potentially have systems that we can extend. We could more easily integrate with the other things that we have inside the shop that maybe we built or maybe things that we've acquired through other means. Open source gives us a lot of flexibility because there's a lot of opportunities to do things that we might not be able to do with some proprietary software.

Can you share a couple of specific examples of open source tools that you're using and what you actually use them for within mission?

Willey: On network monitoring, for example, we're using ZFS, which is an open source monitoring tool. We've been working with Nagios as well. Nagios, we actually inherited from Treasury — and while Treasury's not necessarily known for its use of open source technologies, it uses that internally for network monitoring. Splunk is another one that we have been using for web analysis. [After the interview, Burton and Willey also shared that they built the CFPB's intranet on MediaWiki, the software that drives Wikipedia.]

Burton: On the development side, we've invested a lot in Django and WordPress. Our site is a hybrid of them. It's WordPress at its core, with Django on top of that.

In November of 2010, it was actually a few weeks before I started here, Merici [Vinton] called me and said, "Matt, what should we use for our website?"

And I said, "Well, what's it going to do?"

And she said, "At first, it's going to be a blog with a few pages."

And this website needed to be up and running by February. And there was no hosting; there was nothing. There were no developers.

So I said, "Use WordPress."

And by early February, we had our website up. I'm not sure that would have been possible if we had to go through a lengthy procurement process for something not open source.

We use a lot of jQuery. We use Linux servers. For development ops, we use Selenium and Jenkins and Git to manage our releases and source code. We actually have GitHub Enterprise, which although not open source, is very sharing-focused. It encourages sharing internally. And we're using GitHub on the public side to share our code. It's great to have the same interface internally as we're using externally.

Developers and citizens alike can go to github.com/cfpb and see code that you've released back to the public and for other federal agencies. What projects are there?

Burton: These are the ones that came up between basic building blocks. They range from code that may not strike an outside developer as that interesting but that's really useful for the government, all the way to things that we created from scratch that are very developer-focused and are going to be very useful for any developer.

On the first side of that spectrum, there's an app that we made for transit subsidy involvement. Treasury used to manage our transit subsidy balances. That involved going to a webpage that you would print out, write into with a pen and then fax to someone.

Willey: Or scan and email it.

Burton: Right. And then once you'd had your supervisor sign it, faxed it over to someone, eventually, several weeks later, you would get your benefits. We started to take over that process and the human resources office came to us and asked, "How can we do this better?"

Obviously, that should just be a web form that you type into, that will auto fill any detail it knows about you. You press submit and it goes into the database, which goes directly to the DOT [Department of Transportation]. So that's what we made. We demoed that for DOT and they really like it. USAID is also into it. It's encouraging to see that something really simple could prove really useful for other agencies.

On the other side of the spectrum, we use a lot of Django tools. As an example, we have a tool we just released through our website called "Ask CFPB." It's a Django-based question and answer tool, with a series of questions and answers.

Now, the content is managed in Django. All of the content is managed from our staging server behind the firewall. When we need to get that content, we need to get the update from staging over to production.

Before, what we had to do was pick up the entire database, copy it and them move it over to production, which was kind of a nightmare. And there was no Django tool for selectively moving data modifications.

So we sat there and we thought, "Oh, we really need something to do that because we're going to be doing a lot of that. We can't be copying the database over every time we need to correct a copy. So two of our developers developed a Django app called "Nudge." Basically, you go into a Django and if you've ever seen a Django admin, you just go into it and assess, "Hey, here's everything that's changed. What do you want to move over?"

You can pick and choose what you want to move over and, with the click of a button, it goes to production. I think that's something that every Django developer will have a use for if they have a staging server.

In a way, we were sort of surprised it didn't exist. So, we needed it. We built it. Now we're giving it back and anybody in the world can use it.

You mentioned the cloud. I know that CFPB is very associated with Treasury. Are you using Treasury's FISMA moderate cloud?

Willey: We have a mix of what I would say are private and public clouds. On the public side, we're using our own cloud environments that we have established. On the private side, we are using Treasury for some of our apps. We're slowly migrating off of treasury systems onto our own cloud infrastructure or our own cloud.

In the case of email, for example, we're looking at email as a service. So we'll be looking at Google, Microsoft and others just to see what's out there and what we might be able to use.

Why is it important for the CFPB to share code back to the public? And who else in the federal government has done something like this, aside from the folks at the White House?

Burton:: We see it the same way that we believe the rest of the open source community sees it: The only way this stuff is going to get better and become more viable is if people share. Without that, then it'll only be hobbyists. It'll only be people who build their own little personal thing. Maybe it's great. Maybe it's not. Open source gets better by the community actually contributing to it. So it's self-interest in a lot of ways. If the tools get better, then what we have available to us is, therefore, gets better. We can actually do our mission better.

Using the transit subsidy enrollment application example, it's also an opportunity for government to help itself, for one agency to help another. We've created this thing. Every federal agency has a transit subsidy program. They all need to allow people to enroll in it. Therefore, it's immediately useful to any other agency in the federal government. That's just a matter of government improving its own processes.

If one group does it, why should another group have to figure it out or have to pay lots of money to have it figured out? Why not just share it internally and then everybody benefits?

Why do you think it's taken until 2012 to have that insight actually be made into reality in terms of a policy?

Burton: I think to some degree, the tools have changed. The ability to actually do this easily is a lot better now than it was even a year or two ago. Government also traditionally lags behind the private sector in a lot of ways. I think that's changing, too. With this administration in particular, I think what we've seen is that government has started to become a little bit on parity with the private sector, including some of the thinking around how to use technology to improve business processes. That's really exciting. And I think as a result, there are a lot of great people coming in as developers and designers who want to work in the federal government because they see that change.

Willey: It's also because we're new. There are two things behind that. First, we're able to sort of craft a technology philosophy with a modern perspective. So we can, from our founding, ask "What is the right way to do this?" Other agencies, if they want to do this, have to turn around decades of culture. We don't have that burden. I think that's a big reason why we're able to do this.

The second thing is a lot of agencies don't have the intense need that we do. We have 76 projects to do. We have to use every means available to us.

We can't say, "We're not going to use a large share of the software that's available to us." That's just not an option. We have to say, "Yes, we will consider this as a commercial good, just like any other piece of proprietary software."

In terms of the broader context for technology and policy, how does open source relate to open government?

Willey: When I was working for the District, Apps for Democracy was a big contest that we did around opening data and then asking developers to write applications using that data that could then be used by anybody. We said that the next logical step was to sort of create more participatory government. And in my mind, open sourcing the projects that we do is a way of asking the citizenry to participate in the active government.

So by putting something in the public space, somebody could pick that up. Maybe not the transit subsidy enrollment project — but maybe some other project that we've put out there that's useful outside of government as well as inside of government. Somebody can pick that code up, contribute to it and then we benefit. In that way, the public is helping us make government better.

When you have conversations around open source in government, what do you say about what it means to put your code online and to have people look at it or work on it? Can you take changes that people make to the code base to improve it and then use it yourself?

Willey: Everything that we put out there will be reviewed by our security team. The goal is that, by the time it's out there, not to have any security vulnerabilities. If someone does discover a security vulnerability, however, we'll be sharing that code in a way that makes it much more likely that someone will point it out to us and maybe even provide a fix than they will exploit it because it's out there. They wouldn't be exploiting our instance of the code; they would be working with the code on Github.com.

I've seen people in government with a misperception of what open source means. They hear that it's code that anyone can contribute to. I think that they don't understand that you're controlling your own instance of it. They think that anyone can come along and just write anything into your code that they like. And, of course, it's not like that.

I think as we talk more and more about this to other agencies, we might run into that, but I think it'll be good to have strong advocates in government, especially on the security side, who can say, "No, that's not the case; it doesn't work that way."

Burton: We have a firewall between our public and private instances at Git as well. So even if somebody contributes code, that's also reviewed on the way in. We wouldn't implement it unless we made sure that, from a security perspective, the code was not malicious. We're taking those precautions as well.

I can't point to one specifically, but I know that there have been articles and studies done on the relative security of open source. I think the consensus in the industry is that the peer review process of open source actually helps from a security perspective. It's not that you have a chaos of people contributing code whenever they want to. It improves the process. It's like the thinking behind academic papers. You do peer review because it enhances the quality of the work. I think that's true for open source as well.

We actually want to create a community of peer reviewers of code within the federal government. As we talk to agencies, we want people to actually use the stuff we build. We want them to contribute to it. We actually want them to be a community. As each agency contributes things, the other agencies can actually review that code and help each other from that perspective as well.

It's actually fairly hard. As we build more projects, it's going to put a little bit of a strain on our IT security team, doing an extra level of scrutiny to make sure that the code going out is safe. But the only way to get there is to grow that pie. And I think by talking with other agencies, we'll be able to do that.

A classic open source koan is that "with many eyes, all bugs become shallow." In IT security, is it that with many eyes, all worms become shallow?

Burton: What the Department of Defense said was if someone has malicious intent and the code isn't available, they'll have some way of getting the code. But if it is available and everyone has access to it, then any vulnerabilities that are there are much more likely to be corrected than before they're exploited.

How do you see open source contributing to your ability to get insights from large amounts of data? If you're recruiting developers, can they actually make a difference in helping their fellow citizens?

Burton: It's all about recruiting. As we go out and we bring on data people and software developers, we're looking for that kind of expertise. We're looking for people that have worked with PostgreSQL. We're looking for people that have worked with Solar. We're looking for people that have worked with Hadoop, because then we can start to build that expertise in-house. Those tools are out there.

R is an interesting example. What we're finding is that as more people are coming out of academia into the professional world, they're actually used to using R in school. And then they have to come out and learn a different tool and they're actually working in the marketplace.

It's similar with the Mac versus the PC. You get people using the Mac in college — and suddenly they have to go to a Windows interface. Why impose that on them? If they're going to be extremely productive with a tool like R, why not allow that to be used?

We're starting to see, in some pockets of the bureau, push from the business side to actually use some of these tools, which is great. That's another change I think that's happened in the last couple of years.

Before, there would've been big resistance on that kind of thing. Now that we're getting pushed a little bit, we have to respond to that. We also think it's worth it that we do.

Related:

Carsharing saves U.S. city governments millions in operating costs

One of the most dynamic sectors of the sharing economy is the trend in large cities toward more collaborative consumption — and the entrepreneurs have followed, from Airbnb to Getable to Freecycle. Whether it's co-working, bike sharing, exchanging books and videos, or cohabiting hackerspaces and community garden spaces, there are green shoots throughout the economy that suggest the way we work, play and learn is changing due to the impact of connection technologies and the Great Recession.

This isn't just about the classic dilemma of "buy vs. rent." It's about whether people or organizations can pool limited resources to more efficiently access tools or services as needed and then pass them back into a commons, if appropriate.

Speaking to TechCrunch last year, Lauren Anderson floated the idea that a collaborative consumption revolution might be as "significant as the Industrial Revolution." We'll see about that. The new sharing economy is clearly a powerful force, as a recent report (PDF) by Latitude Research and Shareable Magazine highlighted, but it's not clear yet if it's going to transform society and production in the same way that industrialized mass production did in the 19th and 20th centuries.

Opportunity Infographic - The New Sharing Economy Study by latddotcom, on Flickr
Infographic from "The New Sharing Economy" study. Read the report (PDF) and see a larger version of this image.

Carsharing is saving

What is clear is that, after years of spreading through the private sector, collaborative consumption is coming to government, and it's making a difference. A specific example: Carsharing via Zipcar in city car fleets is saving money and enabling government to increase its efficacy and decrease its use of natural resources.

After finally making inroads into cities, Zipcar is saving taxpayers real money in the public sector. Technology developed by the car-sharing startup is being used in 10 cities and municipalities in 2012. If data from a pilot with the United States General Services Agency fleet pans out, the technology could be also adopted across the sprawling federal agency's vehicles, saving tens of millions of dollars of operating expenses though smarter use of new technology.

"Now the politics are past, the data are there," said Michael Serafino, general manager for Zipcar's university and FastFleet programs, in a phone interview. "Collaborative consumption isn't so difficult from other technology. We're all used to networked laser printers. The car is just a tool to do business. People are starting to come around to the idea that it can be shared."

As with many other city needs, vehicle fleet management in the public sector shares commonalities across all cities. In every case, municipal governments need to find a way to use the vehicles that the city owns more efficiently to save scarce funds.

The FastFleet product has been around for a little more than three years, said Serafino. Zipcar started it in beta and then took a "methodical approach" to rolling it out.

FastFleet uses the same mechanism that's used throughout thousands of cars in the Zipcar fleet: a magnetized smartcard paired with a card reader in the windshield that can communicate with a central web-based reservation system.

There's a one-time setup charge to get a car wired for the system and then a per-month charge for the FastFleet service. The cost of that installation varies, predicated upon the make of vehicles, type of vehicles and tech that goes into them. Zipcar earns its revenue in a model quite similar to cloud computing and software-as-a-service, where operational costs are billed based upon usage.

Currently, Washington, D.C., Chicago, Santa Cruz, Calif., Boston, New York and Wilmington, Del. are all using FastFleet to add carsharing capabilities to their fleets, with more cities on the way. (Zipcar's representative declined to identify which municipalities are next.)

Boston's pilot cut its fleet in half

"Lots of cities have departments where someone occasionally needs a car," said Matthew Mayrl, chief of staff in the Boston Public Works department, during a phone interview.

"They buy one and then use it semi-frequently, maybe one to two times per week. But they do need it, so they can't give up the car. That means it's not being used for highest utilization."

The utilization issue is the key pain point, in terms of both efficiency and cost. Depending on the make and model, it generally costs between $3,000 and $7,000 on average for a municipality to operate a vehicle, said Serafino. "Utilization is about 30% in most municipal fleets," he said.

That's where collaborative consumption became to relevant to Boston. Mayrl said Boston's Public Works Department talked to Zipcar representatives with two goals in mind: get out of a manual reservation system and reduce the number of cars the city uses, which would reduce costs in the process. "Our public works was, for a long time, administered by a city motor pool," Mayrl said. "It was pretty old school: stop by, get keys, borrow a car."

While Boston did decide to join up with Zipcar, public sector workers aren't using actual Zipcars. The city has licensed Zipcar's FastFleet technology and is adding it to the existing fleet.

One benefit to using just the tech is that it can be integrated with cars that are already branded with the "City of Boston," pointed out Mayrl. That's crucial when the assessing office is visiting a household, he said: In that context, it's important to be identified.

Boston started a pilot in February that was rolled out to existing users of public works vehicles, along with two pilots in assessing and the Department of Motor Vehicles. The program started by taking the oldest cars off the road and training the relevant potential drivers. Using carsharing, the city of Boston was able to reduce the number of vehicles in the pilot by over 50%.

"Previously, there were 28 cars between DPW [the Public Works department] and those elsewhere in the department," said Mayrl. "That's been cut in half. Now we have 12 to 14 cars without any missed reservations. This holds a lot of promise, only a month in. We don't have to worry about maintenance or whether someone is parked in the wrong place or cleaning snow off a car. We hope that if this is successful, we can roll it out to other departments."

The District's fleet gets leaner

While a 50% reduction in fleet size looks like significant cost savings, Serafino said that a 2:1 ratio is actually a conservative number.

"We strive for 3:1," Serafino said. "The one thing we have is data. We capture and gather data from every single use of every single vehicle by every single driver, at a very granular level, including whenever a driver gets in and out. That allows a city to measure real utilization and efficiency. Using those numbers, officials can drive policy and other things. You can take effective utilization and real utilization and say, 'we're taking away these four cars from this area.' You can use hard data gathered by the system to make financial and efficiency decisions."

Based upon the results to date, Serafino said he expects Washington, DC, to triple its investment in the system. "The original pilot was started by a mandated reduction by [former DC Mayor Adrian] Fenty, who said 'make this goal,' and 'get it done by this date.' Overall, DC went from 365 to 80 vehicles by consolidating and cooperating."

Serafino estimated the reduction represents about 50% of the opportunity for DC to save money. "The leader of the DC Department of Public Works wants to do more," he said. "The final plans are to get to a couple of hundred vehicles under management, resulting in another reduction by at least 200 cars." Serafino estimated potential net cost savings would be north of $1 million per year.

There is a floor, however, for how lean a city's car fleet can become — and a ceiling for optimal utilization as well.

"The more you reduce, the harder it gets," said Serafino. "DC may have gone too far, by going down to 80 [vehicles]. It has hurt mobility." If you cut into fat deep enough, in other words, eventually you hit muscle and bone.

"DC is passing 70% utilization on a per-day basis," said Serafino. "They have three to four people using each of the cars every day. The trip profile, in the government sense, is different from other customers. We don't expect to go over 80%. There is a point where you can get too lean. DC has kind of gotten there now."

In Boston, Mayrl said they did a financial analysis of how to reduce costs from their car fleet. "It was cheaper to better manage the cars we have than to buy new ones. Technology helps us do that. [Carsharing] had already been done in a couple of other cities. Chicago does it. The city of DC does it. We went to a competitive bid for an online vehicle fleet management software system. [Zipcar] was the only respondent."

Given that FastFleet has been around for more than three years and there's a strong business case for employing the technology, the rate of adoption by American cities might seem to be a little slow to outside observers. What would be missing from that analysis are the barriers to entry for startups that want to compete in the market for government services.

"What hit us was the sales cycle," said Zipcar's Serafino. "The average is about 18 months to two years on city deals. That's why they're all popping now, with more announcements to come soon."

The problem, Serafino mused, was not making the case for potential cost savings. "Cities will only act as sensitive as politics will allow," said Serafino.

"Boston, San Francisco, New York and Chicago are trying. The problem is the automotive and vehicle culture," Serafino said. "That, combined with the financial aspects of decentralized budgeting for fleets, is the bane of fleet managers. Most automotive fleet managers in cities don't control their own destinies. Chicago is one of the very few cities where they can control the entire fleet.

Cities do have other options to use technology to manage their car fleets, from telematics providers to GPS devices to web-based reservation systems, each of which may be comparatively less expensive to buy off the shelf.

One place that Zipcar will continue to face competition at the local level is from companies that provide key vending machines, which are essentially automated devices on garage walls.

"You go get a key and go to a car," said Serafino. "If you have 20 cars in one location, it's not as likely to make sense to choose our system. If you have 50 cars in three locations, that's a different context. You can't just pick up a keybox and move it."

Collaborative consumption goes federal?

Zipcar is continuing along the long on-ramp to working with government. The next step for the company may be to help Uncle Sam with the federal government's car fleet.

As noted previously, the U.S. General Services Agency (GSA) has already done a collaborative consumption pilot using part of its immense vehicle fleet. Serafino says the GSA is now using that data to prepare a broader procurement action for a request for proposals.

The scale for potential cost savings is significant: The GSA manages some 210,000 vehicles, including a small but growing number of electric vehicles.

Given congressional pressure to find cost savings in the federal budget, if the GSA can increase the utilization of its fleet in a way that's even vaguely comparable to the savings that cities are finding, collaborative consumption could become quite popular in Congress.

If carsharing at the federal level succeeded similarly well at scale, members of Congress and staff that became familiar with collaborative consumption through the wildly popular Capital bike sharing program may well see the sharing economy in a new light.

"There's a broader international trend to work to share resources more efficiently, from energy to physical infrastructure," said Mayrl. "Like every good city, we're copying the successful stuff elsewhere."

Related:

April 09 2012

The Consumer Financial Protection Bureau shares code built for the people with the people

Editor's Note: This guest post is written by Matthew Burton, the acting deputy chief information officer of the Consumer Financial Protection Bureau (@CFPB). The quiet evolution in government IT has been a long road, with many forks. In the original version of this piece, published on the CFPB's blog, Burton needed to take the time to explain what open source software is because many people in government and citizens in the country still don't understand it, unlike readers here at Radar. That's why the post below includes a short section outlining the basics of open source. — Alex Howard.


The Consumer Financial Protection Bureau (CFPB) was fortunate to be born in the digital era. We've been able to rethink many of the practices that make financial products confusing to consumers and certain regulations burdensome for businesses. We've also been able to launch the CFPB with a state-of-the-art technical infrastructure that's more stable and more cost-effective than an equivalent system was just 10 years ago.

Many of the things we're doing are new to government, which has made them difficult to achieve. But the hard part lies ahead. While our current technology is great, those of us on the CFPB's Technology & Innovation team will have failed if we're still using the same tools 10 years from now. Our goal is not to tie the Bureau to 2012's technology, but to create something that stays modern and relevant — no matter the year.

Good internal technology policies can help, especially the policy that governs our use of software source code. We are unveiling that policy today.

Source code is the set of instructions that tells software how to work. This is distinct from data, which is the content that a user inputs into the software. Unlike data, most users never see software source code; it works behind the scenes while the users interact with their data through a more intuitive, human-friendly interface.

Some software lets users modify its source code, so that they can tweak the code to achieve their own goals if the software doesn't specifically do what users want. Source code that can be freely modified and redistributed is known as "open-source software," and it has been instrumental to the CFPB's innovation efforts for a few reasons:

  • It is usually very easy to acquire, as there are no ongoing licensing fees. Just pay once, and the product is yours.
  • It keeps our data open. If we decide one day to move our website to another platform, we don't have to worry about whether the current platform is going to keep us from exporting all of our data. (Only some proprietary software keeps its data open, but all open source software does so.)
  • It lets us use tailor-made tools without having to build those tools from scratch. This lets us do things that nobody else has ever done, and do them quickly.

Until recently, the federal government was hesitant to adopt open-source software due to a perceived ambiguity around its legal status as a commercial good. In 2009, however, the Department of Defense made it clear that open source software products are on equal footing with their proprietary counterparts.

We agree, and the first section of our source code policy is unequivocal: We use open-source software, and we do so because it helps us fulfill our mission.

Open-source software works because it enables people from around the world to share their contributions with each other. The CFPB has benefited tremendously from other people's efforts, so it's only right that we give back to the community by sharing our work with others.

This brings us to the second part of our policy: When we build our own software or contract with a third party to build it for us, we will share the code with the public at no charge. Exceptions will be made when source code exposes sensitive details that would put the Bureau at risk for security breaches; but we believe that, in general, hiding source code does not make the software safer.

We're sharing our code for a few reasons:

  • First, it is the right thing to do: the Bureau will use public dollars to create the source code, so the public should have access to that creation.
  • Second, it gives the public a window into how a government agency conducts its business. Our job is to protect consumers and to regulate financial institutions, and every citizen deserves to know exactly how we perform those missions.
  • Third, code sharing makes our products better. By letting the development community propose modifications , our software will become more stable, more secure, and more powerful with less time and expense from our team. Sharing our code positions us to maintain a technological pace that would otherwise be impossible for a government agency.

The CFPB is serious about building great technology. This policy will not necessarily make that an easy job, but it will make the goal achievable.

Our policy is available in three formats: HTML, for easy access; PDF, for good presentation; and as a GitHub Gist, which will make it easy for other organizations to adopt a similar policy and will allow the public to easily track any revisions we make to the policy.

If you're a coder, keep an eye on our GitHub account. We'll be releasing code for a few projects in the coming weeks.

Related:

April 01 2012

What is smart disclosure?

Citizens generate an enormous amount of economically valuable data through interactions with with companies and government. Earlier this year, a report from the World Economic Forum and McKinsey Consulting described the emergence of personal data as of a new asset class." The value created from such data does not , however, always go to the benefit of consumers, particularly when third parties collect it, separating people from their personal data.

The emergence of new technologies and government policies has provided an opportunity to both empower consumers and create new markets from "smarter disclosure" of this personal data. Smart disclosure is when a private company or government agency provides a person with periodic access to his or her own data in open formats that enable them to easily put the data to use. Specifically, smart disclosure refers to the timely release of data in standardized, machine readable formats in ways that enable consumers to make better decisions about finance, healthcare, energy or other contexts.

Smart disclosure is "a new tool that helps provide consumers with greater access to the information they need to make informed choices," wrote Cass Sunstein, the U.S. administrator of the White House Office of Information and Regulatory Affairs (OIRA), in a post on smart disclosure on the White House blog. Sunstein delivered a keynote address at the White House Summit on smart disclosure at the U.S. National Archives on Friday. He authored a memorandum providing  guidance on smart disclosure guidance from OIRA in September 2011.

Smart disclosure is part of the final United States National Action Plan for its participation in the Open Government Partnership." Speaking at the launch of the Open Government Partnership in New York City last September, the president specifically referred to the role of smart disclosure in the United States:

"We’ve developed new tools -- called 'smart disclosures' -- so that the data we make public can help people make health care choices, help small businesses innovate, and help scientists achieve new breakthroughs," said President Obama. "We’ve been promoting greater disclosure of government information, empowering citizens with new ways to participate in their democracy," said President Obama. "We are releasing more data in usable forms on health and safety and the environment, because information is power, and helping people make informed decisions and entrepreneurs turn data into new products, they create new jobs."

In the months since the announcement, the U.S. National Science and Technology Council established a smart disclosure task force dedicated to promoting better policies and implementation across government.

"In many contexts, the federal government uses disclosure as a way to ensure that consumers know what they are purchasing and are able to compare alternatives," wrote Sunstein at the White House blog. "Consider nutrition facts labels, the newly designed automobile fuel economy labels, and ChooseMyPlate.gov.  Modern technologies are giving rise to a series of new possibilities for promoting informed decisions."

Smart disclosure is a "case of the Administration asking agencies to focus on making available high value data (as distinct from traditional transparency and accountability data) for purposes other than decreasing corruption in government," wrote New York Law School professor Beth Noveck, the former U.S. deputy chief technology officer for open government, in an email. "It starts from the premise that consumers, when given access to information and useful decision tools built by third parties using that information, can self-regulate and stand on a more level playing field with companies who otherwise seek to obfuscate." The choice of Todd Park as United States CTO also sends a message about the importance of smart disclosure to the administration, she said.

The United Kingdom's “midata” smart disclosure initiative is an important smart disclosure case study outside of the United States. Progress there has come in large part because the UK has a privacy law that gives citizens the right to access their personal data held by private companies, unlike the United States. In the UK, however, companies have been complying with the law in a way that did not realize the real potential value of that right to data, which is to say that a citizen could request personal data and it would arrive the mail weeks later at a cost of a few dozen pounds. The UK government has launched a voluntary public-private partnership to enable companies to comply with the law by making the data available online in open formats. The recent introduction of the Consumer Privacy Bill of Rights from the White House and Privacy Report from the FTC suggests that such rights to personal data ownership might be negotiated, in principle, much as a right to credit reports have been in the past.

Four categories of smart disclosure

One of the most powerful versions of smart disclosure is when data on products or services (including pricing algorithms, quality, and features) is combined with personal data (like customer usage history, credit score, health, energy and education data) into "choice engines" (like search engines, interactive maps or mobile applications) that enable consumers to make better decisions in context, at the point of a buying or contractual decision. There are four broad categories where smart disclosure applies:

  1. When government releases data about products or services. For instance, when the Department of Health and Human Services releases hospital quality ratings, the Security and Exchange Commission releases public company financial filings in machine-readable formats at XBLR.SEC.gov, or the Department of Education puts data about more than 7,000 institutions online in a College Navigator for prospective students.
  2. When government releases personal data about a citizen. For instance, when the Department of Veterans Affairs gives veterans access to health records using at the "Blue Button" or the IRS provides citizens with online access to their electronic tax transcript. The work of BrightScope liberating financial advisor data and 401(k) data has been an early signal of how data drives the innovation economy.
  3. When a private company releases information about products or services in machine readable formats. Entrepreneurs can then use that data to empower consumers. For instance, both Billshrink.com and Hello Wallet may enhance consumer finance decisions.
  4. When a private company releases personal data about usage to a citizen. For instance, when a power utility company provides a household access to its energy usage data through the Green Button or when banks allowing customers to download their transaction histories in a machine readable format to use at Mint.com or similar services. As with the Blue Button for healthcare data and consumer finance, the White House asserts that providing energy consumers with secure access to information about energy usage will increase innovation in the sector and empower citizens with more information.

An expanding colorwheel of buttons

Should smart disclosure initiatives continue to gather steam, citizens could see “Blue Button”-like and "Green Button"-like solutions for every kind of data government or industry collects about citizens.  For example, the Department of Defense has military training and experience records. Social Security and the Internal Revenue Service have the historical financial history of citizens, such as earnings and income. The Department of Veterans Affairs and Centers for Medicare and Medicaid Services have personal health records.

More "Green Button"-like mechanisms could enable secure, private access to private industry collects about citizen services. The latter could includes mobile phone bills, credit card fees, mortgage disclosures, mutual fund fee and more, except where there are legal restrictions, as for national security reasons.

Earlier this year, influential venture capitalist Fred Wilson encouraged entrepreneurs and VCs to get behind open data. Writing on his widely read blog, Wilson urged developers to adopt the Green Button.

"This is the kind of innovation that gets me excited," Wilson wrote. "The Green Button is like OAuth for energy data. It is a simple standard that the utilities can implement on one side and web/mobile developers can implement on the other side. And the result is a ton of information sharing about energy consumption and in all likelihood energy savings that result from more informed consumers.

When citizens gain access to data and put it to work, they can tap it to make better choices about everything from finance to healthcare to real estate, much in the same way that Web applications like Hipmunk and Zillow let consumers make more informed decisions.

"I'm a big fan of simplicity and open standards to unleash a lot of innovation," wrote Wilson. "APIs and open data aren't always simple concepts for end users. Green Buttons and Blue Buttons are pretty simple concepts that most consumers will understand. I'm hoping we soon see Yellow Buttons, Red Buttons, Purple Buttons, and Orange Buttons too. Let's get behind these open data initiatives. Let's build them into our apps. And let's pressure our hospitals, utilities, and other institutions to support them."

The next generation of open data is personal data, wrote open government analyst David Eaves this month:

I would love to see the blue button and green button initiative spread to companies and jurisdictions outside the United States. There is no reason why for example there cannot be Blue Buttons on the Provincial Health Care website in Canada, or the UK. Nor is there any reason why provincial energy corporations like BC Hydro or Bullfrog Energy (there's a progressive company that would get this) couldn't implement the Green Button. Doing so would enable Canadian software developers to create applications that could use this data and help citizens and tap into the US market. Conversely, Canadian citizens could tap into applications created in the US.

The opportunity here is huge. Not only could this revolutionize citizens access to their own health and energy consumption data, it would reduce the costs of sharing health care records, which in turn could potentially create savings for the industry at large.

Data drives consumer finance innovation

Despite recent headlines about the Green Button and the household energy data market, the biggest US smart disclosure story of this type is currently consumer finance, where there is already significant private sector activity going on today.

For instance, if a consumer visits Billshrink.com, you can get personalized recommendations for a cheaper cell phone plan based on your calling history. Mint.com will make specific recommendations on how to save (and alternative products to use) based on an analysis of the accounts it is pulling data from. Hello Wallet is enabled by smart disclosure by banks and government data. The sector's success hints at the innovation that's possible when people get open, portable access to their personal data in a a consumer market of sufficient size and value to attract entrepreneurial activity.

Such innovation is enabled in part because entrepreneurs and developers can go directly to data aggregation intermediaries like Yodlee or CashEdge and license the data, meaning that they do not have to strike deals directly with each of the private companies or build their own screen scraping technology, although some do go it alone.

"How do people actually make decisions?  How can data help improve those decisions in complex markets?  Research questions like these in behavioral economics are priorities for both the Russell Sage Foundation and the Alfred P. Sloan Foundation," said Daniel Goroff, a Sloan Program Director, in an interview yesterday.  "That's why we are launching a 'Smart Disclosure Research and Demonstration Design Competition.'  If you have ideas and want to win a prize,  please send Innocentive.com a short essay.  Even if you are not in a position to carry out the work, we are especially interested in finding and funding projects that can help measure the costs and benefits of existing or novel 'choice engines.'" 

What is the future of smart disclosure?

This kind of vibrant innovation could spread to many other sectors, like energy, health, education, telecommunication, food and nutrition, if relevant data were liberated. The Green Button is an early signal in this area, with the potential to spread to 27 million households around the United States. The Blue Button, with over 800,000 current users, is spreading to private health plans like Aetna and Walgreens, with the potential to spread to 21 million users.

Despite an increasingly number of powerful tools that enable data journalists and scientists to interrogate data, many of even the most literate consumers do not look at data themselves, particularly if it is in machine-readable, as opposed to human-readable formats. Instead, they digest it from ratings agencies, consumer reports and guides to the best services or products in a given area. Increasingly, entrepreneurs are combining data with applications, algorithms and improved user interfaces to provide consumers with "choice engines."

As Tim O'Reilly outlined in his keynote speech yesterday, the future of smart disclosure includes more than quarterly data disclosure from the SEC or banks. If you're really lining up with the future, you have to think about real-time data and real-time data systems, he said. Tim outlined 10 key lessons his presentation, an annotated version of which is embedded below.

The Future of Smart Disclosure (pdf)
View more presentations from Tim O'Reilly

When released through smart disclosure, data resembles a classic "public good" in a broader economic sense. Disclosures of such open data in a useful format are currently under-produced by the marketplace, suggesting a potential role for government in the facilitation of its release. Generally, consumers do not have access to it today.

Well over a century ago, President Lincoln said that "the legitimate object of government is to do for the people what needs to be done, but which they cannot by individual effort do at all, or do so well, for themselves." The thesis behind smart disclosure in the 21st century is that when consumers have access to that personal data and the market creates new tools to put to work, citizens will be empowered make economic, education and lifestyle choices that enable to them to live healthier, wealthier, and -- in the most aspirational sense -- happier lives.

"Moving the government into the 21st century should be applauded," wrote Richard Thaler, an economics professor at the University of Chicago, in the New York Times last year. In a time when so many citizens are struggling with economic woes, unemployment and the high costs of energy, education and healthcare, better tools that help them invest and benefit from personal data are sorely needed..

March 09 2012

OK, I Admit It. I have a mancrush on the new Federal CTO, Todd Park

I couldn't be more delighted by the announcement today that Todd Park has been named the new Chief Technology Officer for the United States, replacing Aneesh Chopra.

I first met Todd in 2008 at the urging of Mitch Kapor, who thought that Todd was the best exemplar in the healthcare world of my ideas about the power of data to transform business and society, and that I would find him to be a kindred spirit. And so it was. My lunch with Todd turned into a multi-hour brainstorm as we walked around the cliffs of Lands End in San Francisco. Todd was on fire with ideas about how to change healthcare, and the opportunity of the new job he'd just accepted, to become the CTO at HHS.

Subsequently, I helped Todd to organize a series of workshops and conferences at HHS to plan and execute their open data strategy. I met with Todd and told him how important it was not just to make data public and hope developers would come, but to actually do developer evangelism. I told him how various tech companies ran their developer programs, including some stories about Amazon's rollout of AWS: they had first held a small, private event to which they invited people and companies who'd been unofficially hacking on their data, told them their plans, and recruited them to build apps against the new APIs that were planned. Then, when they made their public announcement, they had cool apps to show, not just good intentions.

Todd immediately grasped the blueprint, and executed with astonishing speed. Before long, he held a workshop for an invited group of developers, entrepreneurs and health data wonks to map out useful data that could be liberated, and useful applications that could be built with it. Six months later, he held a public conference to showcase the 40-odd applications that had been developed. Now in its third year, the event has grown into what Todd calls the Health Datapalooza. As noted on GigaOm, the event has already led to several venture backed startup. (Applications are open for startups to be showcased at this year's event, June 5-6 in Washington D.C.)

Since I introduced him to Eric Ries, author of The Lean Startup, Todd has been introducing the methodology to Washington, insisting on programs that can show real results (learning and pivots) in only 90 days. He just knows how to make stuff happen.

Todd is also an incredibly inspiring speaker. At my various Gov 2.0 events, he routinely got a standing ovation. His enthusiasm, insight, and optimism are infectious.

Todd Park

When Todd Park talks, I listen. (Photo by James Duncan Davidson from the 2010 Gov 2.0 Summit. http://www.flickr.com/photos/oreillyconf/4967787323/in/photostream/)

Many will ask about Todd's technical credentials. After all, he is trained as a healthcare economist, not an engineer or scientist. There are three good answers:

1. Economists are playing an incredibly important role at today's technology companies, as extracting meaning and monetization from massive amounts of data become one of the key levers of success and competitive advantage. (Think Hal Varian at Google, working to optimize the ad auction.) Healthcare in particular is one of those areas where science, human factors, and economics are on a collision course, but virtually every sector of our nation is undergoing a transformation as a result of intelligence derived from data analysis. That's why I put Todd on my list for Forbes.com of the world's most important data scientists.

2. Todd is an enormously successful technology entrepreneur, with two brilliant companies - Athenahealth and Castlight Health - under his belt. In each case, he was able to succeed by understanding the power of data to transform an industry.

3. He's an amazing learner. In a 1998 interview describing the founding of Athena Health, he described his leadership philosophy: "Put enough of an idea together to inspire a team of really good people to jump with you into a general zone like medical practices. Then, just learn as much as you possibly can and what you really can do to be helpful and then act against that opportunity. No question."

Todd is one of the most remarkable people I've ever met, in a career filled with remarkable people. As Alex Howard notes, he should be an inspiration for more "retired" tech entrepreneurs to go into government. This is a guy who could do literally anything he put his mind to, and he's taking up the challenge of making our government smarter about technology. I want to put out a request to all my friends in the technology world: if Todd calls you and asks you for help, please take the call, and do whatever he asks.

HHS CTO Todd Park to serve as the second chief technology officer of the United States

The White House has announced that Todd Park (@Todd_Park), the chief technology officer for the Department of Health and Human Services, will step into the role left open by Aneesh Chopra, the first person to hold the newly created position.

At the White House blog, John P. Holdren, assistant to the president for science and technology and director of the White House Office of Science and Technology Policy, wrote that:

For nearly three years, Todd has served as CTO of the U.S. Department of Health and Human Services, where he was a hugely energetic force for positive change. He led the successful execution of an array of breakthrough initiatives, including the creation of HealthCare.gov, the first website to provide consumers with a comprehensive inventory of public and private health insurance plans available across the Nation by zip code in a single, easy-to-use tool.

I knew Park's young family could be a factor in whether he would be the next US CTO, given that he'd already served longer than perhaps expected. That said, if the President of the United States asked you to serve as his CTO, would you say no?

This is some of the best personnel news to come out of Washington and the federal government under President Obama. Park has been working to revolutionize the healthcare industry at HHS since 2009, and in the private sector as an entrepreneur since 1997. Now he'll have the opportunity to try to improve how the entire federal government works through technology. It's a daunting challenge, but one that he may have been born to take on. Park is charismatic, understands technology on a systems level, and has been successful in applying open innovation and a lean startup approach to government at HHS.

White House director of digital Macon Phillips was "thrilled" about the choice:

As a close observer of the impact of technology on government, it's extremely exciting to hear that HHS's "entrepreneur in residence" is moving into a much bigger stage. Park's entrepreneurial energy and experience drive both his outlook and execution. He also seems to grok project management, which former US CIO Vivek Kundra identified as a core skill to encourage in the public sector. If he's able to harness the power of data to the benefit of the entire country, the outcome could be massive public good.

It's a shame more "Todd Parks" don't serve in government — but then there are very few of them in the world.

On a 30,000-foot level, his personal story is deeply compelling. He's the son of a brilliant immigrant who came here from Korea, attained a graduate-level education, spent his career in a company in the United States and raised a family, including a son who then went on to live the American dream, founding two successful healthcare companies and retiring a wealthy man.

From a 2008 interview on Park's background:

"My father emigrated to the United States from rural South Korea in the late 60s on a scholarship to the University of Utah. He got a PhD in chemical engineering and joined Dow Chemical. He worked there for the next 30 years. He actually has about 72 patents, more patents than anybody in Dow Chemical’s history except for Dr. Dow himself.

He raised me in a small town in Ohio. He sacrificed a lot to try to give me the best options he could. I went to Harvard for my undergrad education. I actually wanted to be in the Naval Academy and I really had my heart set on that, but then Dad and Mom sat me down one evening and said, “Son, no pressure, but we’ve wanted you to go to Harvard since before you were born.” The way they said, that I knew there was no hyperbole. I knew they were serious. I said, “Jeez, if you’re that serious about it, fine, I’ll go.” So I went.

In the matter of what I do in my life, nothing will ever compare to what my dad did: growing up in the Korean War, born dirt poor, emigrating to a brand new country, and becoming one of the most decorated chemical engineers in the world. My entire life is a quest to live up to half of what my dad actually did in some ways. That’s my background. We’re an immigrant family."

That's a powerful narrative, and one that I think should be compelling to the nation — and maybe the world — right now. Park was a successful entrepreneur, retired in his thirties to spend time with his family, and then received the call to enter public service.

As Park describes it, he was planning to retire from the 24x7 life of an entrepreneur, spend time with his young family, and become a healthcare investor when he received an email from HHS Deputy Secretary Bill Corr asking him to become the HHS CTO. As a long-time admirer of Corr, Park took the meeting.

"At the end of the meeting I said, 'This is actually a really amazing job. I'd really love to do this job, but I'll be divorced,'" Park recalled. "Bill replied that that would be bad, and if you're going to be divorced you shouldn't do this job. But why don't you go back and talk to Amy about it and see what she says?

"So I talked to Amy about it, and she was incredibly angry. But then after four days she came back to me and said, 'If they're really creating an entrepreneur in residence job at HHS, it's your national duty to take that job. And as much as I can't believe I'm saying this, I'll move back to the East Coast -- which I hate -- with our baby, to be there with you.'"

The country needs more examples of public servants like Park and his family. If Facebook, Twitter and other startups mint thousands of millionaires and a new class of founders who can "retire" early, I hope some of them will be inspired and become "entrepreneurs in residence" at the federal, state and city level as well.

Park could be a transformational figure of some magnitude in our history, if the politics, the resources and other external forces — war, natural or economic disaster — don't thwart his good work. That's all out of his control, of course, but the prospects here are notable.

For more context on the next US CTO, I've embedded my September 2010 interview with Park about his work at HHS below:

And, befitting the timing, here's an interview with Park about health data from the 2011 SXSW Interactive festival:

Congratulations to Park and condolences to HHS, which will have a hard time filling his shoes.

February 13 2012

Open innovation works in the public sector, say federal CTOs

President Barack Obama named Aneesh Chopra as the nation’s first chief technology officer in April 2009. In the nearly three years since, he was a tireless, passionate advocate for applying technology to make government and society work better. If you're not familiar with the work of the nation's first CTO, make sure to read Nancy Scola's extended "exit interview" with Aneesh Chopra at the Atlantic. where he was clear about his role: "As an advisor to the president, I have three main responsibilities," he said: "To make sure he has the best information to make the right policy calls for the country, which is a question of my judgment."

On his last day at the White House, Chopra released an "open innovator's toolkit" that highlights twenty different case studies in how he, his staff and his fellow chief technology officers at federal agencies have been trying to stimulate innovation in government.

Chopra announced the toolkit last week at a forum on open innovation at the Center for American Progress in Washington. The forum was moderated by former Virginia congressman Tom Perriello, who currently serves as counselor for policy to the Center for American Progress and featured Todd Park, U.S. Department of Health and Human Services CTO, Peter Levin, senior advisor to the Veterans Affair Secretary and U.S. Department of Veterans Affairs CTO, and Chris Vein, deputy U.S. CTO for government innovation at the White House Office of Science and Technology Policy. Video of the event is embedded below:

An open innovator's toolkit

"Today, we are unveiling 20 specific techniques that are in of themselves interesting and useful -- but they speak to this broader movement of how we are shifting, in many ways, or expanding upon the traditional policy levers of government," said Chopra in his remarks on Wednesday. In the interview with the Atlantic and in last week's forum, Chopra laid out four pillars in the administration's approach to open innovation:

  • Moving beyond providing public sector data by request to publishing machine-readable open data by default
  • Engaging with the public not simply as a regulator but as "impatient convener"
  • Using prizes and competitions to achieve outcomes, not just procurements
  • Focusing on attracting talented people to government by allowing them to serve as “entrepreneurs-in-residence.”

"We are clearly moving to a world where you don't just get data by requesting it but it's the default setting to publish it," said Chopra. "We're moving to a world where we're acting beyond the role of regulator to one of 'impatient convening.' We are clearly moving to a world where we're not just investing through mechanisms like procurement and RFPs to one where where we're tapping into the expertise of the American people through challenges, prizes and competition. And we are changing the face of government, recruiting individuals who have more of an entrepreneur-in-residence feel than a traditional careerist position that has in it the expectation of a lifetime of service. "

"Entrepreneurs and innovators around the country are contributing to our greater good. In some cases, they're coming in for a tour of duty, as you'll hear from Todd and Peter. But in many others, they're coming in where they can and how they can because if we tap into the collective expertise of the American people we can actually overcome some of the most vexing challenges that today, when you read the newspaper and you watch Washington, you say, 'Gosh, do we have it in us' to get beyond the divisions and these challenges, not just at the federal government but across all level of the public sector."

Open innovation, applied

Applying open innovation "is a task we’ve seen deployed effectively across our nation’s most innovative companies," writes Chopra in the memorandum on open innovation that the White House released this week. "Procter & Gamble’s “Connect+Develop” strategy to source 50% of its innovations from the outside; Amazon’s “Just Do It” awards to celebrate innovative ideas from within; and Facebook’s “Development Platform” that generated an estimated 180,000 jobs in 2011 focused on growing the economy while returning benefits to Facebook in the process."

The examples that Chopra cited are "bonafide," said MIT principal research professor Andrew McAfee, via email. "Open innovation or crowdsourcing or whatever you want to call it is real, and is (slowly) making inroads into mainstream (i.e. non high-tech) corporate America. P&G is real. Innocentive is real. Kickstarter is real. Idea solicitations like the ones from Starbucks are real, and lead-user innovation is really real."

McAfee also shared the insight of Eric Von Hippel on innovation:

“What is changing,” is that it is getting easier for consumers to innovate, with the Internet and such tools, and it is becoming more visible for the same reason. Historically though the only person who had the incentive to publicize innovation was the producer. People build institutions around how a process works and the mass production era products were built by mass production companies, but they weren’t invented by them. When you create institutions like mass production companies you create the infrastructure to help and protect them such as heavy patent protection. Now though we see that innovation is distributed, open collaborative.”

In his remarks, Chopra hailed a crowdsourced approach to the design of DARPA's next-generation combat vehicle, where an idea from a U.S. immigrant led to a better outcome. "The techniques we’ve deployed along the way have empowered innovators, consumers, and policymakers at all levels to better use technology, data, and innovation," wrote Chopra in the memo.

"We’ve demonstrated that “open innovation,” the crowdsourcing of citizen expertise to enhance government innovation, delivers real results. Fundamentally, we believe that the American people, when equipped with the right tools, can solve many problems." To be fair, the "toolkit" in question amounts more to a list of links and case studies than a detailed manual or textbook, but people interested in innovating in government at the local, state and national level should find it useful.

The question now is whether the country and its citizens will be the "winners in the productivity revolutions of the future," posed Chopra, looking to the markets for mobile technology, healthcare and clean energy. In that context, Chopra said that "open data is an active ingredient" in job creation and economic development, citing existing examples. 6 million Californians can now download their energy data through the Green Button, said Chopra, with new Web apps like Watt Quiz providing better interfaces for citizens to make more informed consumption decision.

More than 76,000 Americans found places to get treatment or health services using iTriage, said Chopra, with open data spurring better healthcare decisions by a more informed mobile citizenry. He hailed the role of collaborative innovation in open government, with citing mobile healthcare app ginger.io.

Open government platforms

During his tenure as US CTO, Chopra was a proponent of open data, participatory platforms and one of the Obama administration's most prominent evangelists for the use of technology to make government more open and collaborative. Our September 2010 interview on his work is embedded below:

In his talk last Wednesday, Chopra highlighted two notable examples of open government. First, he described the "startup culture" at the Consumer Financial Protection Bureau, highlighting the process by which the new .gov agency designed a better mortgage disclosure form.

Second, Chopra cited two e-petitions to veto the Stop Online Piracy Act and Protect IP Act on the White House e-petition platform, We The People, as an important example of open government in actions. The e-petitions, which gathered more than 103,000 signatures, are proof that when citizens are given the opportunity to participate, they will, said Chopra. The White House response, which came at a historic moment in the week the Web changed Washington. "SOPA/PIPA is exactly what We the People was meant to do," Chopra told Nancy Scola.

Traditionally, Congress formally requests a Statement of Administration Policy, called a "SAP." Requests for SAPs come in all the time from Congress. We respond based on the dynamics of Washington, priorities and timelines. One would argue that a Washington-centric approach would have have been to await the request for a SAP and publish it, oftentimes when a major vote is happening. If you contrast that were SOPA/PIPA was, still in committee or just getting out of committee, and not yet on the floor, traditionally a White House would not issue a SAP that early. So the train we were on, the routine Washington line of business, we would have awaited the right time to issue a SAP, and done it at congressional request. It just wasn't time yet. The We the People process flipped upside-down to whom we are responsible for providing input. In gathering over a hundred thousand signatures, on SOPA/PIPA, the American people effectively demanded a SAP.

Innovation for healthcare and veterans

"I think people will embrace the open innovation approach because it works," said Todd Park at last week's forum, citing examples at Novartis, Aventis and Walgreens, amongst others. Park cited "Joy's Law," by Sun Microsystems computer science pioneer Bill Joy: "no matter who you are, you have to remember that most of the smart people don't work for you."

Part of making that work is opening up systems in a way that enables citizens, developers and industry to collaborate in creating solutions. "We're moving the culture away from proprietary, closed systems … into something that is modular, standards-based & open, said Peter Levin.

If you went to the Veterans Affairs website in 2009, you couldn't see where you were in the process, said Levin. One of the ways to solve that problem is to create a platform for people to talk to each other, he explained, which the VA was able to do that through its Facebook page.

That may be a "colossal policy change," in his view, but it had an important result: "the whole patronizing fear that if we open up dialogue, open up channels, you'll create a problem you can't undo - that's not true for us," he said.

If you want to rock and roll, emphasized Park, don't just have your own smart people work on a challenge. That's an approach that Aventis executives found success using in a data diabetes challenge. Walgreens will be installing "Health Guides" at its stores to act as a free "health concierge," said Park, as opposed to what they would have done normally. They launched a challenge and, in under three months, got 50 credible prototypes. Now, said Park, mHealthCoach is building Health Guides for Walgreens.

One of the most important observations Park made, however, may have been that there has been too much of a focus on apps created from open data, as opposed to data informing policy makers and care givers. If you want to revolutionize the healthcare industry, open data needs to be at the fingertips of the people who need it most, where then need it most, when they need it most.

For instance, at a recent conference, he said, "Aetna rolled out this innovation called a nurse." If you want to have data help people, built a better IT cockpit for that nurse that helps that person become more omniscient. Have the nurse talk over the telephone with a human who can be helped by the power of the open data in front of the healthcare worker.

Who will pick up the first federal CTO's baton?

Tim O'Reilly made a case for Chopra in April 2009, when the news of his selection leaked. Tim put the role of a federal CTO in the context of someone who provides "visionary leadership, to help a company (or in this case, a government) explore the transformative potential of new technology." In many respects, he delivered upon that goal during his tenure. The person who fills the role will need to provide similar leadership, and to do so in a difficult context, given economic and political headwinds that confront the White House.

As he turns the page towards the next chapter of his career -- one which sources cited by the Washington Post might lead him into politics in Virginia -- the open question now will be who President Obama will choose to be the next "T" in the White House Office of Science and Technology Policy, a role that remains undefined, in terms of Congressional action.

The administration made a strong choice in federal CIO Steven VanRoekel. Inside of government, Park or Levin are both strong candidates for the role, along with Andrew Blumenthal, CTO at the Bureau of Alcohol, Tobacco and Firearms. In the interim, Chris Vein, deputy chief technology office for public sector innovation, is carrying the open government innovation banner in the White House.

In this election year, who the administration chooses to pick up the baton from Chopra will be an important symbol of its commitment to harnessing technology on behalf of the American people. Given the need for open innovation to addressing the nation's grand challenges, from healthcare to energy to education, the person tapped to run this next leg will play an important role in the country's future.

Related:

December 30 2011

2011 Gov 2.0 year in review

By most accounts, the biggest stories of 2011 were the Arab Spring, the historic earthquake and tsunami in Japan, and the death of Osama Bin Laden. In each case, an increasingly networked world experienced those events together through the growing number of screens. At the beginning of the year, a Pew Internet survey emphasized the Internet's importance in civil society. By year's end, more people were connected than ever before.

Time magazine named 2011 the year of the protester, as apt a choice as "You" was in 2006. "No one could have known that when a Tunisian fruit vendor set himself on fire in a public square, it would incite protests that would topple dictators and start a global wave of dissent," noted Time. "In 2011, protesters didn't just voice their complaints; they changed the world."

The Arab Spring extended well through summer, fall and winter, fueled by decades of unemployment, repression, and autocratic rule in Tunisia, Egypt, Libya, Syria, Yemen and Bahrain. This year's timeline of protest, revolution and uprising was not created by connection technologies, but by year's end, it had been accelerated by millions of brave young people connected to one another and the rest of the world through cell phones, social networks and the Internet.  

"We use Facebook to schedule the protests, Twitter to coordinate, and YouTube to tell the world," said an unnamed activist in Cairo in January.

In the months that followed, the Occupy Wall Street movement used the same tools in the parks and streets of the United States to protest economic inequality and call for accountability in the financial industry, albeit without the same revolutionary results.

This was the year where unemployment remained stubbornly high in the United States and around the world, putting job creation and economic growth atop the nation's priority list.

The theme that defined governments in Europe, particularly England, was austerity, as a growing debt crisis and financial contagion spread and persisted throughout the year. In Washington, the theme might be gridlock, symbolized by a threatened government shutdown in April and then brinkmanship over the debt crisis during the summer. As the year came to a close, a dispute between the White House, Senate and House over the extension of payroll tax cuts rounded out a long year of divided government.

We also saw a growing conflict between closed and open. It was a year that included social media adoption by government and a year where governments took measures to censor and block it. It was a year when we learned to think different about hacking, even while the "hacktivism" embodied in groups like Anonymous worried officials and executives in boardrooms around the world.

The United States bid farewell to its first CIO, Vivek Kundra, and welcomed his replacement, Steven VanRoekel, who advanced a "future first" vision for government that focuses on cloud, open standards, modularity and shared services. VanRoekel brought a .com mentality to the FCC, including a perspective that "everything should be an API," which caught the attention of some tech observers. While Kundra may have left government, his legacy remains: cloud computing and open data aren't going away in federal government, according to his replacement and General Services Administration (GSA) officials.

This was the year where the death of Steve Jobs caused more than a few people to wonder what Jobs would do as president. His legacy will resonate for many years to come, including the App Store that informed the vision of government as a platform.

If you look back at a January interview with Clay Johnson on key trends for Gov 2.0 and open government in 2011, some of his predictions bore out. The House of Representatives did indeed compete with the White House on open government, though not in story lines that played out in the national media or Sunday morning talk shows. The Government Oversight and Reform Committee took a tough look at the executive's progress in a hearing on open government. Other predictions? Not so much. Rural broadband stalled. Transparency as infrastructure is still in the future. We're still waiting on that to be automated, though when the collective intelligence of people in Washington looks at new versions of bills tied to the social web, there's at least a kludge.

Many of the issues and themes in 2011 were extensions of those in the 2010 Gov 2.0 Year in Review: the idea of government as a platform spread around the world; gated governments faced disruption; open government initiatives were stuck in beta; open data went global; and laws and regulations were chasing technology, online privacy, cloud computing, open source and citizen engagement.

"It's tough to choose which issue dominated the year in transparency, but I'd say that the Open Government Partnership, the E-government funding fight, and the Super Committee all loomed large for Sunlight," said John Wonderlich, policy director for the Sunlight Foundation. "On the state level, I'd include Utah's fight over FOI laws, Tennessee's Governor exempting himself from financial disclosure requirements, and the Wisconsin fight as very notable issues.  And the rise of Super PACs and undisclosed money in politics is probably an issue we're only just starting to see."

Three dominant tech policy issues

Privacy, identity and cybersecurity dominated tech policy headlines coming out of D.C. all year. By year's end, however, no major cybersecurity or consumer privacy bill had made it through the U.S. Congress to the president's desk. In the meantime, the Federal Trade Commission (FTC) made its own moves. As a result, Google, Facebook and Twitter are all now subject to "audits" by the FTC every two years.

On the third issue — cybersecurity — there was progress: The U.S. government's National Strategy for Trusted Identities in Cyberspace addressed key issues around creating an "identity ecosystem online." Implementation, however, will require continued effort and innovation from the private sector. By year's end, Verizon became the first identity provider to receive Level of Access 3 credentialing from the U.S. government. Look for more identity providers to follow in 2012, with citizens gaining increased access to government services online as a result.

A meme goes mainstream

This was the year when the story of local governments using technology with citizens earned more attention from mainstream media, including outlets like the Associated Press and National Public Radio.

In February, the AP published a story about how cities are using tech to cull ideas from citizens. In the private sector, leveraging collective intelligence is often called crowdsourcing. In open government, it's "citizensourcing." In cities around the country, the approach is gaining traction.

At Yahoo Canada, Carmi Levy wrote that the future of government is citizen focused. In his view, open government is about leveraging technology and citizens to do more with less. It's about doing more than leaving or speaking up: it's making government work better.

In November, NPR listeners learned more about the open government movement around the country when the Kojo Nnamdi Show hosted an hour-long discussion on local Gov 2.0 on WAMU in Washington, D.C. Around the same time, the Associated Press reported that a flood of government data is fueling the rise of city apps:

New York, San Francisco and other cities are now working together to develop data standards that will make it possible for apps to interact with data from any city. The idea, advocates of open data say, is to transform government from a centralized provider of services into a platform on which citizens can build their own tools to make government work better.

Gov 2.0 goes local

All around the country, pockets of innovation and creativity could be found, as "doing more with less" became a familiar mantra in many councils and state houses. New open data platforms or citizen-led initiatives sprouted everywhere.

Here's just a sample of what happened at the local level in 2011:

If you want the full fire hose, including setbacks to open government on the state level, read the archives of the Sunlight Foundation's blog, which aggregated news throughout the year.

Several cities in the United States hopped on the open government and open data bandwagon in 2011. Baltimore empowered its citizens to acts as sensors with new mobile apps and Open311. New York City is opening government data and working to create new relationships with citizens and civic developers in the service of smart government. Further afield, Britain earned well deserved attention for seeking alpha, with its web initiatives and an open architecture that could be relevant to local governments everywhere.

In 2011, a model open government initiative gained traction in Cook County. In 2012, we'll see if other municipalities follow. The good news is that the Pew Internet and Life Project found that open government is tied to higher levels of community satisfaction. That carrot for politicians comes up against the reality that in a time of decreased resources, being more open has to make economic sense and lead to better services or more efficiency, not just be "the right thing to do."

One of the best stories in open government came from Chicago, where sustainability and analytics are guiding Chicago's open data and app contest efforts. The city's approach offers important insights to governments at all levels. Can the Internet help disrupt the power of Chicago lobbyists through transparency? We'll learn more in 2012.

Rise of the civic startups

This year, early entrants like SeeClickFix and Citysourced became relatively old hat with the rise of a new class of civic startups that aspire to interface with the existing architectures of democracy. Some hope to augment what exists, others to replicate democratic institutions in digital form.  [Disclosure: O'Reilly AlphaTech Ventures is an investor in SeeClickFix.]

This year, new players like ElectNext, OpenGovernment.org, Civic Commons, Votizen and POPVOX entered the mix alongside many other examples of social media and government innovation. [Disclosure: Tim O'Reilly was an early angel investor in POPVOX.]

In Canada, BuzzData aspires to be the GitHub of datasets. Simpl launched as a platform to bridge the connection between social innovators and government. Nation Builder went live with its new online activism platform.

Existing civic startups made progress as well. BrightScope unlocked government data on financial advisers and made the information publicly available so it could be indexed by search engines. The Sunlight Foundation put open government programming on TV and a health app in your pocket. Code for America's 2011 annual report offered insight into the startup nonprofit's accomplishments.

Emerging civic media

The 2011 Knight News Challenge winners illustrated data's ascendance in media and government. It's clear that data journalism and data tools will play key roles in the future of media and open government.

It was in that context that the evolution of Safecast offered us a glimpse into the future of networked accountability, as citizen science and open data help to inform our understanding of the world. After a tsunami caused a nuclear disaster in Japan, a radiation detection network starting aggregating and publishing data. Open sensor networks look like an important part of journalism's future.

Other parts of the future of news are more nebulous, though there was no shortage of discussion about it. The question of where citizens will get their local news wasn't answered in 2011. A Pew survey of local news sources revealed the influence of social and mobile trends, along with a generation gap. As newsprint fades, what will replace it for communities? We don't know yet.

Some working models are likely to be found in civic media, where new change agents aren't just talking about the future of news; they're building it. Whether it's mobile innovation or the "Freedom Box," there's change afoot.

This was also a deadly year for journalists. The annual report from the Committee to Protect Journalists found 44 journalists were killed in the line of duty, with the deaths of dozens more potentially associated with the process of gathering and sharing information. Only one in six people lives in a country with a free press, according to the 2011 report on world press freedom from Freedom House.

Open source in government

At the federal level, open source continued its quiet revolution in government IT. In April, the new version of FCC.gov incorporated the principles of Web 2.0 into the FCC's online operations. From open data to platform thinking, the reboot elevated FCC.gov from one of the worst federal websites to one of the best. In August, the Energy Department estimated that the new Energy.gov would save $10 million annually through a combination of open source technology and cloud computing.

The White House launched IT Dashboard and released parts of it as open source code. (It remains to be seen whether the code from those platforms is re-used in the market.)

NASA's commitment to open source and its game plan for open government were up for discussion at the recent NASA Open Source Summit. One of NASA's open source projects, Nebula, saw its technology used in an eponymous startup. Nebula, the company, combines open source software and hardware in an appliance. If Nebula succeeds, its "cloud controller" could enable every company to implement cloud computing.

In cities, the adoption of "Change By Us" in Philadelphia and OpenDataPhilly in Chattanooga showed the potential of reusable civic software.

At the end of 2011, Civic Commons opened up its marketplace. The Marketplace is designed to be a resource for open source government apps. As Nick Judd observed at techPresident, both Civic Commons and its Marketplace "propose to make fundamental changes to the way local governments procure IT goods and services."

Open government goes global

As White House tech talent comes and goes, open government continued to grow globally.

In September, a global Open Government Partnership (OGP) launched in New York City. Video of the launch, beginning with examples of open government innovation from around the world, is embedded below:

Making the Open Government Partnership work won't be easy, but it's an important initiative to watch in 2011. As The Economist's review of the Open Government Partnership highlights, one of the most important elements is the United States' commitment to join the Extractive Industries Transparency Initiative. If this initiative bears fruit, citizens will have a chance to see how much of the payments oil and gas companies send to governments actually end up in the public's coffers.

Even before the official launch of the OGP, there was reason to think that something important was afoot globally in the intersection of governments, technology and society. In Africa, the government of Kenya launched Open Kenya and looked to the country's dynamic development community to make useful applications for its citizens. In Canada, British Columbia joined the ranks of governments embracing open government platforms. Canadian citizens in the province of British Columbia now have three new websites that focus on open government data, making information related to accountability available and providing easier access to services and officials. In India, the seeds of Gov 2.0 started bearing fruit through a growing raft of civil society initiatives. In Russia, Rospil.info aimed to expose state corruption.

For open government advocates, the biggest advance of the year was "the recognition of the need for transparency of government information world wide as a means for holding government and its officials accountable," said Ellen Miller, executive director of the Sunlight Foundation, via email. "The transparency genie is out of the bottle — world wide — and it's not going back into the darkness of that lantern ever again.  Progress will be slow, but it will be progress."

Federal open government initiatives

"Cuts in e-gov funds, Data.gov evolution, Challenge.gov and the launch of many contests were the big stories of the year," commented Steve Ressler, the founder of Govloop. Ressler saw Gov 2.0 go from a shiny thing to people critically asking how it delivers results.

At the beginning of the year, OMB Watch released a report that found progress on open government but a long road ahead. At the end of 2011, the Sunlight Foundation assessed the Open Government Directive two years on and found "mixed results." John Wonderlich put it this way:

Openness without information is emptiness.  If some agencies won't even share the plans they've made for publishing new information, how far can their commitment to openness possibly go? The Open Government Directive has caused a lot of good.  And it has also often failed to live up to its promise, the administration's rhetoric, and agencies' own self-imposed compliance plans. We should remember that Presidential rhetoric and bureaucratic commitments are not the same thing as results, especially as even more administration work happens through broad, plan-making executive actions and plans.

In 2011, reports of the death of open government were greatly exaggerated. That doesn't mean its health in the United States federal government is robust. In popular culture, of course, its image is even worse. In April, Jon Stewart and the Daily Show mocked the Obama administration and the president for a perceived lack of transparency.

Stewart and many other commentators have understandably wondered why the president's meeting with open government advocates to receive a transparency award wasn't on the official schedule or covered by the media. A first-hand account of the meeting from open government advocate Danielle Brian offered a useful perspective on the issues that arose that go beyond a sound bite or one-liner.

Some projects are always going to be judged as more or less effective in delivering on the mission of government than others. An open government approach to creating a "Health Internet" may be the most disruptive of them. For those who expected to see rapid, dynamic changes in Washington fueled by technology, however, the bloom has long since come off of the proverbial rose. Open government is looking a lot more like an ultra-marathon than a 400-yard dash. As a conference at the National Archives reminded the open government community, media access to government information also has a long way to go.

Reports on citizen participation and rulemaking from America Speaks offered open government guidance beyond technology. Overall, the administration received mixed marks. While America Speaks found that government agencies "display an admirable willingness to experiment with new tools and techniques to involve citizens with their decision-making processes," it also found the "Open Government Initiative and most Federal Agency plans have failed to offer standards for what constitutes high-quality public participation."

On the one hand, agencies are increasing the number of people devoted to public engagement and using a range of online and offline forums. On the other, "deliberative processes, in which citizens learn, express points of view, and have a chance to find common ground, are rarely incorporated." Getting to a more social open government is going to take a lot more work.

There were other notable landmarks. After months of preparation, the local .gov startup went live. While ConsumerFinance.gov went online back in February, the Consumer Financial Protection Board (CFPB) officially launched on the anniversary of H.R.4173 (the Dodd-Frank Wall Street Reform and Consumer Protection Act),  with Richard Cordray nominated to lead it.  By year's end, however, he still had not been confirmed. Questions about the future of the agency remain, but to place credit where credit is due: the new consumer bureau has been open to ideas about how it can do its work better. This approach is what led New York Times personal finance columnist Ron Lieber to muse recently that "its openness thus far suggests the tantalizing possibility that it could be the nation's first open-source regulator."

When a regulator asks for help redesigning a mortgage disclosure form, something interesting is afoot.

It's extremely rare that an agency gets built from scratch, particularly in this economic and political context. It's notable, in that context, that the 21st century regulator embraced many of the principles of open government in leveraging technology to stand up the Consumer Financial Protection Bureau.

This fall, I talked with Danny Weitzner, White House deputy chief technology officer for Internet policy, about the administration's open government progress in 2011. Our interview is embedded below:

In our interview, we talked about what the Internet means to government and society, intellectual property, the risks of a balkanized Internet, digital privacy, the Direct Project, a "right to connect," ICE takedowns and open data initiatives. On the last issue, the Blue Button movement, which enables veterans to download a personal health record, now has a website: BlueButtonData.org. In September, Federal CTO Aneesh Chopra challenged the energy industry to collaborate in the design of a "green button" modeled after that Blue Button. All three of California's public utilities have agreed to standardize energy data for that idea.

Tim O'Reilly talked with Chopra and White House deputy CTO for public sector innovation Chris Vein about the White House's action plan for open government innovation at the Strata Summit in September. According to Chopra, the administration is expanding Data.gov communities to agencies, focusing on "smart disclosure" and building out "government as a platform," with an eye to embracing more open innovators.

As part of its commitments to the Open Government Partnership, the White House also launched an e-petitions platform this fall called "We The People."

The White House has now asked for feedback on the U.S. Open Government National Action Plan, focusing on best practices and metrics for public participation. Early responses include focusing on outcomes first and drawing attention to success, not compliance. If you're interested in giving your input, Chopra is asking the country questions on Quora.

Opening the People's House

Despite the abysmal public perception of Congress, genuine institutional changes in the House of Representatives, driven by the GOP embracing innovation and transparency, are incrementally happening. As Tim O'Reilly observed earlier in the year, the current leadership of the House is doing a better job on transparency than their predecessors.

In April, Speaker John Boehner and Majority Leader Eric Cantor sent a letter to the House Clerk about releasing legislative data. Then, in September, a live XML feed for the House floor went online. Yes, there's a long way to go on open legislative data quality in Congress — but at year's end,  following the first "Congressional hackathon," the House approved sweeping open data standards.

The House also made progress in opening up its recorded videos to the nation. In January, Carl Malamud helped make the hearings of the House Committee on Oversight and Government Reform available on the Internet in high-quality video at house.resource.org. Later in the year, HouseLive.gov brought live video to mobile devices.

Despite the adoption of Twitter and Facebook by the majority of senators and representatives, Congress as a whole still faces challenges in identifying constituents on social media.

It's also worth noting that, no matter what efforts have been made to open the People's House through technology, at year's end, this was the least popular Congress in history.

Strata 2012 — The 2012 Strata Conference, being held Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work.

Save 20% on registration with the code RADAR20

Open data

The open data movement received three significant endorsements on the world stage in 2011.

1. Open government data was featured in the launch of the Open Government Partnership.

That launch, however, offered an opportunity to reflect upon the fundamental conditions for open government to exist. Simply opening up data is not a replacement for a Constitution that enforces a rule of law, free and fair elections, an effective judiciary, decent schools, basic regulatory bodies or civil society, particularly if the data does not relate to meaningful aspects of society. That said, open data is a key pillar of how policy makers are now thinking about open government around the world.

2. The World Bank continued to expand what it calls "open development" with its own open data efforts

The World Bank is building upon the 2010 launch of data.worldbank.org. It's now helping countries prepare and launch open government data platforms, including support for Kenya. In December, the World Bank hosted a webinar about how countries can start and run open government data ecosystems, launched an online open data community, and published a series of research papers on the topic.

Realizing the Vision of Open Government Data (Long Version): Opportunities, Challenges and Pitfalls

3. The European Union's support for open data

The BBC reported that Europe's governments are "sitting on assets that could be worth 40bn euros ($52bn, £33.6bn) a year" in public sector data. In addition, the European Commission has launched an open data strategy for the EU. Here's Neelie Kroes, vice president of the European Commission, on public data for all:

Big data means big opportunities. These opportunities can flow from public and private data — or indeed from mixing the two. But a public sector lead can set an example, allowing the same taxpayers who have paid for the data to be gathered to benefit from its wider use. In my opinion, data should be open and available by default and exceptions should be justified — not the other way around, as is too often the case still today.

Access to public data also has an important and growing economic significance. Open data can be fuel for innovation, growth and job creation. The overall economic impact across the whole EU could be tens of billions of Euros per year. That's amazing, of course! But, big data is not just about big money. It promises a host of socially and environmentally beneficial uses too — for example, in healthcare or through the analysis of pollution patterns. It can help make citizens' lives easier, more informed, more connected.


As Glynn Moody wrote at Computer World UK, Europe is starting to get it.

Open data is not a partisan issue, in the view of professor Nigel Shadbolt. In 2012, Shadbolt will lead an "Open Data Institute" in England with Tim Berners-Lee.

Shadbolt is not out on a limb on this issue. In Canada and Britain, conservative governments supported new open data initiatives. In 2011, open government data also gathered bipartisan support in Washington when Rep. Darrell Issa introduced the DATA Act to track government financial spending. We talked about that and other open government issues this fall during an interview at the Strata Conference:

There was no shortage of other open data milestones, from Google adding the Public Data Explorer to its suite of free data tools to an International Open Government Data Camp in Poland.

In New York City, social, mapping and mobile data told the story of Hurricane Irene. In the information ecosystem of 2011, media, government and citizens alike played a critical role in sharing information about what's happening in natural disasters, putting open data to work and providing help to one another.

Here at Radar, MySociety founder Tom Steinberg sounded a cautionary note about creating sustainable open data projects with purpose. The next wave of government app contests need to incorporate sustainability, community, and civic value. Whether developers are asked to participate in app contests, federal challenges, or civic hackathons, in 2012, the architects behind these efforts need to focus on the needs of citizens and sustainability.

Open mapping

One of the biggest challenges government agencies and municipalities have is converting open data to information from which people easily can draw knowledge. One of the most powerful ways humanity has developed to communicate information over time is through maps. If you can take data in an open form and map it out, then you have an opportunity to tell stories in a way that's relevant to a region or personalized to an individual.

There were enough new mapping projects in 2011 that they deserved their own category. In general, the barrier to entry for mapping got lower thanks to new open source platforms like MapBox, which powered the Global Adaptation Index and a map of the humanitarian emergency in the Horn of Africa. And Data.nai.org.afs charted attacks on the media onto an interactive map of Afghanistan.

IssueMap.org, a new project launched by the FCC and FortiusOne, aimed to convert open data into knowledge and insight. The National Broadband Map, one of the largest implementations of open source and open data in government to date, displayed more than 25 million records and incorporated crowdsourced reporting. A new interactive feature posted at WhiteHouse.gov used open data to visualize excess federal property.

"Maps can be a very valuable part of transparency in government," wrote Jack Dangermond, founder of ESRI. "Maps give people a greater understanding of the world around them. They can help tell stories and, many times, be more valuable than the data itself. They provide a context for taxpayers to better understand how spending or decisions are being made in a circumstance of where they work and live. Maps help us describe conditions and situations, and help tell stories, often related to one's own understanding of content."

Social media use grows in government

When there's a holiday, disaster, sporting event, political debate or any other public happening, we now experience it collectively. In 2011, we were reminded that there were a lot of experiences that used to be exclusively private that are now public because of the impact of social media, from breakups to flirting to police brutality. From remembering MLK online to civil disobedience at the #Occupy protests, we now can share what we're seeing with an increasingly networked global citizenry.

Those same updates, however, can be used by autocratic regimes to track down protestors, dissidents and journalists. If the question is whether the Internet and social media are tools of freedom or tools of oppression, the answer may have to be "yes." If online influence is essential to 21st century governance, however, how should government leaders proceed?

Some answers could be found in the lessons learned by the Federal Emergency Management Agency (FEMA), the Red Cross and Crisis Commons that were entered into the Congressional Record when the U.S. Senate heard testimony on the role of social media in crisis response.

If you're a soldier, you should approach social media carefully. The U.S. Army issued a handy social media manual to help soldiers, and the Department of Veterans Affairs issued a progressive social media policy.

A forum on social media at the National Archives featured a preview of a "citizen archivist dashboard" and a lively discussion of the past, present and future of social media — a future which will certainly include the growth of networks in many countries. For instance, in 2011, Chinese social media found its legs.

For a comprehensive discussion of how governments dealt with social media in 2011, check out this piece I wrote for National Journal.

Intellectual property and Internet freedom

In 2011, the United Nations said that disconnecting Internet users is a breach of human rights. That didn't stop governments around the world from considering it under certain conditions. The UN report came at an important time. As Mathew Ingram wrote at GigaOm, reporting on a UNESCO report on freedom of expression online, governments are still trying to kill, replace or undo the Internet.

In 2011, Russia earned special notice when it blocked proposals for freedoms in cyberspace. The Russian blogosphere came under attack in April. This fall, DDoS attacks were used in Russia after the elections in an attempt to squelch free speech. As Russian activists get connected, they'll be risking much to express their discontent.

In May, the eG8 showed that online innovation and freedom of expression still need strong defenders. While the first eG8 Forum in Paris featured hundreds of business and digital luminaries, the policies discussed were of serious concern to entrepreneurs, activists, media and citizens around the world. If the Internet has become the public arena for our time, as the official G8 statement that followed the Forum emphasized, then defending the openness and freedoms that have supported its development is more important than ever.

That need became clearer at year's end when the United States Congress considered anti-piracy bills that could cripple Internet industries. In 2012, the Stop Online Piracy Act (SOPA) and PROTECT IP Act will be before Congress again. Many citizens are hoping that their representatives decide not to break the Internet.

After all, if an open Internet is the basis for democracy flourishing around the world, billions of people will be counting upon our leaders to keep it open and accessible.

What story defined the year for you?

On Govloop, the government social network, the community held its own debate on the issue of the year. There, the threat of a government shutdown led the list. A related issue — "austerity" — was the story that defined government in 2011 in Chris Dorobek's poll. I asked people on Govloop, Quora, Twitter, Facebook and Google+ what the most important Gov 2.0 or open government story of 2011 was and why. Their answers were all about what happened in the U.S., versus the globe, but here's what I heard:

1. The departure of Kundra and White House deputy CTO for open government Beth Noveck mattered

"The biggest story of the year was Vivek Kundra and Beth Noveck leaving the White House," commented Andy Krzmarzick, director of community engagement at Govloop. "Those personnel changes really stalled momentum, generally speaking, on the federal level. I respect their successors immensely, but I think they have an uphill climb as we head into an election year and resisters dig in their heels to wait it out and see if there is a change in administration before they spend a lot of time and energy at this stage of the game. Fortunately, the movement has enough of a ground swell that we'll carry the torch forward regardless of leadership ... but it sure helps to have strong champions."

Terell Jones, director of green IT solutions at EcomNets, agreed. "The departure of Vivek Kundra as CIO of the United States. Under his watch they developed the Cloud Computing Strategy, the 25 Point Plan, and the Federal Data Center Consolidation Initiative (FDCCI). He saved the federal government millions, but they cut his budget so he would be ineffective; so, he escaped to Harvard University," commented Jones. "He may have been frustrated with the speed at which government moves, but he made great strides in the right direction. I hope his replacement will stay the course."

2. Budget cuts to the Office of Management and Budget's E-Government Fund

"I think the biggest story is the Open Government budget cuts," commented Steve Radick, a lead associate with Booz Allen Hamilton, which consults with federal agencies. "After all, these seemed to be the writing on the wall for Vivek's departure, and forced everyone to re-think why open government was so important. It wasn't just for the sake of becoming a more open government — open government needed to be about more than that. It needed to show real mission impact. I think these budget cuts and the subsequent realization of the Gov 2.0 community that Gov 2.0 efforts needed to be deeper than just retweets, friends, and fans was the biggest story of 2011."

3. Insider trading in Congress

"I think the most important story of the year was the 60 Minutes expose on insider trading in Congress," commented Joe Flood, a D.C.-area writer and former web editor at DC.gov and NOAA. "It demonstrated the power of data to illuminate connections that were hidden, showing how members of Congress made stock trades based upon their inside information on pending legislation. It showed what could be done with open data as well as why government transparency is so vital."

4. Hackathons

"I feel like 2011 was kind of the year of the hackathon," commented Karen Suhaka, founder of Legination. "Might just be my perception, but the idea seems to be gaining significant steam."

5. iPads in government

"I think the winner should be iPads on the House Floor and in committee hearings," commented Josh Spayher, a Chicago attorney and creator of GovSM.com. "[It] totally transforms the way members of Congress can access information when they need it."

6. Social media in emergencies, National Archives and Records Administration (NARA), and open government in the European Union

"I think there was significant progress in the use of social media for emergency alerts/warnings and disaster response this year," commented Mollie Walker, editor of FierceGovernmentIT.  "It also shows agencies are letting this evolve beyond a broadcast medium and seeing the value of a feedback loop for mission-critical action. Although it hasn't really come to fruition yet (it's technically in the "operational" phase, though development and migration appear to still be in progress), I think the NARA's electronic record archive has some positive implications for open government going forward. It's something to watch for in 2012, but the fact that NARA tied up a lot of loose ends in 2011 was a big win. The open government efforts in the E.U. are also worth noting. While there have been isolated initiatives in the U.S. and U.K., seeing a governing body such as the E.U. set new standards for openness could have a broader impact on how the rest of the world manages and shares public information."

If you think there's another story that deserves to be listed, please let us know in the comments.

The year ahead

What should we expect in the year ahead? Some predictions are easier than others. The Pew Internet and Life Project found that more than 50% of U.S. adults used the Internet for political purposes during the 2010 midterm elections. Pew's research also showed that a majority of U.S. citizens now turn to the web for news and information about politics. Expect that to grow in 2012.

This year, there was evidence of the maker movement's potential for education, jobs and innovation. That same DIY spirit will matter even more in the year ahead. We also saw the impact of apps that matter, like a mobile geolocation app that connected first responders to heart attack victims. If developers want to make an impact, we need more applications that help us help each another.

In 2011, there were more ways for citizens to provide feedback to their governments than perhaps ever before. In 2012, the open question will be whether "We the People" will use these new participatory platforms to help government work better.

The evolution of these kinds of platforms is neither U.S.-centric nor limited to tech-savvy college students. Citizen engagement matters more now in every sense: crowdfunding, crowdsourcing, crowdmapping, collective intelligence, group translation, and human sensor networks. There's a growth in "do it ourselves (DIO) government," or as the folks at techPresident like to say, "We government." As institutions shift from eGov to WeGov, leaders will be looking more to all of us to help them in the transition.

Related:

December 09 2011

Top Stories: December 5-9, 2011

Here's a look at the top stories published across O'Reilly sites this week.

The end of social
Mike Loukides: "If you want to tell me what you listen to, I care. But if sharing is nothing more than a social application feed that's constantly updated without your volition, then it's just another form of spam."

Why cloud services are a tempting target for attackers
Jeffrey Carr says before organizations embrace the efficiencies and cost savings of cloud services, they should also closely consider the security repercussions and liabilities attached to the cloud.


White House to open source Data.gov as open government data platform
The new "Data.gov in a box" could empower countries to build their own platforms. With this step forward, the prospects are brighter for stimulating economic activity, civic utility and accountability under a global open-government partnership.

Stickers as sensors
Put a GreenGoose sticker on an object, and just like that, you'll have an Internet-connected sensor. In this interview, GreenGoose founder Brian Krejcarek discusses stickers as sensors and the data that can be gathered from everyday activities.

What publishers can learn from Netflix's problems
Wired.com writer Tim Carmody examines the recent missteps of Netflix and takes a broad look at how technology shapes the reading experience.


Tools of Change for Publishing, being held February 13-15 in New York, is where the publishing and tech industries converge. Register to attend TOC 2012.

December 05 2011

White House to open source Data.gov as open government data platform

As 2011 comes to an end, there are 28 international open data platforms in the open government community. By the end of 2012, code from new "Data.gov-in-a-box" may help many more countries to stand up their own platforms. A partnership between the United States and India on open government has borne fruit: progress on making the open data platform Data.gov open source.

In a post this morning at the WhiteHouse.gov blog, federal CIO Steven VanRoekel (@StevenVDC) and federal CTO Aneesh Chopra (@AneeshChopra) explained more about how Data.gov is going global:

As part of a joint effort by the United States and India to build an open government platform, the U.S. team has deposited open source code — an important benchmark in developing the Open Government Platform that will enable governments around the world to stand up their own open government data sites.

The development is evidence that the U.S. and India are indeed still collaborating on open government together, despite India's withdrawal from the historic Open Government Partnership (OGP) that launched in September. Chopra and VanRoekel explicitly connected the move to open source Data.gov to the U.S. involvement in the Open Government Partnership today. While we'll need to see more code and adoption to draw substantive conclusions on the outcomes of this part of the plan, this is clearly progress.

Data.gov in a boxThe U.S. National Action Plan on Open Government, which represents the U.S. commitment to the OGP, included some details about this initiative two months ago, building upon a State Department fact sheet that was released in July. Back in August, representatives from India's National Informatics Center visited the United States for a week-long session of knowledge sharing with the U.S. Data.gov team, which is housed within the General Services Administration.

"The secretary of state and president have both spent time in India over the past 18 months," said VanRoekel in an interview today. "There was a lot of dialogue about the power of open data to shine light upon what's happening in the world."

The project, which was described then as "Data.gov-in-a-box," will include components of the Data.gov open data platform and the India.gov.in document portal. Now, the product is being called the "Open Government Platform" — not exactly creative, but quite descriptive and evocative of open government platforms that have been launched to date. The first collection of open source code, which describes a data management system, is now up on GitHub.

During the August meetings, "we agreed upon a set of things we would do around creating excellence around an open data platform," said VanRoekel. "We owned the first deliverable: a dataset management tool. That's the foundation of an open source data platform. It handles workflow, security and the check in of data -- all of the work that goes around getting the state data needs to be in before it goes online. India owns the next phase: the presentation layer."

If the initiative bears fruit in 2012, as planned, the international open government data movement will have a new tool to apply toward open data platforms. That could be particularly relevant to countries in the developing world, given the limited resources available to many governments.

What's next for open government data in the United States has yet to be written. "The evolution of data.gov should be one that does things to connect to web services or an API key manager," said VanRoekel. "We need to track usage. We're going to double down on the things that are proving useful."

Drupal as an open government platform?

This Open Government Data platform looks set to be built upon Drupal 6, a choice that would further solidify the inroads that the open source content management system has made into government IT. As always, code and architecture choices will have consequences down the road.

"While I'm not sure Drupal is a good choice anymore for building data sites, it is key that open source is being used to disseminate open data," said Eric Gunderson, the founder of open source software firm Development Seed. "Using open source means we can all take ownership of the code and tune it to meet our exact needs. Even bad releases give us code to learn from."

Jeff Miccolis, a senior developer at Development Seed, concurred about how open the collaboration around the Data.gov code has been or will be going forward. "Releasing an application like this as open source on an open collaboration platform like Github is a great step," he said. "It still remains to be seen what the ongoing commitment to the project will be, and how collaboration will work. There is no history in the git repository they have on GitHub, no issues in the issue tracker, nor even an explicit license in the repository. These factors don't communicate anything about their future commitment to maintaining this newly minted open source project."

The White House is hoping to hear from more developers like Miccolis. "We're looking forward to getting feedback and improvements from the open source community," said VanRoekel. "How do we evolve the U.S. data.gov as it sits today?"

Strata 2012 — The 2012 Strata Conference, being held Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work.

Save 20% on registration with the code RADAR20

Open data impact

From where VanRoekel sits, investing in open source, open government and open data remain important to the administration. He said to me that the fact that he was hired was a "clear indication of the importance" of these issues in the White House. "It wasn't a coincidence that the launch of the Open Government Partnership coincided with my arrival," he said. "There's a lot of effort to meet the challenge of open government," according to VanRoekel. "The president has me and other people involved meeting every week, reporting on progress."

The open questions now, so to speak, are: Will other countries use it? And to what effect? Here in the U.S., there's already code sharing between cities. OpenChattanooga, an open data catalog in Tennessee, is using source code from OpenDataPhilly, an open government data platform built in Philadelphia by GIS software company Azavea. By the time "Data.gov in a box" is ready to be deployed, some cities, states and countries might have decided to use that code in the meantime.

There's good reason to be careful about celebrating the progress here. Open government analysts like Nathaniel Heller have raised concerns about the role of open data in the Open Government Partnership, specifically that:

... open data provides an easy way out for some governments to avoid the much harder, and likely more transformative, open government reforms that should probably be higher up on their lists. Instead of fetishizing open data portals for the sake of having open data portals, I'd rather see governments incorporating open data as a way to address more fundamental structural challenges around extractives (through maps and budget data), the political process (through real-time disclosure of campaign contributions), or budget priorities (through online publication of budget line-items).

Similarly, Greg Michener has made a case for getting the legal and regulatory "plumbing" for open government right in Brazil, not "boutique Gov 2.0" projects that graft technology onto flawed governance systems. Michener warned that emulating the government 2.0 initiatives of advanced countries, including open data initiatives:

... may be a premature strategy for emerging democracies. While advanced democracies are mostly tweaking and improving upon value-systems and infrastructure already in place, most countries within the OGP have only begun the adoption process.

Michener and Heller both raise bedrock issues for open government in Brazil and beyond that no technology solution in of itself will address. They're both right: Simply opening up data is not a replacement for a Constitution that enforces a rule of law, free and fair elections, an effective judiciary, decent schools, basic regulatory bodies or civil society, particularly if the data does not relate to meaningful aspects of society.

"Right now, the problem we are seeing is not so much the technology around how to open data but more around the culture internally of why people are opening data," agreed Gunderson. "We are just seeing a lot of bad data in-house and thus people wanting to stay closed. At some point a lot of organizations and government agencies need to come clean and say 'we have not been managing our decisions with good data for a long time'. We need more real  projects to help make the OGP more concrete."

Heller and Michener speak for an important part of the open government community and surely articulate concerns that exist for many people, particularly for a "good government" constituency whose long term, quiet work on government transparency and accountability may be receiving the same attention as some shinier technology initiatives. The White House consultation on open government that I attended included considerable recognition of the complexities here.

It's worth noting that Heller called the products of open data initiatives "websites," including Kenya's new open government platform. He's not alone in doing so. To rehash an old but important principle, Gov 2.0 is not about "websites" or "portals" — it's about web services and the emerging global ecosystem of big data. In this context, Gov 2.0 isn't simply about setting up social media accounts, moving to grid computing or adopting open standards: it's about systems thinking, where open data is used both by, for and with the people. If you look at what the Department of Health and Human Services is trying to do to revolutionize healthcare with open government data in the United States, that approach may become a bit clearer. For that to happen, countries, states and cities have to stand up open government data platforms.

The examples of open government data being put to use that excite VanRoekel are, perhaps unsurprisingly, on the healthcare front. If you look at the healthcare community pages on Data.gov, "you see great examples of companies and providers meeting," he said, referencing two startups from a healthcare challenge that were acquired by larger providers as a result of their involvement in the open data event.

I'm cautiously optimistic about what this news means for the world, particularly for the further validation of open source in open government. With this step forward, the prospects for stimulating more economic activity, civic utility and accountability under a global open government partnership are now brighter.

Related:

December 01 2011

Gov 2.0 enters the mainstream on NPR and the AP

Regular Radar readers know that "Gov 2.0 has gone local," as local governments look for innovative ways to use technology cooperatively with citizens to deliver smarter government. This week, NPR listeners learned more about the open-government movement around the country when the Kojo Nnamdi Show hosted an hour-long discussion on local Gov 2.0 on WAMU in Washington, D.C.

You can listen to the audio archive of the program and read the transcript at TheKojoNnamdiShow.org.

I was happy to join Bryan Sivak, chief innovation officer of the state of Maryland; Tom Lee, director of Sunlight Labs; and Abhi Nemani, director of strategy and communications at Code For America, as a guest on the show.

Heather Mizeur, a delegate to the Maryland State Assembly, called in to the show to share what her state has been working on with respect to open government, including streaming video, budget transparency and online access. Mizeur had the one-liner of the day: Commenting on the need to improve Maryland.gov, she observed that "our state website is an eight-track tape player in an iPhone universe."

An open government linkology

During the program, the @KojoShow producer shared links to sites and services that were mentioned by the guests. These included:

  • Maryland's solicitation for feedback on helpful or hurtful business regulations at
  • An NPR News feature on the American Legislative Exchange Council and channels of influence in state legislatures.
  • Churnalism, an app to discover PR masquerading as original journalism. Could a churnalism model be used to detect similar subtle influences in state legislatures? Sunlight Labs has an ongoing project at OpenStates. Stay tuned.
  • Civic engagement platform Change By Us launched in Philadelphia and was open sourced into the Civic Commons.
  • Sivak cited Arkansas.gov as a model for well designed government websites. The key is that it's adapted for mobile visitors.
  • Nemani cited Open Data Philly as a local open government platform that uses open standards. It's open sourced, so that other cities, like Chattanooga, Tenn., can use it to stand up their own open-data efforts.
  • The Sportaneous location-aware mobile app uses open-government data to help people find pick-up sports games.
  • The StreetBump app uses a smartphone's accelerometer to automatically report potholes in Boston.
Moving to Big Data: Free Strata Online Conference — In this free online event, being held Dec. 7, 2011, at 9AM Pacific, we'll look at how big data stacks and analytical approaches are gradually finding their way into organizations as well as the roadblocks that can thwart efforts to become more data driven. (This Strata Online Conference is sponsored by Microsoft.)

Register to attend this free Strata Online Conference

Civic applications enter the mainstream

Recently, civic applications and open data pushed further into the national consciousness with a widely syndicated Associated Press story by Marcus Wohlsen. Here's how Wohlsen described what's happening:

Across the country, geeks are using mountains of data that city officials are dumping on the web to create everything from smartphone tree identifiers and street sweeper alarms to neighborhood crime notifiers and apps that sound the alarm when customers enter a restaurant that got low marks on a recent inspection. The emergence of city apps comes as a result of the rise of the open-data movement in U.S. cities, or what advocates like to call "Government 2.0."

The AP covered Gov 2.0 and the open-government data movement in February, when it looked at how cities were crowdsourcing ideas from citizens, or "citizensourcing."

It's great to see what's happening around the country receive more mainstream attention. Over on Google+, Tim O'Reilly commented on the AP story:

Of all the things that made up the "gov 2.0" meme, open data may be one of the most important. It's a key part of government thinking like a platform player rather than an application provider. At Code for America, the work ended up being about liberating data as much as about writing apps. We're just at the beginning of a really interesting new approach to government services.

Wohlsen captured the paradigm behind Gov 2.0 at the end of his article:

New York, San Francisco and other cities are now working together to develop data standards that will make it possible for apps to interact with data from any city. The idea, advocates of open data say, is to transform government from a centralized provider of services into a platform on which citizens can build their own tools to make government work better.

Open311 is a data standard of this sort. So is GTFS. "So much can flow from so little," noted O'Reilly. "Consider how Google Transit began with outreach from the city of Portland to create GTFS, a standard format for transit data, which was subsequently adopted by other cities. Now, you can get transit arrival times from Google as well as from hundreds of smartphone apps, none of which needed to be written by city government."

What lies ahead for Gov 2.0 in 2012 has the potential to improve civic life in any number of interesting ways. If the Gov 2.0 movement is to have a lasting, transformative effect, however, what's described above needs to be the beginning of the story, not the end. That arc will include the results of HHS CTO Todd Park's efforts to revolutionize the healthcare industry or the work of the Alfred brothers at BrightScope to bring more transparency to financial advisors.

Making Gov 2.0 matter will also mean applying different ways of thinking and new technology to other areas, as FutureGov founder Dominic Campbell commented on Google+:

There aren't enough of us working to transform, challenge and change the inside of government. Not enough taking on the really sticky issues beyond relatively quick and easy wins, such as transit data or street-scene related apps. This needs to change before anything can be said to have gone mainstream. Disclaimer: this is exactly what we're looking to do with apps like PatchWorkHQ and CasseroleHQ, starting to hone in on priority, challenging, socially important and costly areas of government, such as child protection and supporting older people to live better independent lives. The journey is far longer and harder, but (we're hoping) even more rewarding.

More awareness of what's possible and available will lead to more use of civic applications and thereby interest and demand for open-government data. For instance, on the AP's Twitter feed, an editor asked more than 634,000 followers this question: "Hundreds of new apps use public data from cities to improve services. Have you tried any?" I'll ask the same of Radar readers: have you used a civic app? If so, what and where? Did it work? Did you keep it? Please let us know in the comments.

October 15 2011

International Open Government Data Camp looks to build community

There's a growing international movement afoot worldwide to open up government data and make something useful with it. Civic apps based upon open data are emerging that genuinely serve citizens in a beneficial ways that officials may have not been able to deliver, particularly without significant time or increased expense.

For every civic app, however, there's a backstory that often involves a broad number of stakeholders. Governments have to commit to open up themselves but will in many cases need external expertise or even funding to do so. Citizens, industry and developers have to use the data, demonstrating that there's not only demand but skill outside of government to put open data to work in the service of accountability, citizen utility and economic opportunity. Galvanizing the co-creation of civic services, policies or apps isn't easy but the potential of the civic surplus attracted the attention of governments around the world.

The approach will not be a silver bullet to all of society's ills, given high unemployment, economic uncertainty or high healthcare or energy costs -- but an increasing number of states are standing up platforms and stimulating an app economy. Given the promise of leaner, smarter government that focuses upon providing open data to fuel economic activity, tough, results-oriented mayors like Rahm Emanuel and Mike Bloomberg are committing to opening Chicago and open government data in NYC.

A key ingredient in successful open government data initiatives is community. It's not enough to simply release data and hope that venture capitalists and developers magically become aware of the opportunity to put it to work. Marketing open government data is what has brought federal CTO Aneesh Chopra and HHS CTO Todd Park repeatedly out to Silicon Valley, New York City and other business and tech hubs. The civic developer and startup community is participating in creating a new distributed ecosystem, from BuzzData to Socrata to new efforts like Max Ogden's DataCouch.

As with other open source movements, people interested in open data are self-organizing and, in many cases, are using the unconference model to do so. Over the past decade, camps have sprung up all around the U.S. and, increasingly, internationally, from Asia to India to Europe Africa to South America. Whether they're called techcamps, barcamps, citycamps or govcamps, these forums are giving advocates, activists, civic media, citizens and public officials to meet, exchange ideas, code and expertise.

Next week, the second International Open Government Data Camp will pull together all of those constituencies in Warsaw, Poland to talk about the future of open data. Attendees will be able to learn from plenary keynotes from open data leaders and tracks full of sessions with advocates, activists and technologists. Satellite events around OGD Camp will also offer unstructured time for people to meet, mix, connect and create. You can watch a short film about open government data from the Open Knowledge Foundation below:

To learn more about what attendees should expect, I conducted an email interview with Jonathan Gray, the community coordinator for the Open Knowledge Foundation. For more on specific details about the camp, consult the FAQ at OGDCamp.org. Gray offered more context on open government data at the Guardian this past week:

It's been over five years since the Guardian launched its influential Free Our Data campaign. Nearly four years ago Rufus Pollock coined the phrase "Raw Data Now" which web inventor Sir Tim Berners-Lee later transformed into the slogan for a global movement. And that same year a group of 30 open government advocates met in Sebastopol, California and drafted a succinct text on open government data which has subsequently been echoed and encoded in official policy and legislative documents around the world.

In under half a decade, open data has found its way into digital policy packages and transparency initiatives all over the place - from city administrations in Berlin, Paris and New York, to the corridors of supranational institutions like the European Commission or the World Bank. In the past few years we've seen a veritable abundance of portals and principles, handbooks and hackdays, promises and prizes.

But despite this enthusiastic and energetic reception, open data has not been without its setbacks and there are still huge challenges ahead. Earlier this year there were reports that Data.gov will have its funding slashed. In the UK there are concerns that the ominously titled "Public Data Corporation" may mean that an increasing amount of data is locked down and sold to those who can afford to pay for it. And in most countries around the world most documents and datasets are still published with ambiguous or restrictive legal conditions, which inhibit reuse. Public sector spending cuts and austerity measures in many countries will make it harder for open data to rise up priority lists.

Participants at this year's camp will swap notes on how to overcome some of these obstacles, as well as learning about how to set up and run an open data initiative (from the people behind data.gov and other national catalogues), how to get the legal and technical details right, how to engage with data users, how to run events, hackdays, competitions, and lots more.

What will this camp change?

We want to build a stronger international community of people interested in open data - so people can swap expertise, anecdotes and bits of code. In particular we want to get public servants talking to each other about how to set up an open data initiative, and to make sure that developers, journalists NGOs and others are included in the process.

What did the last camp change?

Many of the participants from the 2010 camp came away enthused with ideas, contacts and energy that has catalysed and informed the development of open data around the world. For example, groups of citizens booted up grassroots open data meetups in several places, public servants set up official initiatives on the back of advice and discussions from the camp, developers started local versions of projects they liked, and so on.

Why does this matter to the tech community?

Public data is a fertile soil out of which the next generation of digital services and applications will grow. It may take a while for technologies and processes to get there, but eventually we hope open data will be ubiquitous and routine.

Why does it matter to the art, design, music, business or nonprofit community?

Journalists need to be able to navigate public information sources, from official documents and transcripts to information on the environment or the economy. Rather than relying on press releases and policy reports, they should be able to have some grasp of the raw information sources upon which these things depend - so they can make up their own mind, and do their own analysis and evaluation. There's a dedicated satellite event on data journalism at the camp, focusing on looking at where EU spending goes.

Similarly, NGOs, think tanks, and community groups should be able to utilise public data to improve their research, advocacy or outreach. Being more literate about data sources, and knowing how to use them in combination will existing free tools and services can be a very powerful way to put arguments into context, or to communicate issues they care about more effectively. This will be a big theme in this year's camp.

Why does it matter to people who have never heard of open data?

Our lives are increasingly governed by data. Having basic literacy about how to use the information around is is important for all sorts of things, from dealing with major global problems to making everyday decisions. In response to things like climate change, the financial crisis, or disease outbreaks, governments must share information with each other and with the public, to respond effectively and to keep citizens informed. We depend on having up-to-date information to plan our journeys, locate public facilities close to see how our taxes are spent.

What are the outcomes that matter from such an event?

We are hoping to build consensus around a set of legal principles for open data so key stakeholders around the world come to a more explicit and formal agreement about under what terms open data should be published (as liberal as possible!). And we'll be working on datacatalogs.org, which aims to be a comprehensive directory of open data catalogues from around the world curated for and by the open data community.

We also hope that some key open data projects will be ported and transplanted to different countries. Perhaps most importantly, we hope that (like last year) the discussions and workshops that take place will give a big boost to open data around the world, and people will continue to collaborate online after the camp.

How is OGD Camp going to be different from other events?

It looks like it will be the biggest open data event to date. We have representation from dozens and dozens of countries around the world. There will be a strong focus on getting things done. We're really excited!

October 07 2011

How data and open government are transforming NYC

"In God We Trust," tweeted New York City Mayor Mike Bloomberg this month. "Everyone else, bring data."

Bloomberg, the billionaire founder of Bloomberg L.P., is now in his third term as mayor of the Big Apple. During his tenure, New York City has embraced a more data-driven approach to governing, even when the results of that data-driven transparency show a slump in city services.

This should be no surprise to anyone familiar with the mission statement of his financial data company:

Bloomberg started out with one core belief: that bringing transparency to capital markets through access to information could increase capital flows, produce economic growth and jobs, and significantly reduce the cost of doing business.

To reshape that mission statement for New York City, one might reasonably suggest that Bloomberg's data-driven approach to government is founded upon that belief that bringing transparency to government through access to information could increase capital flows, produce economic growth and jobs, and significantly reduce the cost of the business of government.

As Gov 2.0 goes local, New York City has become the epicenter for many experiments in governance, from citizensourcing smarter government to participatory budgeting to embracing a broader future as a data platform.

One of the most prominent New Yorkers supporting architecting a city as a platform is the city's first chief digital officer, Rachel Sterne.

Sterne gave a keynote speech at this year's Strata NY conference that explained how data-driven innovation informs New York's aim to be the nation's premier digital city.

"I'm especially excited to be speaking with you because as a city, we need your help," said Sterne to the assembled Strata attendees. "As the data practitioners and data scientists who are at the forefront of this revolution, all of our efforts are for naught if you are not part of them and not helping us to expand them and helping to really take advantage of all of the resources that the city of New York is trying put at your disposal."

Video of Sterne's talk is embedded below.

New York City's digital strategy is focused on access to technology, open government, engagement and industry. "Industry is important because we need to make sure the private sector has all the supports it needs to grow and thrive and help to create these solutions that will help the government to ultimately better serve the public," said Sterne. "Open government is important because if our data and our internal structure and priorities aren't completely open, we're not going to be able to enable increased [open] services, that kind of [open] exchange of information, etc. Engagement is crucial because we need to be constantly gathering feedback from the public, informing and serving. And access is the foundation because everyone needs access to these technologies."

Big data in the Big Apple

What does data-driven innovation look like in New York City? Sterne focused on how data "evolves government," asserting that it leads to a more efficient allocation of resources, a more effective execution, and a better response to the real-time needs of citizens. Although she allowed that, "as everyone knows, data can be manipulated."

Sterne highlighted several data-driven initiatives across the city, including the Metropolitan Transit Authority's Bus Time Initiative. "Initially, it was scoped out to hundreds of millions of dollars. The MTA ended up working with a local open-source development shop, [which] did it for a fraction of that, below a million dollars, and now you can get real-time updates on your phone based on where the buses are located using very low-cost technologies."

New York City is also using data internally, explained Sterne — like applying predictive analytics to building code violations and housing data to try to understand where potential fire risks might exist. If that sounds familiar to Radar readers, it should: Chicago is also looking to use data, developers and citizens to become a smarter city. "This is as much about citizens talking to the infrastructure of the city as infrastructure talking to itself," said Chicago CTO John Tolva in an interview last March. "It's where urban informatics and smarter cities cross over to Gov 2.0."

Web 2.0 Summit, being held October 17-19 in San Francisco, will examine "The Data Frame" — focusing on the impact of data in today's networked economy.

Save $300 on registration with the code RADAR

New York City, however, has a vastly greater "digital reach" than Chicago. It's bigger than many corporations and states, in fact, connecting to more than four million people through NYC.gov and social media channels that have expanded to include Twitter, Facebook, Tumblr, YouTube and Foursquare. Sterne envisions the city's 200-plus social media platforms as a kind of "digital switchboard," where citizens ask questions and government workers direct them to the appropriate resources, much in the same way that California connects citizens to e-services with social media.

The web as the 21st century public square

"What we're really seeing that's interesting about all these things is that they're happening in public, so people are informing one another," said Sterne. "They're engaging one another, and it's not so much the city telling you what to do but creating a forum for that conversation to take place." If you visit NYC's custom bitly URL shortener, on.nyc.gov, you can see what content is popular within that community.

Back in May, when NYC's digital roadmap was released, Anil Dash highlighted something important: the roadmap captured New York City government thinking about the web as a public space. This has profound implications about how it should be regulated, treated or described. "The single biggest lesson I got from the 65-page, 11.8mb PDF is a simple one," Dash, a native New Yorker, blogger and entrepreneur, wrote. "The greatest city in the world can take shared public spaces online as seriously as it takes its public spaces in the physical world."

City as a platform

Sterne's description of a "city as a platform" is one of the purest articulations of Tim O'Reilly's "government as a platform" vision that I've heard any public servant articulate this year.

"The thing that's really exciting to me, better than internal data, of course, is open data," Sterne said during her Strata Conference talk. "This, I think, is where we really start to reach the potential of New York City becoming a platform like some of the bigger commercial platforms and open data platforms. How can New York City, with the enormous amount of data and resources we have, think of itself the same way Facebook has an API ecosystem or Twitter does? This can enable us to produce a more user-centric experience of government. It democratizes the exchange of information and services. If someone wants to do a better job than we are in communicating something, it's all out there. It empowers citizens to collaboratively create solutions. It's not just the consumption but the co-production of government services and democracy."

Sterne highlighted the most important open data initiative that the city has pursued to date, the NYC DataMine. Soon, she said, they will be introducing "NYC Platform," which she described as "the city's API." All of their work opening the data, however, "doesn't matter if we're not evangelizing it and making sure people are using it."

NYC has used an app competition to draw more attention to its open data. As I've written elsewhere, by tying specific citizen needs to development, NYC Bigs Apps 3.0 is part of the next-generation of government apps competitions that incorporate sustainability, community, and civic value.

"We've had about 150 apps developed," said Sterne. "There are apps that would be a significant cost to the city. Instead, they're at basically no cost because the prize money is all donated. We provide 350 datasets. Until now, they were not API-enabled. They were not dynamic, but we're going to be doing that because that's the overwhelming response that we're receiving from everyone."

That feedback is widespread in the open government data community, where studies show that developers prefer to explore and interact with data online, as opposed to downloading datasets. When it comes to developers working with public data, dynamic access can open up entire new horizons for potential applications, as the release of real-time transit data has demonstrated.

Sterne shared some useful examples of apps that have been created using NYC open government data, including Roadify, which allows you to find parking spots or transit information, and Don't Eat At, a Foursquare app that sends users a text message when they check into a NYC restaurant that is at risk of being closed for health code violations.

Sterne's message to data scientists was generally quite well received at Strata. "Pleased to see @RachelSterne's keynote today," tweeted Alistair Coote, a NYC Web developer at RecordSetter. "If done right, open govt will be far more important than anything announced at #f8 today," he observed, referring to Facebook's new look.

Why open government data matters to New Yorkers

The experience in NYC during Hurricane Irene "once again proved the utility and importance of open data and the NYC DataMine, as several organizations used OEM's Hurricane Evacuation Zone geographic data to build maps that served and informed the public," Sterne told me via email. "This data has been public for over a year. Parties developing tools built on city platforms included WNYC, NYTimes, Google, Mobile Commons and Crisis Commons. NYC Digital was also in regular contact with these parties to alert them of information changes."

The key insight coming out of that August weekend, with respect to the city acting as a platform during unprecedented demands for information, was that the open data that NYC provided on evacuation zones was used by other organizations to build maps. When NYC.gov buckled under heavy traffic, the city government turned to the Internet to share important resources. "As long as the right information was getting to citizens, that's all that matters," said Sterne at Strata. "It's OK if it's decentralized, as long as the reach is being expanded."

As I reported here on Radar, the growth of an Internet of things is an important evolution. What we saw during Hurricane Irene is the increasing importance of an Internet of people, where citizens act as sensors during an emergency.

"Social media played a critical role in informing New Yorkers," wrote Sterne. "Prior to that weekend, we established clear guidelines and a streamlined approvals process for social media content, which were disseminated to all social media managers. This ensured that even as we communicated in real time, we had accuracy and consistency in city messaging. @NYCMayorsOffice and youtube.com/mayorbloomberg were both major communication channels. @NYCMayorsOffice doubled its followers, increasing by nearly 30,000 during the weekend, and was cited by the mayor in press conferences as a resource. The YouTube channel was updated shortly after each press conference and saw nearly 60,000 views over the weekend. Over 32,000 tweets were published (not counting retweets) containing the text 'nycmayorsoffice' from August 25-29. Response was overwhelmingly positive."

Data pitfalls and potential

Legitimate questions have been raised about New York's data-driven policy, where journalists have questioned crime data behind the city's CompStat program. The city has also faced challenges and nearly $300 million in expanded costs for its computer system for personnel data, offering up a sobering reminder of how difficult it is even for immensely successful private sector leaders to upgrade public sector IT systems. That's a reality that former U.S. CIO Vivek Kundra can certainly speak to at the federal level.

That said, New York City and its mayor clearly deserve credit for opening data, being transparent about the administration's performance, and continuing to work toward the incremental improvements that tend to be the way that government moves ahead.

For more insight into the IT behind New York City government, Radar's managing editor, Mac Slocum, talked with Carole Post, commissioner of NYC's Department of Information Technology and Telecommunications, about what being a data-driven city means and some of the most valuable ways that data is being applied. Video from that interview is below:

The challenges you see in opening up data in New York City are two-fold, and ones you see across government, said Post. "We first and foremost are a steward of the data that we hold, and so the concerns around privacy, confidentiality and public safety are definitely ones that need to be balanced against accessibility to the information," she said.

That challenge is one that every big city CIO will face in the years ahead, as technology affords new potential to open government and new risks for exposing sensitive personal data. "While we are enormous proponents of having open data," said Post, preserving integrity of the data and protections is important.

Post acknowledged that city government has "typically not been a very open bastion of sharing all of its information," but pointed to a necessary step in open government's evolution: moving to a standard of open by default, where civic data is considered open unless there is a reason for it not to be.

Related:

September 20 2011

Historic global Open Government Partnership launches in New York City

Open government is about to assume a higher profile in foreign affairs. On July 12, 2011, the State Department hosted an historic gathering in Washington to announce the (OGP) with Brazil and six other nations. Today in New York City, this unprecedented global partnership will launch. Heads of state, representatives of civil society, members of the free press and technologists will convene at the New York offices of Google to hail the "Power of Open" around the world. In the afternoon, President Obama and the leaders of seven other countries will announce their national action plans and commitments to open government. I'll be liveblogging the event here on the Radar Gov 2.0 channel and tweeting out pictures to Tumblr and other social platforms. Virtual participants will be able to watch the launch at Google's YouTube channel at 9 AM EST.

Some 43 countries have now indicated their intent to join this international open government partnership, with the vast majority joining the founding eight members, led by Brazil and the United States. The formation of the OGP revisited the bilateral U.S.-Indian partnership on open government that was announced during President Obama's trip to India last November, although India subsequently withdrew from the OGP in July.

In her remarks on July 12 at the State Department, Secretary of State Clinton explicitly connected open government to economic activity. "We've also seen the correlation between openness in government and success in the economic sphere," said Clinton. "Countries committed to defending transparency and fighting corruption are often more attractive to entrepreneurs. And if you can create small- and medium-size businesses, you have a broader base for economic activity. At a time when global competition for trade and investment is fierce, openness is not just good for governance, it is also good for a sustainable growth in GDP."

In the week following Clinton's speech, I spoke with Mario Otero, Under Secretary of State for Democracy and Global Affairs at the United States State Department, about the Open Government Partnership and what it will mean. Our interview follows. You can also listen to an audio recording of our discussion, embedded above.

Can you explain how open government and a greater degrees of transparency or accountability are related to investment, economic output or activity?

"I think what the secretary said really summarizes well one aspect of what's economic growth and even economic development in a country, which is really how the rest of the world perceive it and how the rest of the world measures risk when you invest in a country," said Otero. "Clearly, if anyone looks at the components of country risk as you invest, issues that have to do with transparency and accountability are present within the factors that comprise that equation.

Otero explored other aspects of open government that arose in discussions at the forums at the State Department in July. "One was clearly that transparency will insure that resources are used for what they are meant to be used for in their totality, in part because you are seeing the transfer of funds and the use of funds, to make sure that some of that is not being set aside for other things or in some way taken out for corrupt reasons," she said. "That concept of being able to use a country's revenues in order to carry out a government's mandate and plan is again one way which the economic concept becomes important. Even if you're talking about health, if in fact you're providing improved health services to your population, you are improving their capacity to be productive citizens and to contribute to the economy. I mean, you can just go across the board."

"Another thing that came up that was very interesting, and it was actually brought up on Kenya, was the degree to which they themselves were not asking to collect information completely, but now that they are, how it is that they look at some of the items that they import into the country they they themselves could produced or could have. Just looking more carefully both at their balance of trade issues, recording all the information, giving emphasis to using data to make decisions, led, certainly the Kenyan participants, to give a couple of examples of how their imports had decreased in a couple of areas."

"These are different ways that open government can address directly the question that you are asking. I think we're going to come up with a lot more applications for open government that relate to reducing costs, said Otero. "As countries do this work more and more, we will see, especially when they are looking at the budget and the way the resources are allocated, that this will also, and the Secretary talked about this, conceivably have an impact on the tax revenue base of a country, because there are many citizens, either for excuse or otherwise, say 'well, why am I going to pay taxes if it's going to go into the pockets of some bureaucrat and it's really not going to bring about changes.' The minute you have more transparency and people begin to see how their taxes are being used, you then again increase the tax revenue that the country has available."

I brought up how the new city government in Chicago is thinking about data and the global movement towards open data, which Otero said is part of OGP. For city government under Mayor Emmanuel, open data is viewed as a means for government to understand its own operations, become more productive and deploy its resources more efficiently and effectively. The example of Chicago led Otero to highlight an aspect of the Open Government Partnership that's she found very interesting. "It is open to developed countries that have cities like Chicago, and developing countries, like a Kenya," said Otero.

"The point is that some of these tools for transparency can be used even by countries that one might think may not have the resources to be able to do that, or even the know how," she said. "In fact, it is available across the board and that is one of the characteristics of the Open Government Partnership, both recognizing that and ensuring that the leadership in this partnership from the outset is comprised of countries from the north or from the south. Again, showing examples of how you can do this in the south that are attainable to the countries that want to do that. It's very interesting that we can talk about Chicago and, say, Kisumu, Kenya in the same breath."

What concrete outcomes for open government around the world should citizens, advocates, entrepreneurs and technologists be looking for from this partnership?

"The partnership is really the first time that there is a multilateral platform to address these issues," said Otero. "The partnership could have focused on countries come in and present best practices and exchange ideas and then just go home," said Otero. "The partnership is really focused on first having countries participate that have already demonstrated interest in this area and have already put in place a number of specific things and the material laid out, if you will, the minimum standards that are being requested. What the partnership really looks for is to provide a mechanism by which the countries can each develop their own national plans on ways to expand what they're doing on transparency, accountability, and civic engagement, or to start new initiatives for them. That is really what is very different and important about this partnership, is that it is very action- and results-oriented."

When countries join the Open Government Partnership, they commit themselves to address one of several "grand challenges." "They can be anything from public service, addressing public integrity issues, for managing public resources," said Otero. "Using these challenges, they need to be able to create a plan. Now countries can, of course, choose what they will address. The partnership is not saying 'now all of you have to do the same thing.' It's very much based upon the way in which each country is assessing the specific ways it is interested in addressing. The Partnership is challenging countries to identify those areas of most interest to them, and then to be able to develop a plan that will allow them to make changes and have some real results come out of this. The broad vision for this effort is to really mobilize countries to do something very concrete and in the process develop their own capacity for doing it. Of course here, one can note that there will be some resources available to help countries do this work. That's really at the core of the work."

One clear difference that we see today from past decades is the reality of an increasingly wired citizenry. "The role of technology in doing all of this is very apparent to anyone that's been alive in the last decade," she said. "How countries are using technology, everything from using social media to creating their own websites to a variety of different things is really impressive and very innovative. So, of course, the private sector, if they've got any brains in their head, are seeing this as an important business opportunity."

"Whether you're creating new apps or working with directly with different governments, keeping your eyes open in this space, you also create different mechanisms, different technologies that can be of use to government. The bottom line is that the real effort here and the real outcome that would make the Open Government Partnership successful is signing up a significant number of countries that participate, and having those countries launch their own national plans and carry them out."

What were some of the platforms and technologies that have inspired you?

In Estonia they talk about creating a 'paperless government, Otero observed. "They really are creating 'e-governance,' as they call it, throughout, which is really quite amazing," she said. "In Iceland, it's very interesting that they're using social media to be able to have citizens participate in the redrafting of their constitution. They're using Facebook, and Twitter, and other things to just be able to communicate with the population.

Otero also pointed to the dynamic technology sector in Kenya, which launched an open government data platform this summer. Kenyans have advanced in technology more than any country in Africa, said Otero with the M-PESA system and the way that Kenyans can access information record data using mobile phones. "I think the Kenyan understand the importance of being able to use this data and some of the ideas that they put forth were more related to this area of saving resources and making some of the money available for other work. Otero also referenced open government work in Mexico, England, Honduras, Tanzania and Uganda.

India withdrew from the partnership, reportedly over concerns about a third party "audit" of its progress. Can you offer any more detail?

"It makes all the sense in the world to have independent experts who don't do an audit, which is a word that you used, but really assess, and look, and monitor the progress that's being made," said Otero. "They do this in a way to maintain that accountability, but also to make sure that you're not rating these countries or grading them or putting them in a category from 1 to 100 or whatever. That process is in place that was decided upon and all the countries believe that it adds vigor and rigor to this effort. I think, as you said, India has provided great value in this area of open government, of transparency, of accountability. They have done very important work, and they are strongly committed to the principles that are espoused by the Open Government Partnership. In fact, in the time that they worked directly, they really contributed a great deal. I think, right now, the government has indicated that they can't participate, and I think that the reason is precisely the one that you've laid out."

"I think that they will continue to follow the progress of the partnership. Many countries have bilateral relationships with India and continue to address these kinds of issue in a more bilateral way, because they have a great deal to contribute, both to this initiative and the overall work in transparency. I think, certainly, we completely respect their decision right now to watch this closely but not be part of it right now, and to continue doing their work internally. That's really the way that I understand their position."

Progress and setbacks toward open government

Over the summer and fall, analysis and information have steadily emerged about what this open government partnership will mean to open government in the United States and around the world. David Sasaki wondered if the OGP was "democracy building 2.0." Greg Michener echoed his analysis, wondering if the Brazil was fit to lead the OGP. Global Integrity explained its role in the OGP. Emma Smith questioned whether the Philippines is serious about open government.

In the U.S., OMB Watch posited that the OGP could drive U.S. commitments, particularly if, as John Wonderlich suggested at the Sunlight Foundation suggested, a U.S. national plan for open government was matched by subsequent follow through. The White House open government "status update" capped a historic week for open government in Washington, as the administration prepares to launch e-petitions. Quiet successes, however, have been matched with setbacks to open government in Washington over the past three years. The Obama administration now faces an uncertain future for funding for its Office of Management and Budget's open government initiatives after the U.S. Senate appropriations committee shortchanged the Electronic Government Fund by some $10 million dollars last week. With these proposed funding cuts the U.S. Congress is, as OMB Watch put, it "about to underfund the very tools that will tell them how federal money is being spent." When President Obama announces the U.S. National Plan for Open Government PDF) (embedded below), the implementation will have to be undertaken in that context.

The future of funding for open government platforms coming from the White House, however, now must be taken in the context of a much broader narrative that includes dozens of other countries and hundreds of millions of other citizens. Aleem Walji, writing at the World Bank, put the effort in the context of a broad move from "eGov to 'WeGov'. His analysis captures something important: whatever action the United States does or does not take in its own movements towards greater transparent, accountable or participatory government, there is a global movement towards transparency that is now changing the relationship of the governed to their governments. Unprecedented levels of connectivity and mobile devices have created new connections between citizens and information that lie outside of traditional methods of government command and control. The future of open government may well literally be in all of our hands.

This interview was condensed and edited. A full audio recording is embedded above.

August 17 2011

Opening government, the Chicago way

Chicago Skyline @ Night by Rhys Asplundh, on FlickrCities are experimenting with releasing more public data, engaging with citizens on social networks, adopting open source software, and finding ways to use new technologies to work with their citizens. They've been doing it through the depth of the Great Recession, amidst aging infrastructure, spiraling costs and flat or falling budgets. In that context, using technology and the Internet to make government work better and cities smarter is no longer a "nice to have" ... it's become a must-have.

In 2011, with the election of former White House chief of staff and congressman Rahm Emanuel, Chicago has joined the ranks of cities embracing the open government movement. Before his inauguration, Emanuel released a strategic plan that explicitly endorsed open data as a part of Chicago's future. The new administration hired its first chief technology officer, John Tolva, and a chief data officer, Brett Goldstein. In the months since, the new Chicago government is doing something notable, as far as governments go: it's following through on some of its open government promises.

Interviews with Chicago journalists and open government advocates, along with Tolva and Goldstein themselves, led me to a clear conclusion: there's something new going on in the Windy City that's worth sharing with the rest of the country and world.

"Appointing Tolva and Goldstein was one of the biggest ways in which Rahm has followed through," said Virginia Carlson, president of the Metro Chicago Information Center (MCIC), in an interview this summer. "The two of them make for a powerhouse, with Brett helping with releasing the data, in terms of the APIs and the time he's spent with the community."

The city has been releasing about two datasets a week since the new administration came into office, said Brian Boyer, news application developer for the Chicago Tribune. (That data trend is a big part of what motivated Boyer to work on the Panda Project.)

From where Tolva sits, what's happening in Chicago is not limited to open data or involving the tech community in improving the city. The culture of the mayor's office "changed radically with Mayor Emmanuel," said Tolva (@ChicagoCTO), speaking in a phone interview this summer. "I'm seeing the passion of the startup world here."

There's a long road ahead for open government in Chicago — the legacy of corruption, fraud and graft in City Hall there is legendary, after all — but it's safe to say that a new generation of civic coders and city officials are doing what they can to upgrade both the tone and technology that underpins city life.

"There was a lot of catching up to do," allowed Tolva. "A lot of it has been the open data publication. We've been getting very high-value datasets out almost every day. We launched an app competition. We got a performance dashboard up."

All of that is only the first step, he said. "It's part of a larger vision for stoking the entrepreneurial fires, where open data is used for much more than transparency. Data is a kind of raw material that the city encourages people to use. We're working on a digital roadmap and thinking more broadly. What can we do that will help businesses make the city more livable in a systemic way? One way we're going about that is rethinking what public space means. What are the kinds of data and interoperability standards that will allow that invisible architecture to be as accessible as a park is, and as malleable in purpose?"

Strata Conference New York 2011, being held Sept. 22-23, covers the latest and best tools and technologies for data science — from gathering, cleaning, analyzing, and storing data to communicating data intelligence effectively.

Save 30% on registration with the code STN11RAD

Tolva also offered some constructive criticism for the technologists in the open government movement to consider: "The community of civic nerds has not done a great job at engaging the big civic innovators who have no knowledge of technical skills or that area," he said. "We're trying to bring them together. One of my roles — the reason we're in the mayor's office — is to try to be that translator between the architects and the urban planners of the world and technologists."

Tolva said he is working on both economic development and applying technology to empower others to help the city work better. "I'm working with the commissioner to evangelize and convene innovators in Chicago's technology community, including Threadless, Groupon and EveryBlock. We want to promote that sensibility from the mayor's office, in terms of business developments. One third of my job is the analytics part of that, bringing data-driven decision making to the city departments, down into the individual commissions."

The movement toward opening up Chicago's government data predates the Emmanuel administration, as Carlson reminded me when I asked about new releases since the inauguration.

"The conversations started in November of 2009," she said. "The city has been building its data catalog for over a year and a half. We've been waiting for someone to come in and pull the switch. Maybe one quarter of what's now available was available before the new mayor took office. Three quarters of the data was sitting on internal servers waiting for someone to say, 'yes, we can publish it!' The salaries of city workers, for instance, was absolutely something that Rahm has released, along with lots of 311 data."

311 data has been the target of much of the initial open government activity in cities around the country, given the insight it can provide into the problems that citizens are reporting and the customer service they receive from their governments. When city officials can look at what 311 data can reveal about their urban environments, for instance, new opportunities emerge for improving the way government can target its efforts in cooperation with developers and citizens. That's the kind of "citizensourcing" smarter government that Tolva is looking to tap into in Chicago.

"This is as much about citizens talking to the infrastructure of the city as infrastructure talking to itself," he said. "It's where urban informatics and smarter cities cross over to Gov 2.0. There are efficiencies to be gained by having both approaches. You get the best of both worlds by getting an Internet of things to grow."

The most important thing that Tolva said that he has been able to change in the first months of the young administration is integrating technology into more of Chicago's governing culture. "If a policy point is being debated, and decisions are being made, people are saying 'let's go look at the data.' The people in office are new enough that they can't run on anecdotes. There's the beginning of a culture merging political sensibility with what the city is telling us."

That culture sounds more than a little like the new data journalism, applied to an emerging civic stack.

"I'm proud — and a bit harried by — the number of people asking for a regression analysis," said Tolva. "We have policy analysts who are dabbling with ArcGIS and trying Python."

The business case for open data

Like every other metropolis, Chicago has budget constraints. In the current economic climate, spending public dollars has to provide a return for taxpayers. Accountability and transparency are important civic goods — but making a business case for open data requires more grounded arguments for a city CFO to support these initiatives.

"The mayor is firmly committed to innovation that really matters and that can be built upon," said Tolva. When it comes to the business case for open data, Tolva identified four areas that support the investment, including an economic rationale:

  1. Trust — "Open data can build or rebuild trust in the people we serve," Tolva said. "That pays dividends over time."
  2. Accountability of the work force — "We've built a performance dashboard with KPIs [key performance indicators] that track where the city directly touches a resident."
  3. Business building — "Weather apps, transit apps, that's the easy stuff," he said. "Companies built on reading vital signs of the human body could be reading the vital signs of the city."
  4. Urban analytics — "Brett [Goldstein] established probability curves for violent crime. Now we're trying to do that elsewhere, uncovering cost savings, intervention points, and efficiencies."

Opening Chicago's data

Opening up Chicago's government data further will take time, expertise, and political support, along with a lot of hard work. Applying it is no different. For now, Tolva and Goldstein have the former three components firmly in hand. The latter is what lies ahead.

"In the realm of public safety, I had a good sense of the relevant data structures," said Goldstein in an interview this summer. "The city is an enterprise that's so large, with so many different functions and so many different data structures, that making sense of the landscape and developing a plan is a challenge."

From enterprise resource planning systems to public health to transportation, there's great diversity in how city data is structured and stored.

"One of the things Chicago has done very well is collect data," Goldstein said. "Now, one of the things we need to do is develop a holistic vision for an enterprise data architecture and data warehouse. How to do you take the things that are meaningful from architecture and then make them meaningful to the public?"

Given the challenges involved here, it wasn't surprising to hear Goldstein say that "we're not where I want to be yet" — but he's approaching the process methodically. "I want to know the entire lay of the land, have everything mapped out and understand the next steps."

As he looks ahead, Goldstein is less worried about access or load concerns, given the city's use of the Socrata online platform for open data. He's more focused on sustainable design.

"I want to make sure that the path we take the city on is sustainable and has a more open architecture," he said. "I find that when we choose proprietary solutions, it's hard to get the data out. If I'm going to sit down and code, I'm going to do it in Python, use Linux, and I'm going to be happier about it.

Goldstein is well aware of persistent issues around data quality that have dogged the use — and reputation — of open government data releases. "I'm very traditional in how I deal with data," he said. "It's the same as working with analytics. You need to make sure data is clean and high quality."

The process to get to clean data is, as Goldstein described it, quite methodical: "We have multiple phases for how we roll out data internally, starting with working with the business owner. We figure out how we'll get it out of the transactional database. After that, we determine if it's clean, if it's verified, and if we can sign off on it technically. "

The last step is analyzing whether the process is sustainable. "Some people send a spreadsheet, upload it and maintain it manually," said Goldstein. "That's not sustainable. We have hundreds of datasets. We're not going to do that. You need to write code that updates data on its own, and then you can focus on new datasets."

At a high level, Chicago's chief data officer emphasizes the value of open data in providing the city with insight into its business processes. "Opening data alone isn't enough," Goldstein said. "We're giving people the data to make meaningful apps and do meaningful research — but are we putting out a tabular dataset? Is it spatially enabled? Are we offering KML files directly versus a downloadable file? If we keep the KML file updated, then [application developers] can access the data directly from the app."

In this respect, Goldstein's focus on making data clean, sustainable and directly available suggests that he's attuned to what citizens want when they build applications. An open data study from late last year found that a majority of citizens prefer to explore and interact with data online, as opposed to downloading data to examine in a spreadsheet.

To fully embrace this vision, however, Chicago is going to have to build out its data capabilities to become a smarter city. "The first step is moving over to a more open platform," said Goldstein. "You don't have to make a multi-million-dollar investment to get a fancy GUI and something meaningful. If you bring something over to Linux, between Python and R you can produce some remarkable outcomes. These are some really low-cost solutions."

They're looking to use city data to make the city more productive and the processes better, said MCIC president Virginia Carlson. "For example, what if the city wants to understand zoning and the retail food landscape? Using its own food licensing and food inspection data, they can see where food is being sold. If Walmart is coming in, can the city mine its own data to understand where food deserts are and have a much richer understanding of its landscape?"

The city won't be working on this alone either, emphasized Goldstein. "We have great academic partners and lots of people coming to the table. We don't need to be afraid of using these tools. It's high time."

Refining apps competitions

The design of the Apps for Metro Chicago competition, offers some insight into how Chicago has learned from what other cities have done in their own open government and open data efforts. The competition is taking a next-generation approach, trying to provide technical assistance and connect communities with software developers.

"When I think about where we are, versus a San Francisco or Boston, it's because of examples of what worked and what didn't," said Tolva. "The judging criteria for the competition takes into account the sustainability of an idea, along with its cross-platform nature." In the video below, city officials talk about open data and building applications that are useful to the community.

Given the points that have been raised about the sustainability of apps contests, tying development to the demonstrated needs of citizens looks like an idea whose time has come. Look to the submitted ideas for NYC Big Apps in version 3.0 of its competition, for instance.

"We've elevated business viability in the judging rubric and are working with a great partner, MCIC," said Tolva. With regard to NYC BigApps 3, "there are all kinds of apps that we'd love to have," he allowed, but the applications in Apps for Metro Chicago have to solve business problems.

"The judging rubric has it that you have to demonstrate community participation and then release open source code," said Carlson. "The app has to be free to users for a year. We're very conscious that we don't want this to be a big competition ... and then it's over."

Tolva also focused on building community around apps contests and bringing more voices into the process. "We're using the Apps for Chicago to get a new kind of civic engagement and participation, which you can get involved in whether you write code or not," he said. "We've invited community leaders and groups to the table. The idea for a 'Yelp for social services' didn't come from a technologist, for example. We're curating ideas from non-technologists."

Apps for Metro ChicagoThe hypothesis in Chicago is that this hybrid strategy will result in better outcomes for taxpayers, developers and, crucially, citizens. "The apps competition needed to have a data expert, with someone outside of the city running it," said Carlson. "Justin Massa helped write the rules. Chicago was the first place to bring in unbiased external experts. Can we understand what we need to by doing open data right? This story is just beginning. The questions will be if, in six to eight months, whether this model works. We need to promote data sharing and cleanliness between data departments, to have data tickets, an internal account and a liaison, who can share that information, getting that productivity feedback and communication with developers."

The better part of an apps competition is the feedback on the data, said Carlson, not just how the city can use data on the public-facing side but apply data on the enterprise architecture side. "We're trying to capitalize on the cool factor to enhance internal processes, working with staff, and trying to get data to understand the city."

Writing the rough code of history

"We have been trying to get data out of state and local government for more than 20 years," said Carlson. "For me, to see this tide coming along from loosely affiliated millennials willing to stay up all night is inspiring. That's what's creating the energy to free up the data — this distributed network that's been living and breathing opening up the data."

There's more than the energy of millennials to celebrate here, however, as she emphasized. "They're pushing the data out to citizens as a way of running the city," she said. "It's in a business enterprise kind of way — that's the way Rahm is thinking about it. Using it internally hasn't been emphasized a lot, but it's a big part of what they're trying to do."

To get anywhere close to achieving that goal, Chicago will have to close the IT gap between the public and private sector, particularly in the emerging field of data science.

From the outside, it looks like the city's technology officials are hungry to improve how Chicago uses technology. "In the private sector and research community, we do cutting-edge work," said Goldstein. "Why shouldn't the government do this? Why should the bar be any lower?"

For now, as the new administration finds its way, there's hope that Chicago will take a leading role among other cities adopting open government.

"The combination of committed political leadership, engaged civic leaders and a vibrant start-up scene has made Chicago the place to watch for people who care about technology and society," said John S. Bracken, director of media innovation at the John S. and James L. Knight Foundation, when asked for comment. "We're living in what is potentially one of the most important times in the city's history."

Photo: Chicago Skyline @ Night by Rhys Asplundh, on Flickr



Related:


August 04 2011

Energy.gov relaunches using open source and the cloud

In April, Radar reported that Energy.gov was moving to Drupal. This morning, the Energy Department launched a redesigned Energy.gov as an interactive open platform that enables information exchange, open data and localized information for citizens. The new Energy.gov uses a combination of open source technology and cloud computing will save an estimated $10 million dollars annually, according to Energy Department officials.

"Our goal is to make Energy.gov easier to use, more transparent and more participatory," said Secretary of Energy Steven Chu in a prepared statement. "This next phase is part of our ongoing commitment to empower consumers and businesses with the information, tools and services they need to save money, create jobs and find opportunities in the new energy economy."

The new Energy.gov is built using Drupal 7, the same open source content management system used at WhiteHouse.gov, Commerce.gov, House.gov, and it's the system that supported the reboot of FCC.gov as an open government platform. Drupal distributions are now supporting a growing number of open government platforms in local, state and federal government.

Strata Conference New York 2011, being held Sept. 22-23, covers the latest and best tools and technologies for data science -- from gathering, cleaning, analyzing, and storing data to communicating data intelligence effectively.

Save 20% on registration with the code STN11RAD


Saving money through open source


The new site was implemented by several different firms. The Treehouse Agency built the backend, HUGE Inc designed the front end, Acquia helped with Drupal support and Energy Enterprise Solutions served as the integrator. Energy.gov is hosted in the cloud by BlackMesh.

[Disclosure: O'Reilly AlphaTech Ventures is an investor in Acquia.]

"The initial investment for the project was nearly paid for by consolidating other sites into this platform and not building new stand-alone sites in other places," wrote Cammie Croft, senior adviser and director of new media and citizen engagement at the Department of Energy, in an email this morning.

"We strategically invested our resources into open source and cloud solutions where possible," she wrote. "We anticipate more cost savings as we consolidate more sites into this platform and eliminate duplicative and out-dated website infrastructure elsewhere. As we consolidate more sites into the platform, we anticipate cost savings or avoidance of $10 million in a year. As of this launch, we're over $1 million in cost savings or avoidance."

The redesigned site has several notable bells and whistles around localization, data visualizations and open source mapping tools that use Node.js. Energy.gov features interactive maps built from open government data using MapBox, a map design suite from Washington, D.C.-based development firm Development Seed. For instance, an "alternative fuel locator" dataset is mapped and embedded below:

"This is about telling complex stories with data, and beautiful maps matter," said Development Seed founder Eric Gunderson in an interview this morning. "It just makes data a lot more consumable for citizens. The best part is that agencies are now able to do this for free on their own using open source tools."

For more ways that the Energy Department is tapping technology to deliver on its mission, including fuel economy, a solar decathlon, ARPA-E and more, make sure to read Aliza Sherman's excellent Mashable article.

Croft emphasized that the relaunch shouldn't be seen simply through the prism of cost savings alone. "This isn't about reducing the bottom line," she wrote. "It's about being more strategic with our investments in digital communications and technology."

The Energy.gov site allows Energy staff to create new sites without needing to go to developers. They'll "own" their own platform and will be able to add more functionality from the open source community in the future and contribute code back as well.

In that context, open source is playing an important role in open government, although it's hardly a precondition for it. Whether it's Energy.gov, coding the middleware for open government data or codesharing with CivicCommons, open source matters more than ever. As we moved together into the 21st century, open source technology and collaborative models will matter in media, mapping, education, smarter cities, national security, disaster response and much more in 2011 and beyond. The success of open source in building systems that work at scale offers an important lesson to government leaders as well: to meet grand national challenges and create standards for the future, often it's best to work on them together.

July 26 2011

A refresh for open government in British Columbia

The citizens of British Columbia have much more to be proud of than extraordinary natural beauty and abundant resources. Last week, Canadian citizens in the province of British Columbia saw three new websites go online that focus on releasing open government data, making information related to accountability available, and providing easier access to services and officials. These websites include:

WIth the launch, the province has joined a growing community of states that have adopted open government principles for governance. As the Vancouver Sun reported, the open government initiative and open data portal fulfill campaign promises made by British Columbia premier Christy Clark during her campaign. Clark has committed to making information on surgical wait times, school test scores and public-sector salaries available to the public. Last week, the premier directed her ministers and ministries to make more data available and then report back to the B.C. cabinet every quarter. Clark recorded a video, embedded below, in which she talks about what open government means to her and the accompanying change in culture that she's asking of public servants.

This was the first time a premier's video message was embedded in British Columbia's internal intranet for public servants, said David Hume, executive director of citizen engagement, business and workforce transformation, in a phone interview. "We, like many other public services, try to maintain this difference between the political and the non-partisan," he said. "The reason that we did this was because the challenge that she gave to public servants is so significant. The culture shift isn't trivial."

In other words, the legal, cultural and technical aspects of open government have to be in sync, along with a strong rationale for people to become more civically engaged. "In order to have meaningful conversations with citizens, they need to have the same information government has in raw form to do their own work," said Kevin Jardine, assistant deputy minister, in a phone interview. "You have to see a cultural change that sees opening up data as the default condition, versus one having to have an excuse, legal or otherwise, for having it otherwise."

Risks and rewards for open government

The value proposition for the average citizen comes from when you bundle the whole of this open government initiative together, explained Stephanie Cadieaux, minister for citizen services and open government, in a phone interview. "The open data initiative, and what we're going to be doing with public engagement, that's when you get the real value," she said. "When the information is out there, the community has access. It enhances democracy and citizenship."

"I think it's natural for people to be apprehensive at first," said Cadieaux. "This is a really different way of doing government than in the past. We need to find the best ways to use all the technologies to enhance what we do for citizens. There's potential value in opening up the data. Quite frankly, we don't have all the answers. This opens up a whole new world of looking at the data from different perspective, some of which may be "validating ways of doing things that we've done in the past."

The outgoing chief information officer of the United States, Vivek Kundra, recently highlighted concerns about unanticipated sensitive information being revealed through the combination of multiple datasets. Critics of open data initiatives also have focused on the potential for open government data to be misrepresented in the public sphere.

Cadieaux acknowledged the need to protect privacy and security, along with the reality of democratized data. "Whenever data is released, in a report or otherwise, it will be misused," she said. "You can say the same about statistics. I think it's the responsibility of government, who have collected the data, to make sure it's correct."

Realistically, said Cadieaux, anything new comes with risk. "Government tends to be quite risk averse, and for good reasons. We must do the best job with dollars and for citizens." Simply having the information is valuable, emphasized Cadieaux. While the same opportunity exists for open data to be used negatively, "there's opportunities for incredible innovation," she said. "We have to be open to it."

Once government releases open data, it's up to the larger community of civic coders, media, government, citizens, activists and nonprofits to further vet its accuracy and representations based upon it. "I think there will be a lot of self-regulation," she said. "That's what we see on Twitter. You have to have faith in people."

While officials in some state or national governments may be interested in adding direct revenues through selling public data, Cadieaux doesn't support that direction. "It's data that was collected on the public dollar," she said. "To charge the public to use it doesn't seem like the best use. I think we're going to be received well. Releasing government data provides a great opportunity for governments around the world to speak differently and gain value," said Cadieaux. "I would hate to see it as a revenue stream when what I think it should do is engage in a meaningful way with our citizens."

Strata Conference New York 2011, being held Sept. 22-23, covers the latest and best tools and technologies for data science -- from gathering, cleaning, analyzing, and storing data to communicating data intelligence effectively.

Save 20% on registration with the code STN11RAD


Separating good government from open innovation

One of the semantic challenges that has dogged open government advocates lies in the difference between good government, traditionally associated with transparency and accountability, and open innovation, where public servants collaborate with the public in co-creating services or policy. The B.C. government chose to separate these concerns into two components, calling each "open data" and "open information."

You have to make a distinction between the types of information early on, said Kevin Jardine, assistant deputy minister for business and workforce transformation. The most recent version of B.C.'s Freedom Of Information & Privacy Act (FOIPA) has been on the books since early 1990s, he explained, and it provides a pathway for citizens to request information from government.

"The bulk of those requests are for person information, like for adoption, for example," he said. "Some are for other information, like meeting notes, calendars or expenses from government officials. We've made distinction that open information is information about government, versus open data, which is data that government uses that is about the business of government."

Jardine said the B.C. government will be posting expense information for ministers and deputy ministers, along with proactively publishing information that people request and often pay for to the public broadly. "We anticipate — and this is part of the premier's thinking — that there will be more of this," he said. "As we look at what we think open government is — transparency, engagement, participation, citizen-centered design — we've tried to reflect that in the sites we've developed. Doing government differently is about engaging the public directly, becoming greater partners in governance, solving problems we collectively face."

This compartmentalization may help to focus on separating the specific requirements of good government advocates from the innovation community in B.C.. "We can never be transparent enough," said Jardine. "It's a game you can never win. We might hope that it reduces FOIA requests, but we anticipate that it may increase them."

Standing up the open information site was — and is — complicated by back-end systems, said Jardine. "The corporate system or travel system were never designed for this kind of transparency," he explained. "We've had to hack and splice to make even this small innovation work. We have to really work and try to standardize processes." Building openness into a system from the beginning, in other words, has considerable merits in open government as well as open source software.

Opening the data

At the time of launch, there were on the order of 2,400 data sets online, said Jardine. "Almost all of the data was previously available somewhere within the 400,000-page website that makes up B.C. government's web, presence but it was extremely difficult to find, with no standards or consistent licensing."

The first step, said Jardine, was to consolidate access to all of that data through a single place: data.bc. "This is a consistent catalog, with a license that makes it possible to use the data, with one exception — where prohibited by law. We've made an effort to convert many of those datasets to machine-readable format." While there are no APIs at the moment, Jardine says that the B.C. government is collecting suggestions, "since they are definitely on the agenda."

For the moment, Jardine said that the provincial government needs to understand more about what data people want. "One of the discussion groups is where the interests may lie," he said, pointing to the online open data communities that already exist in the province.

"What levels of effort are necessary here? What we've done with our holdings to date is gather what's out there into static datasets for download. As we progress, the APIs are definitely going to be on the table." Inside of government, said Jardine, they expect to see many more datasets available over the coming year. "Outside, we hope to see many more people aware of it and using it. You will see links to iTunes and other apps stores."

While there's no automated, technical means to request data at the moment, Jardine noted that interested parties can fill out an online contact form or contact the B.C. government open data team via social media channels. "We've already gotten a couple questions from our Twitter channel," he said. "The challenge is to match requests about what our holdings are. As for high value, what people are requesting, demanding it will be important, in terms of data tied to time series or geography."

In that respect, the question is whether the data released is valuable for understanding or improving the business of government, or to the private sector. Hume, who is the lead on the open data site, said in an interview that more performance data, like surgery wait times for specific physicians, is necessary. "We're working on getting that surgery data on the site," said "You can get social services data now, including key performance indicators for things like child protection."


The right open data license matters


Jardine also highlighted the importance of thinking through how the data is released. "You have to do more than just make the data available. You also have to get the license right," he said. In that respect, open government advocate David Eaves wrote that the license that the B.C. open government data catalog is released under is the "single biggest good news story for Canadians interested in the opportunities around open data." While Eaves is less positive about open data licenses in Canadian government at the federal level, he's optimistic about the prospects for cities and towns. "The fact that most new open data portals at the municipal level have adopted the PDDL suggests that many in these governments 'get it'," he writes. "I also think the launch of data.gov.bc.ca will spur other provinces to be intelligent about their license choice."

The British Columbia license, which was adapted from the United Kingdom's license, might also serve as a model for an open government data license for the world, writes Glynn Moody. " If most governments adopted the same open data licence, their projects would be compatible and therefore able to be combined easily," he said. "That would mean open government data could scale in a useful way."

Why does this license matter? "BC's open data license allows entrepreneurial use," tweeted Bowen Moran (@bxmx) in response to a question about the open government initiative. Bowen works on the B.C. public engagement team. "Part of the story there is how it's different from the UK license," Moran continued in a series of tweets. "BC's privacy protections are more robust than the UK's, so the links to the Act are more clearly defined ... the direct emphasis on combining it with other information or by including it in your own product or application is an explicit (and thus very open) invitation to entrepreneurs to build on what we have here. Open government is more than just data being available — it's about an entirely open approach to working with citizens. That's the license's spirit."

Building upon government as a platform

The architects of British Columbia's open government initiative specifically couched their efforts in terms of Tim O'Reilly's Gov 2.0 paradigm. Jardine said the B.C. government wants to see future iterations of the open data site driven by users, "because we really do view this as a citizen platform, as government as a platform."

"This open government initiative is about releasing information proactively, publishing data online or culture change. It's about business people using data and about informed decision makers making better public policy," said Hume. "It's tapping the ingenuity and excellence of British Columbians to use the data as a platform, to solve their own problems, and to self organize to get things done."

What's especially interesting here is that they've clearly internalized some of the lessons other countries have learned in their open government efforts. "The experience has been to build it and they won't come," said Jardine. "You will get a blip from developers, from those that are proficient, and then it tails off. By adding components that enable people to use the data, to embrace this government as a platform idea, to create community, we hope this leads to greater success." Jardine said that provincial government is working on a citizen engagement site that he expects to see linked to the open data website.

"The real success is about building community, certainly as a data publisher," emphasized Hume. "The higher-value function is being able to connect people to one another." (Historically, as Clay Shirky has observed, connecting citizens to one another has been undervalued, in the context of the Internet acting as a platform for collective action.) There are a number of places online for citizens, civic coders, government officials, journalists, nonprofits, user experience designers, librarians and other interested parties to connect, explained Hume. "One is the open data blog, where we'll be able to talk, at general level, and also at technical level. We'll also connect existing people working to government. We're lucky to have Open Data BC and will be working closely with them. We have integrated their Google Group into the site already. We're going to where people already are and connecting users where people who are already experienced."

The B.C. open government initiative isn't about data, culture, accountability or efficiency, though they all matter. "It's not really about the data — it's about building a community to work together to solve problems," tweeted Moran. "I love the idea that now I don't just serve the people of BC. With open info and open data, I serve with them."

(Note: Bowen Moran's quoted tweets were edited for clarity.)

July 14 2011

There are bigger issues surrounding the .gov review

The United States federal government is reforming its Internet strategy. In the context of looming concerns about the debt ceiling, high unemployment, wars abroad, rising healthcare costs, and the host of other issues that the White House and Congress should be addressing, that might seem like a side issue.

It's not. The federal government spends some $80 billion dollars every year on information technology. If you're paying any attention at all to government IT, you know that Uncle Sam is not getting his money's worth from that investment. Furthermore, the federal government has not been getting the kinds of returns in productivity or efficiency that the private sector has enjoyed over the past decade or so. Former White House OBM director Peter Orzag called that disparity the "IT gap" last year. So when the Obama administration launched a new initiative on Web reform this summer, it might have seemed a little overdue.

Better late than never, and better now than later.

Citizens are turning to the Internet for government data and services in unprecedented numbers, and they're expecting to find answers, applications and, increasingly, people. While public servants can join the conversations on social networks to address the latter demand, delivering improved information and e-services confronts the federal government with some tough choices, given budget constraints. That challenge is one reason that they're looking to the general public and the private sector for some ideas on how they can improve their strategy.

This week, in service of that goal, the White House hosted a livechat on improving federal websites with Macon Phillips, White House director of digital strategy, Vivek Kundra, the federal chief information officer, and Sheila Campbell, director of the GSA’s Center for Excellence in Digital Government. The chat, which has become a standard tool in the White House's online communications toolkit over the last year, included a livestream from WhiteHouse.gov/live, a Facebook chat and an active Twitter backchannel at the #dotgov hashtag. The White House also took questions through a form on WhiteHouse.gov and its Facebook wall.

These issues aren't new, of course, even if the tools for discussion have improved. And if you've been following the Gov 2.0 movement over the years this issue of how the government can use the Internet and associated technologies to work better has been at the core of the discussion throughout. Success online used to be measured by having a website, said federal chief information Vivek Kundra. As he observed immediately afterwards, "those days are long gone."

If the federal government is going to reform how it uses the Internet, it will need to learn and apply the lessons that Web 2.0 offers to Gov 2.0, whether it's standing up open government platforms, leveraging the cloud, crowdsourcing, or making data-driven policy.

Government is also going to need to stop creating a new .gov website for every new initiative, particularly if they're not optimized for search engines. There's some good news here: "Every month, historically, federal agencies would register 50 new domain names, said Kundra on Tuesday. "That's been halted."

This proliferation of federal .gov websites has been an issue for some time — call it ".gov sprawl" — and that what's driven the .gov reform effort in the context of the Obama administration's campaign to cut government waste. This week, for the first time, a dataset of federal executive branch Internet domains has been published as open government data online. The dataset of federal gov domains is hosted on Data.gov and has been embedded below:

Federal Executive Branch Internet Domains

"This dataset lists all of the executive branch second-level domains within the top-level .gov domain, and which agencies own them," commented General Services Agency new media specialist Dan Munz in the Community page for the dataset. "As White House Director of Digital Strategy Macon Philips has pointed out (see "TooManyWebsites.gov"), while many of these domain names point to sites that are valuable, some are duplicative or unnecessary. That makes it harder to manage the .gov domain, impairs agencies' ability to distribute information, and creates a user experience for citizens that just isn't as good as it could or should be. How can we fix that? Over the coming months, we'll have a plan for streamlining the federal executive branch webspace, and we want to invite you into the conversation. We're releasing this dataset as a first step, so that you can explore, comment, remix, and maybe even use the data to map the .gov domain in ways we haven't seen before."

OSCON Data 2011, being held July 25-27 in Portland, Ore., is a gathering for developers who are hands-on, doing the systems work and evolving architectures and tools to manage data. (This event is co-located with OSCON.)

Save 20% on registration with the code OS11RAD


Why reforming .gov matters

This effort is not impressing all observers. Micah Sifry, the co-founder of the Personal Democracy Forum, has called the move to delete redundant websites "cheap, dumb and cynical" at techPresident. "Redundant government websites probably cost the taxpayer a fraction of what we spend on military bands, let alone what we spend on duplicative and unnecessary government websites promoting the Army's, Navy's, Air Force's, Merchant Marine's, Naval Academy's, and Coast Guard's bands' websites! (According to NPR, the Marines spend $50 million a year on their bands, and the Army $198 million." In a larger sense, Sifry argued, "if you are really serious about eliminating stupid and pointless spending, then you'd be pushing for laws to strengthen protections for government whistleblowers (instead of going an stupid and pointless rampage to prosecute them!), since insiders know where the real waste is hidden."

Sifry is absolutely right on one count: the amount of money up for savings through reducing federal .gov websites is dwarfed by what is saved by, say, reducing Medicare fraud using new data analytics tools, or in finding cost savings in defense spending. Reducing the number of federal .gov websites by 90% would not significantly address the federal deficit. The biggest federal cost savings from this week's .gov livechat were likely cited by Kundra, when he said that 137 federal data centers were closed by the end of this calendar year, each of which consumes immense amounts of energy.

Where Sifry may have been overly harsh in his critique is in not acknowledging how progressive a perspective the White House appears to have embraced here. (Progressive meaning "forward-thinking," not political ideology, in this case.) Democratizing healthcare data so that it showed up in search engine results or is integrated into applications makes it more useful, argues Kundra, citing the improvements to hospitalcompare.gov. Moving from a static website to a universe of applications and services provisioned by open government data is shifting from a Web 1.0 vision to 2.0 reality. In a country where 35% of citizens have a smartphone, delivering services and providing information to a mobile audience has to be factored into any online strategy, whether in the public or private sector. And, in most cases, it's the private sector that will be able to create the best applications that use that data, if government acts as a platform to empower civic coders. Phillips acknowledged that explicitly. "The best mobile apps," he said, "are going to be driven by the private sector making use of public data."

If e-government is going to move toward "We-government" — as Sifry has described the growing ecosystem of civic media, technology-fueled transparency advocates and empowered citizens — government data and services will need to be discoverable where and when people are looking for them. That is ultimately, in part, what getting .gov reform right needs to be about, versus straightforward cost-savings.

Kundra asked the broader community to "help us think through how we're going to provide services over the mobile Internet." If, as he said, search is the default way that people search for information now, then releasing high quality open data about government spending, the financial industry, healthcare, energy, education, transportation, legislation and campaign finance would be a reasonable next step. Tim O'Reilly has been sharing a simple piece of advice to the architects of platforms for years: "Don't make people find data. Make data find the people."

The .gov reform, in that context, isn't just about reducing the number of websites and saving associated design or maintenance costs. It's about reducing the need to ever visit a website to retrieve the information or access a citizen requires. In the years ahead, it will be up to Congress and Kundra's successor as federal CIO — along with whomever he or she reports to in the Oval Office — to get that part of "web reform" done.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl