Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

April 05 2011 reboots as an open government platform

A decade ago, the Federal Communications Commission's (FCC) website received an award for the best website in federal government, but the largely static repository has steadily fallen over the years to become one of the worst. Today, the bar has been raised for federal government website reboots with the relaunch of the new, now available in beta at home page

The new site is organized around the three primary activities: file a public comment, file a complaint, and search for information. The insight for that redesign came through a combination of online traffic analysis, requests for information through the call center, and conversations with FCC employees.

Some changes that go along with the new are literally tiny, like the newly launched URL shortener. Others look small but are a big deal, like secure HTTPS web browsing across Other upgrades work on small devices, enabling interested parties to watch proceedings wherever they are: the livestream now includes the ability to sense the device that someone is using and convert on the fly to HTML5 or Flash. That livestream can also be embedded on other websites.

All of those upgrades add up to a greater whole. Broadly speaking, FCC managing director Steve Van Roeckel and his team of software developers, designers, new media and IT security staff have worked hard to bring Web 2.0 principles into the FCC's online operations. Those principles include elements of open data, platform thinking, collective intelligence, and lightweight social software. What remains to be seen in the years ahead is how much incorporating Web 2.0 into operations will change how the FCC operates as a regulator.

Nearly two years ago, Tim O'Reilly and John Battelle asked how Web 2.0 technologies could transform the actual practice of governing. The FCC has made a big step toward that vision, at a cost of approximately $1.35 million in total development costs. "Everything should be an API," said Van Roeckel, speaking in a briefing on Monday. "The experiences that live outside of should interact back into it. In a perfect world, no one should have to visit the FCC website." Instead, he said, you'd go to your favorite search engine or favorite app and open data from the FCC's platform would be baked into it.

The overhaul of has been underway since last September. "We're approaching .gov like .com," Van Roeckel said at the time. Seven months later, is the next iteration of what an open government platform can be — at least with respect to the digital architecture for a regulatory agency.

"It is our intention that every proceeding before the agency will be available for public comment," Van Roeckel said at the briefing. "If we think of citizens as shareholders, we can do a lot better. Under the Administrative Procedure Act, agencies will get public comments that enlighten decisions. When citizens care, they should be able to give government feedback, and government should be able to take action. We want to enable better feedback loops to enable that to happen."

Following are five ways the new improves on the previous version, followed by an analysis of how some of these changes relate to open government.

1. runs on open source

Specifically, open source redesign runs on Drupal, like, and The FCC also considered Sharepoint, Documentum, WordPress and Ruby on Rails before ultimately going with Drupal. The use of Drupal at the White House was a "strong validator" for that choice, said Van Roeckel. As the White House has done, Van Roeckel said that the FCC will contribute code back to the Drupal community.

2. is hosted in the cloud

Federal cloud computing is no longer on the horizon. It's now a reality. Last May, the White House moved to Amazon's cloud. The new is hosted on Amazon's cloud. Today, the new is hosted in Terremark's cloud, according to Van Roeckel. As with, content is accelerated by the Akamai Content Delivery Network.

This Terremark implementation has been certified at a higher level of security required for compliance with the Federal Information Security Management Act (FISMA)., in fact, is one of the first federal websites in the cloud at the FISMA moderate level. Van Roeckel, however, cautioned that only information that the agency deems "read-only" will be hosted externally. Transactional implementations in the cloud will follow later. "Everything that the government allows us to shift to the cloud, we will shift to the cloud," he said.

The move to the cloud is being driven, as with so many aspects of government, by costs. As with much of industry, the FCC's servers have been underutilized. Moving to the cloud will enable the agency to more closely track actual usage and adjust capacity. "My goal is to move everything from a capital expense to an operating expense," said Van Roeckel.

3. incorporates collective intelligence

Over time, the new will begin to show the most popular pages, official documents, and comments on the site.

Last year, the Office of Management and Budget (OMB) updated rules for cookies on federal websites. Among other changes, the new guidance allowed federal agencies to use data analytics to monitor website traffic, interact and engage with citizens online, deliver e-services, and provide information.

Van Roeckel also pointed to the use of FCC broadband speed testing apps to collect more than 2 million tests across the United States as a precedent for looking to citizens as sensors. The testing data was integrated into the FCC's national broadband map. Van Roeckel said that he has hopes for other uses of collective intelligence in the future, like crowdsourcing cellular tower locations.

4. has significantly improved search

The less that's said about the ability of visitors to find information on the old, the better. The new uses Solr, an open source enterprise search platform from the Apache Lucene project. The new functionality is built upon a topic-based "FCC encyclopedia" that provides dynamically generated results for each search. "It's not breakthrough stuff, but it's breakthrough for government," said Van Roeckel, who indicated that further improvements and fine tuning are coming within a week.

5. is a platform for open data

"The whole website is built on RESTful services by itself," said Van Roeckel. "What I saw when I came in is that we had amazing data locked up in silos that were inaccessible to companies and citizens. There were all these roadblocks to getting and using data."

The new site makes the data for public comments accessible with an associated API. There are also now now chief data officers for wireline, wireless, consumer, media, enforcement, international, engineering and legislative affairs. Each data officer is personally accountable for getting data out to the public.

The roadblocks are far from gone, but due to the efforts of the FCC's first chief data officer, Greg Elin, some are being removed. The agency launched, worked to share more APIs at and hosted its first open developer day. Van Roeckel says the agency is working to get standardization of data, with the general direction of standardizing to XML. If the future he described is near, FCC is increasingly going to ask companies to file regulatory information in electronic, machine-readable formats.

Open government, inside and out

There are three different lenses to look at what open government means for a federal agency, at least as defined by the Open Government Directive: transparency, participation and collaboration.

Open data, online video and collective intelligence applied to governance will help with transparency. Collective intelligence may help to surface key documents or comments. Participation and collaboration in open government have proven to be a harder nut to crack.

The role that a regulator plays matters here. For example, comments from or were entered into the public record. "Today, you can take us to court with one of the blog comments from," said Van Roeckel. "More than 300,000 citizens gave comment on the Open Internet proceeding." Whether those comments lead to positive or negative public policy changes is both an open and contentious question, as this analysis of those who win and lose under FCC net neutrality rules suggests.

That doesn't mean improving the capacity of the FCC to conduct more open rulemaking online wasn't worth the effort. It means that to make those processes truly open, the regulators themselves must shift to being more open.

Embracing the spirit of open government will require all agencies to move beyond what information technology by itself can accomplish or empower. That's a tall order. It requires a cultural change. To put it another way, open government is a mindset.

That's particularly true when applying an open government mandate to an institution with around 1,900 workers, where the dynamics that MIT research professor Andrew McAfee aptly described in "Gov 2.0 vs the beast of bureaucracy" are in play.

Van Roeckel said that the FCC launched internally first, including an anonymous comment box and blog. They're working on "bringing Web 2.0 culture into the building," where possible. For instance, the agency has been using ThinkUpApp for internal collaborative innovation.

For other agencies to succeed in a similar refresh, Van Roeckel shared a key point of advice: "Get someone on the executive team who can get resources and own the mandate. That the chairman cares about this and that I care about this is why it's happening."

Whether those internal and external efforts will lead to a 21st century regulatory agency isn't clear yet. That's a judgment that historians are better suited to render, rather than those chronicling the rough draft of history. What is indisputable is that today, there's a new for the world to explore.


March 31 2011

White House releases IT Dashboard as open source code

The White House has released the software code for its IT Dashboard and TechStat toolkit. The initiative was coordinated through Civic Commons, a code-sharing project incubated within Code for America that helps governments share technology for the public good, with support from OpenPlans. Civic Commons staff worked with REI Systems, the contractor that originally built the IT Dashboard for the United States Office of Management and Budget (OMB), to prepare the code for release under an open source license. That work included a security audit, documentation, and a licensing review of the software's components.

IT Dashboard

"The creation of the IT Dashboard was a significant step forward in making government programs more transparent and accountable," said Tim O'Reilly. "Its open source release is a huge step forward — and a model for other government programs — in showing how to reduce the costs and increase the effectiveness of government by sharing tools developed by one agency with others who also need them."

A live demonstration of the opensourced code for IT Dashboard is now online. The IT Dashboard code is available at and is released under a GNU Public License (GPL). A bug tracker and other resources are also online.

Karl Fogel, who has been working on open sourcing the federal dashboard for months at Civic Commons, shares more context and technical details about how the IT Dashboard was open sourced in this post.

"We launched the IT Dashboard and the TechStat Accountability Sessions to improve IT transparency and accountability across the Federal Government," wrote federal CIO Vivek Kundra at the White House blog. "The Dashboard has helped us shine a light on IT projects, providing performance data to fuel TechStat reviews, which have led to over $3 billion in cost reductions."

For those unfamiliar with the nation's chief information officer, Kundra is the man who has proposed and is now entrusted with implementing sweeping federal IT reforms. He's been applying the IT Dashboard to track IT spending from within the White House Office of Management and Budget, where he serves. During Sunshine Week, Kundra went to the white board to describe good government at the White House. Video of his presentation is embedded below:

With the release, an application that was developed on behalf of government agencies can now be implemented and further customized by other potential government users and developers at the city, state or international level. CIOs from across the United States and around the world have expressed interest in implementing the IT Dashboard in their organizations, including aarten Hillenaar of the Netherlands, Kyle Schafer in West Virginia and Jason DeHaan of the City of Chicago.

"We don't have to build it, we don't have to buy it, we don't have to procure it," said Greg Elin, chief data officer for the Federal Communications Commission. "What's not to like?" The Office of the United States CIO has also launched CIO tools, which aggregates all information about the IT Dashboard and TechStat Toolkit.

"What makes it attractive to them is that the Dashboard sets a baseline level of accountability that many senior managers feel will help them detect problems early, yet does so without imposing too great a burden on the people closest to the projects, that is, those responsible for the inputs from which the overviews are generated," wrote Fogel at "Establishing such a baseline is a cultural act as much as it is a technological or management one.  Once regular measurement is being done, it becomes much more difficult to slip backwards into complacency. There may be genuine disagreement about how to measure costs and return on investment, but that is a productive discussion to have.  The Dashboard thus acts as a quality ratchet not merely in IT accountability, but in dialogue about how IT investments are measured at all. "

There is an important cautionary note to share with government entities that adopt the IT Dashboard code: performance improvements gained through increased transparency will need to be grounded on accurate data. According to a report by the Government Accountability Office (GAO) released this March, while OMB has made improvements to its Dashboard, "further work is needed by agencies and OMB to ensure data accuracy."

... inaccuracies can be attributed to weaknesses in how agencies report data to the Dashboard, such as providing erroneous data submissions, as well as limitations in how OMB calculates the ratings. Until the selected agencies and OMB resolve these issues, ratings will continue to often be inaccurate and may not reflect current program performance. GAO is recommending that selected agencies take steps to improve the accuracy and reliability of Dashboard information and OMB improve how it rates investments relative to current performance and schedule variance.

A new transparency ecosystem

While this analysis from the GAO does not detract from the significance of the release of the IT Dashboard as open source code, it does serve as a reminder that data-driven decisions made with it will rely upon accurately reported data. That necessity, however, will not come as news to the many chief information officers working on opening government data repositories around the country and globe.

The growth of an international open government data movement is one of the notable developments toward greater transparency in the 21st century. Now there's reason to believe that the release of this IT Dashboard has the potential to catalyze the use of the IT Dashboard as a platform to go with them. "Look at how many states and countries have launched data portals modeled after," said Elin. "Authorities — and enterprises — everywhere will similarly adopt the IT Dashboard, too."

Elin anticipates that more will come of this release of code than adoption of the platform. "Come back a year from now and you'll see a nascent ecosystem growing around the IT Dashboard with vendors offering support, add-ins and extensions," he said. ", the Community Health Data Initiative, the National Broadband Map, the IT Dashboard: these are the kind of assets that will just keep giving and giving."

Whether an entire new ecosystem of code based upon the IT Dashboard platform blossoms or not, it has set an important precedent. "The software is less interesting to me than how they released the software in the first place," said Gunnar Hellekson, chief technology strategist for Red Hat, US Public Sector. "The government has been releasing source code for years, but there's no common policy or understanding of how it should be done. Today's announcement is important because it creates a prominent, very public footpath for other agencies. This wasn't a set of patches, it was a whole application. Other agencies can now use their process, this footpath, to release their own projects."

The most important element of making the IT Dashboard open source may be the model for code release. As open source plays a part in open government at the State Department and other federal agencies, that kind of leadership from the federal CIO is important. Whether it's or moving to Drupal open source matters more than ever. In a time when every level of government is facing painful budget decisions, new tools that provide more transparency are more than timely. They're necessary.

"US taxpayers want accountability, transparency, and cost savings in IT spending at all levels of government," said Jennifer Pahlka, founder of Code for America, the organization spearheading the sharing of the IT Dashboard,. "Since they've already paid for an IT Dashboard with their federal taxes, their cities and states shouldn't have to buy it again. We saw a real opportunity here to help all governments work better."


January 18 2011 connects state government to citizens, a free, open source online portal designed to make open state government available to citizens, launched this morning. makes it easier for citizens to learn about pending legislation and their legislators by combining open government data, information about state legislators, multiple databases of voting information, social mentions and news coverage into a lightweight online user interface. If that sounds a lot like what does for the federal government, it should: it's the same model, adapted to the state level. Radar readers will remember that was one of our civic innovation organizations to watch in 2011.

opengovernment-home.jpg is a joint project of the Participatory Politics Foundation and the Sunlight Foundation. The beta version has launched with information for legislatures in California, Louisiana, Maryland, Texas, and Wisconsin. The nonprofit is actively seeking funding to expand to all 50 U.S. states and major cities.

"We're providing a concentrated activity stream that offers a more calibrated way of staying in touch with state government," said David Moore, executive director of the Participatory Politics Foundation. "We believe in the power of peer-to-peer communications, which means connecting with people online and empowering them to share information with one another."

The idea, said Moore, is simple in conception but difficult in execution: create a free, open source platform where "it's as easy to follow your state senator as it is to follow your friends on Facebook."

To get to launch today, the team rewrote the code base for OpenCongress, including an improved Ruby wrapper for open government APIs. The code for the wrapper is available through GitHub. Official legislative information is integrated with Follow the Money, ratings, news and blog information.

"We see this as a tool to fight systemic corruption in government," said Moore. "We think this is a good interface for finding representatives, from federal all the way down. It's an article of faith that even people who use OpenCongress don't know what's hot in state legislatures. There hasn't been a lot of scrutiny or information design devoted to this issue, and we think that's one of our core competencies." updates the model of aggregated official government data, profiles of officials, and bill pages with improved integration of "social wisdom of the crowds," the Disqus comment system, and a "what's hot" tool. Moore thinks that last detail will help reporters, bloggers and citizens get a better handle on what's going on.

OpenGovernment_bill_page_tools.png has ambitious plans for later this year, including mobile versions and apps, and giving users the ability to track issues and receive updates on topics of interest.


December 31 2010

What lies ahead: Gov 2.0

Tim O'Reilly recently offered his thoughts and predictions for a variety of topics we cover regularly on Radar. I'll be posting highlights from our conversation throughout the week. -- Mac

Is open government moving from theory to practice?

Tim O'ReillyTim O'Reilly: The initial rush of interest in open government and transparency is wearing off and people are getting to work. Gov 2.0 startup founders are figuring out business models -- such as advertising and providing back-end services for cities -- and the first crop of startups are being funded. Early entrants, like SeeClickFix and CrimeReports, are growing. I think we'll see a number of new startups in this space. [Disclosure: O'Reilly AlphaTech Ventures is an investor in SeeClickFix.]

Open government's transition is also leading to practical applications. The Blue Button initiative from Veterans Affairs, which allows veterans to easily download their medical records, is an idea that's bound to spread. Blue Button is a direct outcome of the Open Government Initiative, but that connection probably won't be recognized. As is so often the case, the things that really make a difference get put into a different category than those that fail.

Along those lines, people might say that open government failed because many of the items on the punch list didn't happen the way they were originally envisioned. When we look back, we'll realize that open government is not just about transparency and participation in government decision making, but the many ways that open data can be put to practical use.

There are profound open government projects taking shape. For example, the Department of Health and Human Services (HHS) could transform our healthcare system through open data and medical records. HHS Connect and The Direct Project are all about creating the standards for interoperability between medical records. We'll eventually see and benefit from larger efforts like these.

Another open data project that I'm fond of that started very early in the open government process is GTFS, the General Transit Feed Specification. That's the data standard that lets transit districts feed their bus and train arrival times to applications like Google Transit, or any of the many smartphone apps that help you plan your trip on public transit. This standard started as a collaboration between Google and the city of Portland, but is now available from many cities. It's a great example of how governments can think like platform providers. They have to equip their buses and trains with GPS, and report out the data. They could report it just to their own bus stops and train stations, or they could make it available to third parties to deliver in a hundred ways. Which is better for citizens? It's pretty obvious.

And of course, this is government data following in the footsteps of great open data projects of the past, such as the satellite weather data released by NOAA to power the world's weather forecasters, or even the GPS signals that were originally designed only for military use but then released for civilian use.

Much of what you're describing sounds like the Web 1.0-to-2.0 trajectory. Do you see similarities?

Tim O'Reilly: At the end of the Web 1.0 era, some people claimed the web had failed because banner advertising, pop-overs, pop-unders and all the increasingly intrusive forms of advertising didn't work. Then Google came along with a better idea. I think something similar will happen with Gov 2.0. Many of the things people label as "Gov 2.0" are really the early signals and efforts of a "Gov 1.0" period. Important shifts will eventually occur, and they won't have anything to do with government agencies putting up wikis or using Twitter. We will see many unexpected outcomes over time.

A collection of posts that look ahead to 2011 can be found here.


December 29 2010

2010 Gov 2.0 Year in Review

I recently talked with Federal News Radio anchor Chris Dorobek about Gov 2.0 in 2010 and beyond. While our conversation ranged over a wide variety of topics, it was clear afterwards that I'd missed many of the year's important stories in Gov 2.0 during the relatively short segment. I went back over hundreds of posts on Gov 2.0 at Radar and GovFresh, thousands of tweets and other year-end lists, including Govloop's year in review, Gartner's Top 10 for Government 2.0 in 2010, Bill Allison's end of year review, Andrew P. Wilson's memorables from 2010, Ellen Miller's year in Sunlight 2010, John Wonderlich's 2010 in policy and GovTwit's top Gov 2.0 stories. Following are the themes, moments and achievements that made an impact.

Gov 2.0 spreads worldwide

The year was marked by the international spread of Gov 2.0 initiatives. Wherever connections are available in the United States, citizens are turning to the Internet for government data, policy and services. Applying that trend internationally isn't unreasonable, as more of humanity comes online. It won't be easy. It's Gov 2.0 vs the beast of bureaucracy everywhere, as professor Andrew McAfee memorably put it.

In Australia, for instance, government 2.0 Down Under still has a ways to go if it isn't going to be a "one shot" contest or success story. What's next for Government 2.0 in Australia, as Stephen Collins reflected, will rely on more public figures driving change, as well as citizens demanding better results.

In the United Kingdom, the new-ish government will continue to be a test bed, given dire budget projections. A refreshed Number 10 Downing Street online presence and accounts won't address cost issues, either. A spending challenge focused on crowdsourcing cuts didn't get very far. Such initiatives are likely the tip of the iceberg, as tough budget decisions loom in 2011. While the influence of Tim Berners-Lee on is unmistakable, Gov 2.0 in the UK involves a host of small companies, agencies, elected officials and of course the citizens themselves.

Gated governments face disruption

Everywhere, governments remain concerned about the risks and rewards of Web 2.0, but with citizens increasingly going online, those same governments must respond to digital cries for help. In countries with autocratic governments, the disruption and challenge to power represented by free information flows mean that transparency and accountability are a long way off. In that context, an e-government conference in Russia has to be balanced with the government transparency issues revealed by the deployment of Ushahidi for crowdsourcing wildfires.

Citizens empowered with new tools for transparency became a more powerful force in 2010, as the growing lists of examples of open government platforms in India (a democratic country) suggest. As citizens gain more means for reporting issues with services, corruptions or elections, repressive governments will be faced with more challenges in filtering, censoring, blocking or shutting down services that host contradictory or false reports.

In that context, technology companies also have meaningful choices to make, from how they cooperate (or don't) with law enforcement and government agencies that want access to its data, to "firewalling" private information from multiple services within companies, to monitoring internal controls on employee access or to providing technologies that may be used to monitor, track or censor citizens.

Open government remains in beta

While the progress of the White House Open Government Directive at federal agencies is important, as is action in Congress, there's a long road yet ahead in the United States and abroad. As John Wonderlich pointed out in his own look at 2010,

Obama’s Open Government Directive is at a crossroads (like other similar policies), and the changing majority in the House brings new opportunities for change (a 72 Hour Rule!), just as the outgoing majority brought their own new opportunities for transparency.

We're still very much in open government's beta period. Some efforts, like the State Department's Text Haiti program for the Red Cross or the "do-it-ourselves" platforms from groups like CrisisCommons, made a difference. Other efforts, partially represented by many open government plans in the throes of implementation, won't mature for months to come.

What is clear is that open government is a mindset, not simply a fresh set of technological tools. Gov 2.0 is a means, not an end. It can and will mean different things to different constituencies. For instance, the State Department released a social media policy, engaged the world through social media, launched a "Civil Society 2.0" initiative and released a quadrennial review in December. Its efforts to apply social software to digital diplomacy were laudable. By the end of the year, however, Secretary Clinton's landmark speech and policy on Internet freedom came under sharp global criticism in the wake of "Cablegate." The questions of who, where and why the U.S. supports Internet freedom became even more complex.

WikiLeaks is a reminder that the disruption new technology platforms pose will often emerge in unexpected ways.

Open data went global

The first International Open Government Data Conference highlighted how far this trend has gone in a short time. "Since the United Kingdom and United States movement started, lots of other countries have followed," said Tim Berners-Lee, the inventor of the World Wide Web. Canada, New Zealand, Australia, France, Greece, and Finland are all working on open data initiatives. Within the United States, 16 states and 9 cities have created open data platforms. More data platforms at all levels will come online in 2011.

"The more transparency there is, the more likely there is to be external investment," said Berners-Lee, highlighting the potential for open government data to make countries more attractive to the global electronic herd. Berners-Lee anticipates a world where open government data standards will come to cities, states and countries like HTML did in the 1990s. "The web spread quickly because it was distributed," said Berners-Lee. "The fact that people could put up web servers themselves without asking meant it spread more quickly without a centralized mandate." Over in England, the new uses the linked open data standards Berners-Lee recommends.

After nearly a year in the open data trenches, Nat Torkington offered advice here at Radar for those starting or involved in open data projects:

First, figure out what you want the world to look like and why. Second, build your project around users.

The Sunlight Foundation, one of the foremost users of data journalism for open government, created a new ‘data commons’ in 2010 and launched and, both of which combine to make campaign finance, lobbying, earmark and government contract data more accessible. Sunlight Labs also made progress in opening up state legislatures.

In December, a report on the attitudes, quality and use of open government data showed strong support for the release of open data among citizens and government employees. While the open data study showed progress, there's still a long road ahead for open government data. The promise of data journalism is notable, as journalists now have huge volumes of accessible government data, but cultural roadblocks and "dirty" data still need to be addressed.

There are (more) apps for that

Around the world, apps contests are unlocking innovation. One of the biggest contests, Apps for Development, is using new World Bank open data.

As governments create their own applications, however, they'll need to avoid "shiny app syndrome" to avoid empowering the empowered.

Gov 2.0 grew locally

Gov 2.0 is going local, as techies take on City Hall. CityCamp, Code for America, Civic Commons and protocols like Open311 all grew this year. Real-time transit data is opening up exciting prospects for entrepreneurs. Local government as a data supplier looks like it may have legs as well.

Even mainstream media woke up to the local trend. Time Magazine reported on mobile civic applications that let citizens give feedback to cities. At year's end, the use of Twitter by Newark mayor Cory Booker to hack Snowmageddon after a major snowstorm buried the East Coast brought new attention to the opportunities inherent in a new digital nexus between citizens and public servants online.

Look for more citizens as sensors in neighborhoods soon.

Laws, rules and regulations

This was also the year that mainstream media couldn't stop reporting on social media in politics. Sarah Palin's tweets were read on cable news and gaffes from a Congressman or updates from the campaign trail went straight to the headlines. There have been thousands of posts and cable news reports on the topic at this point. A study on Twitter use in Congress asserted that Democrats use Twitter for transparency, while Republicans use it for outreach. For a useful perspective outside of the United States, First Monday published a terrific Gov 2.0 case study in government and e-participation at Brazil's House and presidential websites.

What such reports generally missed is that Gov 2.0 progress within agencies is bolstered by orders, laws or regulations that support these activities. This spring, the Sunlight Foundation and other transparency advocates worked with Rep. Steve Israel and Sen. Jon Tester to introduce the Public Online Information Act in both chambers. As John Wonderlich explained, the act redefines “public information” by requiring that any government information currently required to be available to the public be posted online, and sets forth better technology coordination between the branches of government to achieve that overarching goal."

In June, OMB updated its rules for cookies and privacy on U.S. government websites, enabling government agencies to use social media, video sharing and discussion platforms. In July, the House of Representatives had its first hearing on Government 2.0, examining the risks and rewards of Web 2.0 in government. The White House also released a draft of "National Strategy for Trusted Identities in Cyberspace," including a means for people to comment upon it online. Yes, the government has an online identity plan for you.

The passage and subsequent signing of the Telework Enhancement Act by President Obama was a win for government workers, providing new flexibility in getting the job done. The need for that ability was driven home by the historic snowfall in Washington, D.C. last winter, when "Snowmageddon" made working from home more than a "nice to have" for many parts of the federal government.

Election 2010 was a refresh for Gov 2.0, offering up numerous lessons for social media and politics from the campaign. What emerged were new prospects for the GOP to embrace innovation and transparency. That subsequently manifested with a victory for transparency in House rules.

The enactment of a plain writing law is also a boon for open government, although getting bureaucracies to move away from acronyms won't happen overnight.

In December, the passage of the COMPETES Act in Congress means that every federal agency can create prizes and competitions. Watch to see if citizens and other private entities take them up on those opportunities.

Online privacy went mainstream

While some media outlets declared that privacy is dead, government officials and institutions weren't convinced. That's why online privacy debates heated up in Washington, with Facebook privacy and Google privacy frequently in the news.

The shift to cloud computing puts Electronic Communications Privacy Act reform in the spotlight. Simply put, digital due process matters. As the year came to an end, the FTC released its online privacy report, which included a recommendation for a Do Not Track mechanism, along with increased transparency and baked-in controls.

Government moves into the cloud

When NASA Nebula's open source technology was integrated into Rackspace and others to form OpenStack, the administration's open government initiative had a bonafide success on its hands. Outside of NASA, the White House IT reforms include a "cloud first" strategy for new investments. That move is a part of a broad strategy to close the technology gap which has been a top priority of the administration's IT executives. FedRAMP, a federal government-wide approach to securing cloud computing, may help to provide some of the security and privacy questions that CIOs must ask.

While some elements of government business will never be in the public cloud, look for the cloud transition to be an even bigger story in 2011 as Microsoft, Google,, IBM, Amazon and others look for government dollars in their clouds. The White House moved to Amazon's cloud in May. This fall, moved into their cloud, too. has many agencies in its cloud. Google and Microsoft have been signing up new city and state customers all year, along with chasing federal dollars. Look for more of the same in 2011, along with more tough questions about auditability, security, uptime and privacy questions.

Open source moves deeper into government is moving to Drupal next spring, joining hundreds of other government websites on the open source content management platform. Next year, when gets an overdue overhaul, it will also be open source.

Healthcare communication got an upgrade as the Direct Project creates the basis for a "Health Internet." The NHIN Direct project's approach to creating open health records was an important example of government as a platform. For more context, Apache co-founder Brian Behlendorf talked with Radar about the CONNECT project in a podcast, "from Apache to Health and Human Services.

A "milestone in making government more open" went live this summer when the new Federal Register beta launched at As deputy White House CTO Beth Noveck observed, "Federal Register 2.0" is "collaborative government at its best." It's also all open source, so the site's code is shared in Civic Commons, a project launched at the Gov 2.0 Summit that will help city governments reduce costs and inefficiencies.

Archiving went social

When the Library of Congress teamed up with Twitter to archive tweets, it made headlines everywhere. Less noticed were the social upgrades by the Law Library of the United States to, or the work that the National Archives is doing to guide other governmental agencies. When NARA issued guidance on social media, it was a watershed for many people looking for advice. moved forward

As Carl Malamud has explained:

Law.Gov is an idea, an idea that the primary legal materials of the United States should be readily available to all, and that governmental institutions should make these materials available in bulk as distributed, authenticated, well-formatted data.

This year, moved much closer to reality, as the signing and release of Law.Gov core principles was followed by Google granting major funding.

At year's end, Malamud announced that Public.Resource.Org would begin providing legal decisions freely online in 2011 in a weekly release of the Report of Current Opinions (RECOP). According to Malamud, this report "will initially consist of HTML of all slip and final opinions of the appellate and supreme courts of the 50 states and the federal government."

Citizen engagement platforms grew

With a wave of new citizen engagement platforms and apps, citizens could contribute much more than a vote or a donation in 2010: they could donate their time and skills.

The growth of citizen engagement platforms, however, extends far beyond Washington. Civic developers are helping government by standardizing application programming interfaces and empowering others by coding the middleware for open government. Working with developers can be a crucial complement to publishing open data online. Those citizens matter a lot there, but only if engaged.

As the new year beckons, there are more ways for the citizens of the United States to provide feedback to their federal government than perhaps there ever have been in its history. In 2011, the open question is whether "We the people" will use these new participatory platforms to help government work better. The evolution of these kinds of platforms aren't U.S.-centric, either. Ushahidi, for example, started in Africa and has been deployed worldwide. The crowd matters more now in every sense: crowdfunding, crowdsourcing, crowdmapping, collective intelligence, group translation, and human sensor networks.

What's next?

Have bets for 2011? Let us know in the comments.

December 09 2010

White House proposes sweeping federal IT reforms

For years, the differences between the use of information technology in the public and private sector have been glaring. Closing the technology gap has been one of the most important priorities of the administration's IT executives. Today, the White House released a report (below) that acknowledges those issues and proposes specific reforms to begin to address the IT gap that Peter Orzag, the former head of the Office of Management and Budget (OMB) in the White House, highlighted this summer.

This morning in Washington, the White House will host a forum on information technology management reform hosted by federal chief performance officer and OMB deputy director for management Jeffrey Zients and U.S. chief information officer Vivek Kundra. The two will lay out the Obama administration's strategy to reboot how the federal government purchases and uses information technology. The event will be streamed live at It's also embedded below:

Key reforms proposed in the plan include:

  • Create career tracks for IT program managers
  • Move to pilot projects with more budget flexibility and greater transparency in the implementation process
  • Develop IT acquisition specialists to closely align acquisition with technology cycles in the broader economy
  • Following the model of Social Security, enact the requirement for complete, integrated teams to be in place before projects are approved
  • Launch "myth-busting" campaigns about acquisition of federal IT.
  • Use "TechStat" sessions and other accountability measures to end or accelerate troubled projects
  • Reduce the number of federal data centers by at least 40 percent by 2015.

November 23 2010

Coding the middleware for government data

Cities, states and agencies are publishing more government data online, but that's just the tip of the iceberg. Much government data is still in paper form, locked away in file cabinets, or in closed formats on obscure servers. For instance, the data-driven story of BrightScope, which uses government data to clarify 401(k) plans, started with boxes upon boxes of printouts. The Department of Labor is just now starting to put that data online. That's why reporting on the progress of open government data initiatives is a key pillar of Gov 2.0. For those who have been working toward more transparent government, that issue is central to their work.

"Embracing Tim O'Reilly's concept of 'Government as a Platform' is easier said than done," wrote Max Ogden in his pitch for the first Ignite Gov at the Government Open Source Conference (GOSCON) in Portland. During a five-minute presentation, Ogden offered up a refreshing personal perspective on what it takes for civic hackers to put open data to good use. Here's his talk:

Under the Open Government Directive, a PDF qualifies as an open format. BrightScope uses government data, but it's not "open" in the sense that technologists use the term, nor did BrightScope's business result from the open government initiative. Put in the context of Tim Berners-Lee's definition for open linked data or the principles at, PDFs on CD might not merit even one star, although BrightScope has been able to move forward with their business in the meantime.

IT officials from the Office of Management and Budget, the White House Office of Science and Technology Policy, or open government technologists like Clay Johnson or Noel Hidalgo have expressed a preference for data published in structured formats. Federal CIO Vivek Kundra said at the International Open Government Data Conference that releasing open data can be seen through three lenses: accountability, citizen utility and creating economic value.

"Putting out data is not enough," said Beth Noveck, White House deputy CTO for open government, at the Open Cities Conference earlier this month. "It's what we do with that data to make it useful." It's civic hackers like Ogden who have done just that, with a little help from the government. Ogden wants local government to act as a data supplier, providing the means for civic hackers to make things that help citizens to make better decisions. Ogden was instrumental in connecting data and developers through Portland's Civic Apps contest.

More examples of open government and civic innovation will depend on similar public-private partnerships of open data, developers and entrepreneurs. Portland, San Francisco and Boston have shown how open data can be used to spur economic activity. Real-time transit data in Boston has created a whole new ecosystem of apps. The Apps for California contest that featured mashups of government data is behind a new startup, Zonability. And the healthcare apps coming out of a community health data initiative at the Department of Health and Human Services are continuing to evolve.

If governments wish to provide citizens with better understanding of how government works and what it's doing with taxpayer funds, that means using the most efficient, cost-effective means to provide that transparency, and the right data sets for real accountability. Consider how the British government released spending: spreadsheets. That choice enabled the Guardian to download the data and help citizens analyze it.

The story of open data and Max Ogden is evolving too: he's one of the inaugural fellows of Code for America. As Code for America founder Jen Pahlka notes:

They will be the first participants in our experiment to help city governments better leverage the power of the web. Starting in January, it will be their challenge to not only build innovative apps for each of our cities, but also become the leaders of the ongoing movement to make government more open and efficient.

Based on Ogden's track record, there will be more open government middleware to come.


November 18 2010

The story of BrightScope: Data drives the innovation economy

What's the best story in Gov 2.0, when it comes to entrepreneurs using government data? Given the spotlight that deputy White House CTO Beth Noveck and US CTO Aneesh Chopra shone this week on San Diego-based startup BrightScope at the first International Open Government Data Conference and Politico's "Next in Tech" events, it looks like the Obama administration thinks it just might be the Alfred brothers.

"Form 5500 has sat in Washington," said Chopra at Politico's Next in Tech event Mike and Ryan Alfred, former financial advisors in California, looked for a way to liberate that data. To what end? "The American people spend at least $4 billion in excess fees on 401(k) plans," said Chopra.

Given what BrightScope has been working on since 2008, there's a strong case to make that their government data story is compelling. As Vivek Wadwha noted in his TechCrunch piece on the goldmine of opportunities in Gov 2.0, BrightScope has made a profitable business of using government data about 401(k) plans. In the process, they've helped the American people to understand the fees associated with these plans, which is an area of cost that's frankly unknown to the vast majority of people saving for retirement.

Below, Mike Alfred tells BrightScope's story at this year's Gov 2.0 Summit:

Here's the key point about the founders' fascinating story of extracting public data from the Department of Labor (DOL): it came from more than 50 Freedom of Information Act requests, made at significant cost over many months. They extracted boxes and boxes of paper records. Finally, after a lobbying campaign that took months, they were able to get the data in electronic form. While the BrightScope story is a terrific case study in putting government data to work, it's misleading to count them as an open government success quite yet.

The data that they use is not, according to BrightScope, hosted on "We are aware of @USDataGov but have not sourced our data there. DOL data is starting to go online but it's a process," tweeted the company.

That may change. This morning, evangelist Jeanne Holm asked BrightScope a simple question: "Can I help DOL getting data on"

That's another story that's worth watching. "The single best thing we could do in open government is to get the American people engaged in the question of what high value data is," said Chopra today. To put it another way, who will be the next Alfred brothers? Will it be someone who uses health data from the Department of Health and Human Services? One of the many location-based startups who provision their mobile apps with the new trove of geospatial data coming online? Or will it be an entrepreneur who knows that the government data they need to be the "Morningstar of X industry" is on a government spreadsheet and works to liberate it? The open question then is if United States open government data policy then supports that effort. In service of economic growth and job creation, it's probably the right one to be asking.

"We have to talk about varied uses" for open government data, said Noveck at the data conference. "Not everyone will be interested in fraud or abuse. Others will be interested in citizen empowerment, or job creation."


September 23 2010

How do we get government to share data?

On Tuesday we wrapped up the Manor Makeover in Manor, Texas, population 6,500. In some ways, this is ground zero for Gov 2.0 at the local level. The City of Manor has done some very innovative things on a shoestring, gaining attention ranging from the blogosphere to national press and all the way up to the White House. In fact, keynote speaker Beth Noveck, Deputy CTO in the Office of Science and Technology Policy at the White House wrote up a blog post just last night. The makeover is pretty impressive — they even followed my blog post from earlier this year about how to embrace Gov 2.0. (Not that they’ve seen it — it’s probably just obvious if you think hard enough about prioritizing limited resources.)

The champions of the Manor event (which “made over” a different city, De Leon, Texas) are Dustin Haisler (Manor CIO) and Luke Fretwell (GovFresh). They are both well ahead of the open government curve, and I think they’re absolutely on the mark: with a little elbow grease, the tools exist to make government more collaborative, transparent and participatory on a limited budget. With the right political backing, connections, knowledge and personal drive, a lot can be accomplished toward making government act more like a platform, as Tim O’Reilly and others have envisioned.

But how do we get this “perfect storm” of talent and resources to see it take off? There are more than 18,000 cities and counties in the United States. Are any others doing it this well? Actually, yes. But I can pretty much count them on one hand. Watching (and giving) some of the Manor presentations, I participated with the group in the promise of government as a platform. Bonded by our collective vision, we all bring something to the table: code, software, political/administrative expertise, or perhaps just a dose of unbridled enthusiasm. However, something is "rotten in the state of Denmark." Just yesterday, Luke posted a blog entry announcing the "end of a GovFresh era" — and this, one day after its most successful event.

As I listened, my mind kept wandering back to one common showstopper: without laws to compel the sharing of local data, it’s never going to happen at scale the way people are hoping. Even if we can get past the biggest issues — inertia and resistance to change — municipalities still don’t have the budgets and technical know-how to make it happen. Oh I know, I know — there’s always the argument that “the ROI on giving out data is a no brainer”…sorry, but the evidence isn’t there yet. If it were so compelling, more governments would be doing it.

How, then, does the public get access to data, and ideally, to raw data streams? The typical hurdles to sharing information — technology, budgets, politics, and the absence of laws requiring its disclosure — are particularly acute in law enforcement, so it makes a good case study. Tight budgets are forcing many agencies to cut personnel — and budgets were already tight before the recession. Not to mention that crime data is at best controversial and at worst vulnerable to statistical manipulation by whomever can get their hands on it. So what can we do? As far as I can tell, there are three basic tactics:

  1. Using force, by changing the laws or creating new regulations requiring agencies to disclose data and provide it as a machine-readable feed. (Note: some sunshine laws give access to limited data, but not as an ongoing feed or through an API.) This will take forever, so it’s not a great option.
  2. Using intimidation, by enlisting the news media to pressure the agency, or by hiring lawyers to threaten them with lawsuits, which have no basis in law or fact (caveat: I am not a lawyer, but I have hired some to help me be more informed). Although some agencies might capitulate, most won’t. And then they’ll be put on the defensive.
  3. By creating value for agencies to entice them to share the data. Sometimes it’s as simple as asking; other times, they need to see how they will benefit or they won’t share anything. Still other times, it’s a non-starter.

Let’s talk about crime data

So what, exactly, is public data? Is crime data actually public data, and do agencies have to provide it? If so, why don’t they provide it as a raw data feed? The answer is this: while crime data concerns the general public, it is not exactly public data per se, and individual agencies have their own rules. Most won’t — or at least don’t — release it. And I don’t know of any agency, anywhere in the country of any size, that feels it is legally obligated to disclose any crime data to anyone as a machine-readable feed. Period, end of story. Please, prove me wrong.

So, given that #1 and #2 above are not going to be effective, and that there are no requirements to disclose data in an open format, let’s assume we have to use option #3 above, and create value to entice agencies to participate. We still have to deal with the barriers: budgets, technology and politics (fear, apprehension and skepticism about what will happen if they share it). While the barriers may not be completely justified, they are real.

The CrimeReports approach

At CrimeReports, there’s no question about it: we work for law enforcement. We create solutions for them. Because the laws don’t compel them to share information at all, they hold all the cards, so we draw them into voluntarily sharing crime data by showing them how it creates value for their agency. We solve their technology hurdles, and we make it affordable. Let’s face it, they have other things to worry about, like fighting crime.

One challenge on “making it affordable” is that the sweet spot price is just $100/month on average, and some still can’t afford it. No surprise that at that price, we lose money, and a lot of it, since it’s not easy to extract the data, standardize it and perform all of the work to make it available and easy to use. If it were easy, believe me, it would already be done.

The bottom line is that I think it's good to have incident-level data available to the public in near real time. It's good for communities. It's good for the agency that provides it. And it's good for individuals. But we need a business model that works. If we want that data to be out in any form, we need to have the ability to make up for the loss of our cost on performing that service.

To keep us in check, the invisible hand will exercise its magic. If we price it too high, someone else can move in and replace us, which has already happened (we actually have other viable competitors in the space). If you don’t think that will happen, think again. Our contracts are cancelable with 30 days notice, and we don’t have the exclusive ability to source the data. If you, your company, the Department of Justice, or the local newspaper wants the raw data, nothing is preventing that from happening. (In fact, some of our customers publish crime data to more than one entity.) You just have to make it worthwhile to the agency, since the other two options listed up above are unrealistic.

In a perfect world, every government agency in the country would provide an open and free data catalog. The early adopters have done it. More will continue to do so. We’ve come up with a process that works: it gets data out into the hands of the public, and that’s a step in the right direction. So let's not let the perfect be the enemy of the good.

The discussion is far from over, and I think there's more we can do. I'd love to hear what you think and keep the conversation running.

September 06 2010

Bringing open government to courts

yu.harlan.jpgAs court records increasingly become digitized, unexpected consequences will result from that evolution. It's critical to be thinking through the authentication, cost and privacy issues before we get there.

Harlan Yu, a Princeton computer scientist, worked with a team to create online tools that enable free and open access to court records that highlight the need for more awareness. My interview with Yu this summer was a reminder that the state of open government is both further advanced and more muddled than the public realizes. As with so many issues, it's not necessarily about technology itself. Effective policy will be founded upon understanding the ways that people can and will interact with platforms. Although applying open government principles to public access for court documents is a little dry for the general public, the ramifications of digital records being published online means the issue deserves more sunlight. A condensed version of our interview follows.

Your open government work has focused improving public access to court records in the on PACER system. PACER stands for "Public Access to Court Electronic Records" but the reality of public access is more complicated. What's the history of your involvement with this aspect of open government?

Back in February of last year, Steve Schultze, who was at the time at the Berkman Center, was giving a round of talks about access to court materials on PACER. He came to CITP in February to give a talk with one of his colleagues. I had never heard of PACER before, but I went to Steve's talk and learned about how the federal government provides these documents that form the basis of our common law. I was appalled that these public domain documents were essentially being sold to the public at the detriment to our democracy.

What did you propose to Schultze to fix this situation?

We thought there was a way that you could automatically allow PACER users to share documents that were legitimately purchased from the PACER system. Because these are public domain documents -- and there were no copyrights assigned to these documents -- if one legitimate user pays for a document, they should be able to share it on their blog, send it to their friend, post it online, or do whatever they want with it. We decided to venture out and build a [Firefox] plug-in called RECAP that essentially automatically crowdsources the purchase of PACER documents.

Who else was involved in building RECAP?

Gov 2.0 Summit, 2010We worked with the Internet Archive and with Carl Malamud at We built a system where users could download the RECAP plug-in and install it. While they used PACER, any time they purchased a docket or a PDF, whether it was a brief, an opinion or any motion, it automatically gets uploaded into our central repository in the background.

The quid pro quo in that, as you're using the RECAP plug-in, if we already have a document that has been uploaded by another user, that gets shown to you in PACER to say, "Hey, we already have a copy. Instead of purchasing another copy for $.08 or whatever it'll cost you, just get it from us for free."

We now have about 2.2 million PACER documents in our system, which is actually a small fraction of the total number of documents in the PACER system. The PACER administrative office claims that there are about 500 million documents in PACER, with 5 million being added every month. So 2.2 million is actually a pretty small number of documents, by percentage.

We think that we have a lot of the most commonly accessed documents. For the court cases that have high visibility, those are the ones that people access over and over. So we don't have a lot of "long tail," but we have a lot of the ones that are most commonly used.

Are there privacy and security considerations here? Why does the concept of "practical security" matter to open government?

We'd like to make all of these documents freely available to the public. We've found a couple of different barriers to offering free and open public access. The biggest one is definitely privacy. When an attorney files a brief [in federal courts], they need to ensure that sensitive information is redacted. Whether it's a Social Security number, the name of a minor, bank account numbers, all of these things need to be redacted before the public filing, so when they put it on PACER, it can't be mined for this private information. In the past, the courts themselves haven't been very vigilant in making sure their own rules were properly applied. That's mainly because of "practical obscurity." These documents were behind this paywall, or you had to go to the courts to actually get a copy. The documents weren't just freely available on Google. The worry about privacy was not as significant, because even if there were a Social Security number, it wouldn't be widely distributed. People didn't care so much about the privacy implications.

So a condition of "privacy by obscurity" persisted?

Exactly. The information's out there publicly in public record, but it's practically obscure from public view. So now we have a lot of these PDF documents, but there's actually a number of these documents that have private information, like Social Security numbers, the names of minors or names of informants. Just going out and publishing these documents on Google isn't necessarily the best and most moral thing to do.

I think one of the consequences of RECAP, Carl's work and our work in trying to get these documents online is the realization that eventually all of these documents will be made public. The courts need to be a lot more serious about applying their own rules in their own courts to protect the privacy of citizens. The main problem is that in the past, even though these records weren't available publicly and made freely available, there were already entities in the courtrooms essentially mining this information. For example, in bankruptcy cases, there were already data aggregators looking through court records everyday, finding Social Security numbers, and adding this information into people's dossier but out of the view of the public. Bringing this privacy issue to the forefront, even if these documents aren't yet publicly available, will make a big impact on protecting privacy of citizens who are involved in court cases.

As court records become more public, what will that mean for citizens?

If somebody sues you -- and it's a claim that eventually is unfounded -- that might end up in some dossier and the information may be incorrect. With these 2.2 million documents, we try to make them as publicly accessible as possible without harming the privacy of citizens. Last month, we came out with the RECAP Archive, which is essentially a search interface for our database of documents. We now allow users to search full text across just the metadata associated with the case. You can search across all the documents we had for case title, case number or judge. If there's a summary of the documents, you can search over all of the metadata on the docket. We haven't enabled full text search of the actual PDF or of the brief yet because that's where a lot of the PII is going to be found.

What about the cost of making court records available? Is there a rationale for charging for access?

The other issue with PACER -- and it's hard to ignore -- is cost. The reason why the courts charge money for these public domain documents is that Congress authorized them to. In the 2002 E-Government Act, Congress essentially said that they"re allowed to charge you their fees to recoup the cost of running this public access system, only to the extent necessary to recoup these costs. The courts determined at the time that that should be $0.07 a page and eventually upped that per page access rate to $0.08 per page. But if you look at their budgeting documents, we've found that they actually charge a lot more than the expense necessary to provide these documents. My colleague, Steve Schultze, has done a ton of work digging into the federal judiciary budget. We found that about $21 million every year looks like it's being spent directly on running the PACER systems. That includes networking, running servers, or directly to providing public access through PACER. Their revenue in 2010 is projected to be -- I believe -- $94 million. So there"s a $73 million difference this year in the amount of money that they"re collecting versus the amount of money that they're spending on public access. That $73 million difference is thrown into this thing called the Judiciary Information Technology Fund or the JIT Fund.

The JIT Fund is being used on other court technology projects, like flat screen monitors, telecommunications, embeddable microphones in court benches. I"m not opposed to these projects being funded and more technologies in courtrooms, but these projects are being funded at the expense of public access to the law, including the ability for researchers and others interested in our judicial process to access and study how the judicial process works, which I think is highly detrimental to society.

You've offered a thorough walkthrough of many of the issues that were raised at the workshop earlier this year. What is the next step in opening up the court system in a way that the American people can find utility from those efforts?

I think the ball is essentially in Congress' court, so to speak. The courts need to work together with Congress to find the right appropriation structure such that PACER is funded not by user feeds but can be supported by general appropriations. Only in that case could the courts take down that user pay wall and allow all of these documents to be freely available and accessible. It's important to look at exactly how much money Congress needs to appropriate to the courts to actually run the system. I think $21 million isn't necessarily the right number, even though that"s how much they spend today for a couple of reasons.

Carl has done a bunch of FOIA requests to all of the individual executive agencies and found, for example, that DOJ pays the judiciary $4 million ever year to access cases. That"s probably true for a lot of the other agencies or for Congress. They pay the courts to access PACER. So a lot of that money is already coming from general appropriation where taxpayer money goes to DOJ, $4 million and then that is paid out to the courts.

If Congress were able to redirect that money directly, the courts would get that money directly and that would go a long way in making up this $21 million. In addition, the amount of money to run the payment infrastructure, to keep track of user accounts, to process bills, to send out letters, to collect the fees, I"m sure probably would cost a couple million dollars, too. If you take down the pay wall, that whole system doesn't even need to be run.

From a policy perspective, I think it's important for Congress and the courts to look into how much money is being sent by using taxpayer money already on running PACER and then directly appropriating that money, along with however, more is necessary on top of that if there's a shortfall to fund the system. Once enough funding is available, then you can take down the pay wall and keep the system running.

There are privacy issues that we need to deal with, certainly in bankruptcy cases, there"s a lot more private information that's left un-redacted, in the regular district appeals courts, appellate courts, probably a bit less. But there are definitely issues that we need to talk about.

What are you focusing on in your doctoral work at Princeton?

On the open government front, I've been looking into a variety of topics in privacy and authentication of court records. I think that's extremely important, especially as the focus is on publishing raw data and third-party reuse of data, in terms of re-displaying government data through third parties and intermediaries. It's also important that governments start to focus on the authentication of government records.

By authentication, I mean actual cryptographic digital signatures that third parties can use to verify that whatever dataset that they downloaded, whether it's from the government directly or from another third party, is actually authentic and numbers within the data that haven't been perturbed or modified, either maliciously or accidentally. I think those are two issues definitely will be increasingly important in the open government world.

What will your talk on "Government Data and the Invisible Hand" at the Gov 2.0 Summit examine?

When we try to do open government, government tries to look at the data that they have and try to publish it. Then they get to a certain technological limit, where an important dataset that they want to publish is on paper file or is in a digital record but not in any machine-parsable way. Or records are available in some machine-parsable way, but there are privacy problems. When we talked about open government and innovation, I think a lot of people have been focusing on user-facing innovation, where the data had been published and the public goes out and takes that and makes user-facing interfaces.

There's also back end innovation, where tools that enable government to better build this platform and sharpen this platform make the front-end innovation possible. These things include better redaction tools for privacy that make it more efficient for government to find private information in their public records. Or tools that help government source data at its creation in machine-readable formats, rather than doing it the same old way and then having some very complex and leaky process for converting Word documents or other non-parsable documents into machine-parsable formats. I think there's a lot of innovation that needs to happen in the tools that government can use to better provide the open platform itself.

August 27 2010

Applying the lessons of Enterprise 2.0 to Gov 2.0

Last year, MIT professor Andrew McAfee published a landmark book on the business use and impact of social software platforms titled Enterprise 2.0: New Collaborative Tools for Your Organization’s Toughest Challenges. The book is a collection of McAfee's research since the spring of 2006 when he coined the phrase Enterprise 2.0. Shorthand for enterprise social software, Enterprise 2.0 is the strategic integration of Web 2.0 technologies into an organization's intranet, extranet, and business processes. Those technologies, including wikis, blogs, prediction markets, social networks, microblogging, and RSS, have in turn been adopted by government agencies, a phenomenon that falls under the mantle of Gov 2.0. As the use of such technology has grown, Congress is now considering the risks and rewards of Web 2.0 for federal agencies.

Gov 2.0 Summit, 2010The insights McAfee has gained from years of research into the use of social software by large organizations have broad application to understanding how and where technology will change government, and it's the basis for his talk, New Collaborative Tools for Government's Toughest Challenges, at the Gov 2.0 Summit in Washington D.C. I spoke in detail with Andrew, and anyone interested in understanding how social software is being used in large organizations will find the full half-hour audio interview of great interest.

Below are the questions I asked, and timestamps for the audio of where they start if readers want to jump ahead.


How is Enterprise 2.0 different from Web 2.0? And how does it apply to so-called Government 2.0? What do rules and regulations mean for the growth of social software? What does this mean for open government?
(Answer begins at 4:55)

Does automated filtering hold promise for government or the enterprise to prevent sensitive information from leaking? (Answer begins at 7:13)

Do reports of exfiltration of data from intelligence agencies mean collaborative software is a risk? (Answer begins at 8:35)

One of the examples in Enterprise 2.0 is Intellipedia. What lessons does its creation and evolution hold for the intelligence agencies? What about other government entities? (Answer begins at 9:52)

My interview with Sean Dennehy and Don Burke, the two CIA officers who have spearheaded the Intellipedia effort since its inception, is embedded below:

One of the most interesting parts of the book, for me, was the discussion of ideation platforms and collective intelligence. Government agencies are really running with the concept, including the upcoming launch of Innocentive shows another model. But does crowdsourcing really work? When, and under what conditions? What are the lessons from the private sector and academia in that regard? (Answer begins at 15:00)

You can read more about how game mechanics and crowdsourcing were combined to solve a complex challenge at Professor McAfee's blog.

What are the most common mistakes in implementations of social software, or ESSPs as you call them? Specifically, how do you set up effective crowdsourcing platforms? (Answer begins at 19:10)

What did the MIT "balloon team" that won the DARPA Network Challenge do right? (Answer begins at 21:09)

What challenges - and opportunities does the incoming millennial workforce hold for government and business with respect to IT? What does research show about how boomers, Gen Xers, and millennials interact, collaborate and work? Are there some myths to bust with respect to entrepreneurship and innovation? (Answer begins at 23:29)

What are the cultural issues around adoption of Enterprise 2.0 and Gov 2.0? (Answer begins at 27:07)

What does your new research on the strategic implementation of IT in large enterprises show to date? Why does government lag the private sector in this area, in the so-called "IT gap?" What could be done about it? (Answer begins at 30:03)

August 26 2010

Tracking the signal of emerging technologies

4727644780_1a2f2e5f04_b.jpgLast week the words of science fiction writer William Gibson ran rampant over the Twitter back channel at the inaugural NASA IT Summit when a speaker quoted his observation that "The future is here. It's just not evenly distributed yet." That's a familiar idea to readers of the O'Reilly Radar, given its focus on picking up the weak signals that provide insight into what's coming next. So what does the future of technology hold for humanity and space flight? I've been reading the fiction of Jules Verne, Isaac Asimov, David Brin, Neal Stephenson, Bruce Sterling and many other great authors since I was a boy, and thinking and dreaming of what's to come. I'm not alone in that; Tim O'Reilly is also dreaming of augmented reality fiction these days.

Last week I interviewed NASA's CIO and CTO at the NASA IT Summit about some of that fiction made real. We discussed open source, cloud computing, virtualization, and Climate@Home, a distributed supercomputer for climate modeling. Those all represent substantive, current implementations of enterprise IT that enable the agency to support mission-critical systems. (If you haven't read about the state of space IT, it's worth circling back.)

Three speakers at the Summit offered perspectives on emerging technologies that were compelling enough to report on:

  • Former DISA CTO Lewis Shepherd
  • Gartner VP David Cearley
  • Father of the Internet Vint Cerf

You can watch Cerf speak in the embedded video below. (As a bonus, Jack Blitch's presentation on Disney's "Imagineers" follows.) For more on the technologies they discuss, and Shepherd's insight into a "revolution in scientific computing," read on.

Building an Internet in space

Even a cursory look at the NASA IT Summit Agenda reveals the breadth of topics discussed. You could find workshops on everything from infrastructure to interactivity, security in the cloud to open government, space medicine to ITIL, as well as social media and virtual worlds. The moment that was clearly a highlight for many attendees, however, came when Vint Cerf talked about the evolution of the Internet. His perspective on building resilient IT systems that last clearly resonated with this crowd, especially his description of the mission as "a term of art." Cerf said that "designing communications and architectures must be from a multi-mission point of view." This has particular relevance for an agency that builds IT systems for space, where maintenance isn't a matter of a stroll to the server room.

Cerf's talk was similar to the one he delivered at "Palantir Night Live" earlier this summer, which you can watch on YouTube or read about from Rob Pegoraro at the Washington Post.

Cerf highlighted the more than 1.8 billion people on the IP network worldwide at the end of 2009, as well as the 4.5 billion mobile devices that are increasingly stressing it. "The growth in the global Internet has almost exhausted IPv4 address space," he said. "And that's my fault." Time for everyone to learn IPv6.

Looking ahead to the future growth of the Internet, Cerf noted both the coming influx of Asian users and the addition of non-Latin characters, including Cyrillic, Chinese, and Arabic. "If your systems are unprepared to deal with non-Latin character sets, you need to correct that deficiency," he said.

Cerf also considered the growth of the "Real World Web" as computers are increasingly embedded in "human space." In the past, humans have adapted to computer interfaces, he said, but computers are increasingly adapting to human interfaces, operating by speech, vision, touch and gestures.

Cerf pointed to the continued development of Google Goggles, an app that allows Android users to take a picture of an object and send it to Google to find out what it is. As CNET reported yesterday, Goggles is headed to iPhones this year. Cerf elicited chuckles from the audience when describing the potential for his wife's Cochlear implant to be reprogrammed with TCP/IP, thereby allowing her to ask questions over a VoIP network, essentially putting her wife on the Internet. To date, as far as we know, she is not online.

Cerf also described the growing "Internet of Things." That network will include an InterPlaNetary Internet, said Cerf, or IPN. Work has been going forward on the IPN since 1998, including the development of more fault-tolerant networking that stores and forwards packets as connections become available in a "variably delayed and disrupted environment."

"TCP/IP is not going to work," he said, "as the distance between planets is literally astronomical. TCP doesn't do well with that. The other problem is celestial motion, with planets rotating. We haven't figured out how to stop that."

The "Bundle Protocol" is the key to an interplanetary Internet, said Cerf. The open source, publicly available Bundle protocol was first tested in space on the UK-DMC satellite in 2008. This method allows three to five times more data throughput than standard TCP/IP, addressing the challenge of packetized communications by hopping and storing the data. Cerf said we'll need more sensors in space, including self-documenting instruments for meta-data and calibration, in order to improve remote networking capabilities. "I'm deeply concerned that we don't know how to do many of these things," he observed.

Another issue by Cerf is the lack of standards for cloud interoperability. "We need a virtual cloud to allow more interoperability."

Government 2.0 and the Revolution in Scientific Computing

Lewis Shepherd, former CTO at the Defense Information Systems Agency and current Director of Microsoft’s Institute for Advanced Technology in Governments, focused his talk on whether humanity is on the cusp of a fourth research paradigm as the "scale and expansion of storage and computational power continues unabated."

Shepherd put that prediction in the context of the evolution of science from experimental to theoretical to computational. Over time, scientists have moved beyond describing natural phenomena or Newton's Laws to simulating complex phenomena, an ability symbolized by comparing the use of lens-based microscopes to electron microscopes. This has allowed scientists to create nuclear simulations.

Shepherd now sees the emergence of a fourth paradigm, or "eScience," where a set of tools and technologies support data federation and collaboration to address the explosion of exabytes of data. As an example he referenced imagery of the Pleiades star cluster from the Digitized Sky Survey synthesized within the WorldWide Telescope.

"When data becomes ubiquitous, when we become immersed in a sea of data, what are the implications?" asked Shepherd. "We need to be able to derive meaning and information that wasn't predicted when the data sets were constructed. No longer will we have to be constrained by databases that are purpose-built for a system that we design with a certain set of requirements. We can do free-form science against unconstrained sets of data, or modeling on the fly because of the power of the cloud."

His presentation from the event is embedded below.

In particular, Shepherd looked at the growth of cloud computing and data ubiquity as an enabler for collaboration and distributed research worldwide. In the past, the difficulty of replicating scientific experiments was a hindrance. He doesn't see that as a fundamental truth anymore. Another liberating factor, in his view, is the evolution of programming into modeling.

"Many of the new programming tools are not just visual but hyper-visual, with drag and drop modeling. Consider that in the context of continuous networking," he said. "Always-on systems offer you the ability to program against data sets in the cloud, where you can see the emergence of real-time interactive simulations."

What could this allow? "NASA can design systems that appear to be far simpler than the computation going on behind the scenes," he suggested. "This could enable pervasive, accurate, and timely modeling of reality."

Much of this revolution is enabled by open data protocols and open data sets, posited Shepherd, including a growing set of interactions -- government-to-government, government-to-citizen, citizen-to-citizen -- that are leading to the evolution of so-called "citizen science." Shepherd referenced the Be A Martian Project, where the NASA Jet Propulsion Laboratory crowdsourced images from Mars.

He was less optimistic about the position of the United States in research and development, including basic science. Even with President Obama's promise to put science back in its rightful place during his inaugural address, and some $24 billion dollars in new spending in the Recovery Act, Shepherd placed total research and development as a percentage of GDP at only 0.8%.

"If we don't perform fundamental research and development here, it can be performed elsewhere," said Shepherd. "If we don't productize here, technology will be productized elsewhere. Some areas are more important than others; there are some areas we would not like to see an overseas label on. The creation of was NASA based on that. Remember Sputnik?" His observations were in parallel with those made by Intel CEO Paul Otelinni at the Aspen Forum this Monday, who sees the U.S. facing a looming tech decline.

"Government has the ability to recognize long time lines," said Shepherd, "and then make long term investment decisions on funding of basic science." The inclusion of Web 2.0 into government, a trend evidenced in the upcoming Gov 2.0 Summit, is crucial for revealing that potential. "We should be thinking of tech tools that would underlay Gov 3.0 or Gov 4.0," he said, "like the simulation of data science and investment in STEM education."

Gartner's Top Strategic Technologies

Every year, Gartner releases its list of the top 10 strategic technologies and trends. Their picks for 2010 included cloud computing, mobile applications (Cearley used the term apptrepreneurship to describe the mobile application economy that is powered by the iTunes and Android marketplaces, a useful coinage I wanted to pass along), flash memory, activity monitoring for security, social computing, pod-based data centers, green IT, client computing, advanced analytics, and virtualization for availability. Important trends all, and choices that have been born out since the analysis was issued last October.

What caught my eye at the NASA IT Summit were other emerging technologies, several of which showed up on Gartner's list of emerging technologies in 2008. Several of these are more likely familiar to fellow fans of science fiction than data center operators, though to be fair I've found that there tends to be considerable cross over between the two.

Context-aware Computing
There's been a lot of hype around the "real-time Web" over the past two years. What's coming next is the so-called "right-time Web," where users can find information or access services when and where they need them. This trend is enabled by the emergence of pervasive connectivity, smartphones, and the cloud.

"It will be collaborative, predictive, real-time, and embedded," said Clearey," adding to everyday human beings' daily processes." He also pointed to projects using Hadoop, the open source implementation of MapReduce that Mike Loukides wrote about in What is Data Science? Context-aware computing that features a thin client, perhaps a tablet, powered by massive stores of data and predictive analytics could change the way we work, live, and play. By 2015-2020 there will be a "much more robust context-delivery architecture," Cearley said. "We'll need a structured way of bring together information, including APIs."

Real World Web
Our experiences in the physical world are increasingly integrated with virtual layers and glyphs, a phenomenon that blogger Chris Brogan described in 2008 in his Secrets of the Annotated World. Cyberspace is disappearing into everyday experience. That unification is enabled by geotagging, QR codes, RFID chips, and sensor networks. There's a good chance many more of us will be shopping with QR codes or making our own maps in real-time soon.

Augmented Reality
Context-aware computing and the Real World Web both relate to the emergence of augmented reality, which has the potential to put names to faces and much more. Augmented reality can "put information in context at the point of interaction," said Cearley, "including emerging wearable and 'glanceable' interfaces. There's a large, long term opportunity. In the long term, there's a 'human augmentation' trend."

Features currently available in most mobile devices, such as GPS, cellphone cameras, and accelerometers, have started to make augmented reality available to cutting edge users. For instance the ARMAR project shows the potential of augmented reality for learning, and Augmented reality without the phone is on its way. For a practical guide to augmented reality, look back to 2008 on Radar. Nokia served up a video last year that shows what AR glasses might offer:

Future User Interfaces
While the success of the iPad has many people thinking about touchscreens, Cearley went far beyond touch, pointing to emerging gestural interfaces like the SixthSense wearable computer at MIT. "Consider the Z-factor," he suggested, "or computing in three dimensions." Cearley pointed out that there's also a lot happening in the development of 3D design tools, and he wouldn't count virtual worlds out, though they're mired "deep in the trough of disillusionment." According to Cearley, the problem with current virtual worlds is that they're "mired in a proprietary model, versus an open, standards-driven approach." For a vision of a "spatial operating system" that's familiar to people who have seen "Minority Report," watch the video of g-speak from oblong below:

Fluid User Interface
This idea focuses on taking the user beyond interacting with information through a touchscreen or gesture-based system and into contextual user interfaces, where an ensemble of technologies allow a human to experience emotionally-aware interactions. "Some are implemented in toys and games now," said Cearley, "with sensors and controls." The model would include interactions across multiple devices, including building out a mind-computer interface. "The environment is the computer." For a glimpse into that future, consider the following video from the H+ Summit at Harvard's Science Center with Heather Knight, social roboticist and founder of


User Experience Platforms
Cearley contended that user experience design is more important than a user experience platform. While a UXP isn't a market yet, Cearley said that he anticipated news of its emergence later in 2010. For more on the importance and context of user experience, check out UX Week, which is happening as I write this in San Francisco. A conceptual video of "Mag+" is embedded below:

Mag+ from Bonnier on Vimeo.

3D Printing
If you're not following the path of make-offs and DIY indie innovations, 3D printing may be novel. In 2010, the 3D printing revolution is well underway at places like MakerBot industries. In the future, DARPA's programmable matter program could go even further, said Cearley, though there will need to be breakthroughs in materials science. You can watch a MakerBot in action below:

Mobile robotics driving mobile infrastructure
I experienced a vision of this future myself at the NASA IT Summit when I interviewed NASA's CTO using a telerobot. Cearley observed many applications coming for this technology, from mobile video conferencing to applications in healthcare and telemedicine. A video from the University of Louisville shows how that future is developing:

Fabric Computing
Cearley's final emerging technology, fabric computing, is truly straight out of science fiction. Storage and networking could be distributed through a garment or shelter, along with displays or interfaces. A Stanford lecture on "computational textiles" is embedded below:

August 20 2010

Space IT, the final frontier

rover.JPGWhen people think of NASA and information technology in 2010, issues like the future of manned space flight, the aging space shuttle fleet or progress on the International Space Station may come to mind. What casual observers miss is how NASA is steadily modernizing those systems, including developing open source cloud computing, virtualization, advanced robotics, deep space communications and collaborative social software, both behind the firewall and in the public eye.

NASA has also earned top marks for its open government initiatives from both the White House and an independent auditor. That focus is in-line with the agency's mission statement, adopted in February 2006, to "pioneer the future in space exploration, scientific discovery and aeronautics research," and it was on display this week at the first NASA IT Summit in Washington, D.C.

The first NASA IT Summit featured speeches, streaming video, discussions about government, innovation and a lively Twitter back channel. Plenty of my colleagues in the technology journalism world were on hand to capture insights from NASA's initial sally into the technology conference fray. Headlines offer insight into the flavor of the event and the focus of its keynoters:

Below you'll find my interviews with NASA CTO for IT Chris Kemp (my first interview conducted via telerobot) and NASA CIO Linda Cureton.

NASA CIO and CTO on cloud computing, virtualization and Climate@Home

Gov 2.0 Summit, 2010During the second day of the summit, I interviewed Linda Cureton on some of the key IT initiatives that NASA is pursuing. In particular, I wondered whether NASA's open source cloud computing technology, Nebula, could be used as a platform for other agencies. "The original problem was that NASA was not in the business of providing IT services," said Cureton. "We are in the business of being innovative. To create that capability for elsewhere in government is difficult, from that perspective, yet it's something that the government needs."

Cureton described Nebula as similar to other spinoffs, where NASA develops a technology and provides it elsewhere in government. "We released the foundation of Nebula into the open source domain so that people in other agencies can take it and use it," she said. The other major benefit is that once something is in that public domain, the contributions from others -- crowdsourcing, so to speak -- will improve it."

Current cost savings in NASA isn't rooted in the cloud, however. It's coming from data center consolidation and virtualization. "NASA is decentralized," said Cureton, "so we're seeing people are finding ways to consolidate and save money in many ways. The major drivers of the virtualization that has been done are space and the desire to modernize, and to ensure a user experience that could replicate having their own resources to do things without having their own server."

Cureton observed that because of the decentralization of the agency, energy savings may not always be a driver. "Since low-hanging fruit from virtualization may have been plucked, that's where facilities managers now want to measure," she said. "From what I've learned, over the past year and a half, there's been a lot of virtualization. " For instance, the NASA Enterprise Application Competency Center (NEACC) has achieved floor space reductions from data center consolidation approaching a 12 to 1 ratio, with 36 physical servers and 337 virtual machines.

That's also meant a power reduction ratio of 6 to 1, which feeds into the focus on green technology in many IT organizations. For instance, as a I reported last year, a green data center is enabling virtualization growth for Congress. Cureton emphasized the importance of metering and monitoring in this area. "If you can't measure it, you can't improve it. You need more knowledge about what you can do, like with virtualization technologies. In looking at our refresh strategy, we're looking at green requirements, just as you might with a car. There's also cultural challenges. If you don't pay the electrical bill, you care about different issues."

Does she put any stock in EnergyStar ratings for servers? "Yes," said Cureton, whose biography includes a stint at the Department of Energy. "It means something. It's data that can be taken into account, along with other things. If you buy a single sports car, you might not care about MPG. If you're buying a fleet of cars, you will care. people who buy at scale, will care about EnergyStar."

More perspective on Nebula and Open Stack

Cureton hopes agencies take Nebula code and deploy it, especially given continued concerns in government about so-called public clouds. "The things that slow people down with the public cloud include IT security and things of that nature," she said. "Once an agency understands Nebula, the model can address a lot of risks and concerns the agency might have. if you're not ready for the Amazon model, it might be a good choice to get your feet wet. The best choice is to start with lower security-class data. When you look at large, transactional databases, Ii'm not sure that's ready for cloud yet."

As my telerobotic interview with Chris Kemp revealed (see below) there have now been "hundreds of contributions" to the Nebula code that "taxpayers didn't have to pay for." If you missed the news, Rackspace, NASA and several other major tech players announced Open Stack at OSCON this summer. Open Stack "enables any organization to create and offer cloud computing capabilities using open source technology running on standard hardware." You can watch video of Rackspace's Lew Moorman talking about an open cloud on YouTube.

There will, however, be integration challenges for adding Nebula code to enterprise systems until the collaboration matures. "You have to realize Nebula code is in production," said Kemp in an additional interview. "The Open Stack guys basically took Nebula code as seed for the computing part. For storage, users are able to rapidly contribute Rackspace file code. Together, there eventually will be a whole environment. People are able to check out that code right now in the Nebula environment, but there's a difference between that and a mature implementation."

Kemp pointed out that both of these code bases have been taken out of large production systems. "It would be irresponsible to call it mature," he said. "The community needs to test it on all types of hardware and configurations, building infrastructures with specific security scenarios and hardware scenarios. We expect it to be '1.0 caliber' by the fall."

The bottom line, however, is that, using these components, IT organizations that want to participate can turn commodity hardware into scalable, extensible cloud environments using the same code currently in production serving tens of thousands of customers and large government projects. All of the code for OpenStack is freely available under the Apache 2.0 license. NASA itself has committed to use OpenStack to power their cloud platforms, though Kemp cautioned that NASA is "not endorsing OpenStack, but is endorsing large groups of developers working on the code."

What Kemp anticipated evolving late this year is a "hybrid EC2," referring to Amazon's cloud environment. "Amazon is not selling as EC2 appliance or S3 appliance," he said. "If you're building a large government- or science-class, NASA-class cloud environment, this is intended to make all of the necessary computing infrastructure available to you. If you could build that kind of infrastructure with off the shelf components, we would have."

The manner of the interview with Kemp at the IT Summit also was a powerful demonstration of how NASA is experimenting with telepresence and robotics. Due to his status as a proud new father, Kemp was unable to join in person. Using an Anybot, Kemp was able to talk to dozens of co-workers and collaborators at the summit from his home in California. Watching them talk recalled William Gibson's famous quote: "The future is here. It's just not evenly distributed yet."


381020main_3-5km_lg.jpgCrowdsourcing the search for aliens at the SETI@Home initiative is a well-known project for many computer users. Now, NASA plans to extend that distributed model for processing worldwide to help determine the accuracy of models that scientists will use to predict climate change. NASA describes the project as "unprecedented in scope." Climate@Home is a strategic partnership between NASA's Earth Science Division and the Office of the CIO, which Cureton heads. As with SETI@Home, participants won't need special training. They'll just need a laptop or desktop and to download a client to run in the background.

Effectively, NASA will be creating a virtual supercomputing network instead of building or re-purposing a supercomputer, which consumes immense amounts of energy. That means that the project will feature a much lower carbon footprint than it would otherwise, which is desirable on a number of levels. The Climate@Home initiative is modeled after a similar project coordinated by the Oxford e-Research Center called Cureton talks about the project in the video below. She also comments (briefly) on the "Be A Martian" project at the Jet Propulsion Laboratory, which enlists citizen scientists in exploring Mars and having fun by sorting through images of the red planet.

Federal CIO on smarter spending

The final day of the summit featured a short, clear speech from federal CIO Vivek Kundra, where he challenged the federal government to spend less on IT. Video is embedded below:

Note: Presentations at the Summit from the grandfather of the Internet, Vint Cerf, the futurism of David W. Cearley, VP & Gartner Fellow, and the analysis of Microsoft's Lewis Shepherd, all provided provocative views of what's to come in technology. Look for a post on their insights next week.


The efficiencies and issues surrounding government's use of technology will be explored at the Gov 2.0 Summit, being held Sept. 7-8 in Washington, D.C. Request an invitation.

August 10 2010

"Knowledge is a mashup"

These days, we hear a lot about open data, open government, and Gov 2.0. President Obama's Open Government directive has given us access to huge data sets through avenues such as But we have a lot more assets as a country than just digital 0s and 1s in CSV files. We also have artifacts and science and history and experts. Can open government apply to those assets as well?


If the Smithsonian Commons project is any indication, the answer is yes. I talked to Michael Edson, director of Web and New Media Strategy for the Smithsonian about the project.

The Commons, which is currently being prototyped, is one of the best examples I've seen of "Gov 2.0 in action." Highlights include:

  • Open access to data and knowledge, applied in a way that matters to people's lives.
  • It's transparent, participatory, and collaborative in a way that harnesses what Clay Shirky has called "the value of combinability" -- an end result greater than the sum of its parts.
  • It is made significantly more useful by our contributions.

If these things are important to you -- if you see the power of freeing real objects and history and culture -- then go check out the prototype and let the Smithsonian staff know what you think. They need to hear from you before they can go on to the next phase in the project.

What is the Smithsonian Commons project?

Gov 2.0 Summit, 2010The Smithsonian Commons project is an effort to provide online access to Smithsonian research, collections, and communities. After all, not everyone can pop into one of the Smithsonian museums anytime they want. Even if they could, the buildings hold less than 2 percent of the collection. And anyway, if you're a teacher and want to borrow some American history for a class lesson, I hear the people that work at the museums don't like it much when you collect a bunch of that history in a shopping bag and fly back to Seattle with it.

But that description makes it sound like the project is about making a web site and putting pictures of stuff online, and that's not really it at all.The project goes well beyond just access. This is key, as making information available should be merely the first step. It also has to be applicable and useful to people's lives and it has to be the foundation for collaboration that makes the whole greater than the sum of the parts.

How can people use the information? In the case of the Smithsonian Commons, Edson says the idea is for it to be a catalyst and a platform to empower "innovators, explorers, and sense-makers."

In addition, Edson says the Smithsonian Commons "isn't just a commons of stuff the Smithsonian owns and controls ... Rather, the commons is about seeking powerful network effects from the sum total of the stuff, and the data, and the expertise, and the activities and programs, and the communities that form where stuff needs doing and figuring out in the real world."

He notes that, "in the 19th and 20th centuries the best way to do this was to build big bricks-and-mortar institutions and throw all the stuff and experts in there together so organized work could happen." But now, "we have important and audacious new goals that won't yield easily to the old ways of getting stuff done."

The prototype home page shows how this could work. A teacher can search through the vast collection for material for her class because the public has collaborated on tagging, recommending, discussing physical objects (from across all Smithsonian museums), and assembling videos, articles, and community-uploaded material. The teacher can filter by grade level and can download or share what she gathers, as well as store it in her personal collection. The sample video associated with the prototype shows the teacher downloading the information to a PowerPoint slide, but she could just as easily share the information to a Facebook page. (OK, maybe students don't want teachers to know about their Facebook accounts. But you get the idea.)


Why is the Smithsonian Commons project important?

We tend to think of information as data on a screen, but as Edson points out, the physical items that museums house represent ideas, science, culture, and history:

"I think museums, libraries, archives, and research organizations have a critical role to play in building the preconditions for sustained rational thought and discourse in society," he said. "And we can and should be an engine for creativity and innovation. I think we provide the building blocks for this by publishing our collections and research data in as open and free a way as possible. We provide scaffolding through scholarly research, exhibitions, publications, and public programs. But the mortar — the connective tissue that holds it all together — comes from the curiosity and activity and participation of millions of users, makers, and participants."

How does the Smithsonian Commons project impact developers, makers, and innovators?

When I see the work of the open government movement, I am impressed by how much has been accomplished, but also see there is so much more to be done. Developers need to take that raw data and make it applicable to the every day lives of citizens. Makers, hobbyists, and experts can take what the Smithsonian Commons project hopes to provide as a foundation for collaboration, innovation, and relevance to our every day lives.

And that ability is one of the core attributes of the Smithsonian Commons project. Edson explains:

"I often describe the Smithsonian Commons in Maker terms — that a commons is a kind of organized workshop where the raw materials of knowledge creation can be found and freely assembled into new things, by us, by you, by anybody. Cory Doctorow or Mister Jalopy might say that the Smithsonian Commons is a museum and research complex as it would exist if reconstructed around the Owner's Manifesto. Knowledge is a mashup!"

Edson says this collaboration is important in achieving the Smithsonian's five-year plan of "unlocking the mysteries of the universe, understanding and sustaining a biodiverse planet, valuing world cultures, and understanding the American experience" which Edson notes is doing what "Tim O'Reilly would call "stuff that matters."

Why the Smithsonian project is "crazy good"

"We say that the Smithsonian Commons will be vast, findable shareable, and free," Edson says. "These four things together give us something powerful and unique. Take away one and you get something good, but not crazy good."

What does this mean in practice?

  • Vast: Anyone can have access to the entire Smithsonian collection, staff, vistors, and partners.
  • Findable: Search, navigation, and user experience design, recommendations, comments, and social networks come together to help users find exactly what they need
  • Sharable: The project encourages use and reuse for work, pleasure, education, online and offline
  • Free: "The Smithsonian Commons will be built on the premise that free, high-quality resources will spread farther and create more opportunities for discovery and creation than those that are restricted by unnecessary fees and licenses," Edson says.

Who is the Smithsonian Commons project for?

Smithsonian Commons is for everyone, of course. But in the beginning, the makers and innovators are key. The Smithsonian wants to operate the Commons project a bit like a Web 2.0 startup: launch early and often. Iterate based on how people use it and what they really need. Who can make the best use of what the project has to offer and what is most useful to them?

The Smithsonian Commons prototype is the first step in that process. Publish some ideas and get feedback. Iterate. Repeat. Then ramp things up once the best direction becomes clear. Edson notes that getting permission, or at least forgiveness, for working this way is perhaps one of the greatest challenges in the Gov 2.0 movement: "In Gov 1.0 and in most large organizations, we like to design things in toto, pour the concrete, and be done with it. Varying that process requires a lot of stamina."

I think this is an awesome approach. Now that we can put things online easily and let people use things the way they want to rather than force our audiences into a particular model, why not take the best advantage of that? Edson says that it's easy to make generalizations about the Smithsonian audience, but in reality the Smithsonian is "the consummate long-tail business".

This project will be a great experiment to see how a large government organization can operate like a Web 2.0 startup and learn the needs of the audience as the project evolves.

The power of what the web can be

This project is an amazing example of the true capabilities of the web. It merges offline and online information, makes experts available in any topic we want, provides global collaboration, and gives all of us access to valuable knowledge as building blocks for something even greater. In "Cognitive Surplus" -- and noted above -- Clay Shirky talks about "the value of combinability." This project is a perfect example of what he describes. As I wrote about this concept on my blog:

"Shirky writes “if you have a stick, and someone gives you another one, you have two sticks. If you have a piece of knowledge — that rubbing two sticks together in a certain way can make fire — you can do something of value you couldn’t do before.” And here too is another new surplus the culture of the web gives us. By sharing knowledge, tools, failures, successes, ideas, we can better combine them for sums much greater than the parts. He notes that the community size has to be big enough, sharing has to be easy, there should be a common format or way of understanding the information, and then, there’s the last component, the one that technology can’t solve — people. Can we work well together? Do we understand each other, trust each other, want others to make what we do better?"

Edson says:

"The thing that makes the Smithsonian Commons different than a commons developed by a commercial entity is that the Smithsonian is in the forever business. By putting something in the Smithsonian Commons we're asking people to trust us. We're not going to scam you. We're not going to violate your privacy. We're not going to get bought by a competitor or just decide to go out of business one day. We're going to be honest about what we do and don't know, we're going to be open to new ideas and points of view, we're going to help each other figure out the world, and these promises are good, forever. Museums and libraries and archives are some of the few organizations in our culture that enter into those kinds of promises, and we take that responsibility very seriously."

So what's next?

Edson says that the Smithsonian has never done a project like this before, so they've got no real process for it. Right now, they are soliciting feedback and comments. You can head over to the prototype right now and tell them what you think, what you would like the project to be, and how you'd best be able to use it. The reaction so far has been overwhelmingly positive. But the Smithsonian wants to hear from as many people as possible before going forward so, ultimately, they build what people really want rather than what they think people might want. That's a true Web 2.0 approach to Gov 2.0.


August 06 2010

Gov 2.0 Week in Review

Last week, I watched Tim O'Reilly talk about Gov 2.0 and Code for America with Shira Lazar on CBS News. Their interview focused upon how Gov 2.0 uses the technology and innovation of Web 2.0 to address the needs of government. As Lazar put it, the future of government is in your hands.

The call to civic action that is implicit in that vision for Gov 2.0 may resonate withGeneration Y in unprecedented ways. New Research on millennials by the Center for American Progress (CAP) and the "the generation gap in government" suggests a a majority of millennials "would be more likely to support political candidates who embrace improving government performance, effectiveness, and efficiency." The poll from CAP suggested, in a larger sense, that Americans want better government, not smaller government.

The potential for open government, open data and innovative technology to empower citizens, save costs and inform better policy is compelling the summer of 2010. Consider the video case for open transit data below, or the rest of the news in this week's Gov 2.0 Review after the jump. As William Gibson has observed, "the future is already here - it's just not evenly distributed."

Open Government

Open government is a mindset. Will it be sustainable over the long term?

A "historic milestone in making government more open" went live this summer when the new Federal Register beta launched at As deputy White House CTO Beth Noveck observed, "Federal Register 2.0" is "collaborative government at its best.

The Secretary of Education announced the Learning Registry, a government-wide initiative that would create a platform for educational content across. Steve Midgley blogged about the Learning Registry at "It's an important and challenging opportunity that raises hard technical questions about federated search and hard process questions about cross-agency collaboration," said Noveck. Social Security announced an open government video contest on "how Social Security has made a difference" in citizens' lives.

And speaking of video contests, the winner of the EPA's "Rulemaking Matters - Let Your Voice Be Heard video competition" is worth a watch, if you're interested in the ideas behind

The video below, also from the contest, is a fine example of "Schoolhouse Rock for Rulemaking," as deputy White House CTO Beth Noveck observed on Twitter.

The state of open government and transparency in Ireland, as that government opens the processes of democracy to scrutiny is also worth considering.

The signing and release of Law.Gov core principles show what a proposed distributed repository of all primary legal materials of the United States could be.

The schedules for the President and Vice President of the United States are online are now available to the public online.

Open government also means understandable and usable government communications. NPR's Liane Hansen interview Dave McClure from the GSA about "America's Website Newly User-Friendly,"

Open Data

"A year ago, two representatives from the Massachusetts Department of Transportation (MassDOT) met with a group of developers and interested citizens to talk about opening up public transportation data, including subway, bus, commuter train, boat, highway information, and RMV," said Laurel Ruma, O'Reilly's Gov 2.0 Evangelist via email. "MassDOT wanted to know what data developers would find interesting, figure out how best to serve it up, and get a feel for what the developer community would do with it. Many meetings, two contests, and a holiday party later, the results have been outstanding: visualizations, applications, signs, and even an IVR system built from scratch (see MassDOT Developers Page)."

In just one year, said Ruma, "Boston has gone from having stacks of paper schedules to real-time feeds for 135 out of 185 bus lines in the MBTA system (the rest will be available by the end of summer). Not only did the state government give citizens what they wanted, but they encouraged an innovation economy, built community, made the papers, and, in general, built goodwill."

Way to go, Bay State.

"Data Are Not Information," wrote Jeff Stanger, exploring the relationship between open data and open government

. Open data needs open source tools, argued Clay Johnson. Johnson also points to a thoughtful post by Dan McQuillan that contends that open data does not empower communities.

Finally, consider a compelling piece by Mark Headd at Govfresh on Delaware's progress towards Gov 2.0, "the opposite of open government." As Headd writes, "When it comes to environmental data, and data on contaminated groundwater, open government is not about citizen convenience or improved government efficiency. It is about giving people the information they need so that they can make informed decisions about their own lives and the lives of their families and children."

Government: Is There An App for That?

The nation's largest library celebrated a the launch of its first official iPhone app, when the Library of Congress got a mobile app.

General Sorenson, the CIO of Army, announced the winners of the Apps for the Army in Florida. The contest more than doubled expected participation, writes Peter Corbett.

Mobile applications from government or that work with government data are changing the way citizens navigate the world. Forbes recently shared a great list of ten socially responsible mobile apps.

Federal Computer Week also published a solid selection of government Web apps that get results, including the Twitter earthquake detector, the State Department's Haiti tech resource page, USAID's Global pulse and more.

Gov 2.0 and Accessibility

The power of technology and equality came into sharp focus this year on the 20th anniversary of the Americans with Disabilities Act.

Gov 2.0 and Web 2.0 at odds over accessibility in Australia> where a low-bandwidth, text-only Web is key to open government goals and addressing the digital divide. Australia’s CIO urged civil servants to become “Gov 2.0 activists” and shared some tough talk on accessibility.

In the United States, the new faced a tough critique of accessibility.

The new F.C.C. team aims for accessibility," reported Politico in a profile of the folks behind

Wikileaks, Wookielieaks and Secrecy

This summer, the Robin Sage experiment obtained a photo of a soldier in Afghanistan w/embedded location data, reminding everyone of the national security risks of Gov 2.0 and the social Web.

The unprecedented release of more than 92,000 documents in Afghan War Diaries by Wikileaks is an powerful reminder of the power of technology to disrupt traditional information flows. For more on Wikileaks and context, make sure to read the Nieman Lab's excellent week in review. The photography of Afghanistan on display at's Big Picture photoblog add visual context for what's at stake.

"The release of these documents has not affected the strategy. Many of them were very, very old," said Admiral Mullen on "Meet The Press." Both he and Defense Secretary Gates were extremely critical of the release of the names or locations online and the moral culpability of the sight. "They have put this out without any regards whatsoever for the consequences," said Secretary Gates this week.

The could change the way reporters deal with secrets.

The relationship Wikileaks, government 2.0 and media hurricanes is likely to be hotly debated for months to come. The story continued to evolve this week when the Pentagon threatened to compel Wikileaks to hand over the Afghan war diaries. Given Wikileaks' distributed architecture, that may be unlikely.

What is unquestionable, however, is that the "Wookieleaks" meme that exploded onto Twitter quickly produced more tweets than War Log documents. Marc Ambinder called Wooklileaks the "best hashtag ever," while NPR reported that Wookieleaks was popular because geeks like to go deep on things. The best Of Wookieleaks certainly show that geeks have a sense of humor.

Facebook, Privacy and Government

Yes, Mr. Zuckerberg went to Washington, where Facebook faces online privacy concerns. For more on the online privacy debates in Washington, including hearings where Facebook's CTO and CSO testified before Congress, read my most recent post.

A White House proposal, reported by the Washington Post, that would ease FBI access to records of Internet activity" is a reminder that governments themselves have complicated relationships with electronic privacy. So is the news that the United Arab Emirates and Saudi Arabia would move to block Black Berry messaging. Putting RIM's 'security' challenges in perspective is important, as is government's own record on online privacy.

Does Government Get Social Media?

There are unprecedented ways that government is leveraging the Internet as platform for communications. As President Obama's YouTube addresses are the fireside chats of the 21st century. I was reminded when I watched his context to the President's message.

Despite the new media prowess of the White House, however, the Knight Commission blogged that a Ambassador Rice and the U.S. mission to the UN is now on Facebook but isn't replying to comments there.

One agency that does get social media is NASA. They hosted another NASA Tweetup in Washington, which was streamed live online at NASA TV and discussed at The event featured @astro_tj, who was the first man to tweet from space.

awhich elected officials are tweeting. TweetCongress tweeted that over 200 members of Congress are now tweeting. A new study on Twitter in Congress asserted that Democrats use Twitter for transparency, while Republicans use it for outreach.

For a useful perspective outside of the United States, First Monday published a terrific Gov 2.0 case study in government and e-participation at Brazil's House & Presidential websites.

And in a novel use of crowdsourcing, Delhi police are using Facebook to track scofflaw drivers, in the latest example of Clay Shirky's "Cognitive Surplus" at work. For a reminder of that concept, check out the TED Talk below.

Open Source and Government

Lockheed-Martin went open source, posted Red Hat's Gunnar Helleks, but "tinfoil hats abound." Check out for more the open source project.

This past week, a miltary open source unconference in Washington explored innovation in this space. Whether considering when code disappears in the government or how a CIA software developer went open source, as you can see at


If you missed OSCON in Portland, Oregon, several videos that discuss open source and government are worth watching.

Jennifer Pahlka of Code for America and "Coding the Next Generation of American History"

Bryan Sivak, DC CTO, on the District of Columbia on open source

Mayor Sam Adams of the city of Portland on "America's open source city

Gov 2.0 Summit Draws Near

As Tim O'Reilly wrote this morning, the upcoming Gov 2.0 Summit in Washington will be about opening the doors of government to innovation. His argument that "open government spurs innovation. This year, education and health care will be key themes.

Gov 2.0 Bits and Bytes

If you missed it, President Obama demonstrated the new

As Macon Phillips observed, this was not your ordinary website demo. When the President used online video to go straight to the American people to explain a new online resource, it's noteworthy.

In late July, there was a Virtual summit on Apps for Local Government.

An oil pipeline leak in Michigan meant Crisis Commons asked for volunteers to upload URLs of Kalamazoo River resources to

The State Department hosted a "techATState" event focused on "mobile money."

The Department of Transportation on launched IdeaHub, "online community where employees can post ideas for their colleagues to comment and build upon."

And The Sunlight Foundation launched new Congress Android app. The iPhone app is available at

Opening the doors of government to innovation

When I organize a conference, I don't just reach out to interesting speakers. I try to find people who can help to tell a story about what's important and where the future is going. We've been posting speakers for the second annual Gov 2.0 Summit in Washington DC Sept 7-8, but I realized that I haven't told the story in one place. I thought I'd try to do that here.

Gov 2.0 Summit, 2010First off, our goal at the Gov 2.0 Summit is to bring together innovators from government and the private sector to highlight technology and ideas that can be applied to the nation’s great challenges. In areas as diverse as education, health care, energy, jobs, and financial reform, there are unique opportunities to rethink how government agencies perform their mission and serve citizens. Social media, cloud computing, web, and mobile technologies -- all provide unique new capabilities that government agencies are beginning to harness to achieve demonstrably better results at lower cost.

Our focus this year is on opening the doors to innovation - learning about the latest technology and its application, and breaking down the barriers to its adoption.

Here are some of the themes we're exploring:

1. The Power of Platforms

If there’s one thing we learn from Apple’s iPhone, it’s the power of a platform to spark innovation. Apple revolutionized the smartphone market not just by producing an innovative phone, but by opening up that phone to independent developers. As if by magic, the 15 to 20 applications they designed and released themselves soon became hundreds of thousands, in a textbook demonstration of just what can happen when you harness the power of the marketplace.

So too, government programs can be designed as platforms rather than as fully-specified applications. In this section of the program, we look at some key areas where government is demonstrating strategic mastery of platform thinking, as well as at some innovative private sector programs that can be adapted for government use.

We'll hear from speakers including:

  • Harlan Yu of Princeton, one of the authors of the paper Government Data and the Invisible Hand, which outlines the rationale for opening up government data in machine-readable form.

  • Jim Traficant, who is not only the vice president in charge of the Healthcare Solutions group at Harris Corp, but has intensely personal reasons to believe in the importance of electronic medical records: they saved his life. Twice. He’ll tell us why electronic medical records can and must transform our health care system.

  • XBRL US CEO Mark Bolgiano and the Department of Homeland Security’s Executive Director for Information Sharing (and NIEM Executive Director) Donna Roy, who will share early success stories in using XBRL (Extensible Business Reporting Language) and NIEM (National Information Exchange Model), and suggest how they can be to increase transparency and visibility into “big data” in the private and public sectors, and where they intersect. I'm particularly excited by Mark's thoughts on how to track programs that are funded by the Federal government but actually administered by states or even local jurisdictions. As in healthcare, electronic reporting creates the possibility of feedback loops analogous to those that we've long enjoyed in creating web applications that get smarter the more people use them.

  • Todd Park, CTO of the Department of Health and Human Services, who has a vision of how health care data can be used to create a “holy cow machine” that will let us reduce health care costs and improve health outcomes in the same way that Walmart improves its inventory efficiency or Google improves ad targeting. He’ll talk about how aggregate data about health outcomes is unleashing a torrent of innovation, as we move from paying for volume of care to paying for the value of care in improving actual health outcomes.

  • Clay Johnson, former head of Sunlight Labs, and Indu Subaiya of the Health 2.0 Developer Challenge, who will address the question of how government open data initiatives can best reach out to developers. Developers are the heart and soul of every platform. You can't just "build it and they will come." You have to take practical steps towards developer evangelism.

I'll talk about some of the speakers in the other parts of the program next week, but as a teaser, let me highlight some of the other themes we're exploring.

2. Innovation

Real innovation doesn’t just mean tinkering around the edges. It means remembering your goals, and finding a new way to get there. In this series of sessions, we’ll explore some of the most exciting new sources of innovation, and how they can be harnessed by government. We'll also take a close look at education, one of the foundations of our innovation economy, bringing some fresh voices to the innovation debate.

3. Improving Government Effectiveness

It isn’t enough to be innovative. Government agencies also need to be effective. In this series of sessions, we’ll explore topics such as cost savings, efficiency, and customer service.

4. Empowering Citizens

“We the people...”, the opening of the US Constitution, is a reminder that our government is nothing other than an expression of the collective will of the citizens. No divine right of kings, no entitled nobles, just we, the people. And government is a mechanism by which we express our will. A mechanism that is being turbocharged by the participatory technologies of the web, social media, and mobile phones. We'll explore how to rethink the role of government in the age of electronic participation.

5. Identity, Privacy, and Informed Consent in the Age of the Internet

Many of today’s most powerful technologies depend on trust - trust that when a consumer or citizen provides information, either explicitly or implicitly, to a web or mobile application, that information won’t be misused. Trust is essential, because in order to receive the benefits of social, mobile, and real-time applications, consumers must provide information that has the potential to be misused - their location, their friends, what they are doing, what they are buying, what they are saying, what medications they are taking, how much energy their homes and businesses are using, and much more. The answer is not to treat this information as a kind of toxic asset, and build Maginot lines to protect it, but to build policy frameworks around acceptable use, and penalties for misuse. We'll explore where the technology is leading us and what those policy frameworks might be.

I'm really excited to have such an amazing blend of industry AND Federal heavyweights on the program and in the audience because it gives us an opportunity to explore what the latest technology means for the crafting of future policy and strategy. We've got CTOs and other key executives from major technology companies, including Cisco, VMWare, PayPal, IBM, and Facebook, and their opposite numbers at the Department of Health and Human Services, the Department of Education, the Department of Energy, and the White House Office of Science and Technology Policy. We've also got innovative small companies, educators, and deep thinkers about the future, all with a shared goal of making things work better.

I'll share more detail on some of the other program themes and speakers over the next few weeks.


The Gov 2.0 Summit will be held Sept. 7-8 in Washington, D.C. Learn more and request an invitation.

August 05 2010

Gov 2.0 as means not end

It's often tempting to think that Gov 2.0 is common ground between those who always want smaller government and those who want government to help its citizens. To an extent, this is true: opening up services lets citizens and businesses do more for themselves, and means government doesn't have to grow for more things to happen. In some cases, government can even get smaller.

But government-as-platform doesn't absolve us from asking what fundamental services should be provided by a government, as opposed to private industry. This is a big question. We didn't come up with a single universally-agreed answer before Gov 2.0, and Gov 2.0 will neither answer it for us nor let us evade the question.

You can see this in two recent statements Clay Johnson's Don't Let the Municipal Crisis Go to Waste essay sparked a challenging line of thought. Clay's thesis is simply this:

Perhaps its the idealist in me, but I want this crisis mean more than privatization or bankruptcy. I want it to drive a need for people to connect locally, and I want it to further blur the line between people and the government they elect. I want it to usher in a new era of civic responsibility.

Civic responsibility and participation is, of course, a theme of President Obama who brought up as a center of volunteerism.

At the same time, David Cameron is articulating his Big Society concept:

The Big Society is about a huge culture change…
…where people, in their everyday lives, in their homes, in their neighbourhoods, in their workplace…
…don’t always turn to officials, local authorities or central government for answers to the problems they face …
…but instead feel both free and powerful enough to help themselves and their own communities.

You can see how government-as-cost and government-as-investment thinking comes out in the difference in rhetoric between the Obama and Cameron administrations. Obama and his staff, coming from the investment mindset, are building a Gov 2.0 infrastructure that creates a space for economic opportunity, informed citizens, and wider involvement in decision making so the government better reflects the community's will. Cameron and his staff, coming from a cost mindset, are building a Gov 2.0 infrastructure that suggests it will be more about turning government-provided services over to the private sector:

And in its place we’ve got give professionals much more freedom, and open up public services to new providers like charities, social enterprises and private companies so we get more innovation, diversity and responsiveness to public need.

There will be overlap in what both administrations accomplish—rarely do political opponents completely disagree on what steps to take. However, Government as a Platform is a story being told by both political sides: in a frame about investing for benefits and in a frame about lower taxes and moving services into the private sector.

I can't emphasize enough, though, that APIs and open data are tools, not results. As someone who doesn't feel well represented by either political party (I'm both empathic and numerate), I hope we see a bit of both side's goals: some divested functionality and some increased opportunity. I'm not so optimistic, however, as to assume that Gov 2.0 won't be seized upon by ideological absolutists of both ilks. After all, it's worth remembering that Kashmir and the West Bank are common ground too.


July 27 2010

Open government is a mindset

I recently talked about the role social media can play in open government at Social Security's Open Government Employee Awareness Day. My presentation is embedded below:

As I said in my talk at the agency, what I've seen in my reporting over this year suggests a nascent connection between the evolution of social media, open government and e-government. The economic meltdown of the past few years has pushed state governments to do more with less. The federal government has explicitly -- and sometimes implicitly -- endorsed the use of several types of online social software as tools for open government. The top-down open government directive has come at a time when there is an active network of civic hackers finding innovative ways to use free services, open data and partnerships with social entrepreneurs.

There's an emerging cycle of reciprocity between those governed, the e-services infrastructure provided by government entities and the open government approach adopted by municipalities and agencies. That relationship is worth considering as citizens turn to the Internet for data, policy and services in increasing numbers.

Example: How an agency (Social Security) can become more social

I talked with Social Security CIO Frank Baitman about open government and social media earlier this month. As my interview with Baitman revealed, access to social media is currently blocked for most Social Security employees, with exceptions made on a case-by-case basis. The risks and rewards of Web 2.0 for federal agencies are substantial for Social Security, given the fundamental role the institution plays in American society.

"We're understanding that social media is becoming another means to communicate with a wide range of citizens," said Baitman. "Since Social Security touches virtually every American at some point of their lives, social tools are critical to communication."

The medium could be particularly relevant to senior citizens, who after all receive the lion's share of their income from Social Security, with elderly citizens in the bottom quintile receiving 88.4 percent of their income from that source.

The issue of data leaks through new communication channels is not a negligible concern within the Office of the CIO, particularly as open government efforts move forward. Asked about that issue, Baitman said: "Open government is about communicating with the public, not sharing sensitive data. To the extent that we do share data, we extensively scrub it. Open government has nothing to do with personally identifiable information (PII). That has to do with what government is doing for and behalf of its citizens."

To address those concerns, Social Security may well look to the example set for the secure use of social social media by the Department of Defense or the guidelines for secure use of social media by federal departments and agencies from the Federal CIO Council.

If open government is to continue its progression at Social Security, more online interaction between citizens and staffers is inevitable. Right now, the agency is taking careful steps, as evidenced by it's relatively quiet @SocialSecurity feed on Twitter or Social Security Facebook page. They've marketed their news about the top baby names to outlets like @CNNBrk, @ParentsMagazine and @ESPN, personalities like @RichSanchezCNN, and administration officials or entities like @whitehouse, @PressSec, and @BillBurton44. But they haven't replied to anyone. The agency currently is broadcasting on both forums, not engaging, responding or moderating comments or replies. With care, Social Security might take some notes from NASA on negotiating the frontiers of social media.

While online tools and digital platforms that enable greater transparency, collaboration and citizen participation will continue to improve beyond those used in 2010, the culture of openness within agencies will also need to evolve in order for open government to achieve any measure of success.

There are reasons to be hopeful. After I talked about the increase in location-based social networks, a young Social Security employee asked if I thought a live data feed of waiting times for branch locations would be useful if it was mashed up with an online map. I thought that sounded useful, and said as much.

Social Security Commissioner Michael J. Astrue listed three open government initiatives during his speech at Awareness Day. All are important, meaningful and likely achievable. But if the Social Security Administration wants to evolve into a true 21st-century institution through better use of information technology, those initiatives will need to be the tip of the iceberg.

As Greg Pace, deputy CIO at the Social Security Administration, said at the end of Awareness Day: "We must examine the principles of open government with an open mind. It's not about technology but about the people who use it. Be open."


June 13 2010

Gov 2.0 Week in Review

As usual, there's no shortage of news in the government 2.0 world. There has been one watershed event since our last Gov 2.0 Week in Review, however: the early results of the decision to open up community health data. Here come the healthcare apps. Will the Department of Health and Human Services make community health information as useful as weather data? Will the innovation and associated business value match that unlocked by GPS and NOAA weather data?


An even more pressing question is whether information technology can help close the yawning gap in federal and state government budgets. "Budget director Peter Orszag's speech, Closing the IT Gap, explains what we're about with Gov 2.0 Events," tweeted Tim O'Reilly earlier this week. Peter R. Orszag, Director of the Office of Management and Budget, spoke at length at the Center for American Progress on a "significant IT gap" that has developed between the public and private sector. Orzag cited this IT gap as a big part of the productivity divide between the two.

"Closing this IT gap is key to boost efficiency and make government more open and responsive to the wants and needs of the public," wrote Orzag at, where he linked to budget guidance for agencies and a memo that instructs them to identify "their bottom 5 percent performing programs."

One of the ways that the federal government plans to save some taxpayer dollars will be through data center consolidation. Another will be through bread and butter IT, like the green data center in the House of Representatives that I reported on last year. A third will likely be cloud computing, given the millions that Los Angeles saved in IT costs or estimated $750,000 saved though moving to Amazon's cloud, though serious questions will persist about what government sites or services can be moved to public clouds. A new European Union project on economic effects of open government data may shed light upon whether that approach offers cost savings as well.

More on the past week, including cloud computing, cybersecurity, the 2010 Personal Democracy Forum and Twitter in government, after the jump.

Cloud computing costs, claims and future

If you missed it, Federal CIO Vivek Kundra delivered a keynote at the Cloud Computing Forum and Workshop last month, embedded below. In the speech, Kundra called for the use of cloud computing to narrow a gap between consumers and government while maintaining security, data portability and interoperability.

A new Pew Internet report on the future of cloud computing offered many more perspectives on the topic. A solid majority of respondents agreed with the contention that by 2020, "most people will access software applications online and share and access information through the use of remote server networks, rather than depending primarily on tools and information housed on their individual, personal computers."

O'Reilly Radar's own Andy Oram contributed to Pew Internet report on cloud computing. He's quoted in the findings, recommending that "cloud application providers recognize the value of grassroots innovation - following Eric von Hippel's findings - and solicit changes in their services from their visitors. Make their code open source - but even more than that, set up test environments where visitors can hack on the code without having to download much software. Then anyone with a comfortable keyboard can become part of the development team. We'll know that software services are on a firm foundation for future success when each one offers a 'Develop and share your plug-in here."'

Reflecting the internationalization of the trend, where NASA and Japan announced a cloud computing collaboration that will explore interoperability opportunities between NASA's Nebula Cloud Computing Platform and Japan's NII Cloud Computing Platform. "By demonstrating how cloud interoperability can facilitate international collaboration and seamless global access to public data, NASA hopes to accelerate the development of cloud standards and the adoption of cloud infrastructure services by the scientific community," said Chris C. Kemp, NASA's Chief Technology Officer for Information Technology.

Kemp spoke with me about his role at NASA and Nebula at the Gov 2.0 Expo last month:

As Carl Brooks reported at, the first comprehensive, vendor-neutral cloud computing benchmarks are in the wild at

Looking back at Personal Democracy Forum 2010

Can the Internet fix politics? The answer to that question may not be clear for years. After the Personal Democracy Forum's annual conference, it's clear that the Internet has significantly disrupted the ways that candidates campaign, officials govern and agencies form policy. Highlights of Personal Democracy Forum included some fascinating applications, including, SeeClickFix and Meetup Everywhere.

As Nick Judd reported that mainstream media is a part of the solution for fixing government. Change agents inside of government and engaged citizens are also crucial. All three parties could benefit from publishing public data online, as the FTC highlighted in its discussion draft on the future of journalism.

Federal CTO Aneesh Chopra spoke at length about rethinking government, which he later blogged about at the Huffington Post in empowering Americans through open government. Chopra highlighted the Community Health Data Forum, "Apps for Healthy Kids" and IT dashboards for spending, among other initiatives.

As his wont, Clay Shirky delivered a thoughtful talk on the Internet, citizenship and lessons for government agencies that are looking for feedback online. Hint: use taxonomies to aggregate ideas instead of a single list.

Can technology forge a new relationship between government and the public? Arianna Huffington considers the possibility after PDF 2010, where she participated in the closing panel. That discussion, which also included Tim O'Reilly, Saul Anuzis, Nick Bilton, Andrew Rasiej. and Newark mayor Cory Booker, is embedded below:

And in a huge win for Jen Pahlka's big idea, the Omidyar Network announced a $250,000 grant to Code For America, which is now recruiting fellows. "Ask not what your country can code for you - Ask what you can code for your country."

Twitter looks for a government liaison

Why is Twitter hiring a government liason? Twitter VP Sean Garrett offered up some insight on a new opening for a government liaison, which he said will serve as "a point person that can help verify government IDs, someone that can be down the street to meet with officials in their office, or serve as an overall point person for government outside the Beltway." The Department of Human Services’ new media guru, Andrew P. Wilson, offered up a thoughtful Top 10 Requests for the New Government Liaison at Twitter.

Internet Freedom and U.S. Foreign Policy

As clashes and protests are reported in Iran on the one year anniversary of the historic protests there, the Wall Street Journal reported that the U.S. stepped up tech support for Iranian dissidents. Should the U.S. support Internet freedom through technology? As I reported in my interview with Secretary of State Clinton's senior innovation advisor, Alec J. Ross, technology for Internet freedom and innovation is supported by the State Department.

Using the Internet to communicate about the oil spill

USCG commander Thad Allen and White House press secretary Robert Gibbs held a live briefing on the Obama administration's response to the Deepwater oil spill in the Gulf of Mexico that was streamed through Affected parties are urged to submit claims to BP using Carol Browner, Assistant to the President for Energy and Climate Change, also took questions on the oil spill in a live Web chat using Facebook and The archived video is embedded below:

Digital Capitol Week


Here in the District of Columbia, Digital Capitol Week is now underway. While many of the workshops, clinics, festivals and parties are well worth the time of the thousands of registered attendees, look for the Gov 2.0 and Org 2.0 Day to be particularly notable for this space, along with the DC 140 Conference, where I'll be speaking with NPR's Andy Carvin about "Emergency Response 2.0." For more, iStrategy Labs has helpfully published "the one post you'll need to read" about Digital Capitol Week.

Government 2.0 Bits and Bytes

Elsewhere on the Web, David Eaves offered some thoughtful advice to governments on how to engage with social media and suggested that cities should fork the Kuali Foundation to save millions of dollars.

I posted video of how intelligence agencies are connecting the dots with Intellipedia.

The clever developers at the Guardian created for easy browsing of government spending.

The new features an open data section and the first state use of Get Satisfaction. If you missed it last month, there's also a newly-redesigned, including a refreshed data repository and an Apps for California contest.

For more on such endeavors, make sure to read Mark Headd's "A 'Glass Half Full' View of Government App Contests and Government "Apps" Move from Cool to Useful in Governing.

Germany's President resigned last week, due in part to the power of social media, which played a role in Köhler's departure and replacement.

Military intelligence is tapping social networking skills, enabling a distributed force to conduct swarm warfare via chatrooms. As a guest post on Boing Boing revealed, the military has improved its language education through innovative use of brochures and virtual education.

The State Department launched a mobile website at


Mike Bloomberg has earned some plaudit as an "iPad Mayor." As Javier Hernandez reported for the New York Times, while Bloomberg is still mastering the device, his deputy mayor for operations, Stephen Goldsmith, is apparently interested in using his iPad to monitor city data and take notes at meetings. “This is the future of public service — digital data pushed to workers who use better information to make smart decisions,” he wrote to Hernandez.

Finally, Mike Kujawski posted a series of great links and takeaways from the Gov 2.0 Expo, proving that's it's never too late to post your impressions.

What else is happening in Gov 2.0?

Inevitably, we're going to miss some links, so make sure to read Nancy Scola at techPresident and follow my Gov 2.0 list on Twitter, embedded below. And as always, if you have tips or suggestions, please email them to or leave links in the comments.

June 11 2010

Here come the healthcare apps

"People in communities can improve their healthcare if they just have the information to do it," said Kathleen Sebelius, Secretary of Health and Human Services (HHS), at the Community Health Data Forum in D.C. last week.

The forum took place almost exactly a decade after President Clinton announced he would unscramble global positioning system data (GPS) for civilian use. Now, the potential for private enterprise to provision services using open data from the Community Health Data Initiative could match the billions of dollars made when the government unlocked GPS and NOAA weather data. Last week, in fact, I wrote about how HHS is making community health information as useful as weather data.

Sebelius delivered her remarks to both an online audience at and the collection of government officials, technologists and researchers gathered at the Institute of Medicine at the National Academy of Science. Her speech is embedded below.

After the jump, learn more about the healthcare apps that were featured at the forum's showcase.

Apps that use open health data

Health IT at OSCON 2010I covered the healthcare apps developed by National Association of Counties (NACO), GE, Bing, Healthways and Google last week.

Glimpses of a nascent ecosystem of innovation around community health data were on further display in the apps at the Community Health Data Forum. The selections included games, visualizations, web services, crowdsourcing platforms, and smartphone software.

Walking around the expo, I learned about the following apps:

  •, an interactive database that provides state- and country-level health data.
  •, an in-development iPhone app that engages healthcare practitioners about drug safety alerts
  •, an interactive website for describing the overall health of a community
  •, a web-based mapping platform that mashes up community and health resource data. It includes a HIPAA-compliant means for uploading patient information.

The following are six different apps / web services that also caught my eye at the forum.

Finding connections with Palantir

analyzetheus.jpgPalantir wowed the crowd in the main hall with its tech demonstration, which can be viewed in the video embedded above. Alex Fishman, an engineer, also announced at the forum that Palantir had integrated community health data into Analyze The US, a web application that allows citizens, researchers and government officials to explore community health data. A video comparing Medicare quality to Medicare spending -- an example of this tool in use -- is available at Palantir's Government data analysis blog.

Game mechanics and health data

scvngr.jpgCommunity Clash isn't the only game that's using community health data: SCVNGR combines the location-based technology that has become familiar to many through Foursquare and Gowalla with specific challenges to earn points. SCVNGR provides a platform for organizations to build games upon. To date, more than 550 institutions in 44 states and 20 countries have taken them up on the opportunity as clients, including museums, conferences, universities and cities.

John Valentine, SCVNGR's conference and events manager, says that SCVNGR now has more than 20 million locations in its system and is being downloaded thousands of times daily from the iTunes and Android app stores. In D.C., SCVNGR will be a part of the upcoming Digital Capital Week.

Medicare data gets mapped

The Community Health Map is being used by HHS internally to visualize and organize data, says Sohit Karol, a PhD student in the kinesthesiology department of the University of Maryland. The video below provides an overview of the core features of Community Health Map, a web application for visualizing Medicare datasets.

The tool was developed as a part of a course on Information Visualization at the University of Maryland. More information on the project is available through the class wiki.

iTriage puts hospitals in patients' hands

itriage.jpgThe iTriage app combines open health data with a large database of symptoms and a directory of healthcare service providers. A pair of emergency physicians, Peter Hudson and Wayne Guerra, developed iTriage to empower consumers to make better decisions.

"People are making bad decisions with third-party information," said Hudson at the expo. "The people making those decisions are costing the system money, mostly because they don't have the tools they need to understand."

Now, users can get quality reports on doctors, research symptoms, click to see nearby healthcare facilities and, where available, view emergency room wait times. "We're seeing a high level of engagement," said Hudson. "With people using it to find doctors, hospitals and pharmacists. We've seen 2 million page views on mobile already." Hudson said an iTriage API is in development.

Pillbox turns FDA drug label data into a platform

pillbox-screenshot.jpgPillbox, an open data initiative developed within the National Library of Medicine and the Food and Drug Administration (FDA), makes pill identification easier. Pillbox lets developers build applications for the web and smartphones through an open API.

David Hale, Pillbox's project manager, says a call to a poison control center for a pharmaceutical identification costs $45. With Pillbox and a web browser, that cost can be substantially reduced.

Hale explains more in this video from the USP Annual Scientific Meeting in September 2009:

Asthmapolis crowdsources better health for asthma patients

asthmapolis-screenshot.jpgAsthmapolis has developed a specialized device called a "Spiroscout" that, when attached to an asthma inhaler, uses GPS to track use, and share the time and location of symptoms.

Asthmapolis aggregates the data voluntarily provided by users and gives it to physicians, scientists and health agencies. The goal is to identify environmental exposures that trigger attacks. Asthmapolis has released a web app and its building a mobile phone diary and website for later release to the public.


The opportunities in healthcare IT will be explored at the upcoming OSCON conference. Learn more about OSCON's health track here.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!