Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

December 09 2010

White House proposes sweeping federal IT reforms

For years, the differences between the use of information technology in the public and private sector have been glaring. Closing the technology gap has been one of the most important priorities of the administration's IT executives. Today, the White House released a report (below) that acknowledges those issues and proposes specific reforms to begin to address the IT gap that Peter Orzag, the former head of the Office of Management and Budget (OMB) in the White House, highlighted this summer.

This morning in Washington, the White House will host a forum on information technology management reform hosted by federal chief performance officer and OMB deputy director for management Jeffrey Zients and U.S. chief information officer Vivek Kundra. The two will lay out the Obama administration's strategy to reboot how the federal government purchases and uses information technology. The event will be streamed live at It's also embedded below:

Key reforms proposed in the plan include:

  • Create career tracks for IT program managers
  • Move to pilot projects with more budget flexibility and greater transparency in the implementation process
  • Develop IT acquisition specialists to closely align acquisition with technology cycles in the broader economy
  • Following the model of Social Security, enact the requirement for complete, integrated teams to be in place before projects are approved
  • Launch "myth-busting" campaigns about acquisition of federal IT.
  • Use "TechStat" sessions and other accountability measures to end or accelerate troubled projects
  • Reduce the number of federal data centers by at least 40 percent by 2015.

September 24 2010

The convergence of Google, government and privacy

Google recently added a new Privacy Tools page. If you follow tech policy in Washington, you couldn't miss hearing about it, given that advertising for Google privacy tools was on relevant blogs, email newsletters and periodicals. And if you work, play, shop or communicate online, the issue of online privacy is more relevant to you than perhaps it ever has been before. Companies and governments are gathering unprecedented amounts of data about every click, link, and status update you make. The choices that they are making now around the use of personal information, identity, authentication and relationships will be fundamental to how we see one another and ourselves as the information age evolves.

This historic moment is why the Gov 2.0 Summit featured a deep dive into online privacy this year. The evolution of online privacy continues to spur debate on and offline. Below, Tim O'Reilly shared his thinking on the complex subject at the Summit:

Why the big push in awareness around the new privacy tools? For once, it's simple: privacy is a hot issue in Washington, and Google has a major interest in contributing to and informing that conversation.

As one of the most powerful tech companies in the world, Google will inevitably be affected by any new electronics privacy laws from Congress or regulations from the Federal Trade Commission. From potential updates on the law to the complexities introduced by cloud computing to the unprecedented collection of user data from search, webmail or advertising (Google's bread and butter), getting online privacy right will matter.

That's doubly true because of recent missteps, like the widespread backlash over Google Buzz. With the anticipated launch of Google Me later this year, which is reported to add a social layer to Google's products, online privacy will become even more relevant. Google is up against the hypergrowth of Facebook, which, with some 500 million users, has grown into one of the world's top online destinations.

Every time users "like" a page on Facebook instead of linking a website on web, that action provides Facebook with information about relevance and connections that aren't available to Google, at least for now. When Google returns a result using its search algorithms, they are in part parsing linking behavior. Should Facebook continue to expand its own search, it's safe to assume that those results will be in part informed by liking behavior. The contrast between hyperlinks and "hyperlikes" is relevant because the future of the Internet is likely to be built on understanding those interactions. In both cases, users (and regulators) have a vested interest in understanding how, where and why their data is being used.

Is it possible to share and protect sensitive information? That will continue to be an open question -- and a contentious question -- for years to come. For an informed discussion on that topic, watch the conversation embedded below with John Clippinger, of Harvard's Berkman Center, Loretta Garrison from the FTC and Ellen Blacker from AT&T.

New Google privacy tools

Last week, I attended a press briefing where Jonathan McPhee, Google's product manager for web history, SSL search and personalized search offerings, walked through Google's privacy settings.

"It's on us to educate members of Congress," said Google spokesman Adam Kovacevich. "Google is an engineering company. We would like to address challenging issues -- copyright or free expression for instance -- with great tools. The Content ID tool on YouTube is a great example of that. Our first instinct is to try to address them through tools like this."

One of Google's most public responses to online privacy concerns came last year, during the FTC privacy roundtables, when the company launched a dashboard to show users information associated with Google accounts. Yahoo also launched a dashboard to provide similar insight into data collection. Both dashboards were designed to provide users with more insight into what data was being collected around which interests. The introduction of these tools was particularly relevant to behavioral advertising, which many publishers are looking to as an important way of offering targeted, high-value marketing.

According to McPhee, there are now more than 100,00 unique visitors every day to Google's dashboard. Users can view, modify or even export data there, though the company's "Data Liberation Front.

McPhee framed the privacy discussion in the context of principles. Reflecting Google's famous mantra, "Don't Be Evil," these online privacy principles should give users "meaningful choices" to protect their personal information.

Google itself has meaningful choices to make: from how they cooperate (or don't) with law enforcement and government agencies that want access to its data, to "firewalling" private information from multiple services within companies, to monitoring internal controls on employee access.

"We want to help users know what's associated with their account, so each user is aware from beginning what's correlated across the account," said Kovacevich, who noted that Google published the number of requests they get for user data online. Privacy researcher Chris Soghoian considered the pros and cons of this tool in praise of Google. Google is also working with a coalition of technology companies and think tanks on Digital Due Process, an effort advocating for an update of the Electronic Communications Privacy Act to incorporate privacy principles relevant to cloud computing.

Google has also made an effort to make privacy tools more visible, said McPhee, notably on the sparse home page and on every search page. By the time users reach the bottom of the new privacy tools page, said McPhee, users will be more "empowered." He touted two areas where Google introduced privacy features earlier than competitors: encrypted webmail, in January 2010 and encrypted search this spring. "We launched [] in May," said McPhee. "It encrypts communication between Google and the searcher. The concept is simple but implementation is complex, in maintaining performance. The challenge is around latency."

"Another feature I don't think people are aware of is the ability to pause," said McPhee, referring to the ability of users to stop recording their Web History, then resume. Users can also remove items from that Web History if they wish.

Web browsing and privacy

Google's browser, Chrome, includes an "Incognito Mode" that reduces information collected during a user's browsing session. While Incognito Mode won't help anyone protect information from a colleague or friend if they leave a browser window open, it does mean that the history will not log progress, URLs will not be stored and any cookies on the computer are session-level only. Any downloads made, however, will stick around.

McPhee also noted that an opt-out option for Google Analytics has been available since April. The tradeoff between integrating analytics or disabling the function is between making a website more useful for users vs. individual user privacy. That frames the debate going on within the government webmaster community, where the recent revamp of federal cookie policy by the Office of Management and Budget officially allowed the use of cookies and analytics.

Opting out of analytics doesn't prevent a website from knowing certain things, like a HTTPS referrer, but "we chose privacy over data," said McPhee. "That was significant for us."

Future Google services and online privacy

Geolocation services have attracted notice of late, due to concerns over granular location information. As with other areas, a user chooses how much to share with friends, said McPhee. People have to sign up to see the locations of users in Google's Latitude program.

"You can detect location or set location manually," said McPhee, "and even hide location." He highlighted monthly email reminders that remind users that the Latitude service is on. McPhee also noted that feedback from a shelter for battered women focused on a scenario where Latitude was installed without a user's knowledge resulted in a feature addition. "Where possible, within 24 hours if you haven't used it, a Latitude dialog pops up," he said.

Online video will present other issues. While users can make an uploaded video "private," if people have the right link, the video can be viewed. If YouTube does move forward with allowing livestreaming, expect these privacy controls to receive additional scrutiny.

Google's moves into mobile, television, e-commerce, virtual worlds and even augmented reality will also create new privacy concerns. With each service, the question of where and how privacy is baked in will be important to analyze. On one count, at least, Google seems to have learned an important lesson: more transparency into how data is being used and what tools are available to control it can satisfy the privacy concerns of the majority of users. According to McPhee, only 1 in 7 people who used the online advertising preferences chose to opt-out entirely. Providing users with control over their own private data can't be discounted, either.

That was just one element that Jules Polonestky, the former privacy officer at DoubleClick and AOL, focused on in his talk at the Gov 2.0 Summit and in our subsequent interview. Both are embedded below.

Questions for Google from the online audience

Before I headed into Google's Washington offices, I solicited questions to ask from readers. Answers, which I've edited for readability, follow.

How long before Google Street View covers my country, especially the cities of Makkah & Madina? -- Aamer Trambu (@TVtrambu)

While Google's staff wouldn't comment on any plan to roll out Street View in the Middle East, they did emphasize the availability of users to opt out using privacy tools. "Facial recognition, if we were to introduce it," said McPhee, "would also have controls."

[I have] concerns over the control of "predicted search" data on Google Instant. How is it stored, associated, protected? -- Andrew N (@tsbandito)

"Google Instant works just like normal web searches," said McPhee. "If you click on a result, press enter or take some other action like clicking on an ad, just like before, it's recorded in your Web History." He did highlight a way that Instant is a bit different: when you get a result and you don't have to click on anything, Google records it as a search if you pause on a term for three seconds.

What is the ETA on Google turning on encryption for search by default? Do the filtering concerns of schools take priority? -- Chris Soghoian (@csoghoian)

For now, you can make your home page, suggested McPhee. "For those unfamiliar with the issue, schools have an obligation for funding to provide filtering of pornographic images. The difficulty is that because schools didn't know what people are searching for, they blocked"

McPhee focused on issues of search speed as an key consideration in default encryption. "The difficulty of offering encryption by default is that the central challenge is performance," said McPhee. "There are some features where this is more difficult than text search. Encrypted search doesn't support images or maps. Before we made this the default, we would need that to be supported as well. As soon as we have that feature parity, we will look into that as well."

What extent will they be using social data in conglomeration with Web History? -- Eric Andersen (@eric_andersen)

"We have a social search function, and that exists as a separate function from Web History," said McPhee. "There's a page called social circle and you can go through and see what information is there and edit it. You can say 'I don't want this guy promoted in social search.' I can't comment on rumors [regarding Google Me]."

How far will Google would go to protect user privacy? -- Ilesh (@ileshs)

"We abide by the laws in the countries in which we operate," said Kovacevich. "That doesn't mean at the very first request for user data that we give it away. From a broad perspective in promoting freedom of speech globally, we are interested in the issue. We're doing a big conference in Budapest with a Central European university."

I recently heard about mail graphing. What about the data privacy concerns with that? -- Meg Biallas (@MegBiallas)

This third-party ad on "is a great example of where we think data belongs to users, and they can use it in creative ways," said McPhee. You can learn more about mail graphing in a recent edition of Strata Week.

How many of the U.S. government requests for information were made for information on people from outside of the United States? [This waas in regards to data requests, not removal requests.] -- Annie Crombie (@annieatthelake)

"Honestly, I don't know," said Kovacevich. "We track them by the origin of the request."

How are they going to use the information from what we watch on Google TV? -- Tim Carmody (@tcarmody)

"We definitely have a goal to have all Google products and services included in the dashboard if it's in your account," said McPhee. "It's safe to assume if there's unique information collected via Google TV, it will be included there."

What about Google's own access to stored data? Any comment on that case? [This question referred to Google firing an engineer for violating privacy policies.] -- Carl Brooks (@eekygeeky)

Google's spokesman referred me to the company's public statement on this question, which was published in TechCrunch:

"We dismissed David Barksdale for breaking Google’s strict internal privacy policies. We carefully control the number of employees who have access to our systems, and we regularly upgrade our security controls -- for example, we are significantly increasing the amount of time we spend auditing our logs to ensure those controls are effective. That said, a limited number of people will always need to access these systems if we are to operate them properly -- which is why we take any breach so seriously." -- Bill Coughran, Senior Vice President, Engineering, Google


September 14 2010

We're in open government's beta period

As Congress returns to Washington after this weekend's somber anniversary, the conversation on Capitol will inevitably shift to the midterm elections. The White House is faced with high unemployment, the long war abroad and domestic priorities that range from education to energy policy to immigration to cybersecurity.

In that context, it might be easy for citizens and officials alike to let deep discussion of open government be subsumed under the tsunami of partisan rancor, entertainment news and horse race coverage of the elections. That would be a mistake. There are legitimate arguments to be had about the tech policy choices of Congress and the White House, and they will no doubt be on display in the pages of the country's newspapers and hotly debated in comment threads.

What's not in contention, however, is the exploration of technology-enabled platforms for a government of, by, for and with the people. This deserves close scrutiny.

Patching open government

As Nancy Scola noted at techPresident in a post about, "Skepticism about the transformative effect of open government isn't surprising," but backing that critique up with rigorous analysis is crucial. That's precisely what the White House's open government directive received last week, when the Sunlight Foundation's executive director, Ellen Miller, delivered a bracing analysis of its progress:

As Miller said in our subsequent interview, the launch of open government was inspirational, but the "follow through has left something to be desired." The bottom line "is that we're not seeing the kind of data made available that we were promised," said Miller. We talked more about whether a change in Congress will give a boost to the transparency movement, and the commonalities between open government in the United Kingdom and the United States.

Miller sees congruency in finding accountability for government, across the ideological spectrum, in finding efficiencies and effectiveness. She said the Sunlight foundation is working on two sides: helping and prodding entities to publish data, and helping government engage with citizens to co-create government. That's enabled through the tools created at Sunlight Labs, like or, or "Sunlight Live."

Miller's strong speech prodded open source advocate Gunnar Hellekson to respond that government doesn't look good naked, which is to say that transparency through open government is a messy, iterative process that inevitably reveals some ugly truth in the process. If people look back at how far open government has come, as Derek Willis wrote, the perspective shifts. In responding to Tom Lee's post on open government carrots and sticks, Hellekson borrowed from the open source world to describe Sunlight's role as a welcome patch.

That considered approach was adopted by MIT professor Andew McAfee's post on Gov 2.0 vs the Beast of Bureaucracy, where he weighed the reasons to be optimistic against the reasons to be concerned. McAfee wrote:

There now exists a fantastic set of digital tools to make government data and services available, and to make the work of the state more open, transparent, and participative. The idea of "government as platform" that Tim [O'Reilly] has been so eloquent about is not a pipe dream; it's feasible right now, and is only going to get easier to realize thanks to relentless technology improvement and innovation.

The challenge for innovators is the inertia and immensity of a self-perpetuating bureaucracy, which waits out reformers, whether they're working to implement open government or other initiatives. Like Miller, however, McAfee pulled no punches in his final assessment of whether improvements are necessary, particularly in providing e-services to veterans:

The VA has rolled out an ebenefits resource where veterans can instantly see the status of their claims, and the agency is to be applauded for this Gov 2.0 innovation. But the overall lack of state-of-the art digital tools at the VA, and the persistence of a bureaucracy that takes more than 160 days to let someone know if they'll receive disability payments for the limbs they lost in Iraq or Afghanistan is not a problem that needs to be fixed. It's a moral stain on the country. Sometimes it's important to speak plainly.

No open government tool has addressed that backlog. However, Peter Levin, the CTO of the Department of Veterans Affairs, appears to be focused on working toward improving that situation.

New open government initiatives

Four initiatives launched at the Gov 2.0 Summit are relatively modest in their immediate impact, but they could fundamentally improve different aspects of government.

First, the FCC launched new APIs and a developer engagement platform, extending the notion of government as a platform to the country's top communications regulator. The launch of is the precursor to a larger plan to reboot with open government, as I've previously reported. We should see the launch of a new FCC site by January of next year. The FCC presentation is below:

Second, the General Services Agency launched Does deserve a fist bump? Is it an excuse killer? As CityCamp founder Kevin Curry pointed out, "what people may not understand about is that it isn't about the website. It's about changing the procurement model."

The question of whether crowdsourcing national challenges leads to better solutions will remain outstanding for months time to come.

Third, in an unscheduled moment, deputy White House CTO Andrew McLaughlin was joined by Carl Malamud on-stage to talk about, a "platform that will connect all of the disparate video archives of the federal government departments and agencies, as well as easy access to feeds and an inspiring presentation of live video feeds from across the government."

McLaughlin's conversation with Malamud follows Beth Noveck's talk on "Ten Ways to Change The World," which offered insight into the administration's perspective on open government progress over the past year.

Finally, Civic Commons launched at Gov 2.0 Summit. Civic Commons is a code-sharing initiative between cities aimed at helping city governments cut IT costs. One of the first chunks of code in Civic Commons is an open-sourced federal IT dashboard. DC CTO Bryan Sivak, Code For America's Jen Pahlka and OpenPlans' Nick Grossman announced Civic Commons below.

Open government? Yes we scan

The most rousing invocation of the conference was delivered by the man who has done as much as anyone outside of government to open it up. Carl Malamud's talk, the "Currents of our Time", provided historical context and it challenged the feds to go much further with open government.

Malamud defined three steps government needs to take:

  1. Finish the open government revolution: Define bulk data standards and enforce them. Release more data online. Update the Freedom of Information Act (FOIA) for the Internet Age, and publish those materials gained online.
  2. Get serious about digitization: Embrace a national scanning initiative. "If we can put a man on the moon, surely we can launch the Library of Congress into cyberspace."
  3. Start an open systems revolution: Create a "Computer Commission" with the kind of authority the Civil Service Commission created in the 19th Century. This commission should "conduct agency-by-agency reviews and help us reboot .gov, flipping the bit from a reliance on over-designed custom systems to one based on open-source building blocks, judicious use of commercial off-the-shelf-components, and much tighter control of the beltway bandits."

    1. Recapturing the open government genie

      What to take away from Malamud's "technohomiletics," Miller's open government scorecard, Noveck's citations or the rich online discussions about accountability stimulated by Gov 2.0 Summit?

      First, open government is complicated. It's risky. It's incremental. It's a largely unfunded mandate. Cost savings founded in reducing FOIA requests by publishing public information online often remain anticipated, not realized. The cultural shifts required for full adoption are not in the DNA of many federal or state agencies. There are new security and privacy risks that the intelligence directorates, citizens, and developers are just beginning to appreciate. Consider what it will mean to bring open government to courts, for example. As Jim Stogdill argues in his take on Gov 2.0 and open government: "... fixing government IT may also mean fixing incentives and making a cognitive leap to intentional emergence."

      Second, open government patches like can make a difference in holding transparency to account. But none of it will be easy, or fast, or certain. Any "Government 2.0 beta" will have crashes, bugs and failures. Mistakes can't be tolerated in the code for a nuclear launch vehicle, but in the messy intersection of citizens, open data, civic hackers, government agencies, new media and private industry, they are inevitable.

      Finally, the Internet's disruption to communications and secrecy -- as recently embodied by Wikileaks -- is not a genie that can be stuffed back into a dusty archive. Would the electorate tolerate, or being shut down? Will Australia or the United Kingdom roll back their Gov 2.0 efforts? As citizens turn to the Internet for government data, policy and services, the importance of relevant and accessible information only grows.

      Extensive news coverage about Gov 2.0 Summit is online, along with thousands of #g2s tweets and dozens of Gov 2.0 video on O'Reilly Media's YouTube channel. Over the next week, look for more analysis of the interviews, speeches and panels from Gov 2.0 Summit here on Radar.


    September 10 2010

    Four short links: 10 September 2010

    1. Instrumentation and Observability (Theo Schlossnagle) -- thoughtful talk (text and video available at that link) from a devops master. Many systems have critical metrics, which are diverse and specific to the business in question. For the purposes of this discussion, consider a system where advertisements are shown. We, of course, track every advertisement displayed in the system and that information is available for query. Herein the problem lies. Most systems put that information in a data store that is designed to answer marketing-oriented information: who clicked on what, what was shown where, etc. Answering the question, "How many were shown?" is possible but is not particularly efficient.
    2. Peak MHz (Mike Kuniavsky) -- we hit the era of what I'm calling Peak MHz in about 2004. That's the point when processor speed effectively peaked as chip manufacturers began competing along other dimensions. Which is why all the effort is going into horizontally-scalable systems like the NoSQL gadgets. (via Matt Jones)
    3. Transparency -- the great British satires Yes, Minister and Yes, Prime Minister continue as one of the writers blogs in the persona of the elder civil servant Sir Humphrey Appleby. His take on transparency is funny because it's true: I understand your anxiety about the new government’s fixation on what they are pleased to call ‘transparency’, but you are distressing yourself unnecessarily. It afflicts all incoming administrations. It used to be called ‘open government’, and reflects the frustrations they felt when they were in opposition and could not find out what was going on, combined with an eagerness to discover and publicise the deception, distortions and disasters of their predecessors.
    4. The Government Doesn't Look Good Naked -- a fine counter to the squawks of "the government's open efforts suck!" that are building. this is exactly how to prevent innovation in government. If you want change, you have to tolerate imperfection and risk. If every program manager thinks they’ll end up on the front page of the Washington Post or get dressed down onstage at Gov 2.0, nothing will change. (via Tim McNamara)

    September 09 2010

    As California goes, so goes the nation?

    How will the most populous state in the nation move forward with digital elections? When California secretary of state Debra Bowen endorsed open source voting systems at yesterday's Gov 2.0 Summit, the validation naturally struck many people in O'Reilly's community as significant, given the size and prominence of the Sunshine State in the world.

    "Open source election software saves us a ton of money, and can be re-used in other states," she said. Bowen's interview with Tim O'Reilly at Gov 2.0 Summit, embedded below, featured a wide-ranging discussion of trust, digital literacy and the future of electronic voting in thousands of California's precincts. While open source voting poses challenges in implementation, Bowen's perspective on the subject are worth reviewing.

    As a one of the country's pioneers in open government reform, election integrity, and personal privacy rights, Debra Bowen is well positioned to comment. "How do we create an education plan in CA so that everyone has 'access'?" she asked, focusing what a continued digital divide would pose for widespread adoption of electronic voting online or using smartphone apps. Bowen pointed to the potential for new tools and apps to engage young people and save money for government, particularly as millennials make different choices for media consumption.

    Integrating more efficiencies into the system isn't a theoretical or aspirational goal, either. As Brian Kalish reported in Nextgov, one of the silliest things is that officials in Los Angeles County will be transcribing by hand 30,000 to 40,000 voter registration forms every day in advance of Election Day in November 2. "Paying people to type data from a form is one of the silliest things we can do in 2010," Bowen said, and the manual process naturally creates instances where mistakes enter the system as election officials try to discern names.

    When I interviewed California's Secretary of State after her conversation with Tim, she elaborated on the utility of open source software in elections and electronic voting. As California resident Alan Silberberg pointed out before our interview, Secretary Bowen de-certified electronic voting machines due to concerns over the security and validity, a decision that she enduring considerable blowback. In our conversation, she talked about on whether it will possible to deploy digital voting in California any time soon. (Spoiler: there will continue to be barriers in terms of record keeping and security in the near future.)

    As California resident Ryan Alfred observed during Bowen's conversation with Tim O'Reilly, open source voting platforms sound great in theory -- but can technology increase the percentage of citizens who vote? Bowen said that it comes down to trust in the systems. In response to New York State Senate IT staffer Noel Hidalgo's question on the role does open source have in closing state budget deficit gaps, Bowen pointed to replicability and the comparable cost of proprietary systems.

    On the bigger questions of how the future of civics, the digital divide information literacy relate, Bowen reflected more about how California addressing the digital divide. Both interviews with the secretary provided fascinating insight into the state of how digital democracy will evolve.

    September 08 2010

    Civic Commons code-sharing initiative bids to reduce government IT costs

    Civic CommonsAround the United States, city governments have created a multitude of software. Unfortunately, most of the time the code from those projects is not shared between municipalities, which results in duplication of effort and redundant, static software.

    Civic Commons, launched today at the Gov 2.0 Summit in Washington, is aimed squarely at helping city governments share the software they've developed. Civic Commons is the product of the District of Columbia's Office of the Chief Technology Officer (OCTO), Code for America and OpenPlans.

    "In the District of Columbia alone, we have a large set of applications that other governments may find very useful," said District CTO Bryan Sivak. The OCTO will be adding multiple applications into the "commons," including a data warehouse application, a new agency performance management application, TrackDC, and numerous GIS applications. "Not only will other jurisdictions benefit from the public release of these applications, we will benefit from external individuals and organizations contributing to the code base and sharing exciting and innovative applications they have created," Sivak said.

    At launch, CivicCommons will catalogue existing projects like the DC App Store and the Federal IT Dashboard. As the blog post introducing Civic Commons explains:

    The long-term goal is to develop the app catalog into an open 'Civic Stack' -- a streamlined collection of software that cities can use to run core services. The stack will have open data and APIs like Open311 to encourage development and innovation, and so allow any developer to create a solution that'll make life better for citizens anywhere.

    You can read more about Civic Commons at And, if you happen to be online around 1:50 PM EST, you can tune in to the livestream for Gov 2.0 Summit to watch the launch announcement. For more perspective on open source and open government, you can also watch the video of DC CTO Bryan Sivak from the 2010 OSCON Conference below:


    Better, faster, cheaper ... emergent

    Gov 2.0 Summit, 2010Carl Malamud gave the opening keynote at the Government 2.0 Summit yesterday and greatly magnified my disappointment at having missed the event. If you've seen Carl speak, you know he is one person on the agenda that won't give a "presentation" or a "talk." Carl is an orator from a different era. He gives speeches. Rousing, moving, elevating speeches that turn our shared history into a kind of sermon; that inform us and inspire our better angels in equal measure. This is the kind of speech whose stirring coda lifts an audience to its feet and leaves them hitting replay in their heads -- not just to pilfer the richest sound bites for their tweet streams -- but to gather it all in.

    I'm sorely disappointed that I missed it. I had to make do with reading it and watching it later on YouTube.

    I absolutely agree with Carl that the failure in government IT is a failure to govern. We simply can't do without these things that we are trying so hard to build. Information technology is the infrastructure of governance, yet we fail to deliver it over and over again, and at great cost.

    Carl's prescriptions for open data, open systems, and government copyright reform are also right on. However, after years of observing the system from the inside, I have come to believe they aren't enough.

    While much can be said about the business practices of the "beltway bandits," they are not exactly the modern analogs to those turn-of-the-century pharma companies Carl describes. In the world before the FDA those exploiters waged an asymmetric and undefended war on American consumers. The beltway bandits are waging a war too, but one with more conventional symmetry -- where the government on the other side is heavily armed with a bureaucracy of its own.

    We are the witnesses to an arms race of growing bureaucratic complexity, where absent market forces are inadequately replaced with the Federal Acquisition Rules and binders full of subordinate regulation. Where each additional regulation, intended to de-risk the process of building software, instead adds risk by delaying the delivery of anything actually useful.

    The end result is hardly a lion preying on sheep. A better analogy might be Mothra and Godzilla locked in a death embrace. If your view is from the side of the government it is easy to place the blame with the beltway bandits. However, get inside their world a little bit and you may see them as a bit less rapacious and a lot more hogtied. When the customer is more interested in earned value reports and paperwork than working software, well, that's what they get. The beltway bandits are without a doubt a Frankenstein's monster, but it took a Dr. Frankenstein to build them.

    Really though, I don't care one whit who is to blame. The real issue is that all of this complexity of product and process keeps out participants, innovation, and success. Open access systems can experiment, innovate, and deliver things while closed or limited access systems evolve to deliver rents -- in the economic sense.

    In attempting to regulate a market that has inadequate competition, the government has inadvertently erected a bureaucracy that burdens market entry and facilitates the taking of uneconomic rents by those same beltway bandits they are trying to regulate. We should not be surprised that our creation, essentially a regulated quadopoly, is neither efficient or innovative.

    Carl's focus on open data, open systems, and misuse of intellectual property is important and relevant, as they will all contribute to moving government IT back into open access territory. But we will have to deal with the market and incentive factors as well -- and they are probably bigger. In other words, demonizing the bandits without addressing the root cause -- the lock-in incentives inherent in a single-customer market -- will just lead to new ways to lock them in.

    Shifting gears a bit, there is another area I would like to parse a bit further. The other major problem with government IT is the problem of enterprise IT in general, but at even larger scale. Some of this stuff is just really friggin' complicated. And I'm just not convinced we know how to build some of these systems, at this scale, inside this rate of technological change. Eliminate the added complexity of working with the government for a moment and ask yourself, do repeatable best practices even exist for specifying, planning, delivering and operating systems at the scale of the Navy Marine Corps Internet (400,000 nodes and thousands of applications)? Do they even exist for the example Carl used, the National Archives system?

    Before you jump all over me, notice exactly how I worded the question: specify, plan, deliver, and operate. This is the classic systems engineering approach to developing IT. It is reductionist, assumes a reasonably stable technology ecosystem and problem space, and relies on complex planning and execution to deliver.

    I am beginning to be of the opinion that this approach fundamentally won't work for systems at the scale that government often finds itself building, especially as they interconnect into ever larger and more complex wholes -- even if bureaucracy wasn't an impediment.

    Consider this analogy: Our economy consists of countless businesses, each one to a large degree planned, hierarchical, and reductionist in outlook. And while they often acquire each other and grow very large, so far no single company has grown to swallow our entire economy. And if there is a lesson in the Soviet's five-year plans, none will. These planned entities have natural limits (based on current management science and systems) to how large they can become before they become unwieldy, and beyond that we rely on market mechanisms to coordinate them in an emergent way.

    So, if large scale software systems are like that, what do you do if you want one that is bigger or more complex than our plans-oriented methodologies can deliver? Well, our government has been busily demonstrating that you don't do it by planning harder and de-risking more aggressively.

    What Carl intrinsically grasps when he suggests open data and open systems, even if he doesn't say it outright, is that government IT must recognize that it is entering a different realm, one where we need to abandon planning beyond a certain scale and adopt an approach that intentionally facilitates emergence. Intentional emergence isn't planning faster or harder, it's about structuring incentives, policies, and eco-systems to encourage the complex to emerge from the simple. This may not mean a fully atomized simplicity, but may come to look like our economy where pockets of planning coexist in an emergent ecosystem.

    Like on the web, it means that software/systems engineering will still exist in the nodes, but the coordinating signals among the whole must be economic. Systems engineering simply isn't equipped to operate at that scale and complexity. To cope we have to make a cognitive shift from planning, reductionism, and hierarchy to flattened networks and emergence, and put specifics such as open source, open systems and intellectual property policy into this broader framework.

    I'm not exactly certain how to replace systems engineering as the basis for large system emergence, but I have some ideas. They draw inspiration from the transition of a planned to market economy. Reforms to government IT should look less like a more comprehensive CMMI and more like China's market reforms of the 1980s -- less about systems engineering and more about ecosystem engineering focused on incentives and policy.

    As a starting point, we might ask the question, how might a NARA emerge in bits and pieces if a decentralized meta organization of government entities and citizens had budgetary and cultural incentives to contribute along a least resistant path that encouraged interoperability? And what should the FARS and other policy say to encourage rather than prohibit such an outcome?


    September 06 2010

    "Spontaneous collaboration" and other lessons from the private sector

    Gov 2.0 Summit, 2010Cisco CTO Padmasree Warrior's vision for government's future includes smarter cities, real-time communication over national boundaries and more efficient collaboration with citizens.

    Warrior touched on these ideas during our recent interview, and she'll expand on many of them during her conversation with Tim O'Reilly at this week's Gov 2.0 Summit. Video highlights from our wide-ranging interview are embedded below.

    (Note: This interview was conducted via Cisco's TelePresence. Exporting video from that format presents technical challenges, so portions were recorded with a Flip camera and an iPhone.)

    Video and spontaneous collaboration

    Given that our interview was conducted via video conferencing technology, I asked Warrior which private-sector tech lessons can be applied to the public world.

    "A big lesson that can be transferred from the private sector is kind of already happening. It's how can we use technology, like this [referring to TelePresence], to spontaneously bring ideas together," she said. That goes with the notion of open government, suggested Warrior. "How do you enable citizens to participate in brainstorming sessions, idea collection, in a more spontaneous way? The power of video is that it really allows us to extend the abstract notions of text-based technology and replaces that with much more human way of communicating. It's more natural."

    The power of platforms

    Platforms such as Amazon's cloud, Apple's App Store, Twitter and Facebook are key parts of the Web 2.0 world. I asked Warrior what government can learn or adopt from these examples.

    "The broader access you have to ideas, the stronger the end result will be," said Warrior. "Whatever the platform, the idea is how do get more innovation onto the platform." She sees a clear opportunity for government, but challenges lie in separating signals from noise and applying useful filters so decision makers can enact informed policies.

    The evolution of smarter cities

    Last month, Warrior shared a link on Twitter about how sensor networks in buildings could use air conditioning ducts as building-wide antennas. Dovetailing with that, I asked her about the evolution of smarter cities.

    "If you step back a little bit and think about what's happening, this is going to be a problem that we're all going to face in the next 10-20 years," she said. "There's rapid urbanization going on around the world. We're expecting maybe about 100 new cities, with over 100 million people. New cities, that would be created over the next 10-15 years or so. So the challenge that we all face is how do we enable this urbanization to happen in a different way than we have done in the past. What role can technology play in building smarter cities, cities that are more sustainable, that are greener, that are more efficient?"

    That perspective was further expressed by a recent tweet from Warrior, where she shared a piece from Science Daily: "Networks -- not size -- give cities competitive advantage.

    On cloud computing, innovation and enterprise collaboration

    The last part of our conversation focused on operating in a time of resource scarcity and the use of social software within Cisco itself.

    "You don't want to compromise innovation through piping and cost cutting," said Warrior. "I think there will be technology-enabled ways to innovate that the government has to think about as well."

    If open government is done properly, according to Warrior, it will increase participation and share the load of the work. "It will drive that speed and the better quality of decisions. If not, it will end up being more bureaucratic, because the noise level is higher than the signal level. I think the key thing in open gov or any kind of open platform is optimizing the signal-to-noise ratio."

    Privacy and the transfer of information across communities

    One part of the interview that did not make it onto YouTube focused upon the challenges for both government and enterprises that adopt cloud computing. Warrior pointed to the importance addressing the dual issues of authentication and identity, which from her point of view are essential issues. Those are precisely the topics, in fact, that will be focused on at the Internet Identity Workshop in Washington, D.C. this week.

    Warrior was thoughtful about the privacy issues that result from digital citizenship and business in the cloud. "There's a difference between identity and community," she said. "I have one identity that's visible to many, being CTO of Cisco. That identity needs to be authentic. I tweet personal things because people want to know who the person is behind the title. At the same time, I belong to a community of Cornell alumnae, to women in tech, to haiku writers, and to southeastern Asian-Americans. You have to know what community is appropriate to share information with and how."

    The issue, explained Warrior, is the appropriate transferability of information from one community to another. That's at the heart of privacy concerns about Facebook or Google's initial missteps with Buzz. As government considers cloud computing models, getting privacy right there will be even more important.

    Bringing open government to courts

    yu.harlan.jpgAs court records increasingly become digitized, unexpected consequences will result from that evolution. It's critical to be thinking through the authentication, cost and privacy issues before we get there.

    Harlan Yu, a Princeton computer scientist, worked with a team to create online tools that enable free and open access to court records that highlight the need for more awareness. My interview with Yu this summer was a reminder that the state of open government is both further advanced and more muddled than the public realizes. As with so many issues, it's not necessarily about technology itself. Effective policy will be founded upon understanding the ways that people can and will interact with platforms. Although applying open government principles to public access for court documents is a little dry for the general public, the ramifications of digital records being published online means the issue deserves more sunlight. A condensed version of our interview follows.

    Your open government work has focused improving public access to court records in the on PACER system. PACER stands for "Public Access to Court Electronic Records" but the reality of public access is more complicated. What's the history of your involvement with this aspect of open government?

    Back in February of last year, Steve Schultze, who was at the time at the Berkman Center, was giving a round of talks about access to court materials on PACER. He came to CITP in February to give a talk with one of his colleagues. I had never heard of PACER before, but I went to Steve's talk and learned about how the federal government provides these documents that form the basis of our common law. I was appalled that these public domain documents were essentially being sold to the public at the detriment to our democracy.

    What did you propose to Schultze to fix this situation?

    We thought there was a way that you could automatically allow PACER users to share documents that were legitimately purchased from the PACER system. Because these are public domain documents -- and there were no copyrights assigned to these documents -- if one legitimate user pays for a document, they should be able to share it on their blog, send it to their friend, post it online, or do whatever they want with it. We decided to venture out and build a [Firefox] plug-in called RECAP that essentially automatically crowdsources the purchase of PACER documents.

    Who else was involved in building RECAP?

    Gov 2.0 Summit, 2010We worked with the Internet Archive and with Carl Malamud at We built a system where users could download the RECAP plug-in and install it. While they used PACER, any time they purchased a docket or a PDF, whether it was a brief, an opinion or any motion, it automatically gets uploaded into our central repository in the background.

    The quid pro quo in that, as you're using the RECAP plug-in, if we already have a document that has been uploaded by another user, that gets shown to you in PACER to say, "Hey, we already have a copy. Instead of purchasing another copy for $.08 or whatever it'll cost you, just get it from us for free."

    We now have about 2.2 million PACER documents in our system, which is actually a small fraction of the total number of documents in the PACER system. The PACER administrative office claims that there are about 500 million documents in PACER, with 5 million being added every month. So 2.2 million is actually a pretty small number of documents, by percentage.

    We think that we have a lot of the most commonly accessed documents. For the court cases that have high visibility, those are the ones that people access over and over. So we don't have a lot of "long tail," but we have a lot of the ones that are most commonly used.

    Are there privacy and security considerations here? Why does the concept of "practical security" matter to open government?

    We'd like to make all of these documents freely available to the public. We've found a couple of different barriers to offering free and open public access. The biggest one is definitely privacy. When an attorney files a brief [in federal courts], they need to ensure that sensitive information is redacted. Whether it's a Social Security number, the name of a minor, bank account numbers, all of these things need to be redacted before the public filing, so when they put it on PACER, it can't be mined for this private information. In the past, the courts themselves haven't been very vigilant in making sure their own rules were properly applied. That's mainly because of "practical obscurity." These documents were behind this paywall, or you had to go to the courts to actually get a copy. The documents weren't just freely available on Google. The worry about privacy was not as significant, because even if there were a Social Security number, it wouldn't be widely distributed. People didn't care so much about the privacy implications.

    So a condition of "privacy by obscurity" persisted?

    Exactly. The information's out there publicly in public record, but it's practically obscure from public view. So now we have a lot of these PDF documents, but there's actually a number of these documents that have private information, like Social Security numbers, the names of minors or names of informants. Just going out and publishing these documents on Google isn't necessarily the best and most moral thing to do.

    I think one of the consequences of RECAP, Carl's work and our work in trying to get these documents online is the realization that eventually all of these documents will be made public. The courts need to be a lot more serious about applying their own rules in their own courts to protect the privacy of citizens. The main problem is that in the past, even though these records weren't available publicly and made freely available, there were already entities in the courtrooms essentially mining this information. For example, in bankruptcy cases, there were already data aggregators looking through court records everyday, finding Social Security numbers, and adding this information into people's dossier but out of the view of the public. Bringing this privacy issue to the forefront, even if these documents aren't yet publicly available, will make a big impact on protecting privacy of citizens who are involved in court cases.

    As court records become more public, what will that mean for citizens?

    If somebody sues you -- and it's a claim that eventually is unfounded -- that might end up in some dossier and the information may be incorrect. With these 2.2 million documents, we try to make them as publicly accessible as possible without harming the privacy of citizens. Last month, we came out with the RECAP Archive, which is essentially a search interface for our database of documents. We now allow users to search full text across just the metadata associated with the case. You can search across all the documents we had for case title, case number or judge. If there's a summary of the documents, you can search over all of the metadata on the docket. We haven't enabled full text search of the actual PDF or of the brief yet because that's where a lot of the PII is going to be found.

    What about the cost of making court records available? Is there a rationale for charging for access?

    The other issue with PACER -- and it's hard to ignore -- is cost. The reason why the courts charge money for these public domain documents is that Congress authorized them to. In the 2002 E-Government Act, Congress essentially said that they"re allowed to charge you their fees to recoup the cost of running this public access system, only to the extent necessary to recoup these costs. The courts determined at the time that that should be $0.07 a page and eventually upped that per page access rate to $0.08 per page. But if you look at their budgeting documents, we've found that they actually charge a lot more than the expense necessary to provide these documents. My colleague, Steve Schultze, has done a ton of work digging into the federal judiciary budget. We found that about $21 million every year looks like it's being spent directly on running the PACER systems. That includes networking, running servers, or directly to providing public access through PACER. Their revenue in 2010 is projected to be -- I believe -- $94 million. So there"s a $73 million difference this year in the amount of money that they"re collecting versus the amount of money that they're spending on public access. That $73 million difference is thrown into this thing called the Judiciary Information Technology Fund or the JIT Fund.

    The JIT Fund is being used on other court technology projects, like flat screen monitors, telecommunications, embeddable microphones in court benches. I"m not opposed to these projects being funded and more technologies in courtrooms, but these projects are being funded at the expense of public access to the law, including the ability for researchers and others interested in our judicial process to access and study how the judicial process works, which I think is highly detrimental to society.

    You've offered a thorough walkthrough of many of the issues that were raised at the workshop earlier this year. What is the next step in opening up the court system in a way that the American people can find utility from those efforts?

    I think the ball is essentially in Congress' court, so to speak. The courts need to work together with Congress to find the right appropriation structure such that PACER is funded not by user feeds but can be supported by general appropriations. Only in that case could the courts take down that user pay wall and allow all of these documents to be freely available and accessible. It's important to look at exactly how much money Congress needs to appropriate to the courts to actually run the system. I think $21 million isn't necessarily the right number, even though that"s how much they spend today for a couple of reasons.

    Carl has done a bunch of FOIA requests to all of the individual executive agencies and found, for example, that DOJ pays the judiciary $4 million ever year to access cases. That"s probably true for a lot of the other agencies or for Congress. They pay the courts to access PACER. So a lot of that money is already coming from general appropriation where taxpayer money goes to DOJ, $4 million and then that is paid out to the courts.

    If Congress were able to redirect that money directly, the courts would get that money directly and that would go a long way in making up this $21 million. In addition, the amount of money to run the payment infrastructure, to keep track of user accounts, to process bills, to send out letters, to collect the fees, I"m sure probably would cost a couple million dollars, too. If you take down the pay wall, that whole system doesn't even need to be run.

    From a policy perspective, I think it's important for Congress and the courts to look into how much money is being sent by using taxpayer money already on running PACER and then directly appropriating that money, along with however, more is necessary on top of that if there's a shortfall to fund the system. Once enough funding is available, then you can take down the pay wall and keep the system running.

    There are privacy issues that we need to deal with, certainly in bankruptcy cases, there"s a lot more private information that's left un-redacted, in the regular district appeals courts, appellate courts, probably a bit less. But there are definitely issues that we need to talk about.

    What are you focusing on in your doctoral work at Princeton?

    On the open government front, I've been looking into a variety of topics in privacy and authentication of court records. I think that's extremely important, especially as the focus is on publishing raw data and third-party reuse of data, in terms of re-displaying government data through third parties and intermediaries. It's also important that governments start to focus on the authentication of government records.

    By authentication, I mean actual cryptographic digital signatures that third parties can use to verify that whatever dataset that they downloaded, whether it's from the government directly or from another third party, is actually authentic and numbers within the data that haven't been perturbed or modified, either maliciously or accidentally. I think those are two issues definitely will be increasingly important in the open government world.

    What will your talk on "Government Data and the Invisible Hand" at the Gov 2.0 Summit examine?

    When we try to do open government, government tries to look at the data that they have and try to publish it. Then they get to a certain technological limit, where an important dataset that they want to publish is on paper file or is in a digital record but not in any machine-parsable way. Or records are available in some machine-parsable way, but there are privacy problems. When we talked about open government and innovation, I think a lot of people have been focusing on user-facing innovation, where the data had been published and the public goes out and takes that and makes user-facing interfaces.

    There's also back end innovation, where tools that enable government to better build this platform and sharpen this platform make the front-end innovation possible. These things include better redaction tools for privacy that make it more efficient for government to find private information in their public records. Or tools that help government source data at its creation in machine-readable formats, rather than doing it the same old way and then having some very complex and leaky process for converting Word documents or other non-parsable documents into machine-parsable formats. I think there's a lot of innovation that needs to happen in the tools that government can use to better provide the open platform itself.

    September 02 2010 poised for an overdue overhaul

    Gov 2.0 Summit, 2010For an agency that is charged with regulating communications, the Federal Communication Commission (FCC) has been a bit behind the curve in evolving its online presence. was launched in June 1995, redesigned in 1999, and relaunched again in September 2001. Since then, it has remained a largely static repository for public notices and information about the agency's action.

    According to FCC officials, that's going to change, and soon. There was already some insight offered into redesigning the FCC website back in January on the agency blog, informed at least in part by discussions with Sunlight Labs on redesigning the government.

    Yesterday, I interviewed FCC managing director ‪Steven VanRoekel at FCC headquarters about what rebooting‬ will mean for the agency, businesses and the American people. "The new site will embrace open government principles around communication and participation," said VanRoekel. "Consider, where over 30,000 ideas were generated, or Comments there go into the official record and are uploaded to the Library of Congress. You will see that in a much more pervasive way in the new"

    Our short video interview is below. An extended interview follows.

    Redesigning FCC websites for public comment

    In January, the FCC launched and asked for public input on improving citizen interaction. The site, which was touted as the first website to solicit citizen interaction with the FCC, followed the launch of and last year. All three websites are notable for their clean design and integration of new media components (blogs, Twitter, etc). Chairman Julius Genachowski's introduction to the site is embedded below:

    Improving public access to the FCC's operation is part of a new mentality, according to VanRoekel: "Last year, the chairman talked about entrepreneurs taking a rotation through government. We think a lot about bringing in great leadership and managing people around leadership. We were third from the bottom last year in rankings for the best places to work in federal government. We hired a new team to bring in a new culture, which means looking at citizens as shareholders."

    One of the stated aims of was to gather feedback on how itself can be redesigned, a project that, as noted above, is long overdue. The announcement of the new site, for instance, showed up in email but was not posted in plain text on Like other releases, it showed up as a Word doc and PDF on the site. That said, the FCC has picked up the pace of its communications over the past year, as anyone who has followed the @FCC on Twitter knows.

    Aside from the cleaner design of the new microsites and an embrace of social media, open government geeks and advocates took note of, which is meant to be "an online clearinghouse for the Commission's public data." The FCC has posted XML feeds and search tools for its documents that allow users to sort data by type and bureau.

    Under the Media Bureau, for instance, visitors can explore DTV Station Coverage Maps, a key issue to many given the transition to digital TV earlier this year. But the maps are on the old For those who don't enjoy good public DTV reception, they'd have to find the tiny icon for below the fold and click through to get more information.

    FCC's clunky clickstream

    Navigation on is still a work in progress. In this example, a user who clicks a link for "DTV Station Coverage Maps" is taken to the old FCC site. From there, they need to find and click on a DTV icon to receive deeper information.

    That kind of reciprocal citizen-to government interaction is precisely where the potential for these sites can be best realized, and where good design matters. So-called Web 1.0 tools like static websites, email and SMS used to share information about the quality of services. Web 2.0 services like blog comments and social media have, in turn, been deployed to gather feedback from citizens about the delivery of said services. The FCC began to pursue that potential in earnest in March, when the FCC went mobile and launched iPhone and Android apps for crowdsourced broadband speed testing.

    The potential for empowering citizens and developers with open data s where VanRoekel focused first when we talked.

    "We'll be announcing a couple of things next week at the Gov 2.0 Summit," he said. "Since we launched the speed test, we've gathered over a million data points. That continues to grow each day. We're going to launch a web services API where people can write apps against the speed test data. You'll be able to send us a GPS coordinate and we'll give you data against it."

    FCC Chairman Julius Genachowski and Managing Director Steven VanRoekel will discuss their experiences turning into a 21st-century consumer resource at the Gov 2.0 Summit in Washington, D.C. (Sept. 7-8). Request an invitation.

    If incorporated into or the thousands of online real estate brokerages, that kind of interaction has the potential to give people what they need to make more informed rental or buying decisions. "When I click on a house on a real estate site, why don't I see what broadband capabilities are there?" asked VanRoekel. "We're approaching .gov like .com. We're not only setting up data services and wrapping the API, but we're building apps as well, and utilizing the same APIs we expect developers to use."

    A consistent challenge across government for releasing open data has been validation and accuracy. The FCC may employ crowdsourcing to address the quality issue. "Think about a map of broadband speeds," VanRoekel explained. "I would love the ability for users to show us what's valid."

    Balancing transparency and open government

    As a regulatory body, the FCC has both great power and great responsibility, to put it in terms that Stan Lee might appreciate. Despite the arcane nature of telecommunications law, the agency's decisions have the potential to affect every citizen in the nation. As VanRoekel pointed out, the FCC must follow administrative procedures and publish drafts of rulemaking for public comment, followed by a vote by the commissioners. In the age of the Internet and the open government directive, that process is due for the same reboot the will receive.

    "Once approved, language in the APA [Administrative Procedure Act] says government will open up the notice of draft rules to enlighten public decision-making," said VanRoekel. "In the past, what that's meant is us putting it up on a website, in PDFs. Law firms would send clerks, who would photocopy folders and come back with comments at the draft rule. There was no way for an educator or an affected family to get involved. It's our vision that every rule that's up for decision in this agency will be opened for public input."

    The first draft of that effort has been on display at "We made it so that an idea entered into our engines was entered into the public record," VanRoekel said. "An interesting fact there is that you, as a citizen or industry body, can see the comments and hold us legally liable."

    The FCC is faced with difficulties that derive from handling the explosion of online feedback that contentious issues like net neutrality generate. "The volume of comments becomes our problem," he said. "When you have 30,000 ideas coming in and comments on top of them on the record, and we have a limited number of people that oversee the effort, that's our biggest challenge."

    While the FCC has touted new tools for openness and transparency, it's also taken a beating about a lack of transparency in close- door meetings on Internet rules.

    "There's a role to play on certain meetings where ex parte comes into play," said VanRoekel. "We tend to use ex parte as a mechanism for understanding. We ask vendors specific questions. Many times there are questions that involve their intellectual property."

    The agency has since ended closed-door meetings, but the episode highlights the complexity of enacting new regulations in the current media climate.

    Yesterday, in fact, as the New York Times reported, the FCC agency released a public notice seeking more input on open Internet rules, which the agency duly tweeted out as a PDF. The document is embedded below.

    Ars Technica and others criticized the agency for asking more questions instead of taking action.

    Will the FCC get net neutrality right?. Hard to say. The Center for Democracy and Technology, by way of contrast, endorsed the FCC focusing in on key issues in the net neutrality debate "as a good sign that the FCC is rolling up its sleeves to grapple with the most contentious issues." As my colleague Andy Oram pointed out this week, the net neutrality debate depends upon what you fear. The only safe bet here is that is likely to get a fresh batch of public comment to take into the record.

    Addressing the digital divide

    Online debates over net neutrality or the proposed broadband plan leave out a key constituency: the citizens who do not have access to the Internet. The information needs of communities in a democracy were the focus of the recent report by the Knight Commission.

    To that point, VanRoekel spoke with the Sunlight Foundation's executive director, Ellen Miller, earlier this year about how everyone is changing everything. Their conversation is embedded below:

    In our interview, VanRoekel focused on how mechanisms of community activation can be used to include disconnected people.

    VanRoekel pointed to the growth of mobile access and social media uptake in communities that have been traditionally less connected. That's a focus that is substantiated by Pew Research that shows citizens turning to the Internet for government data, policy and services, particularly minority communities.

    Traditional outreach is still viable as well. Community organizers can reach people on the ground and involve key constituencies. "We also can go back to 800 numbers," he said. "Using voice to offer access and adding the ability to enter into the public record."


    August 27 2010

    Applying the lessons of Enterprise 2.0 to Gov 2.0

    Last year, MIT professor Andrew McAfee published a landmark book on the business use and impact of social software platforms titled Enterprise 2.0: New Collaborative Tools for Your Organization’s Toughest Challenges. The book is a collection of McAfee's research since the spring of 2006 when he coined the phrase Enterprise 2.0. Shorthand for enterprise social software, Enterprise 2.0 is the strategic integration of Web 2.0 technologies into an organization's intranet, extranet, and business processes. Those technologies, including wikis, blogs, prediction markets, social networks, microblogging, and RSS, have in turn been adopted by government agencies, a phenomenon that falls under the mantle of Gov 2.0. As the use of such technology has grown, Congress is now considering the risks and rewards of Web 2.0 for federal agencies.

    Gov 2.0 Summit, 2010The insights McAfee has gained from years of research into the use of social software by large organizations have broad application to understanding how and where technology will change government, and it's the basis for his talk, New Collaborative Tools for Government's Toughest Challenges, at the Gov 2.0 Summit in Washington D.C. I spoke in detail with Andrew, and anyone interested in understanding how social software is being used in large organizations will find the full half-hour audio interview of great interest.

    Below are the questions I asked, and timestamps for the audio of where they start if readers want to jump ahead.


    How is Enterprise 2.0 different from Web 2.0? And how does it apply to so-called Government 2.0? What do rules and regulations mean for the growth of social software? What does this mean for open government?
    (Answer begins at 4:55)

    Does automated filtering hold promise for government or the enterprise to prevent sensitive information from leaking? (Answer begins at 7:13)

    Do reports of exfiltration of data from intelligence agencies mean collaborative software is a risk? (Answer begins at 8:35)

    One of the examples in Enterprise 2.0 is Intellipedia. What lessons does its creation and evolution hold for the intelligence agencies? What about other government entities? (Answer begins at 9:52)

    My interview with Sean Dennehy and Don Burke, the two CIA officers who have spearheaded the Intellipedia effort since its inception, is embedded below:

    One of the most interesting parts of the book, for me, was the discussion of ideation platforms and collective intelligence. Government agencies are really running with the concept, including the upcoming launch of Innocentive shows another model. But does crowdsourcing really work? When, and under what conditions? What are the lessons from the private sector and academia in that regard? (Answer begins at 15:00)

    You can read more about how game mechanics and crowdsourcing were combined to solve a complex challenge at Professor McAfee's blog.

    What are the most common mistakes in implementations of social software, or ESSPs as you call them? Specifically, how do you set up effective crowdsourcing platforms? (Answer begins at 19:10)

    What did the MIT "balloon team" that won the DARPA Network Challenge do right? (Answer begins at 21:09)

    What challenges - and opportunities does the incoming millennial workforce hold for government and business with respect to IT? What does research show about how boomers, Gen Xers, and millennials interact, collaborate and work? Are there some myths to bust with respect to entrepreneurship and innovation? (Answer begins at 23:29)

    What are the cultural issues around adoption of Enterprise 2.0 and Gov 2.0? (Answer begins at 27:07)

    What does your new research on the strategic implementation of IT in large enterprises show to date? Why does government lag the private sector in this area, in the so-called "IT gap?" What could be done about it? (Answer begins at 30:03)

    August 23 2010

    Cost is only part of the Gov 2.0 open source story

    Gov 2.0 SummitBryan Sivak, chief technology officer for the District of Columbia and a speaker at the upcoming Gov 2.0 Summit, has smartly mixed healthy realism with enthusiastic support for open source in government. The result is a message that resonates beyond open source evangelists.

    For example, here's what he recently had to say about the allure of open source cost savings:

    "I don't think cost savings of open source is the panacea that everyone thinks it is. It's true that there's no upfront licensing cost, but there's cost in figuring out the appropriate implementation strategy, making sure you have the people with the right skills on staff, and making sure you're able to maintain and manage the system. You need to put a lot into how you implement it."

    Acknowledging the limits of open source savings is key to ongoing use. It's all about managing expectations: If I expect 100 percent savings and your open source solution only offers 50 percent, I won't be pleased. But lead with the real story and show me the other benefits and maybe I'll commit for the short- and long-term.

    Speaking of those other benefits: Sivak noted during our interview that open source's real upside lies in its ability to expand the talent pool and take government transparency to a new level.

    "You can get people to help you build things who would not normally be involved in that process," Sivak said. "In a weird sort of way, we're's actually taking this concept of government openness and transparency and making it even more open and transparent. We're saying: 'Here are our business processes. Here are the things we need to accomplish with this tool or this solution. Help us accomplish this'."

    Bryan Sivak will discuss code sharing between cities at the Gov 2.0 Summit, being held Sept. 7-8 in Washington D.C. Learn more and request an invitation.

    Sivak explored a host of related ideas in our full discussion, including:

    • Why sharing software and technology projects between local, state and federal governments could solve a "multi-billion dollar problem."
    • How Code for America's Civic Commons project is applying the lessons of Linux to open government. "If we can create a foundation that makes it easy for governments to adopt this open source stack, and everything that goes along with it ... then we've got a winner," he said.
    • And finally: Why he thinks "Gov 2.0" might need a new name.

    The following video contains the full interview:


    Bryan Sivak will discuss the Civic Commons project at the Gov 2.0 Summit, being held Sept. 7-8 in Washington, D.C. Request an invitation to attend.

    August 17 2010

    Tracking the tech that will make government better

    Gov 2.0 Summit, 2010Will crowdsourcing and next-generation data mining tools enable the federal government to find innovative solutions to grand challenges and reduce fraud?

    Last week, the Senate Committee on Homeland Security and Governmental Affairs Subcommittee on Federal Financial Management, Government Information, Federal Services, and International Security held a hearing on "Transforming Government Through Innovative Tools and Technology" that looked at the potential for technology to improve government transparency and accountability. The first part of the hearing featured testimony from Daniel Werfel, controller of the Office of Federal Financial Management within the Office of Management and Budget, and Earl Devaney, chairman of the Recovery Accountability and Transparency Board (RATB). You can view their written testimony and archived webcast at

    Riley Crane, a post-doc at the Media Laboratory Human Dynamics Group at the Massachusetts Institute of Technology, shared one of the most successful examples of crowdsourcing in history: the strategy that led to the MIT balloon team's victory in the DARPA Network Challenge.

    "For the first time, we can bridge the gap between online and the real world," testified Crane. A challenge "thought impossible by the intelligence community using traditional techniques" was solved in 8 hours and 52 minutes, said Crane. "We leveraged the problem-solving capabilities of the participants," said Crane, and "built the infrastructure that allowed others to solve the problem for us." As Brian Ahier pointed out on his blog on healthcare IT, Crane praised Tim O'Reilly's "government as a platform" concept and Gov 2.0 principles in his Senate testimony:

    More on the potential of crowdsourcing and how open data analysis is improving fraud detection after the jump.

    Tools for government transparency

    Devaney asserted that stimulus tracking technology is the model for government transparency. As Gautham Nagesh reported for The Hill, Devaney said that "the amount of fraud he has seen in the Recovery Act is significantly below what he would expect on a program of its size. He argued that the transparency inherent in the program has acted as a deterrent to scam artists, keeping fraud down to a minimum. He said the RATB uses software to identify risk factors associated with particular awards, contractors or grant recipients, then refers the information to specific inspector generals for further investigation."

    And as Aliya Sternstein reported for Nextgov, "compu-forensics saved stimulus funds and Devaney from some pointed questions by John McCain on how the RAT Board's data analytics has saved money or resulted in convictions. Devaney said that there are more than 350 ongoing criminal investigations into fraud and abuse of Recovery Act funds, and, while none have resulted in any convictions yet, he expects to see resolution on some of them within six months.

    Devaney also noted that the Recovery Board put up two separate websites in under six months, and "Government usually takes years to do that," he said. "We didn't do it the way the government usually does it but that's probably what made it work."

    Challenges in data accuracy or clarity in project updates continue, however, despite Devaney's assertions. That said, there's more to the story than data or website infrastructure. As Jason Miller reported for Federal News Radio, the Recovery Board's success inspires others within government. It's not the success of the "RAT Board" in moving to the cloud or redesigning the site: it's the use of a data visualization software for fraud detection.

    This tool is a key a component of the enforcement of the Obama administration's "do not call" list, which was announced in June of this year.

    Crowdsourcing and the power of open data

    For those who weren't familiar with the tool that Werfel or Devaney described, the second panel that testified before the Senate provided the answer: Palantir Technologies. Until TechCrunch's recent coverage, in which Evylyn Russell called Palantir the next billion dollar company, the data analysis software developer was operating under the radar, at least outside of the beltway.

    Analysts within the intelligence agencies are using Palantir to fight cybercrime. Transparency wonks are exploring with AnalyzeThe.Us. And the Department of Health and Human Services is using Palantir internally to detect fraud with the Medicare system. That innovation, incidentally, is precisely why Palantir is in the technology spotlight at the Gov 2.0 Summit next month.

    "We're specialized in the least glamorized part of finding fraud," testified Alexander Karp, founder and CEO of Palantir. "Palantir is based on a methodology that reduced fraud at PayPal from something that takes thousand of hours to something that could be done in real-time."

    When applied within government agencies and enterprises, Palantir helps non-technical analysts see latent patterns in open data. Effectively, it is a platform that allows subject-matter experts to perform highly sophisticated analyses. "The inspector general community has never had these tools before," said Devaney in his testimony.

    Karp's written testimony is embedded below:

    Following Karp, Rob McEwen, founder and former chairman and CEO of Goldcorp, told a story about the power of crowdsourcing that will be familiar to readers of "WIkinomics." As described in this excerpt, Goldcorp published online every element of geographic data the company held, investing nearly $1 million in prize money and website development.

    Virtual prospectors spread to the site and, in time, identified more than 100 sites on a 55,000-acre property that yielded 8 million ounces of gold. "Incentives can be much more than cash," said McEwen. "Nobody is as smart as everybody. The biggest goldmine in the world exists between everyone's ears."

    After McEwen, Crane offered insights into MIT's win in the DARPA Network Challenge, as described above. I talked to Crane later about crowdsourcing and government. Here's our short interview:

    Crane's written testimony is embedded below:


    The link between technical innovation and government improvement will be explored at the Gov 2.0 Summit, being held Sept. 7-8 in Washington, D.C. Request an invitation.

    August 11 2010

    Hearing those digital cries for help

    Gov 2.0 Summit, 2010New research from the Red Cross shows that online, people increasingly rely on social media to seek help in a disaster. As ReadWriteWeb reported, the Red Cross survey found that 74 percent of social media users expect help within an hour.

    Tomorrow in Washington, D.C., the Red Cross will convene an Emergency Social Data Summit, bringing together representatives from the White House, technologists, first responders, non-governmental communities, and citizens to "address how to reply to these digital cries for help more effectively."

    What's at the heart of this phenomenon? Simply put, the Internet is helping disaster response evolve -- and quickly. In the video below, NPR's senior social strategist, Andy Carvin talks about how people all over the world are collaborating to help in crisis.

    After the jump, learn more about the summit, the power of platforms for collective action, and the rising adhocracy that empowers citizens to help one another online.

    Convening the Emergency Social Data Summit

    The agenda includes Gail McGovern, president and CEO of the American Red Cross, @WhiteHouse's Macon Phillips, FEMA administrator Craig Fugate>, uberblogger Robert Scoble, podcamper Christopher Penn, CrisisCommons' Heather Blanchard and Noel Dickover, Ushahidi's Patrick Meier and dozens of others who have been involved in disaster response using social media, including myself. Beth Kanter, a noted authority on "networked nonprofits" and social media, wrote about the emergency social dataevent on her blog.

    In January, after the Haiti earthquake struck, if you were participating on social networks, you couldn't help but notice the many, many Tweets and Facebook status messages about the Haiti earthquake.   The messages included pleas for support or retweeting the news, but beyond that the stream included pleas from people on the ground in Haiti asking for emergency assistance or letting loved ones and friends know they're okay.

    Social media has radically changed how people communicate, including their calls for help. As we have seen in natural disasters from Hurricane Katrina to the Chile earthquake, people are using social media to reach out for help. And they expect a response from emergency and disaster response organizations. To meet this growing challenge, the American Red Cross is launching an initiative to address how to reply to these digital cries for help more effectively.

    Kanter's company Zoetica, and co-founders Geoff Livingston and Kami Huyse , have been working with the Red Cross on the summit for months. As Kanter pointed out, this initiative includes more than hosting the Emergency Social Data Summit itself, with an accompanying backchannel on Twitter on the #crisisdata hashtag.

    As has been the case for the disaster communications, "the Summit will use both established and more experimental social media tools and platforms to involve people who are not in the room in the discussion," wrote Kanter. Along with Twitter, those tools include:

    Kanter described it as "a geo location crowdsourced storytelling application. Conference attendees will have the opportunity to join an "Emergency Data Society," on the service, which Kanter said "will facilitate a self-organized, community scrapbook of the event from attendees."

    The power of crisis response platforms

    The Red Cross has posted the first three chapters of a white paper based on the Summit's themes at the emergency social data blog, including the case for integrating crisis response with social media, how social media has changed news gathering during disaster response, and the crisis collaboration movement, which documents the growth of Crisis Commons from camps in 2009 to a globally distributed platform. All three of these posts are thorough investigations of a shift from a citizenry limited by a broadcast model and disaster fatigue to an empowered, participatory public.

    I'm humbled to have contributed to documenting the first Crisis Camp Haiti and subsequent efforts this spring and summer, and to have attended the first international Crisis Congress. As aspirations translated to code, a movement of "geeks without borders spread around the globe. More recently, responding to the Gulf oil spill, Crisis Commons delivered Oil Reporter, an open data initiative that provided a "suite of tools and resources that encourages response organizations to capture and share data with the public."

    The energy, passion and innovation that collectively drive Crisis Commons are possible because we're in a unique historical moment. Hundreds of millions of people online can see disasters like the Haiti earthquake unfold on the real-time Web. Unlike the natural disasters or man-made crises of the past, however, citizens, government, media and developers can now do more to help those affected, whether by mobile donations, crisis mapping, timely tweets or random hacks of kindness.

    Given the scope of the crises that humanity faces, the power of social software to empower citizens is of critical interest to many constituencies. After tomorrow's summit concludes, I'll be looking forward to hearing about making states work better from Clare Lockhart, Steve Killelea and Ory Okolloh at the Gov 2.0 Summit in Washington.

    The challenges, successes, and opportunities presented by new platforms for civic engagement and empowerment call into question recent reports of crowdsourcing losing steam. The increasing use of online platforms for civic engagement as platforms for civic empowerment hints at what might be possible in the future, as more sophisticated tools are developed for an increasingly connected humanity. After Haiti, collaborative action between government, developers, citizens, and NGOs is no longer an academic theory: it's a proud art of our history. The "adhocracy" that Alvin Toffler presciently described in 1970 has come to be through the power of networks. To put the power of that possibility in perspective, here's Tim O'Reilly speaking at OSCON:

    And here's Andy Carvin's talk and slides on The New Volunteers: Social Media, Disaster Response And You:

    I hope you'll tune in to the Emergency Social Data Summit tomorrow.


    August 06 2010

    Gov 2.0 Week in Review

    Last week, I watched Tim O'Reilly talk about Gov 2.0 and Code for America with Shira Lazar on CBS News. Their interview focused upon how Gov 2.0 uses the technology and innovation of Web 2.0 to address the needs of government. As Lazar put it, the future of government is in your hands.

    The call to civic action that is implicit in that vision for Gov 2.0 may resonate withGeneration Y in unprecedented ways. New Research on millennials by the Center for American Progress (CAP) and the "the generation gap in government" suggests a a majority of millennials "would be more likely to support political candidates who embrace improving government performance, effectiveness, and efficiency." The poll from CAP suggested, in a larger sense, that Americans want better government, not smaller government.

    The potential for open government, open data and innovative technology to empower citizens, save costs and inform better policy is compelling the summer of 2010. Consider the video case for open transit data below, or the rest of the news in this week's Gov 2.0 Review after the jump. As William Gibson has observed, "the future is already here - it's just not evenly distributed."

    Open Government

    Open government is a mindset. Will it be sustainable over the long term?

    A "historic milestone in making government more open" went live this summer when the new Federal Register beta launched at As deputy White House CTO Beth Noveck observed, "Federal Register 2.0" is "collaborative government at its best.

    The Secretary of Education announced the Learning Registry, a government-wide initiative that would create a platform for educational content across. Steve Midgley blogged about the Learning Registry at "It's an important and challenging opportunity that raises hard technical questions about federated search and hard process questions about cross-agency collaboration," said Noveck. Social Security announced an open government video contest on "how Social Security has made a difference" in citizens' lives.

    And speaking of video contests, the winner of the EPA's "Rulemaking Matters - Let Your Voice Be Heard video competition" is worth a watch, if you're interested in the ideas behind

    The video below, also from the contest, is a fine example of "Schoolhouse Rock for Rulemaking," as deputy White House CTO Beth Noveck observed on Twitter.

    The state of open government and transparency in Ireland, as that government opens the processes of democracy to scrutiny is also worth considering.

    The signing and release of Law.Gov core principles show what a proposed distributed repository of all primary legal materials of the United States could be.

    The schedules for the President and Vice President of the United States are online are now available to the public online.

    Open government also means understandable and usable government communications. NPR's Liane Hansen interview Dave McClure from the GSA about "America's Website Newly User-Friendly,"

    Open Data

    "A year ago, two representatives from the Massachusetts Department of Transportation (MassDOT) met with a group of developers and interested citizens to talk about opening up public transportation data, including subway, bus, commuter train, boat, highway information, and RMV," said Laurel Ruma, O'Reilly's Gov 2.0 Evangelist via email. "MassDOT wanted to know what data developers would find interesting, figure out how best to serve it up, and get a feel for what the developer community would do with it. Many meetings, two contests, and a holiday party later, the results have been outstanding: visualizations, applications, signs, and even an IVR system built from scratch (see MassDOT Developers Page)."

    In just one year, said Ruma, "Boston has gone from having stacks of paper schedules to real-time feeds for 135 out of 185 bus lines in the MBTA system (the rest will be available by the end of summer). Not only did the state government give citizens what they wanted, but they encouraged an innovation economy, built community, made the papers, and, in general, built goodwill."

    Way to go, Bay State.

    "Data Are Not Information," wrote Jeff Stanger, exploring the relationship between open data and open government

    . Open data needs open source tools, argued Clay Johnson. Johnson also points to a thoughtful post by Dan McQuillan that contends that open data does not empower communities.

    Finally, consider a compelling piece by Mark Headd at Govfresh on Delaware's progress towards Gov 2.0, "the opposite of open government." As Headd writes, "When it comes to environmental data, and data on contaminated groundwater, open government is not about citizen convenience or improved government efficiency. It is about giving people the information they need so that they can make informed decisions about their own lives and the lives of their families and children."

    Government: Is There An App for That?

    The nation's largest library celebrated a the launch of its first official iPhone app, when the Library of Congress got a mobile app.

    General Sorenson, the CIO of Army, announced the winners of the Apps for the Army in Florida. The contest more than doubled expected participation, writes Peter Corbett.

    Mobile applications from government or that work with government data are changing the way citizens navigate the world. Forbes recently shared a great list of ten socially responsible mobile apps.

    Federal Computer Week also published a solid selection of government Web apps that get results, including the Twitter earthquake detector, the State Department's Haiti tech resource page, USAID's Global pulse and more.

    Gov 2.0 and Accessibility

    The power of technology and equality came into sharp focus this year on the 20th anniversary of the Americans with Disabilities Act.

    Gov 2.0 and Web 2.0 at odds over accessibility in Australia> where a low-bandwidth, text-only Web is key to open government goals and addressing the digital divide. Australia’s CIO urged civil servants to become “Gov 2.0 activists” and shared some tough talk on accessibility.

    In the United States, the new faced a tough critique of accessibility.

    The new F.C.C. team aims for accessibility," reported Politico in a profile of the folks behind

    Wikileaks, Wookielieaks and Secrecy

    This summer, the Robin Sage experiment obtained a photo of a soldier in Afghanistan w/embedded location data, reminding everyone of the national security risks of Gov 2.0 and the social Web.

    The unprecedented release of more than 92,000 documents in Afghan War Diaries by Wikileaks is an powerful reminder of the power of technology to disrupt traditional information flows. For more on Wikileaks and context, make sure to read the Nieman Lab's excellent week in review. The photography of Afghanistan on display at's Big Picture photoblog add visual context for what's at stake.

    "The release of these documents has not affected the strategy. Many of them were very, very old," said Admiral Mullen on "Meet The Press." Both he and Defense Secretary Gates were extremely critical of the release of the names or locations online and the moral culpability of the sight. "They have put this out without any regards whatsoever for the consequences," said Secretary Gates this week.

    The could change the way reporters deal with secrets.

    The relationship Wikileaks, government 2.0 and media hurricanes is likely to be hotly debated for months to come. The story continued to evolve this week when the Pentagon threatened to compel Wikileaks to hand over the Afghan war diaries. Given Wikileaks' distributed architecture, that may be unlikely.

    What is unquestionable, however, is that the "Wookieleaks" meme that exploded onto Twitter quickly produced more tweets than War Log documents. Marc Ambinder called Wooklileaks the "best hashtag ever," while NPR reported that Wookieleaks was popular because geeks like to go deep on things. The best Of Wookieleaks certainly show that geeks have a sense of humor.

    Facebook, Privacy and Government

    Yes, Mr. Zuckerberg went to Washington, where Facebook faces online privacy concerns. For more on the online privacy debates in Washington, including hearings where Facebook's CTO and CSO testified before Congress, read my most recent post.

    A White House proposal, reported by the Washington Post, that would ease FBI access to records of Internet activity" is a reminder that governments themselves have complicated relationships with electronic privacy. So is the news that the United Arab Emirates and Saudi Arabia would move to block Black Berry messaging. Putting RIM's 'security' challenges in perspective is important, as is government's own record on online privacy.

    Does Government Get Social Media?

    There are unprecedented ways that government is leveraging the Internet as platform for communications. As President Obama's YouTube addresses are the fireside chats of the 21st century. I was reminded when I watched his context to the President's message.

    Despite the new media prowess of the White House, however, the Knight Commission blogged that a Ambassador Rice and the U.S. mission to the UN is now on Facebook but isn't replying to comments there.

    One agency that does get social media is NASA. They hosted another NASA Tweetup in Washington, which was streamed live online at NASA TV and discussed at The event featured @astro_tj, who was the first man to tweet from space.

    awhich elected officials are tweeting. TweetCongress tweeted that over 200 members of Congress are now tweeting. A new study on Twitter in Congress asserted that Democrats use Twitter for transparency, while Republicans use it for outreach.

    For a useful perspective outside of the United States, First Monday published a terrific Gov 2.0 case study in government and e-participation at Brazil's House & Presidential websites.

    And in a novel use of crowdsourcing, Delhi police are using Facebook to track scofflaw drivers, in the latest example of Clay Shirky's "Cognitive Surplus" at work. For a reminder of that concept, check out the TED Talk below.

    Open Source and Government

    Lockheed-Martin went open source, posted Red Hat's Gunnar Helleks, but "tinfoil hats abound." Check out for more the open source project.

    This past week, a miltary open source unconference in Washington explored innovation in this space. Whether considering when code disappears in the government or how a CIA software developer went open source, as you can see at


    If you missed OSCON in Portland, Oregon, several videos that discuss open source and government are worth watching.

    Jennifer Pahlka of Code for America and "Coding the Next Generation of American History"

    Bryan Sivak, DC CTO, on the District of Columbia on open source

    Mayor Sam Adams of the city of Portland on "America's open source city

    Gov 2.0 Summit Draws Near

    As Tim O'Reilly wrote this morning, the upcoming Gov 2.0 Summit in Washington will be about opening the doors of government to innovation. His argument that "open government spurs innovation. This year, education and health care will be key themes.

    Gov 2.0 Bits and Bytes

    If you missed it, President Obama demonstrated the new

    As Macon Phillips observed, this was not your ordinary website demo. When the President used online video to go straight to the American people to explain a new online resource, it's noteworthy.

    In late July, there was a Virtual summit on Apps for Local Government.

    An oil pipeline leak in Michigan meant Crisis Commons asked for volunteers to upload URLs of Kalamazoo River resources to

    The State Department hosted a "techATState" event focused on "mobile money."

    The Department of Transportation on launched IdeaHub, "online community where employees can post ideas for their colleagues to comment and build upon."

    And The Sunlight Foundation launched new Congress Android app. The iPhone app is available at

    Opening the doors of government to innovation

    When I organize a conference, I don't just reach out to interesting speakers. I try to find people who can help to tell a story about what's important and where the future is going. We've been posting speakers for the second annual Gov 2.0 Summit in Washington DC Sept 7-8, but I realized that I haven't told the story in one place. I thought I'd try to do that here.

    Gov 2.0 Summit, 2010First off, our goal at the Gov 2.0 Summit is to bring together innovators from government and the private sector to highlight technology and ideas that can be applied to the nation’s great challenges. In areas as diverse as education, health care, energy, jobs, and financial reform, there are unique opportunities to rethink how government agencies perform their mission and serve citizens. Social media, cloud computing, web, and mobile technologies -- all provide unique new capabilities that government agencies are beginning to harness to achieve demonstrably better results at lower cost.

    Our focus this year is on opening the doors to innovation - learning about the latest technology and its application, and breaking down the barriers to its adoption.

    Here are some of the themes we're exploring:

    1. The Power of Platforms

    If there’s one thing we learn from Apple’s iPhone, it’s the power of a platform to spark innovation. Apple revolutionized the smartphone market not just by producing an innovative phone, but by opening up that phone to independent developers. As if by magic, the 15 to 20 applications they designed and released themselves soon became hundreds of thousands, in a textbook demonstration of just what can happen when you harness the power of the marketplace.

    So too, government programs can be designed as platforms rather than as fully-specified applications. In this section of the program, we look at some key areas where government is demonstrating strategic mastery of platform thinking, as well as at some innovative private sector programs that can be adapted for government use.

    We'll hear from speakers including:

    • Harlan Yu of Princeton, one of the authors of the paper Government Data and the Invisible Hand, which outlines the rationale for opening up government data in machine-readable form.

    • Jim Traficant, who is not only the vice president in charge of the Healthcare Solutions group at Harris Corp, but has intensely personal reasons to believe in the importance of electronic medical records: they saved his life. Twice. He’ll tell us why electronic medical records can and must transform our health care system.

    • XBRL US CEO Mark Bolgiano and the Department of Homeland Security’s Executive Director for Information Sharing (and NIEM Executive Director) Donna Roy, who will share early success stories in using XBRL (Extensible Business Reporting Language) and NIEM (National Information Exchange Model), and suggest how they can be to increase transparency and visibility into “big data” in the private and public sectors, and where they intersect. I'm particularly excited by Mark's thoughts on how to track programs that are funded by the Federal government but actually administered by states or even local jurisdictions. As in healthcare, electronic reporting creates the possibility of feedback loops analogous to those that we've long enjoyed in creating web applications that get smarter the more people use them.

    • Todd Park, CTO of the Department of Health and Human Services, who has a vision of how health care data can be used to create a “holy cow machine” that will let us reduce health care costs and improve health outcomes in the same way that Walmart improves its inventory efficiency or Google improves ad targeting. He’ll talk about how aggregate data about health outcomes is unleashing a torrent of innovation, as we move from paying for volume of care to paying for the value of care in improving actual health outcomes.

    • Clay Johnson, former head of Sunlight Labs, and Indu Subaiya of the Health 2.0 Developer Challenge, who will address the question of how government open data initiatives can best reach out to developers. Developers are the heart and soul of every platform. You can't just "build it and they will come." You have to take practical steps towards developer evangelism.

    I'll talk about some of the speakers in the other parts of the program next week, but as a teaser, let me highlight some of the other themes we're exploring.

    2. Innovation

    Real innovation doesn’t just mean tinkering around the edges. It means remembering your goals, and finding a new way to get there. In this series of sessions, we’ll explore some of the most exciting new sources of innovation, and how they can be harnessed by government. We'll also take a close look at education, one of the foundations of our innovation economy, bringing some fresh voices to the innovation debate.

    3. Improving Government Effectiveness

    It isn’t enough to be innovative. Government agencies also need to be effective. In this series of sessions, we’ll explore topics such as cost savings, efficiency, and customer service.

    4. Empowering Citizens

    “We the people...”, the opening of the US Constitution, is a reminder that our government is nothing other than an expression of the collective will of the citizens. No divine right of kings, no entitled nobles, just we, the people. And government is a mechanism by which we express our will. A mechanism that is being turbocharged by the participatory technologies of the web, social media, and mobile phones. We'll explore how to rethink the role of government in the age of electronic participation.

    5. Identity, Privacy, and Informed Consent in the Age of the Internet

    Many of today’s most powerful technologies depend on trust - trust that when a consumer or citizen provides information, either explicitly or implicitly, to a web or mobile application, that information won’t be misused. Trust is essential, because in order to receive the benefits of social, mobile, and real-time applications, consumers must provide information that has the potential to be misused - their location, their friends, what they are doing, what they are buying, what they are saying, what medications they are taking, how much energy their homes and businesses are using, and much more. The answer is not to treat this information as a kind of toxic asset, and build Maginot lines to protect it, but to build policy frameworks around acceptable use, and penalties for misuse. We'll explore where the technology is leading us and what those policy frameworks might be.

    I'm really excited to have such an amazing blend of industry AND Federal heavyweights on the program and in the audience because it gives us an opportunity to explore what the latest technology means for the crafting of future policy and strategy. We've got CTOs and other key executives from major technology companies, including Cisco, VMWare, PayPal, IBM, and Facebook, and their opposite numbers at the Department of Health and Human Services, the Department of Education, the Department of Energy, and the White House Office of Science and Technology Policy. We've also got innovative small companies, educators, and deep thinkers about the future, all with a shared goal of making things work better.

    I'll share more detail on some of the other program themes and speakers over the next few weeks.


    The Gov 2.0 Summit will be held Sept. 7-8 in Washington, D.C. Learn more and request an invitation.

    Online privacy debates heat up in Washington

    IMG_1227.JPGThe issue of electronic privacy is as hot as the weather in Washington this summer. Last week, both the House and Senate held hearings on online privacy, featuring testimony from top executives from Facebook, Apple and Google. The Washington Post published a massive investigative report on the growth of "Top Secret America" in July. And the Wall Street Journal's new series on online privacy, "What They Know," has laid bare the scope of tracking technology on the Internet. In the Information Age, our family, friends, neighbors, employers and government all have unprecedented abilities to watch one another, changing the nature of our relationships, workplaces and schools. Solving the privacy dilemma will be difficult, controversial and crucial for the nation's citizens, businesses, regulators and lawmakers.

    Can privacy, social media and business get along? The balance between privacy and societal benefit will be difficult to strike, given the potential to harm civil liberties and the need to preserve the expectation of privacy described by the Fourth Amendment of the Constitution.

    The risks and rewards for federal agencies that adopt Web 2.0 technology weigh government innovation and cost savings against potential security and privacy issues. The risks that social networking poses to consumers, businesses and government aren't theoretical, either, given the expansion of cybercrime and non-state actors targeting online users.

    Given the context of the moment, putting online privacy in perspective will be a key component of the upcoming Gov 2.0 Summit in Washington, where Jules Polonetsky, Tim O'Reilly, John Clippinger and others will discuss privacy in the context of innovation and government. Despite the ongoing debate about Facebook privacy controls, electronic privacy reaches much further than social networking. From cellphone tracking to data brokers to cloud computing to credit bureaus to satellite imaging, our daily lives have become more scrutinized than at any point in human history.

    Digital privacy matters more than ever to citizens in ever country, particularly where autocratic regimes are not constrained by the laws enacted in the United States. As privacy watchdogs like the Electronic Frontier Foundation and Electronic Privacy Information Center have highlighted, law enforcement officials have pushed privacy boundaries in the name of national security.

    Privacy is not dead, though it's fair to say that some of the norms around how, where and when people are sharing information about location, purchases, reading habits or relationships have shifted for some populations. For marginalized members of society, privacy breaches can have tragic outcomes, as danah boyd has discussed in her thoughts on Facebook and radical transparency.

    How might the FTC or FCC regulate online privacy?

    Gov 2.0 Summit, 2010At some point, the Federal Trade Commission (FTC) will also publish the results of the privacy roundtables it held over the past year. The FTC. currently protects consumer privacy under the the FTC Act, which directs the commission to "guard against unfairness and deception by enforcing companies' privacy promises about how they collect, use and secure consumers' personal information." The FTC also has purview over regulating financial privacy under the Gramm-Leach-Bliley Act, consumer privacy under the Fair Credit Reporting Act and the Children's Online Privacy Protection Act. The Department of Health and Human Services enforces healthcare privacy breaches under HIPAA and now the HITECH Act.

    Specific new guidance for the FTC from Congress on electronic privacy, however, has yet to emerge. One direction for legislation comes from the Digital Due Process Coalition, which advocates for an update to the Electronic Communications Privacy Act, given the considerable changes to mobile, location, cloud computing, webmail and social networking technology since it was enacted in 1986.

    In his testimony on online privacy before the Senate Commerce Committee last week, FTC Chairman Jon Leibowitz acknowledged the online advertising industry's principles for self-regulation, including "results that were encouraging." That said, he observed that "if we don't see more progress, we probably will see more focus on prescriptive rules in the next Congress."

    Leibowitz noted three principles for online privacy that had emerged from the roundtables:

    1. Privacy baked in from the beginning, or "privacy by design
    2. Simplified controls, including the potential for a "do not track" mechanism in a Web browser
    3. More transparency about how private information would be used

    In general, the use of private data should be opt-in, not opt-out, said Leibowitz, with clear notice. "Most of the cases that we've brought involve instances where disclosures or use of information was in fine print or designed to be where consumers can't find it," he said. "Good companies want to make these things clear."

    FCC Chairman Genachowski, who testified with the FTC Chairman, noted that "the privacy issues discussed here are not only a fundamental moral issue. To get the economic effects of broadband, people need to be confident that the Internet is a trustworthy place." With adequate information and real choice, Genachowski said that there was an increasing chance that the market will work.

    "With technology changing as rapidly as it does, the lesson should be to pull out core principles and strategies that will work, regardless of how technology evolves," he said.

    Whatever the FTC or FCC do with regards to regulating online privacy, however, is likely to ultimately rely upon Congressional action. The financial regulatory reform bill signed into law by President Obama did not ultimately grant the FTC increased rulemaking authority to regulate online privacy autonomously.

    Online privacy hearings in Congress

    As security technologist Bruce Schneier has argued, privacy and control are closely intermingled. "If we believe privacy is a social good, something necessary for democracy, liberty and human dignity, then we can't rely on market forces to maintain it." wrote Schneier. "Broad legislation protecting personal privacy by giving people control over their personal data is the only solution."

    The nature of that legislation will be on the legislative agenda of Congress this fall after the August recess. Online privacy bills were introduced in the House this year, with varying reactions from industry and privacy advocates.

    Rep. Rush's privacy bill (H.R.5777), has received strong pushback from tech firms, due to its potential restrictions on firms that store personal data. Rep. Boucher's online privacy bill garnered criticism from both industry and advocacy groups.

    IMG_1229.JPGPrivacy and security concerns were highlighted at a House Judiciary subcommittee hearing on social networking last week. At the hearing, the FBI's Gordon Snow described a rising tide of online fraud, identity theft and phishing schemes over Facebook and other networks, where cybercriminals compromised other accounts due to higher levels of trust from familiar senders.

    "Symantec created more malware signatures in the last 15 months than in the past 18 years combined," said Joe Pasqua, the security software company's head of research. A representative of the Secret Service said that the use of social engineering had increased as more sophisticated cybercriminals used multi-faceted attacks to trick users into divulging sensitive personal information or account details. The 2010 Verizon Data Breach Investigations Report, released on the day of the hearing, was a joint effort with the Secret Service.

    "Facebook and other social networks have the power to enrich people's lives in ways that were unimaginable before," said Facebook Chief Security Officer Joe Sullivan, testifying in the hearing. Sullivan said that Facebook was the first major service that required people to build profiles with their real names, "an important choice that allowed people to become more connected and made the service safer." Sullivan, a former federal prosecutor who specialized in high tech crime in Silicon Valley, asked lawmakers for help with improving cyberliteracy with teachers and students. He also requested broader access to hashes for known images of exploited children that Facebook could run against its database of more than 50 billion pictures, the largest photography collection online.

    Sullivan was challenged by EPIC founder Marc Rotenberg, who urged Congress to strengthen privacy laws for Facebook users. Rotenberg said EPIC filed an FTC complaint because of changes in Facebook's privacy settings last December that created exposure that users couldn't control. It was an "unprecedented event in the history of the Internet," said Sullivan, where Facebook "wouldn't let you use the service gain until you go through those pages."

    The shift presented users with a mandatory wizard that, as Nick Bilton reported in his story on Facebook privacy, required users to click through more than 50 buttons to fully opt-out."What Facebook needs is not a Geek Squad but a Granny Squad," said Rep. Zoe Lofgren (D-CA) on the need for simpler privacy controls.

    Sullivan emphasized the contextual privacy controls that allow users to specify who an update is shared with on Facebook. "We want people to make decisions for themselves," he said. "Personally, I choose to share quite a bit, and put different levels of visibility on different types of information."

    That personal choice is at the crux of any statutory solution: if people choose to share information freely online or make an error, neither the government nor tech companies can help. Where the code created by technology providers matters, as Rep. Lofgren observed, is in giving people "the opportunity to have their rights respected."

    Perhaps the most meaningful right is whether user information, behaviors or connections are shared publicly or privately by default. As danah boyd said in her keynote address to the 2010 SXSW Festival, "defaults matter. That's even more so for teenagers and children, whose safety was a recurrent concern at every hearing on online privacy I've attended in Washington. Sullivan noted that Facebook's default settings are different for people under 18, "in terms of what we allow them to do or the type of information made visible to them."

    Questions about child predators were pervasive at the Senate hearing on online privacy, where Senator Klobuchar (D-MN) focused much of her questioning on the issue and Senator Rockefeller (D-WV) repeatedly voiced his concerns on that count.

    "Good parenting is right at the center of protecting children online," said the Cato Institute's Jim Harper in reply. "You're not going to come up with a magical tech solution."

    Klobuchar also questioned Google's Alma Whitten about where Google's users learn about its privacy dashboard. Whitten, whose testimony is embedded below, said that she was "adamant when we created the dashboard that it not be strictly a privacy tool. It needed to be useful."

    Senator Kerry (D-MA) focused his questioning upon Facebook privacy controls and deep packet inspection. "It's fair to say there's a lot of confusion," he said, "with a lot of anxiety in the public at large over what power they have over information. It's not just a commercial component. Information collected might be incorrect or out of context, or might be correct and last longer in the market than they might like to."

    AT&T's representative did publicly attest to using deep packet inspection in their network "for trying to find malware, spyware or purposes of network security." In fact, the federal government itself is using deep packet inspection as part of its cybersecurity initiative as part of its EINSTEIN surveillance program. As Harper observed in his written testimony, "if the federal government is going to work on privacy protection, it should start by getting its own privacy house in order."

    Senator McCaskill (D-MO), by contrast, was concerned about consumers' understanding of their electronic privacy when they printed online coupons to be scanned for offline deals. While McCaskill's questions to Google's representative on that count were misplaced, the issue of whether citizens truly understand how much information is being gathered about their online activity is legitimate and goes to the crux of whether the traditional "notice and consent" model of privacy protections in the terms of service for platforms and software will be viable in the future.

    "National surveys at Annenburg show that large numbers of Americans don't understand how database marketing works," testified Professor Joseph Turow of the University of Pennsylvania.

    That prompted Senator Rockefeller to voice a philosophical question, as to whether "we are dividing ourselves into two classes of people," where some who understand the complexities of online privacy are protected and other who cannot paying a price that they cannot fully understand.

    "I used to believe that could be solved by education," said Turow. "I no longer believe that. It's much too complex. Privacy policy is a scavenger hunt, where links send you into other links to affiliates." Turow expressed his concern where many people are unaware of how the information they consume has been customized to their stored data. "We're going to have a situation where people receive views of the world based upon what others know about them," he said.

    The digital divide in connectivity extends, in other words, to a divide in cyberliteracy. Transparency is not enough. There is deep validity in the concerns of privacy advocates that civil liberties be preserved in the digital age, including a right to privacy that many constitutional scholars perceive in the Constitution. And yet, there are reasons to praise oversharing, as Steven B. Johnson did in Time earlier this year. There are a growing number of ways that social media helps promote better health. Anonymized community health data, can provision healthcare applications that enable citizens and cities to make better decisions. If smart grid privacy concerns can be addressed, more efficient usage could revolutionize the nation's energy infrastructure.

    There may well be reasons to give up some privacy, which means that in enacting new laws and regulations to protect consumers. What happens next will be in the hands of technologists, lawmakers and the online community.


    Identity, privacy and informed consent will be discussed at the Gov 2.0 Summit, being held Sept. 7-8 in Washington, D.C. Learn more and request an invitation.

    Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
    Could not load more posts
    Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
    Just a second, loading more posts...
    You've reached the end.
    Get rid of the ads (sfw)

    Don't be the product, buy the product!