Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

April 27 2012

Passage of CISPA in the U.S. House highlights need for viable cybersecurity legislation

To paraphrase Ben Franklin, he who sacrifices online freedom for the sake of cybersecurity deserves neither. Last night, the Cyber Intelligence Sharing and Protection Act (CISPA) (H.R. 3523) through the United States House of Representatives was sent to a vote a day earlier than scheduled. CISPA passed the House by a vote of 250-180, defying a threatened veto from the White House. The passage of CISPA now sets up a fierce debate in the Senate, where Senate Majority Leader Harry Reid (D-NV) has indicated that he wishes to bring cybersecurity legislation forward for a vote in May.

The votes on H.R. 3523 broke down along largely partisan lines, although dozens of both Democrats and Republicans voted for or against CISPA it in the finally tally. CISPA was introduced last November and approved by the House Intelligence Committee by a 17-1 vote before the end of 2011, which meant that the public has had months to view and comment upon the bill. The bill has 112 cosponsors and received no significant opposition from major U.S. corporations, including the social networking giants and telecommunications companies who would be subject to its contents.

In fact, as an analysis of campaign donations by Maplight showed, over the past two years interest groups that support CISPA have outspent those that oppose it by 12 to 1, ranging from defense contractors, cable and satellite TV providers, software makers, cellular companies and online computer services.

While the version of CISPA that passed shifted before the final vote, ProPublica's explainer on CISPA remains a useful resource for people who wish to understand its contents. Declan McCullagh, CNET's tech policy reporter, has also been following the bill closely since it was introduced and he has published an excellent FAQ explaining how CISPA would affect you.

As TechDirt observed last night, the final version of CISPA — available as a PDF from docs.house.gov contained more scope on the information types collected in the name of security. Specifically, CISPA now would allow the federal government to use information for the purpose of investigation and prosecution of cybersecurity crimes, protection of individuals, and the protection of children. In this context, a "cybersecurity crime" would be defined as any crime that involves network disruption or "hacking."

Civil libertarians, from the Electronic Frontier Foundation (EFF) to the American Civil Liberties Union, have been fiercely resisting CISPA for months. "CISPA goes too far for little reason," said Michelle Richardson, the ACLU legislative counsel, in a statement on Thursday. "Cybersecurity does not have to mean abdication of Americans' online privacy. As we've seen repeatedly, once the government gets expansive national security authorities, there's no going back. We encourage the Senate to let this horrible bill fade into obscurity."

Today, there is widespread alarm online over the passage of CISPA, from David Gewirtz calling it heinous at ZDNet to Alexander Furnas exploring its troubling aspects to it being called a direct threat to Internet privacy over at WebProNews.

The Center for Democracy and Technology issued a statement that it was:

"... disappointed that House leadership chose to block amendments on two core issues we had long identified — the flow of information from the private sector directly to NSA and the use of that information for national security purposes unrelated to cybersecurity. Reps. Thompson, Schakowsky, and Lofgren wrote amendments to address those issues, but the leadership did not allow votes on those amendments. Such momentous issues deserved a vote of the full House. We intend to press these issues when the Senate takes up its cybersecurity legislation."

Alexander Furnas included a warning in his nuanced exploration of the bill at The Atlantic:

"CISPA supporters — a list that surprisingly includes SOPA opponent Congressman Darrell Issa — are quick to point out that the bill does not obligate disclosure of any kind. Participation is 'totally voluntary.' They are right, of course, there is no obligation for a private company to participate in CISPA information sharing. However, this misses the point. The cost of this information sharing — in terms of privacy lost and civil liberties violated — is borne by individual customers and Internet users. For them, nothing about CISPA is voluntary and for them there is no recourse. CISPA leaves the protection of peoples' privacy in the hands of companies who don't have a strong incentive to care. Sure, transparency might lead to market pressure on these companies to act in good conscience; but CISPA ensures that no such transparency exists. Without correctly aligned incentives, where control over the data being gathered and shared (or at least knowledge of that sharing) is subject to public accountability and respectful of individual right to privacy, CISPA will inevitably lead to an eco-system that tends towards disclosure and abuse."

The context that already exists around digital technology, civil rights and national security must also be acknowledged for the purposes of public debate. As the EFF's Trevor Timm emphasized earlier this week, once national security is invoked, both civilian and law enforcement wield enormous powers to track and log information about citizens' lives without their knowledge nor practical ability to gain access to the records involved.

On that count, CISPA provoked significant concerns from the open government community, with the Sunlight Foundation's John Wonderlich calling the bill terrible for transparency because it proposes to limit public oversight of the work of information collection and sharing within the federal government.

"The FOIA is, in many ways, the fundamental safeguard for public oversight of government's activities," wrote Wonderlich. "CISPA dismisses it entirely, for the core activities of the newly proposed powers under the bill. If this level of disregard for public accountability exists throughout the other provisions, then CISPA is a mess. Even if it isn't, creating a whole new FOIA exemption for information that is poorly defined and doesn't even exist yet is irresponsible, and should be opposed."

What's the way forward?

The good news, for those concerned about what passage of the bill will mean for the Internet and online privacy, is that now the legislative process turns to the Senate. The open government community's triumphalism around the passage of the DATA Act and the gathering gloom and doom around CISPA all meet the same reality in this respect: checks and balances in the other chamber of Congress and a threatened veto from the White House.

Well done, founding fathers.

On the latter count, the White House has made it clear that the administration views CISPA as a huge overreach on privacy, driving a truck through existing privacy protections. The Obama administration has stated (PDF) that CISPA:

"... effectively treats domestic cybersecurity as an intelligence activity and thus, significantly departs from longstanding efforts to treat the Internet and cyberspace as civilian spheres. The Administration believes that a civilian agency — the Department of Homeland Security — must have a central role in domestic cybersecurity, including for conducting and overseeing the exchange of cybersecurity information with the private sector and with sector-specific Federal agencies."

At a news conference yesterday in Washington, the Republican leadership of the House characterized the administration's position differently. "The White House believes the government ought to control the Internet, government ought to set standards, and government ought to take care of everything that's needed for cybersecurity," said Speaker of the House John Boehner (R-Ohio), who voted for CISPA. "They're in a camp all by themselves."

Representative Mike Rogers (R-Michigan) -- the primary sponsor of the bill, along with Representative Dutch Ruppersberger (D-Maryland) -- accused opponents of "obfuscation" on the House floor yesterday.

While there are people who are not comfortable with the Department of Homeland Security (DHS) holding the keys to the nation's "cyberdefense" — particularly given the expertise and capabilities that rest in the military and intelligence communities — the prospect of military surveillance of citizens within the domestic United States is not likely to be one that the founding fathers would support, particularly without significant oversight from the Congress.

CISPA does not, however, formally grant either the National Security Agency or DHS any more powers than they already hold under existing legislation, such as the Patriot Act. It would, however, enable more information sharing between private companies and government agencies, including threat information pertinent to legitimate national security concerns.

It's crucial to recognize that cybersecurity legislation has been percolating in the Senate for years now without passage. That issue of civilian oversight is a key issue in the Senate wrangling, where major bills have been circulating for years now without passage, from proposals from Senator Lieberman's office on cybersecurity to the ICE Act from Senator Carper to Senator McCain's proposals.

If the fight over CISPA is "just beginning", as Andy Greenberg wrote in Forbes today, it's important for everyone that's getting involved because of concerns over civil liberties or privacy recognizes that CISPA is not like SOPA, as Brian Fung wrote in the American Prospect, particularly after provisions regarding intellectual property were dropped:

"At some point, privacy groups will have to come to an agreement with Congress over Internet legislation or risk being tarred as obstructionists. That, combined with the fact that most ordinary Americans lack the means to distinguish among the vagaries of different bills, suggests that Congress is likely to win out over the objections of EFF and the ACLU sooner rather than later. Thinking of CISPA as just another SOPA not only prolongs the inevitable — it's a poor analogy that obscures more than it reveals."

That doesn't mean that those objections aren't important or necessary. It does mean, however, that anyone who wishes to join the debate must recognize that genuine security threats do exist, even though massive hype about a potential "Cyber 9/11" perpetuated by contractors that stand to benefit from spending continues to pervade the media. There are legitimate concerns regarding the theft of industrial secrets, "crimesourcing" by organized crime and the reality of digital agents from the Chinese, Iranian and Russian governments — along with non-state actors — exploring the IT infrastructure of the United States.

The simple reality is that in Washington, national security trumps everything. It's not like intellectual property or energy or education or healthcare. What anyone who wishes to get involved in this debate will need to do is to support an affirmative vision for what roles the federal government and the private sector should play in securing the nation's critical infrastructure against electronic attacks. And the relationship of business and government complicates cybersecurity quite a bit, as "Inside Cyber Warfare" author Jeffrey Carr explained here at Radar in February:

"Due to the dependence of the U.S. government upon private contractors, the insecurity of one impacts the security of the other. The fact is that there are an unlimited number of ways that an attacker can compromise a person, organization or government agency due to the interdependencies and connectedness that exist between both."

The good news today is that increased awareness of the issue will drive more public debate about what's to be done. During the week the Web changed Washington in January, the world saw how the Internet can act as a platform for collective action against a bill.

Civil liberties groups have vowed to continue advocating against the passage of any vaguely drafted bill in the Senate.

On Monday, more than 60 distinguished IT security professionals, academics and engineers published an open letter to Congress urging opposition to any "'cybersecurity' initiative that does not explicitly include appropriate methods to ensure the protection of users’ civil liberties."

The open question now, as with intellectual property, is whether major broadcast and print media outlets in the United States will take their role of educating citizens seriously enough for the nation to meaningfully participate in legislative action.

This is a debate that will balance the freedoms that the nation has fought hard to achieve and defend throughout its history against the dangers we collectively face in a century when digital technologies have become interwoven into the everyday lives of citizens. We live in a networked age, with new attendant risks and rewards.

Citizens should hold their legislators accountable for supporting bills that balance civil liberties, public oversight and privacy protections with improvements to how the public and private sector monitors, mitigates and shares information about network security threats in the 21st century.

March 27 2012

FTC calls on Congress to enact baseline privacy legislation and more transparency of data brokers

Over a century ago, Supreme Court Justice Lewis Brandeis "could not have imagined phones that keep track of where we are going, search engines that predict what we're thinking, advertisers that monitor what we're reading, and data brokers who maintain dossiers of every who, what, where, when and how of our lives," said Federal Trade Commission Chairman Jon Leibowitz yesterday morning in Washington, announcing the release of the final version of its framework on consumer privacy.,

"But he knew that, when technology changes dramatically, consumers need privacy protections that update just as quickly. So we issue our report today to ensure that, online and off, the right to privacy, that 'right most valued by civilized men,' remains relevant and robust to Americans in the 21st century as it was nearly 100 years ago."

What, exactly, privacy means in this digital age is still being defined all around us, reflected in the increasing number of small screens, cameras and explosion of data. The FTC's final report, "Protecting Consumer Privacy in an Era of Rapid Change: Recommendations For Businesses and Policymakers," makes a strong recommendation to Congress to draft and pass a strong consumer privacy law that provides rules of the road for the various entities that have the responsibility for protecting sensitive data.

The final report clearly enumerates the same three basic principles that the draft of the FTC's privacy framework outlined for companies :

  1. Privacy by design, where privacy is "built in" at every stage that an application, service or product is developed
  2. Simplified choice, wherein consumers are empowered to make informed decisions by clear information about how their data will be used at a relevant "time and context," including a "Do Not Track" mechanism, and businesses are freed of the burden of providing unnecessary choices
  3. Greater transparency, where the collection and use of consumer data is made more clear to those who own it.

"We are demanding more and better protections for consumer privacy not because industry is ignoring the issue," said Leibowitz today. "In fact, the best companies already follow the privacy principles we lay out in the report. In the last year, online advertisers, major browser companies, and the W3C -- an Internet standard setting group -- have all made strides towards putting into place the foundation of a Do Not Track system, and we commit to continue working with them until all consumers can easily and effectively choose not to be tracked. I'm optimistic that we'll get the job done by the end of the year."

According to the FTC, the nation's top consumer watchdog received over 450 comments on the draft online privacy report that it released in December 2010. In response to "technological advances" and comments, the FTC revised the privacy framework in several areas. (For a broad overview of the final FTC privacy framework, read Dan Rowinski's overview at ReadWriteWeb and the Information Law Group's summary of the commission report on consumer privacy).

First, it will not apply to "companies that collect and do not transfer only non-sensitive data from fewer than 5,000 consumers a year," which would have been a burden on small businesses. Second, the FTC has brought action against Google and Facebook since the draft report was issued. Those actions -- and the agreements reached -- provide a model and guidance for other companies.

Third, the FTC made specific recommendations to companies that offer mobile services that include improved privacy protections and disclosures that are short, clear and effective on small screens. Fourth, the report also outlined "heightened privacy concerns" about large platform providers, such as ISPs, "operating systems, browsers and social media companies," seeking to "comprehensively track consumers' online activities." When asked about "social plug-ins" from such a platform, chairman Leibowitz provided Facebook's "Like" button as an example. (Google's +1 button is presumably another such mechanism.)

Finally, the final report also included a specific recommendation with respect to "data brokers," which chairman Leibowitz described as "cyberazzi" on Monday, echoing remarks at the National Press Club in November 2011. Over at Forbes, Kashmir Hill reports that the FTC officially defined a data broker as those who “collect and traffic in the data we leave behind when we travel through virtual and brick-and-mortar spaces."

During the press conference, chairman Leibowitz said that American citizens should be able to learn see what information is held by them and "have the right to correct inaccurate data," much as they do with credit reports. Specifically, the FTC has called on data brokers to "make their operations more transparent by creating a centralized website to identify themselves, and to disclose how they collect and use consumer data. In addition, the website should detail the choices that data brokers provide consumers about their own information."

While the majority of the tech media's stories about the FTC today focused on "Do Not Track" prospects and mechanisms, or the privacy framework's impact on mobile, apps and social media, the reality of this historic moment is it's world's world's data brokers that currently hold immense amounts of information regarding just about everyone "on the grid," even if they never "Like" something on Facebook, turn on a smartphone or buy and use an app.

In other words, even though the FTC's recommendations for privacy by design led TechMeme yesterday, that's wasn't new news. CNET's Declan McCullagh, one of the closest observers of Washington tech policy in the media, picked up on the focus, writing that FTC stops short of calling for a new DNT law but "asks Congress to enact a new law that "would provide consumers with access to information about them held by a data broker" such as Lexis Nexis, US Search, or Reed Elsevier subsidiary Choicepoint -- many of which have been the subject of FTC enforcement actions in the last few years." As McCullagh reported, the American Civil Liberties Union "applauded" the FTC's focus on data brokers.

They should. As Ryan Singel pointed out at Wired, the FTC's report does "call for federal legislation that would force transparency on giant data collection companies like Choicepoint and Lexis Nexis. Few Americans know about those companies’ databases but they are used by law enforcement, employers and landlords."

Would we, as Hill wondered, be less freaked out if we could see what data brokers have on us? A good question, and one that, should the industry coalesce around providing consumers access to their personal data in that context, just as utilities are beginning to do with energy data.

Another year without privacy legislation?

Whether it's "baseline privacy protections" or more transparency for data brokers, the FTC is looking to Congress to act. Whether it will or not is another matter. While the Online privacy debate was just about as hot in Washington nearly two years ago as it is today, no significant laws were passed.The probability of significant consumer privacy legislation advancing in this session of Congress, however, currently appears quite low. While at least four major privacy bills have been introduced in the U.S. House and Senate, "none of that legislation is likely to make it into law in this Congressional session, however, given the heavy schedule of pending matters and re-election campaigns," wrote Tanzina Vegas and Edward Wyatt in the New York Times.

The push the FTC gave yesterday was welcomed in some quarters. "We look forward to working with the FTC toward legislation and further developing the issues presented in the report," said Leslie Harris, president of the Center for Democracy and Technology (CDT), in a prepared release. CDT also endorsed the FTC's guidance on "Do Not Track" and focus on large platform providers. Earlier this winter, a coalition of Internet giants, including Google, Yahoo, Microsoft, and AOL, have committed to adopt “Do Not Track technology” in most Web browsers by the end of 2012. These companies, which deliver almost 90 percent of online behavioral advertisements, have agreed not to track consumers if they choose to opt out of online tracking using the Do Not Track mechanism, which will likely manifest as a button or browser plug-in. All companies that have made this commitment will be subject to FTC enforcement.

By way of contrast, Jim Harper, the Cato Institute's director of information policy studies, called the framework a "groundhog report on privacy," describing it as "regulatory cheerleading of the same kind our government’s all-purpose trade regulator put out a dozen years ago." In May of 2000, wrote Harper, "the FTC issued a report finding “that legislation is necessary to ensure further implementation of fair information practices online” and recommending a framework for such legislation. Congress did not act on that, and things are humming along today without top-down regulation of information practices on the Internet."

Overall, the "industry here has a self-interest beyond avoiding legislation," said Leibowitz during today's press conference. Consumers have very serious concerns about privacy, he went on, alluding to polling data, surveys and conversations, and "better, clearer privacy policies" will lead to people having "more trust in doing business online."


This FTC privacy framework and the White House's consumer privacy bill of rights will, at minimum, inform the debates going forward. What happens next will depend upon Congress finding a way to protect privacy and industry innovation. It will be a difficult balance to strike, particularly given concerns about protecting children online and the continued march of data breaches around the country.

Making technology more accessible

I interviewed Princeton professor Ed Felten, the FTC's chief technologist and co-author of "Government Data and the Invisible Hand" (2009) after yesterday's FTC press conference at FTC headquarters in D.C. In December 2010, we spoke about the FTC's 'Do Not Track' proposal, after the release of the draft report.

Felten launched "Tech at the FTC" last Friday morning, a new blog that he hopes will play a number of different roles in the discussion of technology, government and society.

"It will combine Freedom to Tinker posts," he said, "some of which were op-ed, some more like teaching. The latter is what I'm looking for: explanations of sophisticated technical information that cross over to a non-technical audience."

Felten wants to start a conversation that's "interesting to general public" and "draws them into the discussion" about the intersection of regulation and technology. One aspect of that will be a connected Twitter account, @TechFTC, along with his established social identity, @EdFelten.

Possible future topics will include security issues around passwords and authentication of people in digital environments, both of which Felten finds interesting as they relate to policy. He said that he expects to write about technology stories that are in the news, with the intent of helping citizens to understand at an accessible level what the take away is for them.

Social media and the Internet are "useful to give people a window into the way people in government are thinking about these issues," said Felten. "They let people see that people in government are thinking about technology in a sophisticated way. It's easy to fall into the trap where people in government don't know about technology. That's part of the goal: speak to the technical community in their language.

"Part of my job is to be an ambassador to the technology community, through speaking to and with the public," said Felten. "The blog will help people know how to talk to the FTC and who to talk to, if they want to. People think that we don't want to talk to them. Just emailing us, just calling us, is usually the best way to get a conversation started. You usually don't need a formal process to do this -- and those conversations are really valuable."

In that context, he plans to write more posts like the one that went live Monday morning, on tech highlights of the FTC privacy report, in which he highlighted four sections of the framework that the computer science professor thought would be of interest to techies:

  1. De-identified data (pp. 18-22):   Data that is truly de-identified (or anonymous) can’t be used to infer anything about an individual person or device, so it doesn’t raise privacy concerns.  Of course, it’s not enough just to say that data is anonymous, or that it falls outside some narrow notion of PII.   But beyond that, figuring out whether your dataset is really de-identified can be challenging. If you’re going to claim that data is de-identified, you need to have a good reason-the report calls it a “reasonable level of justified confidence”-for claiming that the data does not allow inferences about individuals.  What “reasonable” means-how confident you have to be-depends on how much data there is, and what the consequences of a breach would be.  But here’s a good rule of thumb: if you plan to use a dataset to personalize or target content to individual consumers, it’s probably not de-identified.
  2. Sensitive data (pp. 47-48):  Certain types of information, such as health and financial information, information about children, and individual geolocation, are sensitive and ought to be treated with special care, for example by getting explicit consent from users before collecting it.   If your service is targeted toward sensitive data, perhaps because of its subject matter or target audience, then you should take extra care to provide transparency and choice and to limit collection and use of information.  If you run a general-purpose site that incidentally collects a little bit of sensitive information, your responsibilities will be more limited.

  • Mobile disclosures (pp. 33-34): The FTC is concerned that too few mobile apps disclose their privacy practices.  Companies often say that users accept their data practices in exchange for getting a service.  But how can users accept your practices if you don’t say what they are?  A better disclosure would tell users not only what data you’re collecting, but also how you are going to use it and with whom you’ll share it.   The challenging part is how to make all of this clear to users without subjecting them to a long privacy policy that they probably won’t have time to read.   FTC staff will be holding a workshop to discuss these issues.

  • Do Not Track (pp. 52-55): DNT gives users a choice about whether to be tracked by third parties as they move across the web.  In this section of the report, the FTC reiterates its five criteria for a successful DNT system, reviews the status of major efforts including the ad industry’s self-regulatory program and the W3C’s work toward a standard for DNT, and talks about what steps remain to get to a system that is practical for consumers and companies alike.


  • When asked about what the developers and founders of startups should be thinking about with respect to the FTC's privacy framework, Felten emphasized those three basic principles -- privacy by design, simplified choice, greater transparency -- and then offered some common sense:

    "Start with the basic question of 'what Section 5 means for you,' he suggested. "If you make a promise to consumers in your privacy policy, consumers are entitled to rely on that. The FTC has brought cases against companies that made them and didn't hold up their responsibility around privacy. You have a responsibility to protect consumer data. If not, you may find yourself on the wrong side of the FTC act if there's a breach and it harms consumers."

    May 27 2011

    February 18 2011

    Apple iTunes gifts users with a privacy hole

    When Apple added a "Gift button" to the iTunes Store in 2006, it provided users with a new way to easily buy music for friends, family or colleagues. In the years since, the Gift button has been extended to TV shows, movies and applications. As MIT research professor Andrew McAfee discovered recently, however, this gift function also comes with a privacy issue: whoever is making the gift can see whether or not the other person already has a song, video or application.

    In his post, McAfee explains how a user could systematically determine whether someone already has a given video or application in his or her library:

    I've been doing some poking around, and have found that it's pretty straightforward for one person (let's call him George Smiley, after John Le Carré's master spy) to find out what music, video, and apps someone else (like me) has purchased or had gifted to them on iTunes.

    nefarious-vppa.jpg

    The key to this privacy hole is having the email address associated with the iTunes account for the person in question. Acquiring email addresses is not the barrier it once was, however, particularly in the age of spear phishing. As McAfee points out, there's no need to establish an account with Apple or spend any money to work through the process. The user targeted also has no knowledge that this is going on, nor any way to stop it from happening, other than disassociating an exposed email address from iTunes.

    The Video Privacy Protection Act and privacy

    McAfee is right: the harm from this privacy hole in iTunes doesn't extend to a data breach of credit card information or other personally identifiable information. That does not, however, mean that there isn't some potential for a headache for Apple, given an accident of history that brings a federal statute into play.

    McAfee, who is a student of history when it comes to the use of collaborative technology in business, looked back at the Supreme Court nomination of Robert Bork. During the hearings, the question of whether Bork believed that the United States Constitution included a general right to personal privacy was raised. After the Washington City Paper published a list of Bork's rentals from a Chicago video store, Congress passed the Video Privacy Protection Act (VPPA), which specifically forbade the wrongful disclosure of personally identifiable rental records of "prerecorded video cassette tapes or similar audio visual material." As McAfee points out, the VPPA has been used in recent years in class-action lawsuits against Facebook and Netflix.

    "If it's a movie purchase, it's a violation of the statute, under the Video Privacy Act," said Danielle Citron, a law professor and privacy researcher at the University of Maryland School of Law. "Certainly if we were in Europe, there's a whole other set of privacy implications. There are even more robust privacy protections there."

    In theory, a highly motivated searcher could take advantage of the security hole through automated scripts or by posting small tasks on a crowdsourcing platform, like Amazon's Mechanical Turk. In practice, those concerns are unlikely to come to fruition. But as McAfee observes, this capability is problematic with respect to personal privacy:

    A person's taste in media can be highly personal, yet all of Apple's more than 10 billion songs and 200 million TV and movie downloads are potentially traceable by the George Smileys of the world — the world's spies, stalkers, yellow journalists, and opposition researchers. Of course, this is nowhere near as big a deal as privacy holes in online health or financial information would be, so we should keep this issue in perspective.

    Citron offered a scenario that extended beyond one consumer looking at another's media consumption. "Imagine if government has a suspect in mind," she posited. "Typically to get reading habits, you'd need a warrant. If you had an email address, you could pretend to gift them and see whether they'd read something. You have to consider reputational harm — if someone doesn't like you discovers that you're reading or watching something salacious, there's a problem."

    Privacy by design

    Whether Apple will move quickly to address the issue with an update is an open question (Apple did not respond when asked for comment today). A new series of privacy lawsuits over the transmission of unique identifiers to application makers would suggest that the lawyers in Cupertino already have their hands full. The larger issue here lies in how technology companies should build platforms with privacy by design, as the electronic privacy report released by the Federal Trade Commission last year recommended. It's worth going back to consider what FTC officials said about privacy by design then.

    "When you're designing systems, and put it in right at the outset, you're in much better shape than adding it later," said Jessica Rich, deputy director of the Bureau of Consumer Protection. "Behavioral advertising, when we came in and started calling on companies to add privacy to their business models, they were saying 'privacy is very costly, and privacy is not in our business models, and you're changing our business models.' The idea of baking it in from the start is actually very good for small businesses," she said.

    "Companies that handle large amounts of sensitive consumer data, whether or not they are startups, have basic responsibilities to protect that data and to handle it responsibly," said Ed Felten, the chief technologist at the Federal Trade Commission (FTC). "Startups are in a good position to 'bake in' privacy, compared to bigger, more established companies, because they are not constrained as much by past design decisions.  As with security, it is easier to design-in privacy in advance than to retrofit it later."

    As online privacy debates heat up in Washington, the benefits of personalization and new business models for publishing or distribution will need to be balanced with mechanisms to protect consumer privacy. "You want to give gifts that people want," said Citron. "It's part of the behavioral advertising message, but there are privacy risks that we shouldn't overlook. This could be a way of outing people depending on the material."

    Privacy by design in an electronic gifting mechanism for media isn't an unreachable holy grail here, either. McAfee determined that the same issue does not exist with Amazon. "As a test, I tried to send my Mom an Amazon Kindle book I knew she already had," he wrote. "Amazon let the purchase go through and told me nothing about her Kindle inventory. She received a message from the company that I'd sent her an e-book she already owned, and giving her a credit for its price. To put it mildly, this seems like a better approach to me."

    February 17 2011

    The tricky mix of payment, identity and trust

    A new O'Reilly/PayPal report on web-native payment platforms, "ePayments: Emerging Platforms, Embracing Mobile and Confronting Identity," is now available for download. Among the topics covered in the report are the rise of payment platforms, the mobilization of money, and the significance of online identity in mobile commerce.

    The following excerpt considers some of the dimensions of online identity in mobile payment applications and what it means to users and payment providers. Additional excerpts from the report were featured here on Radar last week.


    To process a payment, the payment service needs to determine who someone is, not only to make sure they get paid, but also to understand their broader interests and preferences so they can deliver an online experience that suggests content, merchandise, and other opportunities.

    But the significance of online identity obviously goes far beyond this. A platform that holds someone's identity is the easiest place for that person to do business. Consider the rise of Google's Android platform: Many people who were comfortable on iPhones are now shifting to Androids, in some cases because they work better with Gmail and Google calendar where they have been doing business for years. With an Android phone in their pocket, it's also likely that those users may choose mobile commerce solutions from Google rather than from a third party like Apple or Amazon — presuming that it meets their needs.

    Payment platforms today confirm identity primarily through credit card or banking information. Privacy concerns dictate that sites generally get this information from you before your first transaction and — barring any security breaches — they keep it to themselves. For example, because you've already given Amazon your billing information at some point in the past, you can buy a Kindle edition of a new bestseller today with one click. But wander off Amazon to a site that specializes in, for example, ironic T-shirts, and you'll find yourself having to re-enter all of your shipping and billing information — unless that site offers Amazon Payments.

    Compare this to the way ad networks track your identity as you move from one site to another. Search DIY sites for information on fixing a printer problem and as you later browse unrelated sites you'll see ads for ink cartridges. How is it that ad networks have grown so sophisticated they can make offers across various sites — indeed, they can even predict future romantic interests based on historical browsing patterns — but we still have to re-enter our financial and identity information at every e-commerce site we buy from?

    Perhaps the main reason is that users are less chary about sharing their browsing history than they are about sharing their credit card numbers. But they do appear to be increasingly comfortable giving billing, shipping, and identity information to one or two trusted sources and then referring purchases to them.

    Something like this has already begun to happen with PayPal and Google Checkout. Users place their financial information with these trusted sources and then reference other sites to that account when they make a purchase. Merchants who use platforms like PayPal or Amazon Payments can identify you without asking the same series of questions. This secure, centralized financial identity is the current realization of the long-sought-after digital wallet. Like a physical wallet, your identity with a payment platform carries data that fulfills at least three functions: your identity, your ability to pay (debit and credit cards, cash), and the history of your payments (the receipts you've stuffed in after purchases or ATM withdrawals).

    Currently, each payment platform (indeed, most e-commerce sites you do business with) maintains a separate version of your identity data. While this constrains their ability to simplify payment by collaborating across sites, it does achieve an important goal of many users, which is segmenting identity. A person may be comfortable with Facebook knowing who her friends are, Foursquare knowing her favorite coffee spot, iTunes knowing her favorite performers, and Amazon knowing her credit card number. But she may be less comfortable with each of those sites knowing all those things about her.

    Thus, one of the goals of emerging online identity standards should be to ensure that users have control over which aspects of their identity get shared with whom. Facebook's recent embarrassments around third-party apps (such as Zynga's Farmville) leaking personal, identifiable information about users highlights the risks that platforms face. Users who were comfortable sharing that information with Facebook balked at Zynga redistributing it.

    Efforts to standardize the rules of online identity — based on levels of assurance that range from low to high confidence — seek to clarify the ways that individuals manage elements of their identity online. While the rules of identity will likely be defined and enforced by private organizations with dominant platforms, those rules will also draw on developing industry identity standards.

    Additional excerpts from "ePayments: Emerging Platforms, Embracing Mobile and Confronting Identity" are posted here. The full report is also available as a free download.



    Related:


    December 29 2010

    2010 Gov 2.0 Year in Review

    I recently talked with Federal News Radio anchor Chris Dorobek about Gov 2.0 in 2010 and beyond. While our conversation ranged over a wide variety of topics, it was clear afterwards that I'd missed many of the year's important stories in Gov 2.0 during the relatively short segment. I went back over hundreds of posts on Gov 2.0 at Radar and GovFresh, thousands of tweets and other year-end lists, including Govloop's year in review, Gartner's Top 10 for Government 2.0 in 2010, Bill Allison's end of year review, Andrew P. Wilson's memorables from 2010, Ellen Miller's year in Sunlight 2010, John Wonderlich's 2010 in policy and GovTwit's top Gov 2.0 stories. Following are the themes, moments and achievements that made an impact.

    Gov 2.0 spreads worldwide

    The year was marked by the international spread of Gov 2.0 initiatives. Wherever connections are available in the United States, citizens are turning to the Internet for government data, policy and services. Applying that trend internationally isn't unreasonable, as more of humanity comes online. It won't be easy. It's Gov 2.0 vs the beast of bureaucracy everywhere, as professor Andrew McAfee memorably put it.

    In Australia, for instance, government 2.0 Down Under still has a ways to go if it isn't going to be a "one shot" contest or success story. What's next for Government 2.0 in Australia, as Stephen Collins reflected, will rely on more public figures driving change, as well as citizens demanding better results.

    In the United Kingdom, the new-ish government will continue to be a test bed, given dire budget projections. A refreshed Number 10 Downing Street online presence and accounts won't address cost issues, either. A spending challenge focused on crowdsourcing cuts didn't get very far. Such initiatives are likely the tip of the iceberg, as tough budget decisions loom in 2011. While the influence of Tim Berners-Lee on Data.gov.uk is unmistakable, Gov 2.0 in the UK involves a host of small companies, agencies, elected officials and of course the citizens themselves.

    Gated governments face disruption

    Everywhere, governments remain concerned about the risks and rewards of Web 2.0, but with citizens increasingly going online, those same governments must respond to digital cries for help. In countries with autocratic governments, the disruption and challenge to power represented by free information flows mean that transparency and accountability are a long way off. In that context, an e-government conference in Russia has to be balanced with the government transparency issues revealed by the deployment of Ushahidi for crowdsourcing wildfires.

    Citizens empowered with new tools for transparency became a more powerful force in 2010, as the growing lists of examples of open government platforms in India (a democratic country) suggest. As citizens gain more means for reporting issues with services, corruptions or elections, repressive governments will be faced with more challenges in filtering, censoring, blocking or shutting down services that host contradictory or false reports.

    In that context, technology companies also have meaningful choices to make, from how they cooperate (or don't) with law enforcement and government agencies that want access to its data, to "firewalling" private information from multiple services within companies, to monitoring internal controls on employee access or to providing technologies that may be used to monitor, track or censor citizens.

    Open government remains in beta

    While the progress of the White House Open Government Directive at federal agencies is important, as is action in Congress, there's a long road yet ahead in the United States and abroad. As John Wonderlich pointed out in his own look at 2010,

    Obama’s Open Government Directive is at a crossroads (like other similar policies), and the changing majority in the House brings new opportunities for change (a 72 Hour Rule!), just as the outgoing majority brought their own new opportunities for transparency.

    We're still very much in open government's beta period. Some efforts, like the State Department's Text Haiti program for the Red Cross or the "do-it-ourselves" platforms from groups like CrisisCommons, made a difference. Other efforts, partially represented by many open government plans in the throes of implementation, won't mature for months to come.

    What is clear is that open government is a mindset, not simply a fresh set of technological tools. Gov 2.0 is a means, not an end. It can and will mean different things to different constituencies. For instance, the State Department released a social media policy, engaged the world through social media, launched a "Civil Society 2.0" initiative and released a quadrennial review in December. Its efforts to apply social software to digital diplomacy were laudable. By the end of the year, however, Secretary Clinton's landmark speech and policy on Internet freedom came under sharp global criticism in the wake of "Cablegate." The questions of who, where and why the U.S. supports Internet freedom became even more complex.

    WikiLeaks is a reminder that the disruption new technology platforms pose will often emerge in unexpected ways.

    Open data went global

    The first International Open Government Data Conference highlighted how far this trend has gone in a short time. "Since the United Kingdom and United States movement started, lots of other countries have followed," said Tim Berners-Lee, the inventor of the World Wide Web. Canada, New Zealand, Australia, France, Greece, and Finland are all working on open data initiatives. Within the United States, 16 states and 9 cities have created open data platforms. More data platforms at all levels will come online in 2011.

    "The more transparency there is, the more likely there is to be external investment," said Berners-Lee, highlighting the potential for open government data to make countries more attractive to the global electronic herd. Berners-Lee anticipates a world where open government data standards will come to cities, states and countries like HTML did in the 1990s. "The web spread quickly because it was distributed," said Berners-Lee. "The fact that people could put up web servers themselves without asking meant it spread more quickly without a centralized mandate." Over in England, the new legislation.gov.uk uses the linked open data standards Berners-Lee recommends.

    After nearly a year in the open data trenches, Nat Torkington offered advice here at Radar for those starting or involved in open data projects:

    First, figure out what you want the world to look like and why. Second, build your project around users.

    The Sunlight Foundation, one of the foremost users of data journalism for open government, created a new ‘data commons’ in 2010 and launched poligraft.com and InfluenceExplorer.com, both of which combine to make campaign finance, lobbying, earmark and government contract data more accessible. Sunlight Labs also made progress in opening up state legislatures.

    In December, a report on the attitudes, quality and use of open government data showed strong support for the release of open data among citizens and government employees. While the open data study showed progress, there's still a long road ahead for open government data. The promise of data journalism is notable, as journalists now have huge volumes of accessible government data, but cultural roadblocks and "dirty" data still need to be addressed.

    There are (more) apps for that

    Around the world, apps contests are unlocking innovation. One of the biggest contests, Apps for Development, is using new World Bank open data.

    As governments create their own applications, however, they'll need to avoid "shiny app syndrome" to avoid empowering the empowered.

    Gov 2.0 grew locally

    Gov 2.0 is going local, as techies take on City Hall. CityCamp, Code for America, Civic Commons and protocols like Open311 all grew this year. Real-time transit data is opening up exciting prospects for entrepreneurs. Local government as a data supplier looks like it may have legs as well.

    Even mainstream media woke up to the local trend. Time Magazine reported on mobile civic applications that let citizens give feedback to cities. At year's end, the use of Twitter by Newark mayor Cory Booker to hack Snowmageddon after a major snowstorm buried the East Coast brought new attention to the opportunities inherent in a new digital nexus between citizens and public servants online.

    Look for more citizens as sensors in neighborhoods soon.

    Laws, rules and regulations

    This was also the year that mainstream media couldn't stop reporting on social media in politics. Sarah Palin's tweets were read on cable news and gaffes from a Congressman or updates from the campaign trail went straight to the headlines. There have been thousands of posts and cable news reports on the topic at this point. A study on Twitter use in Congress asserted that Democrats use Twitter for transparency, while Republicans use it for outreach. For a useful perspective outside of the United States, First Monday published a terrific Gov 2.0 case study in government and e-participation at Brazil's House and presidential websites.

    What such reports generally missed is that Gov 2.0 progress within agencies is bolstered by orders, laws or regulations that support these activities. This spring, the Sunlight Foundation and other transparency advocates worked with Rep. Steve Israel and Sen. Jon Tester to introduce the Public Online Information Act in both chambers. As John Wonderlich explained, the act redefines “public information” by requiring that any government information currently required to be available to the public be posted online, and sets forth better technology coordination between the branches of government to achieve that overarching goal."

    In June, OMB updated its rules for cookies and privacy on U.S. government websites, enabling government agencies to use social media, video sharing and discussion platforms. In July, the House of Representatives had its first hearing on Government 2.0, examining the risks and rewards of Web 2.0 in government. The White House also released a draft of "National Strategy for Trusted Identities in Cyberspace," including a means for people to comment upon it online. Yes, the government has an online identity plan for you.

    The passage and subsequent signing of the Telework Enhancement Act by President Obama was a win for government workers, providing new flexibility in getting the job done. The need for that ability was driven home by the historic snowfall in Washington, D.C. last winter, when "Snowmageddon" made working from home more than a "nice to have" for many parts of the federal government.

    Election 2010 was a refresh for Gov 2.0, offering up numerous lessons for social media and politics from the campaign. What emerged were new prospects for the GOP to embrace innovation and transparency. That subsequently manifested with a victory for transparency in House rules.

    The enactment of a plain writing law is also a boon for open government, although getting bureaucracies to move away from acronyms won't happen overnight.

    In December, the passage of the COMPETES Act in Congress means that every federal agency can create prizes and competitions. Watch Challenge.gov to see if citizens and other private entities take them up on those opportunities.

    Online privacy went mainstream

    While some media outlets declared that privacy is dead, government officials and institutions weren't convinced. That's why online privacy debates heated up in Washington, with Facebook privacy and Google privacy frequently in the news.

    The shift to cloud computing puts Electronic Communications Privacy Act reform in the spotlight. Simply put, digital due process matters. As the year came to an end, the FTC released its online privacy report, which included a recommendation for a Do Not Track mechanism, along with increased transparency and baked-in controls.

    Government moves into the cloud

    When NASA Nebula's open source technology was integrated into Rackspace and others to form OpenStack, the administration's open government initiative had a bonafide success on its hands. Outside of NASA, the White House IT reforms include a "cloud first" strategy for new investments. That move is a part of a broad strategy to close the technology gap which has been a top priority of the administration's IT executives. FedRAMP, a federal government-wide approach to securing cloud computing, may help to provide some of the security and privacy questions that CIOs must ask.

    While some elements of government business will never be in the public cloud, look for the cloud transition to be an even bigger story in 2011 as Microsoft, Google, Salesforce.com, IBM, Amazon and others look for government dollars in their clouds. The White House moved Recovery.gov to Amazon's cloud in May. This fall, Treasury.gov moved into their cloud, too. Salesforce.com has many agencies in its cloud. Google and Microsoft have been signing up new city and state customers all year, along with chasing federal dollars. Look for more of the same in 2011, along with more tough questions about auditability, security, uptime and privacy questions.

    Open source moves deeper into government

    Energy.gov is moving to Drupal next spring, joining hundreds of other government websites on the open source content management platform. Next year, when FCC.gov gets an overdue overhaul, it will also be open source.

    Healthcare communication got an upgrade as the Direct Project creates the basis for a "Health Internet." The NHIN Direct project's approach to creating open health records was an important example of government as a platform. For more context, Apache co-founder Brian Behlendorf talked with Radar about the CONNECT project in a podcast, "from Apache to Health and Human Services.

    A "milestone in making government more open" went live this summer when the new Federal Register beta launched at FederalRegister.gov. As deputy White House CTO Beth Noveck observed, "Federal Register 2.0" is "collaborative government at its best." It's also all open source, so the site's code is shared in Civic Commons, a project launched at the Gov 2.0 Summit that will help city governments reduce costs and inefficiencies.

    Archiving went social

    When the Library of Congress teamed up with Twitter to archive tweets, it made headlines everywhere. Less noticed were the social upgrades by the Law Library of the United States to Thomas.gov, or the work that the National Archives is doing to guide other governmental agencies. When NARA issued guidance on social media, it was a watershed for many people looking for advice.

    Law.gov moved forward

    As Carl Malamud has explained:

    Law.Gov is an idea, an idea that the primary legal materials of the United States should be readily available to all, and that governmental institutions should make these materials available in bulk as distributed, authenticated, well-formatted data.

    This year, Law.gov moved much closer to reality, as the signing and release of Law.Gov core principles was followed by Google granting Law.gov major funding.

    At year's end, Malamud announced that Public.Resource.Org would begin providing legal decisions freely online in 2011 in a weekly release of the Report of Current Opinions (RECOP). According to Malamud, this report "will initially consist of HTML of all slip and final opinions of the appellate and supreme courts of the 50 states and the federal government."

    Citizen engagement platforms grew

    With a wave of new citizen engagement platforms and apps, citizens could contribute much more than a vote or a donation in 2010: they could donate their time and skills.

    The growth of citizen engagement platforms, however, extends far beyond Washington. Civic developers are helping government by standardizing application programming interfaces and empowering others by coding the middleware for open government. Working with developers can be a crucial complement to publishing open data online. Those citizens matter a lot there, but only if engaged.

    As the new year beckons, there are more ways for the citizens of the United States to provide feedback to their federal government than perhaps there ever have been in its history. In 2011, the open question is whether "We the people" will use these new participatory platforms to help government work better. The evolution of these kinds of platforms aren't U.S.-centric, either. Ushahidi, for example, started in Africa and has been deployed worldwide. The crowd matters more now in every sense: crowdfunding, crowdsourcing, crowdmapping, collective intelligence, group translation, and human sensor networks.

    What's next?

    Have bets for 2011? Let us know in the comments.


    September 24 2010

    The convergence of Google, government and privacy

    Google recently added a new Privacy Tools page. If you follow tech policy in Washington, you couldn't miss hearing about it, given that advertising for Google privacy tools was on relevant blogs, email newsletters and periodicals. And if you work, play, shop or communicate online, the issue of online privacy is more relevant to you than perhaps it ever has been before. Companies and governments are gathering unprecedented amounts of data about every click, link, and status update you make. The choices that they are making now around the use of personal information, identity, authentication and relationships will be fundamental to how we see one another and ourselves as the information age evolves.

    This historic moment is why the Gov 2.0 Summit featured a deep dive into online privacy this year. The evolution of online privacy continues to spur debate on and offline. Below, Tim O'Reilly shared his thinking on the complex subject at the Summit:

    Why the big push in awareness around the new privacy tools? For once, it's simple: privacy is a hot issue in Washington, and Google has a major interest in contributing to and informing that conversation.

    As one of the most powerful tech companies in the world, Google will inevitably be affected by any new electronics privacy laws from Congress or regulations from the Federal Trade Commission. From potential updates on the law to the complexities introduced by cloud computing to the unprecedented collection of user data from search, webmail or advertising (Google's bread and butter), getting online privacy right will matter.

    That's doubly true because of recent missteps, like the widespread backlash over Google Buzz. With the anticipated launch of Google Me later this year, which is reported to add a social layer to Google's products, online privacy will become even more relevant. Google is up against the hypergrowth of Facebook, which, with some 500 million users, has grown into one of the world's top online destinations.

    Every time users "like" a page on Facebook instead of linking a website on web, that action provides Facebook with information about relevance and connections that aren't available to Google, at least for now. When Google returns a result using its search algorithms, they are in part parsing linking behavior. Should Facebook continue to expand its own search, it's safe to assume that those results will be in part informed by liking behavior. The contrast between hyperlinks and "hyperlikes" is relevant because the future of the Internet is likely to be built on understanding those interactions. In both cases, users (and regulators) have a vested interest in understanding how, where and why their data is being used.

    Is it possible to share and protect sensitive information? That will continue to be an open question -- and a contentious question -- for years to come. For an informed discussion on that topic, watch the conversation embedded below with John Clippinger, of Harvard's Berkman Center, Loretta Garrison from the FTC and Ellen Blacker from AT&T.

    New Google privacy tools

    Last week, I attended a press briefing where Jonathan McPhee, Google's product manager for web history, SSL search and personalized search offerings, walked through Google's privacy settings.

    "It's on us to educate members of Congress," said Google spokesman Adam Kovacevich. "Google is an engineering company. We would like to address challenging issues -- copyright or free expression for instance -- with great tools. The Content ID tool on YouTube is a great example of that. Our first instinct is to try to address them through tools like this."

    One of Google's most public responses to online privacy concerns came last year, during the FTC privacy roundtables, when the company launched a dashboard to show users information associated with Google accounts. Yahoo also launched a dashboard to provide similar insight into data collection. Both dashboards were designed to provide users with more insight into what data was being collected around which interests. The introduction of these tools was particularly relevant to behavioral advertising, which many publishers are looking to as an important way of offering targeted, high-value marketing.

    According to McPhee, there are now more than 100,00 unique visitors every day to Google's dashboard. Users can view, modify or even export data there, though the company's "Data Liberation Front.

    McPhee framed the privacy discussion in the context of principles. Reflecting Google's famous mantra, "Don't Be Evil," these online privacy principles should give users "meaningful choices" to protect their personal information.

    Google itself has meaningful choices to make: from how they cooperate (or don't) with law enforcement and government agencies that want access to its data, to "firewalling" private information from multiple services within companies, to monitoring internal controls on employee access.

    "We want to help users know what's associated with their account, so each user is aware from beginning what's correlated across the account," said Kovacevich, who noted that Google published the number of requests they get for user data online. Privacy researcher Chris Soghoian considered the pros and cons of this tool in praise of Google. Google is also working with a coalition of technology companies and think tanks on Digital Due Process, an effort advocating for an update of the Electronic Communications Privacy Act to incorporate privacy principles relevant to cloud computing.

    Google has also made an effort to make privacy tools more visible, said McPhee, notably on the sparse home page and on every search page. By the time users reach the bottom of the new privacy tools page, said McPhee, users will be more "empowered." He touted two areas where Google introduced privacy features earlier than competitors: encrypted webmail, in January 2010 and encrypted search this spring. "We launched [encrypted.google.com] in May," said McPhee. "It encrypts communication between Google and the searcher. The concept is simple but implementation is complex, in maintaining performance. The challenge is around latency."

    "Another feature I don't think people are aware of is the ability to pause," said McPhee, referring to the ability of users to stop recording their Web History, then resume. Users can also remove items from that Web History if they wish.

    Web browsing and privacy

    Google's browser, Chrome, includes an "Incognito Mode" that reduces information collected during a user's browsing session. While Incognito Mode won't help anyone protect information from a colleague or friend if they leave a browser window open, it does mean that the history will not log progress, URLs will not be stored and any cookies on the computer are session-level only. Any downloads made, however, will stick around.

    McPhee also noted that an opt-out option for Google Analytics has been available since April. The tradeoff between integrating analytics or disabling the function is between making a website more useful for users vs. individual user privacy. That frames the debate going on within the government webmaster community, where the recent revamp of federal cookie policy by the Office of Management and Budget officially allowed the use of cookies and analytics.

    Opting out of analytics doesn't prevent a website from knowing certain things, like a HTTPS referrer, but "we chose privacy over data," said McPhee. "That was significant for us."

    Future Google services and online privacy

    Geolocation services have attracted notice of late, due to concerns over granular location information. As with other areas, a user chooses how much to share with friends, said McPhee. People have to sign up to see the locations of users in Google's Latitude program.

    "You can detect location or set location manually," said McPhee, "and even hide location." He highlighted monthly email reminders that remind users that the Latitude service is on. McPhee also noted that feedback from a shelter for battered women focused on a scenario where Latitude was installed without a user's knowledge resulted in a feature addition. "Where possible, within 24 hours if you haven't used it, a Latitude dialog pops up," he said.

    Online video will present other issues. While users can make an uploaded video "private," if people have the right link, the video can be viewed. If YouTube does move forward with allowing livestreaming, expect these privacy controls to receive additional scrutiny.

    Google's moves into mobile, television, e-commerce, virtual worlds and even augmented reality will also create new privacy concerns. With each service, the question of where and how privacy is baked in will be important to analyze. On one count, at least, Google seems to have learned an important lesson: more transparency into how data is being used and what tools are available to control it can satisfy the privacy concerns of the majority of users. According to McPhee, only 1 in 7 people who used the online advertising preferences chose to opt-out entirely. Providing users with control over their own private data can't be discounted, either.

    That was just one element that Jules Polonestky, the former privacy officer at DoubleClick and AOL, focused on in his talk at the Gov 2.0 Summit and in our subsequent interview. Both are embedded below.

    Questions for Google from the online audience

    Before I headed into Google's Washington offices, I solicited questions to ask from readers. Answers, which I've edited for readability, follow.

    How long before Google Street View covers my country, especially the cities of Makkah & Madina? -- Aamer Trambu (@TVtrambu)

    While Google's staff wouldn't comment on any plan to roll out Street View in the Middle East, they did emphasize the availability of users to opt out using privacy tools. "Facial recognition, if we were to introduce it," said McPhee, "would also have controls."

    [I have] concerns over the control of "predicted search" data on Google Instant. How is it stored, associated, protected? -- Andrew N (@tsbandito)

    "Google Instant works just like normal web searches," said McPhee. "If you click on a result, press enter or take some other action like clicking on an ad, just like before, it's recorded in your Web History." He did highlight a way that Instant is a bit different: when you get a result and you don't have to click on anything, Google records it as a search if you pause on a term for three seconds.

    What is the ETA on Google turning on encryption for search by default? Do the filtering concerns of schools take priority? -- Chris Soghoian (@csoghoian)

    For now, you can make encrypted.google.com your home page, suggested McPhee. "For those unfamiliar with the issue, schools have an obligation for funding to provide filtering of pornographic images. The difficulty is that because schools didn't know what people are searching for, they blocked google.com."

    McPhee focused on issues of search speed as an key consideration in default encryption. "The difficulty of offering encryption by default is that the central challenge is performance," said McPhee. "There are some features where this is more difficult than text search. Encrypted search doesn't support images or maps. Before we made this the default, we would need that to be supported as well. As soon as we have that feature parity, we will look into that as well."

    What extent will they be using social data in conglomeration with Web History? -- Eric Andersen (@eric_andersen)

    "We have a social search function, and that exists as a separate function from Web History," said McPhee. "There's a page called social circle and you can go through and see what information is there and edit it. You can say 'I don't want this guy promoted in social search.' I can't comment on rumors [regarding Google Me]."

    How far will Google would go to protect user privacy? -- Ilesh (@ileshs)

    "We abide by the laws in the countries in which we operate," said Kovacevich. "That doesn't mean at the very first request for user data that we give it away. From a broad perspective in promoting freedom of speech globally, we are interested in the issue. We're doing a big conference in Budapest with a Central European university."

    I recently heard about mail graphing. What about the data privacy concerns with that? -- Meg Biallas (@MegBiallas)

    This third-party ad on "is a great example of where we think data belongs to users, and they can use it in creative ways," said McPhee. You can learn more about mail graphing in a recent edition of Strata Week.

    How many of the U.S. government requests for information were made for information on people from outside of the United States? [This waas in regards to data requests, not removal requests.] -- Annie Crombie (@annieatthelake)

    "Honestly, I don't know," said Kovacevich. "We track them by the origin of the request."

    How are they going to use the information from what we watch on Google TV? -- Tim Carmody (@tcarmody)

    "We definitely have a goal to have all Google products and services included in the dashboard if it's in your account," said McPhee. "It's safe to assume if there's unique information collected via Google TV, it will be included there."

    What about Google's own access to stored data? Any comment on that case? [This question referred to Google firing an engineer for violating privacy policies.] -- Carl Brooks (@eekygeeky)

    Google's spokesman referred me to the company's public statement on this question, which was published in TechCrunch:

    "We dismissed David Barksdale for breaking Google’s strict internal privacy policies. We carefully control the number of employees who have access to our systems, and we regularly upgrade our security controls -- for example, we are significantly increasing the amount of time we spend auditing our logs to ensure those controls are effective. That said, a limited number of people will always need to access these systems if we are to operate them properly -- which is why we take any breach so seriously." -- Bill Coughran, Senior Vice President, Engineering, Google



    Related:


    September 23 2010

    ECPA reform: Why digital due process matters

    Yesterday, the Senate held a hearing on proposed updates to the Electronic Communications Privacy Act, the landmark 1986 legislation that governs the protections citizens have when they communicate using the Internet or cellphones. Today, the House held a hearing on ECPA reform and the revolution in cloud computing.

    While the vagaries of online privacy and tech policy are far out in the geeky stratosphere, the matter before Congress should be earning more attention from citizens, media and technologists alike.

    "Just as the electric grid paved the way for industrial economy, cloud computing paves the way for a digital economy," testified David Shellhuse of Rackspace.

    So to take it one step further: updates to the ECPA have the potential to improve the privacy protections for every connected citizen, cloud computing provider or government employee. "Advances in technology depend not just on smart engineers but on smart laws," testified Richard Delgado of Google. Salgado highlighted Digital Due Process, in concert with a new post on ECPA reform at the Google Public Privacy blog.

    After the hearing yesterday, I interviewed digital privacy and security researcher Chris Soghoian about what's at stake. Soghoian, until recently the resident geek at the Federal Trade Commission, explained why the Digital Due Process coalition is pushing for an ECPA update for online privacy in the cloud computing age.

    “From the perspective of industry and definitely the public interest groups, people shouldn’t have to consider government access as one of the issues when they embrace cloud computing,” said Soghoian. “It should be about cost, about efficiency, about green energy, about reliability, about backups, but government access shouldn’t be an issue.”

    Members of the coalition include Google, Microsoft, AT&T, AOL, Intel, the ACLU and the Electronic Frontier Foundation. "Users of cloud services must have confidence that their data will have privacy protections from government and from providers," testified Mike Hintz of Microsoft, who said that his company "regularly hears from enterprises that moving data to the cloud affects privacy."

    Below, ACLU legislative counsel Chris Calabrese talks about email, cloud computing and what's at stake with proposed updates to the Electronic Communications Privacy Act.

    In the next video, Indiana University professor Fred Cate talks about electronic privacy protections for email under the current laws and and what updates to the Electronic Communications Privacy Act could mean. [Testimony]

    Below, Princeton computer science professor Ed Felten talks about proposed updates to the Electronic Communications Privacy Act in the context of the shift to cloud computing. "In an ideal world, people would be deciding to use on the cloud based on efficiency and cost," testified Felten. Privacy concern alter the choices of businesses and consumers. When ECPA was first written, he said, "the founder of Facebook was 2 years old." To say much has changed in technology since 1986 would be a considerable understatement. [Testimony]

    Finally, Wharton professor Kevin Werbach talks about why the Electronic Communications Privacy Act is important to reducing friction and uncertainty for cloud providers and their customers. "A drop in trust in online intermediaries will add more friction to the Internet economy," he said. [Testimony]

    Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
    Could not load more posts
    Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
    Just a second, loading more posts...
    You've reached the end.

    Don't be the product, buy the product!

    Schweinderl