Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

January 16 2014

Court prods FCC in unexpected direction in this week’s Verizon ruling

A court ruling this past Tuesday on FCC “network neutrality” regulation closes and opens a few paths in a three-way chess game that has been going on for years between the US District Court of Appeals, the FCC, and the major Internet server providers. (Four-way if you include Congress, and five-way if you include big Internet users such as Google — so, our chess game is coming closer to Chinese Checkers at this point.)

A lot of bloggers, and even news headlines, careened into histrionics (“Net neutrality is dead. Bow to Comcast and Verizon, your overlords“). The Free Press, although oversimplifying the impact, did correctly link the ruling to what they and many other network neutrality supporters consider the original sin of FCC rulings: eviscerating the common carrier regulation of broadband providers.

Even better, many commenters noted the ambiguities and double messages in the ruling. Unlike a famous earlier ruling on Comcast regulation, this week’s court ruling spends a good deal of time affirming the FCC’s right to regulate Internet providers. Notably, pp. 35-36 essentially confirm the value and validity of network neutrality (in the form of promoting innovation at the edges by placing no restraints on transmissions).

Let’s go over the efforts of Congress and the FCC to promote competition in Internet service, leading up to Tuesday’s ruling.

Two mandates handed down from the 20th century: Computer II and the Telecom Act of 1996

The major goal of the landmark 1996 Act was to promote competition. The market for Internet service was stunningly different in 1996 from what it is in the US today. There was a lot of competition — but not much reach and not much bandwidth. Many people still lacked Internet access, and those who got it from home dialed into an ISP, often a mom-and-pop operation. Few people could get Internet access over their cable TV network. More competition would presumably lead to more and faster Internet connections.

Although idealists (like me) looked forward to a teeming ecosystem of multiple ISPs, Congress was perhaps more realistic in expecting an oligopoly with a few big players in each geographic market (three companies was considered a good number for robust competition). Many expectations were placed on incumbents: there were seven “baby Bells” that came out of the breakup of the old AT&T, and although each occupied a separate geographic market, observers hoped they would enter each other’s markets. Cable companies were in most markets as well. Somehow, from all this raw material, new services were supposed to arise.

The law established interconnection points that the baby Bells had to provide to competitors. Theoretically, smaller companies could have exploited those points to find a market niche, but that hardly went anywhere (and many observers blamed the Bells for stymying competition). The seven baby Bells quickly recombined to make three (Verizon, CenturyLink, and a new AT&T), who competed for mobile phones but refrained from competing with landlines in each other’s regions. In many areas, a baby Bell and a cable company formed a duopoly.

Much of the country has enjoyed continuous Internet access (no dial-up) with increasing bandwidth, but many observers think the age of fiber expansion is over at both Verizon and AT&T. While the US remains far behind other developed countries in bandwidth.

Internet regulation (or lack thereof) goes back to 1966 with a series of “Computer inquiries” from the FCC. These have been universally praised for allowing the Internet to arise and spread, simply by announcing that the FCC would not regulate it. Computer II, in particular, distinguished the service offered by telephone companies over the line from the data service running through modems on either side. The Telecom Act enshrined this difference by defining “information services” that were separate from the “telecommunication services” that the FCC had long regulated as common carriers.

Telecommunication services (regulated under Title II of the law) have to provide equal, non-discriminitory access to all users. Information services do not. Clearly, companies will go to extreme lengths to evade being labeled a telecommunication service.

The big divide: cable versus baby Bell

Although we hear a lot about “digital divides” between urban and rural areas, rich and poor, white and minority (luckily decreasing), the divide I’m going to talk about here is a regulatory one. Cable companies are not common carriers; they have always been regulated differently. Local communities can require certain services (such as community and educational channels), but the cable companies are definitively free of the burdens of Title II.

Thanks to the Telecom Act, however, cable companies and telecom companies have come to look more and more alike. They all provide voice calls; they all provide TV channels; they all provide Internet access; and, increasingly, they all provide movies on demand and other services. The key problem the FCC faces — not blamable on Congress, the US District Court, or anybody in particular — is that for historical reasons it imposes much heavier requirements on telecom companies than on functionally identical cable companies. Cable companies offer both Internet transport and content of their own, all over the same physical channel — and now, telecom companies do the same. Something’s gotta give: either Title II regulation has to be imposed on cable companies, or it has to be removed from the baby Bells.

We should note, for historical context, that a Republican administration replaced a Democratic one in 2000, and in 2001 Michael K. Powell was appointed FCC chair. He brought with him a profound faith in the free market as a spur to competition and innovation. When the FCC announced in 2002 that cable modem service was an information service, Powell wrote a justification that reads almost like an apology:

The Commission does not have unconstrained discretion to pick its preferred definition or classification, as some imply. The Commission must attempt to faithfully apply the statutory definition to a service, based on the nature of the service, including the technology used and its capabilities, and the nature of the interactive experience for the consumer…The Commission is not permitted to look at the consequences of different definitions and then choose the label that comports with its preferred regulatory treatment.

But that, of course, is exactly what they did in their inquiry. “Even if Computer II were to apply, however, we waive on our own motion the requirements of Computer II in situations where the cable operator additionally offers local exchange service. The Commission, on its own motion or on petition, may exercise its discretion to waive such requirements on the basis of good cause shown and where the particular facts would make strict compliance inconsistent with the public interest.” (paragraph 45)

I’d like to argue that it was inevitable for them to jump off on this side of the fence. They could hardly evade the reasoning in paragraph 43: “The Commission has never before applied Computer II to information services provided over cable facilities. Indeed, for more than 20 years, Computer II obligations have been applied exclusively to traditional wireline services and facilities.” Regarding the alternative they saw, “to find a telecommunications service inside every information service,” they say, “Such radical surgery is not required.” In short, the technical reality behind Internet connections was irrelevant to the policy dilemma. This FCC decision is often called Brand X, after a court ruling that upheld the decision after a challenge led by an ISP of that name.

By the way, it’s not fair to consider Powell a tool of large corporations, as some critics do. He was deeply committed to the principle of free markets, and articulated four “Internet freedoms” reminiscent of Richard M. Stallman’s four software freedoms.

The sin ascribed to the FCC by Free Press and other network neutrality supporters is actually an inescapable corollary to the cable decision. In 2005 — after Powell left — they decided that new lines and equipment rolled out by telecom companies would not be subject to the common carrier requirements that had been in place for some 70 years. The decision explicitly and repeatedly refers to their Brand X cable modem ruling. They claim the change will enhance competition rather than hurting it.

I think the FCC was hamstrung by the evolution of the Internet industry. The hoped-for ecosystem of small Internet competitors was stunted and scattered. Real competition existed only among the big incumbents, both in telecom and in cable. As we’ll see, this had a major impact on campaigns among Internet activists. As for the FCC, the decisions to free those companies from common carrier status stemmed from a hope that they’d put on their boxing gloves. And they did — but the punches were aimed at the FCC rather than each other.

Giving up on the substrate

Over the past few years, advocates for more competition and growth on the Internet have tacitly moved “up the stack,” complaining about ISP practices such as interfering with certain content and their plans to charge certain Internet sites for favorable treatment. For instance, Comcast was found to be secretly throttling traffic when users were downloading large files. When unmasked, Comcast claimed it was placing restrictions on downloads to be fair to all users; critics suggested it regarded the streaming downloads as competition for its own offerings since movies played a large part in the downloads.

One can imagine that, back in the 1990s, ISP practices like this would lead to an exodus by disgusted customers. Nowadays, there’s much less choice. Network neutrality advocates seem to be taking the battle to the software layer because achieving large-scale competition at lower layers seems unattainable. Indeed, real competition would require companies to compete more on the physical layer. Meanwhile, advocates for tiered service suggest it will lower costs and encourage competition.

The FCC is caught between an aroused community of network neutrality advocates and a powerful set of industries looking for ways to increase revenue. Occasionally, it tries to intervene. But the same argument the FCC makes for removing regulation, enthusiastically accepted by the industry, is bitterly opposed when used for exerting regulation. In each case, this argument is:

  • The action we’re taking will promote investment and new services by Internet companies, such as the social networks and content providers.
  • That innovation will stimulate demand by users for more bandwidth, along with a willingness to pay.
  • That in turn leads to more investment and innovation (such as more efficient codecs for multimedia content) in Internet infrastructure.

Comcast’s secret traffic stifling led to the first court battle. In its 2010 ruling, the DC district court basically told the FCC that it had tied its own hands by refusing to regulate the cable companies as common carriers. Cable modems fall in the cracks between the various categories regulated by the Telecom Act. The FCC can’t use Title II (common carrier status). Title III (broadcasting) doesn’t permit the kinds of regulation the FCC was trying to impose. When the FCC tries to cite its mandate to regulate pricing, the court tells it that it can regulate on the basic tier.

The court essentially looked through the Telecom Act for a clause that explicitly let the FCC regulate a practice that didn’t emerge until a decade after the Act was passed, and — unsurprisingly — didn’t find one. The core of the ruling might be found on page 16: “…the Commission must defend its exercise of ancillary authority on a case-by-case basis.”

It would seem like all the players and props were on stage for the final act of the network neutrality drama. But Tuesday’s court ruling showed that the endgame is not at hand. The bottom line is the same — the FCC cannot apply its anti-discrimination and anti-blocking rules; but, as I mentioned at the beginning of the article, the court offered its own sort of encouragement.

The court essentially used a duck test. They found that the FCC regulation looked like a common carrier obligation, so they rapped its knuckles for trying to force common carrier status on companies. Because the FCC had previously removed common carrier status from these companies, the court said it couldn’t impose such regulations now.

Verizon’s lawyers started by cutting and pasting Comcast’s objections to the FCC ruling, changing section 230(b) of the Telecom Act to section 706 and adding some other distracting objections of their own. The court didn’t buy the comparison, which leaves hope for those who want the FCC to rein in ISP business practices. The court even revises the appearance of its early ruling, saying a bit snarkily, “In Comcast, we held that the Commission had failed to cite any statutory authority that justified its order, not that Comcast had never impaired Internet traffic.”

Some network neutrality advocates have reacted to the decisions and rulings I’ve discussed (as Free Press does) by asking the FCC to reverse its 2005 decision that allowed telecom companies essentially to expand as much as they want without opening up to competition. This would encounter insurmountable hurdles because government agencies have to cite compelling reasons to change any decision they’ve made, and eagle-eyed courts hold them to that high standard.

Other people trace the problem to the 1996 Telecom Act, apparently already outdated by rapid changes in the industry. I don’t have to assess the likelihood of getting Congress to take on a major revision at this time in its history, or the likelihood of Internet activists getting the result they want.

Or maybe communities will pool their resources to create their own infrastructure, a particularly bold suggestion when you consider how much it costs to string fiber between cities.

Tuesday’s ruling did not close off the FCC’s right to regulate Internet services — in fact, I think it expanded possibilities beyond the place they seemed to stand following the Comcast decision. I am not sure the current debate over things such as blocking is productive. I think much bigger forces are in play, as I discussed in my article last week about Internet centralization. However, I’ll lay odds that few, if any, lawyers will lose business as a result of Tuesday’s decision.

December 09 2013

Who will upgrade the telecom foundation of the Internet?

Although readers of this blog know quite well the role that the Internet can play in our lives, we may forget that its most promising contributions — telemedicine, the smart electrical grid, distance education, etc. — depend on a rock-solid and speedy telecommunications network, and therefore that relatively few people can actually take advantage of the shining future the Internet offers.

Worries over sputtering advances in bandwidth in the US, as well as an actual drop in reliability, spurred the FCC to create the Technology Transitions Policy Task Force, and to drive discussion of what they like to call the “IP transition”.

Last week, I attended a conference on the IP transition in Boston, one of a series being held around the country. While we tussled with the problems of reliability and competition, one urgent question loomed over the conference: who will actually make advances happen?

What’s at stake and why bids are coming in so low

It’s not hard to tally up the promise of fast, reliable Internet connections. Popular futures include:

  • Delivering TV and movie content on demand
  • Checking on your lights, refrigerator, thermostat, etc., and adjusting them remotely
  • Hooking up rural patients with health care experts in major health centers for diagnosis and consultation
  • Urgent information updates during a disaster, to aid both victims and responders

I could go on and on, but already one can see the outline of the problem: how do we get there? Who is going to actually create a telecom structure that enables everyone (not just a few privileged affluent residents of big cities) to do these things?

Costs are high, but the payoff is worthwhile. Ultimately, the applications I listed will lower the costs of the services they replace or improve life enough to justify an investment many times over. Rural areas — where investment is currently hardest to get — could probably benefit the most from the services because the Internet would give them access to resources that more centrally located people can walk or drive to.

The problem is that none of the likely players can seize the initiative. Let’s look at each one:

Telecom and cable companies
The upgrading of facilities is mostly in their hands right now, but they can’t see beyond the first item in the previous list. Distributing TV and movies is a familiar business, but they don’t know how to extract value from any of the other applications. In fact, most of the benefits of the other services go to people at the endpoints, not to the owners of the network. This has been a sore point with the telecom companies ever since the Internet took off, and spurs them on constant attempts to hold Internet users hostage and shake them down for more cash.

Given the limitations of the telecom and cable business models, it’s no surprise they’ve rolled out fiber in the areas they want and are actually de-investing in many other geographic areas. Hurricane Sandy brought this to public consciousness, but the problem has actually been mounting in rural areas for some time.

Angela Kronenberg of COMPTEL, an industry association of competitive communications companies, pointed out that it’s hard to make a business case for broadband in many parts of the United States. We have a funny demographic: we’re not as densely populated as the Netherlands or South Korea (both famous for blazingly fast Internet service), nor as concentrated as Canada and Australia, where it’s feasible to spend a lot of money getting service to the few remote users outside major population centers. There’s no easy way to reach everybody in the US.

Governments
Although governments subsidize network construction in many ways — half a dozen subsidies were reeled off by keynote speaker Cameron Kerry, former Acting Secretary of the Department of Commerce — such stimuli can only nudge the upgrade process along, not control it completely. Government funding has certainly enabled plenty of big projects (Internet access is often compared to the highway system, for instance), but it tends to go toward familiar technologies that the government finds safe, and therefore misses opportunities for radical disruption. It’s no coincidence that these safe, familiar technologies are provided by established companies with lobbyists all over DC.

As an example of how help can come from unusual sources, Sharon Gillett mentioned on her panel the use of unlicensed spectrum by small, rural ISPs to deliver Internet to areas that otherwise had only dial-up access. The FCC ruling that opened up “white space” spectrum in the TV band to such use has greatly empowered these mavericks.

Individual consumers
Although we are the ultimate beneficiaries of new technology (and will ultimately pay for it somehow, through fees or taxes) hardly anyone can plunk down the cash for it in advance: the vision is too murky and the reward too far down the road. John Burke, Commissioner of the Vermont Public Service Board, flatly said that consumers choose the phone service almost entirely on the basis of price and don’t really find out its reliability and features until later.

Basically, consumers can’t bet that all the pieces of the IP transition will fall in place during their lifetimes, and rolling out services one consumer at a time is incredibly inefficient.

Internet companies
Google Fiber came up once or twice at the conference, but their initiatives are just a proof of concept. Even if Google became the lynchpin it wants to be in our lives, it would not have enough funds to wire the world.

What’s the way forward, then? I find it in community efforts, which I’ll explore at the end of this article.

Practiced dance steps

Few of the insights in this article came up directly in the Boston conference. The panelists were old hands who had crossed each other’s paths repeatedly, gliding between companies, regulatory agencies, and academia for decades. At the conference, they pulled their punches and hid their agendas under platitudes. The few controversies I saw on stage seemed to be launched for entertainment purposes, distracting from the real issues.

From what I could see, the audience of about 75 people came almost entirely from the telecom industry. I saw just one representative of what you might call the new Internet industries (Microsoft strategist Sharon Gillett, who went to that company after an august regulatory career) and two people who represent the public interest outside of regulatory agencies (speaker Harold Feld of Public Knowledge and Fred Goldstein of Interisle Consulting Group).

Can I get through to you?

Everyone knows that Internet technologies, such as voice over IP, are less reliable than plain old telephone service, but few realize how soon reliability of any sort will be a thing of the past. When a telecom company signs you up for a fancy new fiber connection, you are no longer connected to a power source at the telephone company’s central office. Instead, you get a battery that can last eight hours in case of a power failure. A local power failure may let you stay in contact with outsiders if the nearby mobile phone towers stay up, but a larger failure will take out everything.

These issues have a big impact on public safety, a concern raised at the beginning of the conference by Gregory Bialecki in his role as a Massachusetts official, and repeated by many others during the day.

There are ways around the new unreliability through redundant networks, as Feld pointed out during his panel. But the public and regulators must take a stand for reliability, as the post-Sandy victims have done. The issue in that case was whether a community could be served by wireless connections. At this point, they just don’t deliver either the reliability or the bandwidth that modern consumers need.

Mark Reilly of Comcast claimed at the conference that 94% of American consumers now have access to at least one broadband provider. I’m suspicious of this statistic because the telecom and cable companies have a very weak definition of “broadband” and may be including mobile phones in the count. Meanwhile, we face the possibility of a whole new digital divide consisting of people relegated to wireless service, on top of the old digital divide involving dial-up access.

We’ll take that market if you’re not interested

In a healthy market, at least three companies would be racing to roll out new services at affordable prices, but every new product or service must provide a migration path from the old ones it hopes to replace. Nowhere is this more true than in networks because their whole purpose is to let you reach other people. Competition in telecom has been a battle cry since the first work on the law that became the 1996 Telecom Act (and which many speakers at the conference say needs an upgrade).

Most of the 20th century accustomed people to thinking of telecom as a boring, predictable utility business, the kind that “little old ladies” bought stock in. The Telecom Act was supposed to knock the Bell companies out of that model and turn them into fierce innovators with a bunch of other competitors. Some people actually want to reverse the process and essentially nationalize the telecom infrastructure, but that would put innovation at risk.

The Telecom Act, especially as interpreted later by the FCC, fumbled the chance to enforce competition. According to Goldstein, the FCC decided that a duopoly (baby Bells and cable companies) were enough competition.

The nail in the coffin may have been the FCC ruling that any new fiber providing IP service was exempt from the requirements for interconnection. The sleight of hand that the FCC used to make this switch was a redefinition of the Internet: they conflated the use of IP on the carrier layer with the bits traveling around above, which most people think of as “the Internet.” But the industry and the FCC had a bevy of arguments (including the looser regulation of cable companies, now full-fledged competitors of the incumbent telecom companies), so the ruling stands. The issue then got mixed in with a number of other controversies involving competition and control on the Internet, often muddled together under the term “network neutrality.”

Ironically, one of the selling points that helps maintain a competitive company, such as Granite Telecom, is reselling existing copper. Many small businesses find that the advantages of fiber are outweighed by the costs, which may include expensive quality-of-service upgrades (such as MPLS), new handsets to handle VoIP, and rewiring the whole office. Thus, Senior Vice President Sam Kline announced at the conference that Granite Telecom is adding a thousand new copper POTS lines every day.

This reinforces the point I made earlier about depending on consumers to drive change. The calculus that leads small businesses to stick with copper may be dangerous in the long run. Besides lost opportunities, it means sticking with a technology that is aging and decaying by the year. Most of the staff (known familiarly as Bellheads) who designed, built, and maintain the old POTS network are retiring, and the phone companies don’t want to bear the increasing costs of maintenance, so reliability is likely to decline. Kline said he would like to find a way to make fiber more attractive, but the benefits are still vaporware.

At this point, the major companies and the smaller competing ones are both cherry picking in different ways. The big guys are upgrading very selectively and even giving up on some areas, whereas the small companies look for niches, as Granite Telecom has. If universal service is to become a reality, a whole different actor must step up to the podium.

A beautiful day in the neighborhood

One hope for change is through municipal and regional government bodies, linked to local citizen groups who know where the need for service is. Freenets, which go back to 1984, drew on local volunteers to provide free Internet access to everyone with a dial-up line, and mesh networks have powered similar efforts in Catalonia and elsewhere. In the 1990s, a number of towns in the US started creating their own networks, usually because they had been left off the list of areas that telecom companies wanted to upgrade.

Despite legal initiatives by the telecom companies to squelch municipal networks, they are gradually catching on. The logistics involve quite a bit of compromise (often, a commercial vendor builds and runs the network, contracting with the city to do so), but many town managers swear that advantages in public safety and staff communications make the investment worthwhile.

The limited regulations that cities have over cable companies (a control that sometimes is taken away) is a crude instrument, like a potter trying to manipulate clay with tongs. To craft a beautiful work, you need to get your hands right on the material. Ideally, citizens would design their own future. The creation of networks should involve companies and local governments, but also the direct input of citizens.

National governments and international bodies still have roles to play. Burke pointed out that public safety issues, such as 911 service, can’t be fixed by the market, and developing nations have very little fiber infrastructure. So, we need large-scale projects to achieve universal access.

Several speakers also lauded state regulators as the most effective centers to handle customer complaints, but I think the IP transition will be increasingly a group effort at the local level.

Back to school

Education emerged at the conference as one of the key responsibilities that companies and governments share. The transition to digital TV was accompanied by a massive education budget, but in my home town, there are still people confused by it. And it’s a minuscule issue compared to the task of going to fiber, wireless, and IP services.

I had my own chance to join the educational effort on the evening following the conference. Friends from Western Massachusetts phoned me because they were holding a service for an elderly man who had died. They lacked the traditional 10 Jews (the minyan) required by Jewish law to say the prayer for the deceased, and asked me to Skype in. I told them that remote participation would not satisfy the law, but they seemed to feel better if I did it. So I said, “If Skype will satisfy you, why can’t I just participate by phone? It’s the same network.” See, FCC? I’m doing my part.

October 31 2012

NYC’s PLAN to alert citizens to danger during Hurricane Sandy

Starting at around 8:36 PM ET last night, as Hurricane Sandy began to flood the streets of lower Manhattan, many New Yorkers began to receive an unexpected message: a text alert on their mobile phones that strongly urged them to seek shelter. It showed up on iPhones:

…and upon Android devices:

While the message was clear enough, the way that these messages ended up on the screens may not have been clear to recipients or observers. And still other New Yorkers were left wondering why emergency alerts weren’t on their phones.

Here’s the explanation: the emergency alerts that went out last night came from New York’s Personal Localized Alerting Network, the “PLAN” the Big Apple launched in late 2011.

NYC chief digital officer Rachel Haot confirmed that the messages New Yorkers received last night were the result of a public-private partnership between the Federal Communications Commission, the Federal Emergency Management Agency, the New York City Office of Emergency Management (OEM), the CTIA and wireless carriers.

While the alerts may look quite similar to text messages, the messages themselves run in parallel, enabling them to get through txt traffic congestion. NYC’s PLAN is the local version of the Commercial Mobile Alert System (CMAS) that has been rolling out nation-wide over the last year.

“This new technology could make a tremendous difference during
disasters like the recent tornadoes in Alabama where minutes – or even seconds – of extra warning could make the difference between life and death,” said FCC chairman Julius Genachowski, speaking last May in New York City. “And we saw the difference alerting systems can make in Japan, where they have an earthquake early warning system that issued alerts that saved lives.”

NYC was the first city to have it up and running, last December, and less than a year later, the alerts showed up where and when they mattered.

The first such message I saw shared by a New Yorker actually came on October 28th, when the chief digital officer of the Columbia Journalism School, Sree Sreenivasan, tweeted about receiving the alert:

He tweeted out the second alert he received, on the night of the 29th, as well:

These PLAN alerts go out to everyone in a targeted geographic area with enabled mobile devices, enabling emergency management officials at the state and local level to get an alert to the right people at the right time. And in an emergency like a hurricane, earthquake or fire, connecting affected residents to critical information at the right time and place are essential.

While the government texting him gave national security writer Marc Ambinder some qualms about privacy, the way the data is handled looks much less disconcerting than, say, needing to opt-out of sharing location data or wireless wiretapping.

PLAN alerts are free and automatic, unlike opt-in messages from Notify NYC or signing up for email alerts from OEM.

Not all New Yorkers received an emergency alert during Sandy because not all mobile devices have the necessary hardware installed or have updated relevant software. In May 2011, new iPhones and Android devices already had the chip. (Most older phones, not so much.)

These alerts don’t go out for minor issues, either: the system is only used by authorized state, local or national officials during public safety emergencies. They send the alert to CMAS, it’s authenticated, and then the system pushes it out to all enabled devices in a geographic area.

Consumers receive only three types of messages: alerts issued by the President, Amber Alerts, and alerts involving “imminent threats to safety or life.” The last category covers the ones that went out about Hurricane Sandy in NYC last night.

According to the FCC, participating mobile carriers can allow their subscribers to block all but Presidential alerts, although it may be a little complicated to navigate a website or call center to do so. By 2014, every mobile phone sold in the United States must be CMAS-capable. (You can learn more about CMAS in this PDF). Whether such mobile phones should be subsidized for the poor is a larger question that will be left to the next administration.

As more consumers replace their devices in the years ahead, more people around the United States will also be able to receive these messages, benefiting from a public-private partnership that actually worked to deliver on improved public safety.

At least one New Yorker got the message and listened to it:

“If ‘act’ means stay put, then why yes I did,” tweeted Noreen Whysel, operations manager Information Architecture Institute. “It was enough to convince my husband from going out….”

Here’s hoping New York City doesn’t have use this PLAN to tell her and others about impending disaster again soon.

January 30 2012

A discussion with David Farber: bandwidth, cyber security, and the obsolescence of the Internet

David Farber, a veteran of Internet technology and politics, dropped by Cambridge, Mass. today and was gracious enough to grant me some time in between his numerous meetings. On leave from Carnegie Mellon, Dave still intervenes in numerous policy discussions related to the Internet and "plays in Washington," as well as hosting the popular Interesting People mailing list. This list delves into dizzying levels of detail about technological issues, but I wanted to pump him for big ideas about where the Internet is headed, topics that don't make it to the list.

How long can the Internet last?

I'll start with the most far-reaching prediction: that Internet protocols simply aren't adequate for the changes in hardware and network use that will come up in a decade or so. Dave predicts that computers will be equipped with optical connections instead of pins for networking, and the volume of data transmitted will overwhelm routers, which at best have mixed optical/electrical switching. Sensor networks, smart electrical grids, and medical applications with genetic information could all increase network loads to terabits per second.

When routers evolve to handle terabit-per-second rates, packet-switching protocols will become obsolete. The speed of light is constant, so we'll have to rethink the fundamentals of digital networking.

I tossed in the common nostrum that packet-switching was the fundamental idea behind the Internet and its key advance over earlier networks, but Dave disagreed. He said lots of activities on the Internet reproduce circuit-like behavior, such as sessions at the TCP or Web application level. So theoretically we could re-architect the underlying protocols to fit what the hardware and the applications have to offer.

But he says his generation of programmers who developed the Internet are too tired ("It's been a tough fifteen or twenty years") and will have to pass the baton to a new group of young software engineers who can think as boldly and originally as the inventors of the Internet. He did not endorse any of the current attempts to design a new network, though.

Slaying the bandwidth bottleneck

Like most Internet activists, Dave bewailed the poor state of networking in the U.S. In advanced nations elsewhere, 100-megabit per second networking is available for reasonable costs, whereas here it's hard to go beyond a 30 megabits (on paper!) even at enormous prices and in major metropolitan areas. Furthermore, the current administration hasn't done much to improve the situation, even though candidate Obama made high bandwidth networking a part of his platform and FCC Chairman Julius Genachowski talks about it all the time.

Dave has been going to Washington on tech policy consultations for decades, and his impressions of the different administrations has a unique slant all its own. The Clinton administration really listened to staff who understood technology--Gore in particular was quite a technology junkie--and the administration's combination of judicious policy initiatives and benign neglect led to the explosion of the commercial Internet. The following Bush administration was famously indifferent to technology at best. The Obama administration lies somewhere in between in cluefulness, but despite their frequent plaudits for STEM and technological development, Dave senses that neither Obama nor Biden really have the drive to deal with and examine complex technical issues and insist on action where necessary.

I pointed out the U.S.'s particular geographic challenges--with a large, spread-out population making fiber expensive--and Dave countered that fiber to the home is not the best solution. In fact, he claims no company could make fiber pay unless it gained 75% of the local market. Instead, phone companies should string fiber to access points 100 meters or so from homes, and depend on old copper for the rest. This could deliver quite adequate bandwidth at a reasonable cost. Cable companies, he said, could also greatly increase Internet speeds. Wireless companies are pretty crippled by loads that they encouraged (through the sale of app-heavy phones) and then had problems handling, and are busy trying to restrict users' bandwidth. But a combination of 4G, changes in protocols, and other innovations could improve their performance.

Waiting for the big breach

I mentioned that in the previous night's State of the Union address, Obama had made a vague reference to a href="http://www.whitehouse.gov/blog/2012/01/26/legislation-address-growing-danger-cyber-threats">cybersecurity initiative with a totally unpersuasive claim that it would protect us from attack. Dave retorted that nobody has a good definition of cybersecurity, but that this detail hasn't held back every agency with a stab at getting funds for it from putting forward a cybersecurity strategy. The Army, the Navy, Homeland Security, and others are all looking or new missions now that old ones are winding down, and cybersecurity fills the bill.

The key problem with cybersecurity is that it can't be imposed top-down, at least not on the Internet, which, in a common observation reiterated by Dave, was not designed with security in mind. If people use weak passwords (and given current password cracking speeds, just about any password is weak) and fall victim to phishing attacks, there's little we can do with dictats from the center. I made this point in an article twelve years ago. Dave also pointed out that viruses stay ahead of pattern-matching virus detection software.

Security will therefore need to be rethought drastically, as part of the new network that will replace the Internet. In the meantime, catastrophe could strike--and whoever is in the Administration at the time will have to face public wrath.

Odds without ends

We briefly discussed FCC regulation, where Farber tends to lean toward asking the government to forebear. He acknowledged the merits of arguments made by many Internet supporters, that the FCC tremendously weakened the chances for competition in 2002 when it classified cable Internet as a Title 1 service. This shielded the cable companies from regulations under a classification designed back in early Internet days to protect the mom-and-pop ISPs. And I pointed out that the cable companies have brazenly sued the FCC to win court rulings saying the companies can control traffic any way they choose. But Farber says there are still ways to bring in the FCC and other agencies, notably the Federal Trade commission, to enforce anti-trust laws, and that these agencies have been willing to act to shut down noxious behavior.

Dave and I shared other concerns about the general deterioration of modern infrastructure, affecting water, electricity, traffic, public transportation, and more. An amateur pilot, Dave knows some things about the air traffic systems that make on reluctant to fly. But there a few simple fixes. Commercial air flights are safe partly because pilots possess great sense and can land a plane even in the presence of confusing and conflicting information. On the other hand, Dave pointed out that mathematicians lack models to describe the complexity of such systems as our electrical grid. There are lots of areas for progress in data science.

April 05 2011

FCC.gov reboots as an open government platform

A decade ago, the Federal Communications Commission's (FCC) website received an award for the best website in federal government, but the largely static repository has steadily fallen over the years to become one of the worst. Today, the bar has been raised for federal government website reboots with the relaunch of the new FCC.gov, now available in beta at beta.FCC.gov.

FCC.gov home page

The new site is organized around the three primary activities: file a public comment, file a complaint, and search for information. The insight for that redesign came through a combination of online traffic analysis, requests for information through the call center, and conversations with FCC employees.

Some changes that go along with the new FCC.gov are literally tiny, like the newly launched FCC.us URL shortener. Others look small but are a big deal, like secure HTTPS web browsing across FCC.gov. Other upgrades work on small devices, enabling interested parties to watch proceedings wherever they are: the fcc.gov/live livestream now includes the ability to sense the device that someone is using and convert on the fly to HTML5 or Flash. That livestream can also be embedded on other websites.

All of those upgrades add up to a greater whole. Broadly speaking, FCC managing director Steve Van Roeckel and his team of software developers, designers, new media and IT security staff have worked hard to bring Web 2.0 principles into the FCC's online operations. Those principles include elements of open data, platform thinking, collective intelligence, and lightweight social software. What remains to be seen in the years ahead is how much incorporating Web 2.0 into operations will change how the FCC operates as a regulator.

Nearly two years ago, Tim O'Reilly and John Battelle asked how Web 2.0 technologies could transform the actual practice of governing. The FCC has made a big step toward that vision, at a cost of approximately $1.35 million in total development costs. "Everything should be an API," said Van Roeckel, speaking in a briefing on Monday. "The experiences that live outside of FCC.gov should interact back into it. In a perfect world, no one should have to visit the FCC website." Instead, he said, you'd go to your favorite search engine or favorite app and open data from the FCC's platform would be baked into it.

The overhaul of FCC.gov has been underway since last September. "We're approaching .gov like .com," Van Roeckel said at the time. Seven months later, FCC.gov is the next iteration of what an open government platform can be — at least with respect to the digital architecture for a regulatory agency.

"It is our intention that every proceeding before the agency will be available for public comment," Van Roeckel said at the briefing. "If we think of citizens as shareholders, we can do a lot better. Under the Administrative Procedure Act, agencies will get public comments that enlighten decisions. When citizens care, they should be able to give government feedback, and government should be able to take action. We want to enable better feedback loops to enable that to happen."

Following are five ways the new FCC.gov improves on the previous version, followed by an analysis of how some of these changes relate to open government.

1. FCC.gov runs on open source

Specifically, theFCC.gov open source redesign runs on Drupal, like Energy.gov, House.gov and WhiteHouse.gov. The FCC also considered Sharepoint, Documentum, WordPress and Ruby on Rails before ultimately going with Drupal. The use of Drupal at the White House was a "strong validator" for that choice, said Van Roeckel. As the White House has done, Van Roeckel said that the FCC will contribute code back to the Drupal community.

2. FCC.gov is hosted in the cloud

Federal cloud computing is no longer on the horizon. It's now a reality. Last May, the White House moved Recovery.gov to Amazon's cloud. The new Treasury.gov is hosted on Amazon's cloud. Today, the new FCC.gov is hosted in Terremark's cloud, according to Van Roeckel. As with Treasury.gov, FCC.gov content is accelerated by the Akamai Content Delivery Network.

This Terremark implementation has been certified at a higher level of security required for compliance with the Federal Information Security Management Act (FISMA). FCC.gov, in fact, is one of the first federal websites in the cloud at the FISMA moderate level. Van Roeckel, however, cautioned that only information that the agency deems "read-only" will be hosted externally. Transactional implementations in the cloud will follow later. "Everything that the government allows us to shift to the cloud, we will shift to the cloud," he said.

The move to the cloud is being driven, as with so many aspects of government, by costs. As with much of industry, the FCC's servers have been underutilized. Moving to the cloud will enable the agency to more closely track actual usage and adjust capacity. "My goal is to move everything from a capital expense to an operating expense," said Van Roeckel.

3. FCC.gov incorporates collective intelligence

Over time, the new FCC.gov will begin to show the most popular pages, official documents, and comments on the site.

Last year, the Office of Management and Budget (OMB) updated rules for cookies on federal websites. Among other changes, the new guidance allowed federal agencies to use data analytics to monitor website traffic, interact and engage with citizens online, deliver e-services, and provide information.

Van Roeckel also pointed to the use of FCC broadband speed testing apps to collect more than 2 million tests across the United States as a precedent for looking to citizens as sensors. The testing data was integrated into the FCC's national broadband map. Van Roeckel said that he has hopes for other uses of collective intelligence in the future, like crowdsourcing cellular tower locations.

4. FCC.gov has significantly improved search

The less that's said about the ability of visitors to find information on the old FCC.gov, the better. The new FCC.gov uses Solr, an open source enterprise search platform from the Apache Lucene project. The new functionality is built upon a topic-based "FCC encyclopedia" that provides dynamically generated results for each search. "It's not breakthrough stuff, but it's breakthrough for government," said Van Roeckel, who indicated that further improvements and fine tuning are coming within a week.

5. FCC.gov is a platform for open data

"The whole website is built on RESTful services by itself," said Van Roeckel. "What I saw when I came in is that we had amazing data locked up in silos that were inaccessible to companies and citizens. There were all these roadblocks to getting and using data."

The new site makes the data for public comments accessible with an associated API. There are also now now chief data officers for wireline, wireless, consumer, media, enforcement, international, engineering and legislative affairs. Each data officer is personally accountable for getting data out to the public.

The roadblocks are far from gone, but due to the efforts of the FCC's first chief data officer, Greg Elin, some are being removed. The agency launched FCC.gov/data, worked to share more APIs at FCC.gov/developer and hosted its first open developer day. Van Roeckel says the agency is working to get standardization of data, with the general direction of standardizing to XML. If the future he described is near, FCC is increasingly going to ask companies to file regulatory information in electronic, machine-readable formats.

Open government, inside and out

There are three different lenses to look at what open government means for a federal agency, at least as defined by the Open Government Directive: transparency, participation and collaboration.

Open data, online video and collective intelligence applied to governance will help with transparency. Collective intelligence may help to surface key documents or comments. Participation and collaboration in open government have proven to be a harder nut to crack.

The role that a regulator plays matters here. For example, comments from Broadband.gov or OpenInternet.gov were entered into the public record. "Today, you can take us to court with one of the blog comments from Broadband.gov," said Van Roeckel. "More than 300,000 citizens gave comment on the Open Internet proceeding." Whether those comments lead to positive or negative public policy changes is both an open and contentious question, as this analysis of those who win and lose under FCC net neutrality rules suggests.

That doesn't mean improving the capacity of the FCC to conduct more open rulemaking online wasn't worth the effort. It means that to make those processes truly open, the regulators themselves must shift to being more open.

Embracing the spirit of open government will require all agencies to move beyond what information technology by itself can accomplish or empower. That's a tall order. It requires a cultural change. To put it another way, open government is a mindset.

That's particularly true when applying an open government mandate to an institution with around 1,900 workers, where the dynamics that MIT research professor Andrew McAfee aptly described in "Gov 2.0 vs the beast of bureaucracy" are in play.

Van Roeckel said that the FCC launched Reboot.gov internally first, including an anonymous comment box and blog. They're working on "bringing Web 2.0 culture into the building," where possible. For instance, the agency has been using ThinkUpApp for internal collaborative innovation.

For other agencies to succeed in a similar refresh, Van Roeckel shared a key point of advice: "Get someone on the executive team who can get resources and own the mandate. That the chairman cares about this and that I care about this is why it's happening."

Whether those internal and external efforts will lead to a 21st century regulatory agency isn't clear yet. That's a judgment that historians are better suited to render, rather than those chronicling the rough draft of history. What is indisputable is that today, there's a new FCC.gov for the world to explore.



Related:


February 08 2011

"Copy, paste, map"

Data, data everywhere, and all too many spreadsheets to think.

Citizens have a new tool to visualize data and map it onto their own communities. Geospatial startup FortiusOne and the Federal Communications Commission (FCC) have teamed up to launch IssueMap.org. IssueMap is squarely aimed at addressing one of the biggest challenges that government agencies, municipalities and other public entities have in 2011: converting open data into information that people can distill into knowledge and insight.

IssueMap must, like the data it visualizes, be put in context. The world is experiencing an unprecedented data deluge, a reality that my colleague Edd Dumbill described as another "industrial revolution" at last week's Strata Conference. The release of more data under the Open Government Directive issued by the Obama Administration has resulted in even more data becoming available. The challenge is that for most citizens, the hundreds of thousands of data sets available at Data.gov, or at state or city data catalogs, don't lead to added insight or utility in their every day lives. This partnership between FortiusOne and the FCC is an attempt to give citizens a mapping tool to make FCC data meaningful.

IssueMap

There are powerful data visualization tools available to developers who wish to mash up data with maps, but IssueMap may have a key differentiator: simplicity. As Michael Byrne, the first FCC geospatial information officer, put it this morning, the idea is to make it as easy as "copy, paste, map."

Byrne blogged about the history of IssueMap at Reboot.gov:

Maps are a data visualization tool that can fix a rotten spreadsheet by making the data real and rich with context. By showing how data — and the decisions that produce data — affect people where they live, a map can make the difference between a blank stare and a knowing nod. Maps are also a crucial part of a decision-maker's toolkit, clearly plotting the relationship between policies and geographies in easy-to-understand ways.

Working with FCC deputy GIO Eric Spry, Byrne created the video embedded below:

IssueMap was created using FortiusOne's GeoIQ data visualization and analysis platform. "We built GeoIQ to enable non-technical users to easily make sense of data," said Sean Gorman, president and founder of FortiusOne. "IssueMap capitalizes on those core capabilities, enabling citizens to bring greater awareness of important issues and prompt action."

Gorman explained how to use IssueMap at the FortiusOne blog:

Once you've found some data you can either upload the spreadsheet (.csv, .xls, .xlsx, .odf) or just cut and paste into the IssueMap text box. Many tables you find online can also be cut and pasted to create a map. The data just needs to be clean with the first row containing your attributes and the data beneath having the values and geographies you would like to map. Even if you muck it up a bit IssueMap will give you helpful errors to let you know where you went wrong.

Once you've loaded your data just select the boundary you would like to join to and the value you would like to map. Click “Create Map” and magic presto you have a thematic map. Share your map via Twitter, Facebook or email. Now anyone can grab your map as an embed, download an image, grab the map as KML or get the raw data as a .csv. Your map is now viral and it can can be repurposed in a variety of useful ways.

One of the most powerful ways humanity has developed to communicate information over time is through maps. If you can take data in an open form (and CSV files are one of the most standard formats available) then there's an opportunity to tell stories in a way that's relevant to a region and personalized to an individual. That's a meaningful opportunity.



Related:


December 22 2010

December 21 2010

Steve Wozniak on the FCC and Internet freedom

Earlier today, Apple co-founder Steve Wozniak published a passionate open letter to the FCC that described his personal history with the telecommunications industry.

Wozniak followed that up with a surprise appearance at the Federal Communication Commission's public hearing on new open Internet rules and net neutrality. Steven Levy of Wired Magazine tweeted about the unexpected arrival: "Woz is at FCC hearing to speak against the plan--sez that with these rules, he couldn't have done Apple."

Interviewed by various media outlets after the hearing (see video below), Wozniak explained his presence at the hearing:

I wanted to be here because this day was so significant to my life. I had a ham radio license when I was 10 years old. I had the FCC spectrums on my wall in my room. I grew up admiring the FCC ... The FCC has always sort of had a white hat. This is a case where that hat could go black.

Wozniak was not happy with much of what he heard from the commission at today's hearing:

I don't think the rules went far enough in protecting individuals, but I tend to be very much on the side of the small guy being taken advantage of by the big guy. I feel sorry about that. I feel emotional about that.

Specifically, Wozniak expressed concerns over blocking issues:

... no blockages doesn't mean there will be no inhibitions. It wasn't clear what was presented here today to me if that means you can't really favor one source over another. You know, an innovator comes along, they shouldn't have any blocks on the Internet. To me, the Internet, the ISPs, should just be providing things like the copper to your house and the gear that puts it onto the Internet. Step back, get out of the way, don't try to make it go your way.

Wozniak also noted he had been personally affected by nearly every issue FCC commissioner Michael Copps raised in his statement at the hearing.

November 09 2010

Geeks and government converge at the FCC

Yesterday, the first FCC developer day focused on open government innovation. For a day, the commission room that has hosted hearings on spectrum policy, licensing, mergers and net neutrality was full of geeks focused on making something useful from the FCC's new APIs and open data stores.

One of those geeks is well-known to many developers: the founding editor of Lifehacker, Gina Trapani. I spoke with her in the FCC's new TEC Lab about her work on the ThinkUp app, the prototype apps that came out of the hackathon, and the potential for geeks to create better outcomes for citizens - and maybe make a few dollars along the way -- with open government data.

Will a rebooted FCC.gov become a platform for applications driven by open government data? If that vision is going to come to fruition, the nation's pre-eminent communication regulator will have to do more than just publish open data sets in machine readable formats. It must also develop a community of software developers that benefits from creating such applications.

Monday's FCC developer day was a first step toward that future. Whether it’s a successful one will depend on how the applications help citizens, businesses or other organizations do something new. In the process, expect a few savvy entrepreneurs to tap into the goldmine of Gov 2.0, empowering citizens along the way.

September 09 2010

Four short links: 9 September 2010

  1. CloudUSB -- a USB key containing your operating environment and your data + a protected folder so nobody can access you data, even if you lost the key + a backup program which keeps a copy of your data on an online disk, with double password protection. (via ferrouswheel on Twitter)
  2. FCC APIs -- for spectrum licenses, consumer broadband tests, census block search, and more. (via rjweeks70 on Twitter)
  3. Sibyl: A system for large scale machine learning (PDF) -- paper from Google researchers on how to build machine learning on top of a system designed for batch processing. (via Greg Linden)
  4. The Surprisingness of What We Say About Ourselves (BERG London) -- I made a chart of word-by-word surprisingness: given the statement so far, could Scribe predict what would come next?

September 02 2010

FCC.gov poised for an overdue overhaul

Gov 2.0 Summit, 2010For an agency that is charged with regulating communications, the Federal Communication Commission (FCC) has been a bit behind the curve in evolving its online presence. FCC.gov was launched in June 1995, redesigned in 1999, and relaunched again in September 2001. Since then, it has remained a largely static repository for public notices and information about the agency's action.

According to FCC officials, that's going to change, and soon. There was already some insight offered into redesigning the FCC website back in January on the agency blog, informed at least in part by discussions with Sunlight Labs on redesigning the government.

Yesterday, I interviewed FCC managing director ‪Steven VanRoekel at FCC headquarters about what rebooting FCC.gov‬ will mean for the agency, businesses and the American people. "The new site will embrace open government principles around communication and participation," said VanRoekel. "Consider OpenInternet.gov, where over 30,000 ideas were generated, or Broadband.gov. Comments there go into the official record and are uploaded to the Library of Congress. You will see that in a much more pervasive way in the new FCC.gov."

Our short video interview is below. An extended interview follows.

Redesigning FCC websites for public comment

In January, the FCC launched Reboot.gov and asked for public input on improving citizen interaction. The site, which was touted as the first website to solicit citizen interaction with the FCC, followed the launch of Broadband.gov and OpenInternet.gov last year. All three websites are notable for their clean design and integration of new media components (blogs, Twitter, etc). Chairman Julius Genachowski's introduction to the site is embedded below:

Improving public access to the FCC's operation is part of a new mentality, according to VanRoekel: "Last year, the chairman talked about entrepreneurs taking a rotation through government. We think a lot about bringing in great leadership and managing people around leadership. We were third from the bottom last year in rankings for the best places to work in federal government. We hired a new team to bring in a new culture, which means looking at citizens as shareholders."

One of the stated aims of Reboot.gov was to gather feedback on how FCC.gov itself can be redesigned, a project that, as noted above, is long overdue. The announcement of the new site, for instance, showed up in email but was not posted in plain text on FCC.gov. Like other releases, it showed up as a Word doc and PDF on the site. That said, the FCC has picked up the pace of its communications over the past year, as anyone who has followed the @FCC on Twitter knows.

Aside from the cleaner design of the new microsites and an embrace of social media, open government geeks and advocates took note of FCC.gov/data, which is meant to be "an online clearinghouse for the Commission's public data." The FCC has posted XML feeds and search tools for its documents that allow users to sort data by type and bureau.

Under the Media Bureau, for instance, visitors can explore DTV Station Coverage Maps, a key issue to many given the transition to digital TV earlier this year. But the maps are on the old FCC.gov. For those who don't enjoy good public DTV reception, they'd have to find the tiny icon for DTV.gov below the fold and click through to get more information.




FCC's clunky clickstream

Navigation on FCC.gov is still a work in progress. In this example, a user who clicks a link for "DTV Station Coverage Maps" is taken to the old FCC site. From there, they need to find and click on a DTV icon to receive deeper information.



That kind of reciprocal citizen-to government interaction is precisely where the potential for these sites can be best realized, and where good design matters. So-called Web 1.0 tools like static websites, email and SMS used to share information about the quality of services. Web 2.0 services like blog comments and social media have, in turn, been deployed to gather feedback from citizens about the delivery of said services. The FCC began to pursue that potential in earnest in March, when the FCC went mobile and launched iPhone and Android apps for crowdsourced broadband speed testing.

The potential for empowering citizens and developers with open data s where VanRoekel focused first when we talked.

"We'll be announcing a couple of things next week at the Gov 2.0 Summit," he said. "Since we launched the speed test, we've gathered over a million data points. That continues to grow each day. We're going to launch a web services API where people can write apps against the speed test data. You'll be able to send us a GPS coordinate and we'll give you data against it."





FCC Chairman Julius Genachowski and Managing Director Steven VanRoekel will discuss their experiences turning FCC.gov into a 21st-century consumer resource at the Gov 2.0 Summit in Washington, D.C. (Sept. 7-8). Request an invitation.






If incorporated into Zillow.com or the thousands of online real estate brokerages, that kind of interaction has the potential to give people what they need to make more informed rental or buying decisions. "When I click on a house on a real estate site, why don't I see what broadband capabilities are there?" asked VanRoekel. "We're approaching .gov like .com. We're not only setting up data services and wrapping the API, but we're building apps as well, and utilizing the same APIs we expect developers to use."

A consistent challenge across government for releasing open data has been validation and accuracy. The FCC may employ crowdsourcing to address the quality issue. "Think about a map of broadband speeds," VanRoekel explained. "I would love the ability for users to show us what's valid."

Balancing transparency and open government

As a regulatory body, the FCC has both great power and great responsibility, to put it in terms that Stan Lee might appreciate. Despite the arcane nature of telecommunications law, the agency's decisions have the potential to affect every citizen in the nation. As VanRoekel pointed out, the FCC must follow administrative procedures and publish drafts of rulemaking for public comment, followed by a vote by the commissioners. In the age of the Internet and the open government directive, that process is due for the same reboot the FCC.gov will receive.

"Once approved, language in the APA [Administrative Procedure Act] says government will open up the notice of draft rules to enlighten public decision-making," said VanRoekel. "In the past, what that's meant is us putting it up on a website, in PDFs. Law firms would send clerks, who would photocopy folders and come back with comments at the draft rule. There was no way for an educator or an affected family to get involved. It's our vision that every rule that's up for decision in this agency will be opened for public input."

The first draft of that effort has been on display at OpenInternet.gov. "We made it so that an idea entered into our engines was entered into the public record," VanRoekel said. "An interesting fact there is that you, as a citizen or industry body, can see the comments and hold us legally liable."

The FCC is faced with difficulties that derive from handling the explosion of online feedback that contentious issues like net neutrality generate. "The volume of comments becomes our problem," he said. "When you have 30,000 ideas coming in and comments on top of them on the record, and we have a limited number of people that oversee the effort, that's our biggest challenge."

While the FCC has touted new tools for openness and transparency, it's also taken a beating about a lack of transparency in close- door meetings on Internet rules.

"There's a role to play on certain meetings where ex parte comes into play," said VanRoekel. "We tend to use ex parte as a mechanism for understanding. We ask vendors specific questions. Many times there are questions that involve their intellectual property."

The agency has since ended closed-door meetings, but the episode highlights the complexity of enacting new regulations in the current media climate.

Yesterday, in fact, as the New York Times reported, the FCC agency released a public notice seeking more input on open Internet rules, which the agency duly tweeted out as a PDF. The document is embedded below.



Ars Technica and others criticized the agency for asking more questions instead of taking action.

Will the FCC get net neutrality right?. Hard to say. The Center for Democracy and Technology, by way of contrast, endorsed the FCC focusing in on key issues in the net neutrality debate "as a good sign that the FCC is rolling up its sleeves to grapple with the most contentious issues." As my colleague Andy Oram pointed out this week, the net neutrality debate depends upon what you fear. The only safe bet here is that OpenInternet.gov is likely to get a fresh batch of public comment to take into the record.

Addressing the digital divide

Online debates over net neutrality or the proposed broadband plan leave out a key constituency: the citizens who do not have access to the Internet. The information needs of communities in a democracy were the focus of the recent report by the Knight Commission.

To that point, VanRoekel spoke with the Sunlight Foundation's executive director, Ellen Miller, earlier this year about how everyone is changing everything. Their conversation is embedded below:

In our interview, VanRoekel focused on how mechanisms of community activation can be used to include disconnected people.

VanRoekel pointed to the growth of mobile access and social media uptake in communities that have been traditionally less connected. That's a focus that is substantiated by Pew Research that shows citizens turning to the Internet for government data, policy and services, particularly minority communities.

Traditional outreach is still viable as well. Community organizers can reach people on the ground and involve key constituencies. "We also can go back to 800 numbers," he said. "Using voice to offer access and adding the ability to enter into the public record."



Related:

August 11 2010

What I get and don't get about the Google/Verizon proposal

Nobody knew for a long time what Google and Verizon were cooking up on
the network neutrality front, and after the release of their brief,
two-page roadmap (posted href="http://www.scribd.com/doc/35599242/Verizon-Google-Legislative-Framework-Proposal">On
Scribd as a PDF, among other places) nobody still knows. All the
usual Internet observers have had their say, and in general the
assessment is negative.

My first reaction was to ignore the whole thing, mainly because the
language of the agreement didn't match any Internet activity I could
recognize. Some of the false notes struck:

  • The Consumer Protections section keeps using the term "lawful" as if
    there was a regulatory regime on the Internet. Not even the people
    regularly accused of trying to extend government control over the
    Internet (ICANN, WSIS, and the ITU) believe they can define what's
    lawful and make people stick to it.

    If I can send and receive only lawful content, who do Google and
    Verizon think can stop me from exchanging child pornography or
    instructions to blow up buildings? What, in turn, distinguishes lawful
    applications and services from unlawful ones (outside of Saudi Arabia
    and the United Arab Emirates)?

    Deduction: This passage represents no meaningful or enforceable rules,
    but is thrown in to make regulators feel there's a policy where in
    fact there is none.

  • The Transparency section strews around nice, general statements no one
    could complain about--don't we all want our services to tell us what
    they're doing?--but the admonitions are too general to interpret or
    apply.

    For instance, Apple is adamant about its right to determine what apps
    are available to iPhone and iPad buyers. Is that transparency?
    Apparently not, because every Apple developer gnaws his fingernails
    waiting to hear whether and when his app will be accepted into the App
    Store. But I don't see language in the Google/Verizon transparency
    section that covers the App Store at all. They might well say it's not
    networking issue.

    Fine, let's turn to networking. The carriers maintain that they need
    flexibility and a certain degree of secrecy to combat abuses such as
    spam; see for instance my blog href="http://www.oreillynet.com/onlamp/blog/2008/04/consider_the_economics_in_netw.html">Consider
    the economics in network neutrality. Squaring this complex
    issue--which is covered by the Google/Verizon in the next item on
    Network Management--with transparency is a dilemma.

    Deduction: we can all say we're transparent and feel good, but life is
    too complex for authorities to be totally transparent about what
    they're transparent about.

  • The worst passage in my view is the one in the Regulatory Authority
    section assigning authority to the FCC for "broadband." That
    ill-defined term, used far too much in Washington, tends to appear in
    the context of universal service. One can regulate broadband by such
    things as providing incentives to build more networks, but the
    Regulatory Authority section sidesteps the more basic questions of who
    gets to regulate the building, interconnecting, and routing through
    networks.

    Deduction: Google and Verizon put this in to encourage the government
    to continue pouring money into the current telcos and cable companies
    so they can build more high-speed networks, but its effect on
    regulation is nil.

Not too inspiring on first impression, but because so many other
people raised such a brouhaha over the Google/Verizon announcement, I
decided to think about it a bit more. And I actually ended up feeling
good about one aspect. The proposal is really a big concession to the
network neutrality advocates. I had been feeling sour about proposals
for network neutrality because, as nice as they sound in the abstract,
the devil is in the details. Network management for spam and other
attacks provides one example.

But the Google/Verizon announcement explicitly denounces
discrimination and mandates adherence to Internet standards. (Of
course, some Internet standards govern discrimination.) It seems to me
that, after this announcement, no network provider can weep and wring
its hands and claim that it would be unable to do business on a
non-discriminatory basis. And network neutrality advocates can cite
this document for support.

But as others have pointed out, the concession granted in the
"Non-Discrimination Requirement" section is ripped away by the
"Additional Online Services" section to "traffic prioritization." This
makes it clear that the "services" offered in that section reach deep
into the network infrastructure where they can conflict directly with
public Internet service. Unless someone acknowledges the contradiction
between the two sections and resolves it in a logical manner, this
document becomes effectively unusable.

What about the other pesky little exemption in the proposal--wireless
networks? Certainly, a lot of computing is moving to mobile devices.
But wireless networks really are special. Not only are they hampered
by real limits on traffic--the networks being shared and having
limited spectrum--but users have limited tolerance for unwanted
content and for fidgeting around with their devices. They don't want
to perform sophisticated control over transmission over content; they
need someone to do it for them.

Anyway, fiber is always going to provide higher bandwidth than
wireless spectrum. So I don't believe wireless will become
dominant. It will be a extremely valuable companion to us as we walk
through the day, saving data about ourselves and getting information
about our environment, but plenty of serious work will go on over the
open Internet.

So in short, I disdain the Google/Verizon agreement from an editor's
point of view but don't mind it as a user. In general, I have nothing
against parties in a dispute (here, the telephone companies who want
to shape traffic and the Internet sites who don't want them to)
conducting talks to break down rigid policy positions and arrive at
compromises. The Google/Verizon talks are fraught with implications,
of course, because Google is a wireless provider and Verizon
distributes lots of phones with Google's software and services. So I
take the announcement as just one stake in the ground along a large
frontier. I don't see the proposal being adopted in any regulatory
context--it's too vague and limited--but it's interesting for what it
says about Google and Verizon.

May 11 2010

May 06 2010

Four short links: 6 May 2010

  1. Ethics and Economics -- This paper looks at the evidence that suggests that ethical behaviour is good for the economy.
  2. FCC to Regulate Broadband -- Two FCC officials, who spoke on the condition of anonymity, said FCC Chairman Julius Genachowski will announce Thursday that the commission considers broadband service a hybrid between an information service and a utility and that it has sufficient power to regulate Internet traffic under existing law.
  3. TCP/IP and IMS Sequence Diagrams -- watch SYN, ACK, payload, etc. packets to and fro to understand what really happens each time you fetch mail or surf the web. This is what Velocity-type devops performance folks care about.
  4. How to Build a Time Machine (Daily Mail) -- extremely readable article by Stephen Hawking about the possibilities of time travel.

April 06 2010

DC Circuit court rules in Comcast case, leaves the FCC a job to do

Today's ruling in Comcast v. FCC will certainly change the
terms of debate over network neutrality, but the win for Comcast is
not as far-reaching as headlines make it appear. The DC Circuit court
didn't say, "You folks at the Federal Communications Commission have
no right to tell any Internet provider what to do without
Congressional approval." It said, rather, "You folks at the FCC didn't
make good arguments to prove that your rights extend to stopping
Comcast's particular behavior."

I am not a lawyer, but to say what happens next will take less of a
lawyer than a fortune-teller. I wouldn't presume to say whether the
FCC can fight Comcast again over the BitTorrent issue. But the court
left it open for the FCC to try other actions to enforce rules on
Internet operators. Ultimately, I think the FCC should take a hint
from the court and stop trying to regulate the actions of telephone
and cable companies at the IP layer. The hint is to regulate them at
the level where the FCC has more authority--on the physical level,
where telephone companies are regulated as common carriers and cable
companies have requirements to the public as well.




The court noted (on pages 30 through 34 of href="http://pacer.cadc.uscourts.gov/common/opinions/201004/08-1291-1238302.pdf">its
order) that the FCC missed out on the chance to make certain
arguments that the court might have looked on more favorably.
Personally and amateurly, I think those arguments would be weak
anyway. For instance, the FCC has the right to regulate activities
that affect rates. VoIP can affect phone rates and video downloads
over the Internet can affect cable charges for movies. So the FCC
could try to find an excuse to regulate the Internet. But I wouldn't
be the one to make that excuse.

The really significant message to the FCC comes on pages 30 and 32.
The court claims that any previous court rulings that give power to
the FCC to regulate the Internet (notably the famous Brand X decision)
are based on its historical right to regulate common carriers (e.g.,
telephone companies) and broadcasters. Practically speaking, this
gives the FCC a mandate to keep regulating the things that
matter--with an eye to creating a space for a better Internet and
high-speed digital networking (broadband).

Finding the right layer

Comcast v. FCC combines all the elements of a regulatory
thriller. First, the stakes are high: we're talking about who
controls the information that comes into our homes. Second, Comcast
wasn't being subtle in handling BitTorrent; its manipulations were
done with a conscious bias, carried out apparently arbitrarily (rather
than being based on corporate policy, it seems that a network
administrator made and implemented a personal decision), and were kept
secret until customers uncovered the behavior. If you had asked for a
case where an Internet provider said, "We can do anything the hell we
want regardless of any political, social, technical, moral, or
financial consequences," you'd choose something like Comcast's
impedance of BitTorrent.

And the court did not endorse that point of view. Contrary to many
headlines, the court affirmed that the FCC has the right to
regulate the Internet. Furthermore, the court acknowledged that
Congress gave the FCC the right to promote networking. But the FCC
must also observe limits.

The court went (cursorily in some cases) over the FCC's options for
regulating Comcast's behavior, and determined either that there was no
precedent for it or (I'm glossing over lots of technicalities here)
that the FCC had not properly entered those options into the case.

The FCC should still take steps to promote the spread of high-speed
networking, and to ensure that it is affordable by growing numbers of
people. But it must do so by regulating the lines, not what travels
over those lines.

As advocates for greater competition have been pointing out for
several years, the FCC fell down on that public obligation. Many trace
the lapse to the chairmanship of Bush appointee Michael Powell. And
it's true that he chose to try to get the big telephone and cable
companies to compete with each other (a duopoly situation) instead of
opening more of a space for small Internet providers. I cover this
choice in a 2004
article
. But it's not fair to say Powell had no interest in
competition, nor is it historically accurate to say this was a major
change in direction for the FCC.

From the beginning, when the 1996 telecom act told the FCC to promote
competition, implementation was flawed. The FCC chose 14 points in the
telephone network where companies had to allow interconnection (so
competitors could come on the network). But it missed at least one
crucial point. The independent Internet providers were already losing
the battle before Powell took over the reins at the FCC.

And the notion of letting two or three big companies duke it out
(mistrusting start-ups to make a difference) is embedded in the 1996
act itself.

Is it too late to make a change? We must hope not. Today's court
ruling should be a wake-up call; it's time to get back to regulating
things that the FCC actually can influence.

Comcast's traffic shaping did not change the networking industry. Nor
did it affect the availability of high-speed networks. It was a clumsy
reaction by a beleaguered company to a phenomenon it didn't really
understand. Historically, it will prove an oddity, and so will the
spat that network advocates started, catching the FCC in its snares.

The difficulty software layers add

The term "Internet" is used far too loosely. If you apply it to all
seven layers of the ISO networking model, it covers common carrier
lines regulated by the FCC (as well as cable lines, which are subject
to less regulation--but still some). But the FCC has historically
called the Internet a "service" that is separate from those lines.

Software blurs and perhaps even erases such neat distinctions. Comcast
does not have to rewire its network or shut down switches to control
it. All they have to do is configure a firewall. That's why stunts
like holding back BitTorrent traffic become networking issues and draw
interest from the FCC. But it also cautions against trying to regulate
what Comcast does, because it's hard to know when to stop. That's what
opponents of network neutrality say, and you can hear it in the court
ruling.

The fuzzy boundaries between software regulation and real-world
activities bedevils other areas of policy as well. Because
sophisticated real-world processing moves from mechanical devices into
software, it encourages inventors to patent software innovations, a
dilemma I explore in href="http://radar.oreilly.com/archives/2007/09/three_vantage_p.html">another
article. And in the 1990s, courts argued over whether encryption
was a process or a form of expression--and decided it was a form of
expression.

Should the FCC wait for Congress to tell it what to do? I don't think
so. The DC Circuit court blocked one path, but it didn't tell the FCC
to turn back. It has a job to do, and it just has to find the right
tool for the job.

March 17 2010

March 16 2010

Google Fiber and the FCC National Broadband Plan

I've puzzled over Google's Fiber project ever since they announced it. It seemed too big, too hubristic (even for a company that's already big and has earned the right to hubris)--and also not a business Google would want to be in. Providing the "last mile" of Internet service is a high cost/low payoff business that I'm glad I escaped (a friend an I seriously considered starting an ISP back in '92, until we said "How would we deal with customers?").


But the FCC's announcement of their plans to widen broadband Internet access in the US (the "National Broadband Strategy") puts Google Fiber in a new context. The FCC's plans are cast in terms of upgrading and expanding the network infrastructure. That's a familiar debate, and Google is a familiar participant. This is really just an extension of the "network neutrality" debate that has been going on with fits and starts over the past few years.


Google has been outspoken in their support for the idea that network carriers shouldn't discriminate between different kinds of traffic. The established Internet carriers largely have opposed network neutrality, arguing that they can't afford to build the kind of high-bandwidth networks that are required for delivering video and other media. While the debate over network neutrality has quieted down recently, the issues are still floating out there, and no less important. Will the networks of the next few decades be able to handle whatever kinds of traffic we want to throw at it?


In the context of network neutrality, and in the context of the FCC's still unannounced (and certain to be controversial) plans, Google Fiber is the trump card. It's often been said that the Internet routes around damage. Censorship is one form of damage; non-neutral networks are another. Which network would you choose? One that can't carry the traffic you want, or one that will? Let's get concrete: if you want video, would you choose a network that only delivers real-time video from providers who have paid additional bandwidth charges to your carrier? Google's core business is predicated upon the availability of richer and richer content on the net. If they can ensure that all the traffic that people want can be carried, they win; if they can't, if the carriers mediate what can and can't be carried, they lose. But Google Fiber ensures that our future networks will indeed be able to "route around damage", and makes what the other carriers do irrelevant. Google Fiber essentially tells the carriers "If you don't build the network we need, we will; you will either move with the times, or you won't survive."


Looked at this way, non-network-neutrality requires a weird kind of collusion. Deregulating the carriers by allowing them to charge premium prices for high bandwidth services, only works as long as all the carriers play the same game, and all raise similar barriers against high-bandwidth traffic. As soon as one carrier says "Hey, we have a bigger vision; we're not going to put limits on what you want to do," the game is over. You'd be a fool not to use that carrier. You want live high-definition video conferencing? You got it. You want 3D video, requiring astronomical data rates? You want services we haven't imagined yet? You can get those too. AT&T and Verizon don't like it? Tough; it's a free market, and if you offer a non-competitive product, you lose. The problem with the entrenched carriers' vision is that, if you discriminate against high-bandwidth services, you'll kill those services off before they can even be invented.


The U.S. is facing huge problems with decaying infrastructure. At one time, we had the best highway system, the best phone system, the most reliable power grid; no longer. Public funding hasn't solved the problem; in these tea-party days, nobody's willing to pay the bills, and few people understand why the bills have to be as large as they are. (If you want some insight into the problems of decaying infrastructure, here's an op-ed piece on Pennsylvania's problems repairing its bridges.) Neither has the private sector, where short-term gain almost always wins over the long-term picture.


But decaying network infrastructure is a threat to Google's core business, and they aren't going to stand by idly. Even if they don't intend to become a carrier themselves, as Eric Schmidt has stated, they could easily change their minds if the other carriers don't keep up. There's nothing like competition (or even the threat of competition) to make the markets work.


We're looking at a rare conjunction. It's refreshing to see a large corporation talk about creating the infrastructure they need to prosper--even if that means getting into a new kind of business. To rewrite the FCC Chairman's metaphor, it's as if GM and Ford were making plans to upgrade the highway system so they could sell better cars. It's an approach that's uniquely Googley; it's the infrastructure analog to releasing plugins that "fix" Internet Explorer for HTML5. "If it's broken and you won't fix it, we will." That's a good message for the carriers to hear. Likewise, it's refreshing to see the FCC, which has usually been a dull and lackluster agency, taking the lead in such a critical area. An analyst quoted by the Times says "One again, the FCC is putting the service providers on the spot." As well they should. A first-class communications network for all citizens is essential if the U.S. is going to be competitive in the coming decades. It's no surprise that Google and the FCC understands this, but I'm excited by their commitment to building it.


March 10 2010

March 05 2010

Report from HIMMS Health IT conference: building or bypassing infrastructure

Today the Healthcare Information and
Management Systems Society (HIMSS)
conference wrapped up. In
previous blogs, I laid out the href="http://radar.oreilly.com/2010/03/report-from-himms-health-it-co.html">
benefits of risk-taking in health care IT followed by my main
theme, href="http://radar.oreilly.com/2010/03/report-from-himms-health-it-co-1.html">
interoperability and openness. This blog will cover a few topics
about a third important issue, infrastructure.

Why did I decide this topic was worth a blog? When physicians install
electronic systems, they find that they need all kinds of underlying
support. Backups and high availability, which might have been
optional or haphazard before, now have to be professional. Your
patient doesn't want to hear, "You need an antibiotic right away, but
we'll order it tomorrow when our IT guy comes in to reboot the
system." Your accounts manager would be almost as upset if you told
her that billing will be delayed for the same reason.

Network bandwidth

An old sales pitch in the computer field (which I first heard at
Apollo Computer in the 1980s) goes, "The network is the computer." In
the coming age of EHRs, the network is the clinic. My family
practitioner (in an office of five practitioners) had to install a T1
line when they installed an EHR. In eastern Massachusetts, whose soil
probably holds more T1 lines than maple tree roots, that was no big
deal. It's considerably more problematic in an isolated rural area
where the bandwidth is more comparable to what I got in my hotel room
during the conference (particularly after 10:30 at night, when I'm
guessing a kid in a nearby room joined an MMPG). One provider from the
mid-West told me that the incumbent changes $800 per month for a T1.
Luckily, he found a cheaper alternative.

So the FCC is href="http://www.fcc.gov/cgb/rural/rhcp.html">involved in health care
now. Bandwidth is perhaps their main focus at the moment, and
they're explicitly tasked with making sure rural providers are able to
get high-speed connections. This is not a totally new concern; the
landmark 1994 Telecom Act included rural health care providers in its
universal service provisions. I heard one economist deride the
provision, asking what was special about rural health care providers
that they should get government funding. Fifteen years later, I think
rising health care costs and deteriorating lifestyles have answered
that question.

Wireless hubs

The last meter is just as important as the rest of your network, and
hospitals with modern, technology-soaked staff are depending
increasingly on mobile devices. I chatted with the staff of a small
wireless company called Aerohive that aims its products at hospitals.
Its key features are:

Totally cable-free hubs

Not only do Aerohive's hubs communicate with your wireless endpoints,
they communicate with other hubs and switches wirelessly. They just
make the hub-to-endpoint traffic and hub-to-hub traffic share the
bandwidth in the available 2.4 and 5 GHz ranges. This allows you to
put them just about anywhere you want and move them easily.

Dynamic airtime scheduling

The normal 802.11 protocols share the bandwidth on a packet-by-packet
basis, so a slow device can cause all the faster devices to go slower
even when there is empty airtime. I was told that an 802.11n device
can go slower than a 802.11b device if it's remote and its signal has
to go around barriers. Aerohive just checks how fast packets are
coming in and allocates bandwidth on that ratio, like time-division
multiplexing. If your device is ten times faster than someone else's
and the bandwidth is available, you can use ten times as much
bandwidth.

Dynamic rerouting

Aerohive hubs use mesh networking and an algorithm somewhat like
Spanning Tree Protocol to reconfigure the network when a hub is added
or removed. Furthermore, when you authenticate with one hub, its
neighbors store your access information so they can pick up your
traffic without taking time to re-authenticate. This makes roaming
easy and allows you to continue a conversation without a hitch if a
hub goes down.

Security checking at the endpoint

Each hub has a built-in firewall so that no unauthorized device can
attach to the network. This should be of interest in an open, public
environment like a hospital where you have no idea who's coming in.

High bandwidth

The top-of-the-line hub has two MIMO radios, each with three
directional antennae.

Go virtual, part 1

VMware has href="http://www.vmware.com/solutions/industry/healthcare/case-studies.html">customers
in health care, as in other industries. In addition, they've
incorporated virtualization into several products from medical
equipment and service vendors,

Radiology

Hospitals consider these critical devices. Virtualization here
supports high availability.

Services

A transcription service could require ten servers. Virtualization can
consolidate them onto one or two pieces of hardware.

Roaming desktops

Nurses often move from station to station. Desktop virtualization
allows them to pull up the windows just as they were left on the
previous workstation.

Go virtual, squared

If all this talk of bandwidth and servers brings pain to your head as
well as to the bottom line, consider heading into the cloud. At one
talk I attended today on cost analysis, a hospital administrator
reported that about 20% of their costs went to server hosting. They
saved a lot of money by rigorously eliminating unneeded backups, and a
lot on air conditioning by arranging their servers more efficiently.
Although she didn't discuss Software as a Service, those are a couple
examples of costs that could go down if functions were outsourced.

Lots of traditional vendors are providing their services over the Web
so you don't have to install anything, and several companies at the
conference are entirely Software as a Service. I mentioned href="http://www.practicefusion.com/">Practice Fusion in my
previous blog. At the conference, I asked them three key questions
pertinent to Software as a Service.

Security

This is the biggest question clients ask when using all kinds of cloud
services (although I think it's easier to solve than many other
architectural issues). Practice Fusion runs on HIPAA-compliant
Salesforce.com servers.

Data portability

If you don't like your service, can you get your data out? Practice
Fusion hasn't had any customers ask for their data yet, but upon
request they will produce a DVD containing your data in CSV files, or
in other common formats, overnight.

Extendibility

As I explained in my previous blog, clients increasingly expect a
service to be open to enhancements and third-party programs. Practice
Fusion has an API in beta, and plans to offer a sandbox on their site
for people to develop and play with extensions--which I consider
really cool. One of the API's features is to enforce a notice to the
clinician before transferring sensitive data.

The big selling point that first attracts providers to Practice Fusion
is that it's cost-free. They support the service through ads, which
users tell them are unobtrusive and useful. But you can also pay to
turn off ads. The service now has 30,000 users and is adding about 100
each day.

Another SaaS company I mentioned in my previous blog is href="http://www.covisint.com/">Covisint. Their service is
broader than Practice Fusion, covering not only patient records but
billing, prescription ordering, etc. Operating also as an HIE, they
speed up access to data on patients by indexing all the data on each
patient in the extended network. The actual data, for security and
storage reasons, stays with the provider. But once you ask about a
patient, the system can instantly tell you what sorts of data are
available and hook you up with the providers for each data set.

Finally, I talked to the managers of a nimble new company called href="http://carecloud.com/">CareCloud, which will start serving
customers in early April. CareCloud, too, offers a range of services
in patient health records, practice management, and and revenue cycle
management. It was built entirely on open source software--Ruby on
Rails and a PostgreSQL database--while using Flex to build their
snazzy interface, which can run in any browser (including the iPhone,
thanks to Adobe's upcoming translation to native code).upcoming
translation to native code). Their strategy is based on improving
physicians' productivity and the overall patient experience through a
social networking platform. The interface has endearing Web 2.0 style
touches such as a news feed, SMS and email confirmations, and
integration with Google Maps.

And with that reference to Google Maps (which, in my first blog, I
complained about mislocating the address 285 International Blvd NW for
the Georgia World Congress Center--thanks to the Google Local staff
for getting in touch with me right after a tweet) I'll end my coverage
of this year's HIMSS.

February 10 2010

Google Enters the Home Broadband Market

In a week already full of Google announcements, another bomb was casually dropped today via Google's blog. The Borg from California announced that it was experimentally entering the Fiber to the Curb (FTTC) market, and that they planned to offer much higher speeds than current offerings (1Gb/sec) and competitive pricing. The announcement also talks about what, when you remove the marketspeak, is a commitment to net neutrality in their service. This, of course, is not surprising, given Google's strong lobbying for neutrality to the FCC and congress.

What is becoming very clear is that Google wants to have a finger in, if not own, most of the pie when it comes to how consumers and business access their information. Android was a first foray into the mobile market, and we know that Google was in the chase for cellular spectrum during the big auction. Google Voice is another attempt to make an end run around the traditional telecomm infrastructure. But if Google becomes a major player in Fiber to the home, they take a huge step forward.

Once Google has a pipe into the house, they can easily become a player in VoIP and landline telephone service, as well as cable TV and on-demand. Of course, these areas are fraught with regulatory issues. Many towns require cable providers to enter into individual franchise agreements in order to provide service, which can be a nightmare when you multiply it times N towns. But it's much easier to offer when you have a bit pipe already in place. And a 1Gb service will allow for HD or even Blu-Ray 3D service on-demand to the house.

In a way, you can say that it's about time that someone offered Gb fiber in the US. In Europe and Asia, this level of service is already in place, and it's a bit of a crime that we lag so far behind. Google could jumpstart the market in the US, and without all the bagage that the traditional telcos are carrying around.

Mind you, this is just an experiment. According to Google, the pilot will involve somewhere between 50,000 and 500,000 households. But unlike many companies, Google 'experiments' have a habit of turning into game-changing products.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl