Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

December 09 2013

Who will upgrade the telecom foundation of the Internet?

Although readers of this blog know quite well the role that the Internet can play in our lives, we may forget that its most promising contributions — telemedicine, the smart electrical grid, distance education, etc. — depend on a rock-solid and speedy telecommunications network, and therefore that relatively few people can actually take advantage of the shining future the Internet offers.

Worries over sputtering advances in bandwidth in the US, as well as an actual drop in reliability, spurred the FCC to create the Technology Transitions Policy Task Force, and to drive discussion of what they like to call the “IP transition”.

Last week, I attended a conference on the IP transition in Boston, one of a series being held around the country. While we tussled with the problems of reliability and competition, one urgent question loomed over the conference: who will actually make advances happen?

What’s at stake and why bids are coming in so low

It’s not hard to tally up the promise of fast, reliable Internet connections. Popular futures include:

  • Delivering TV and movie content on demand
  • Checking on your lights, refrigerator, thermostat, etc., and adjusting them remotely
  • Hooking up rural patients with health care experts in major health centers for diagnosis and consultation
  • Urgent information updates during a disaster, to aid both victims and responders

I could go on and on, but already one can see the outline of the problem: how do we get there? Who is going to actually create a telecom structure that enables everyone (not just a few privileged affluent residents of big cities) to do these things?

Costs are high, but the payoff is worthwhile. Ultimately, the applications I listed will lower the costs of the services they replace or improve life enough to justify an investment many times over. Rural areas — where investment is currently hardest to get — could probably benefit the most from the services because the Internet would give them access to resources that more centrally located people can walk or drive to.

The problem is that none of the likely players can seize the initiative. Let’s look at each one:

Telecom and cable companies
The upgrading of facilities is mostly in their hands right now, but they can’t see beyond the first item in the previous list. Distributing TV and movies is a familiar business, but they don’t know how to extract value from any of the other applications. In fact, most of the benefits of the other services go to people at the endpoints, not to the owners of the network. This has been a sore point with the telecom companies ever since the Internet took off, and spurs them on constant attempts to hold Internet users hostage and shake them down for more cash.

Given the limitations of the telecom and cable business models, it’s no surprise they’ve rolled out fiber in the areas they want and are actually de-investing in many other geographic areas. Hurricane Sandy brought this to public consciousness, but the problem has actually been mounting in rural areas for some time.

Angela Kronenberg of COMPTEL, an industry association of competitive communications companies, pointed out that it’s hard to make a business case for broadband in many parts of the United States. We have a funny demographic: we’re not as densely populated as the Netherlands or South Korea (both famous for blazingly fast Internet service), nor as concentrated as Canada and Australia, where it’s feasible to spend a lot of money getting service to the few remote users outside major population centers. There’s no easy way to reach everybody in the US.

Governments
Although governments subsidize network construction in many ways — half a dozen subsidies were reeled off by keynote speaker Cameron Kerry, former Acting Secretary of the Department of Commerce — such stimuli can only nudge the upgrade process along, not control it completely. Government funding has certainly enabled plenty of big projects (Internet access is often compared to the highway system, for instance), but it tends to go toward familiar technologies that the government finds safe, and therefore misses opportunities for radical disruption. It’s no coincidence that these safe, familiar technologies are provided by established companies with lobbyists all over DC.

As an example of how help can come from unusual sources, Sharon Gillett mentioned on her panel the use of unlicensed spectrum by small, rural ISPs to deliver Internet to areas that otherwise had only dial-up access. The FCC ruling that opened up “white space” spectrum in the TV band to such use has greatly empowered these mavericks.

Individual consumers
Although we are the ultimate beneficiaries of new technology (and will ultimately pay for it somehow, through fees or taxes) hardly anyone can plunk down the cash for it in advance: the vision is too murky and the reward too far down the road. John Burke, Commissioner of the Vermont Public Service Board, flatly said that consumers choose the phone service almost entirely on the basis of price and don’t really find out its reliability and features until later.

Basically, consumers can’t bet that all the pieces of the IP transition will fall in place during their lifetimes, and rolling out services one consumer at a time is incredibly inefficient.

Internet companies
Google Fiber came up once or twice at the conference, but their initiatives are just a proof of concept. Even if Google became the lynchpin it wants to be in our lives, it would not have enough funds to wire the world.

What’s the way forward, then? I find it in community efforts, which I’ll explore at the end of this article.

Practiced dance steps

Few of the insights in this article came up directly in the Boston conference. The panelists were old hands who had crossed each other’s paths repeatedly, gliding between companies, regulatory agencies, and academia for decades. At the conference, they pulled their punches and hid their agendas under platitudes. The few controversies I saw on stage seemed to be launched for entertainment purposes, distracting from the real issues.

From what I could see, the audience of about 75 people came almost entirely from the telecom industry. I saw just one representative of what you might call the new Internet industries (Microsoft strategist Sharon Gillett, who went to that company after an august regulatory career) and two people who represent the public interest outside of regulatory agencies (speaker Harold Feld of Public Knowledge and Fred Goldstein of Interisle Consulting Group).

Can I get through to you?

Everyone knows that Internet technologies, such as voice over IP, are less reliable than plain old telephone service, but few realize how soon reliability of any sort will be a thing of the past. When a telecom company signs you up for a fancy new fiber connection, you are no longer connected to a power source at the telephone company’s central office. Instead, you get a battery that can last eight hours in case of a power failure. A local power failure may let you stay in contact with outsiders if the nearby mobile phone towers stay up, but a larger failure will take out everything.

These issues have a big impact on public safety, a concern raised at the beginning of the conference by Gregory Bialecki in his role as a Massachusetts official, and repeated by many others during the day.

There are ways around the new unreliability through redundant networks, as Feld pointed out during his panel. But the public and regulators must take a stand for reliability, as the post-Sandy victims have done. The issue in that case was whether a community could be served by wireless connections. At this point, they just don’t deliver either the reliability or the bandwidth that modern consumers need.

Mark Reilly of Comcast claimed at the conference that 94% of American consumers now have access to at least one broadband provider. I’m suspicious of this statistic because the telecom and cable companies have a very weak definition of “broadband” and may be including mobile phones in the count. Meanwhile, we face the possibility of a whole new digital divide consisting of people relegated to wireless service, on top of the old digital divide involving dial-up access.

We’ll take that market if you’re not interested

In a healthy market, at least three companies would be racing to roll out new services at affordable prices, but every new product or service must provide a migration path from the old ones it hopes to replace. Nowhere is this more true than in networks because their whole purpose is to let you reach other people. Competition in telecom has been a battle cry since the first work on the law that became the 1996 Telecom Act (and which many speakers at the conference say needs an upgrade).

Most of the 20th century accustomed people to thinking of telecom as a boring, predictable utility business, the kind that “little old ladies” bought stock in. The Telecom Act was supposed to knock the Bell companies out of that model and turn them into fierce innovators with a bunch of other competitors. Some people actually want to reverse the process and essentially nationalize the telecom infrastructure, but that would put innovation at risk.

The Telecom Act, especially as interpreted later by the FCC, fumbled the chance to enforce competition. According to Goldstein, the FCC decided that a duopoly (baby Bells and cable companies) were enough competition.

The nail in the coffin may have been the FCC ruling that any new fiber providing IP service was exempt from the requirements for interconnection. The sleight of hand that the FCC used to make this switch was a redefinition of the Internet: they conflated the use of IP on the carrier layer with the bits traveling around above, which most people think of as “the Internet.” But the industry and the FCC had a bevy of arguments (including the looser regulation of cable companies, now full-fledged competitors of the incumbent telecom companies), so the ruling stands. The issue then got mixed in with a number of other controversies involving competition and control on the Internet, often muddled together under the term “network neutrality.”

Ironically, one of the selling points that helps maintain a competitive company, such as Granite Telecom, is reselling existing copper. Many small businesses find that the advantages of fiber are outweighed by the costs, which may include expensive quality-of-service upgrades (such as MPLS), new handsets to handle VoIP, and rewiring the whole office. Thus, Senior Vice President Sam Kline announced at the conference that Granite Telecom is adding a thousand new copper POTS lines every day.

This reinforces the point I made earlier about depending on consumers to drive change. The calculus that leads small businesses to stick with copper may be dangerous in the long run. Besides lost opportunities, it means sticking with a technology that is aging and decaying by the year. Most of the staff (known familiarly as Bellheads) who designed, built, and maintain the old POTS network are retiring, and the phone companies don’t want to bear the increasing costs of maintenance, so reliability is likely to decline. Kline said he would like to find a way to make fiber more attractive, but the benefits are still vaporware.

At this point, the major companies and the smaller competing ones are both cherry picking in different ways. The big guys are upgrading very selectively and even giving up on some areas, whereas the small companies look for niches, as Granite Telecom has. If universal service is to become a reality, a whole different actor must step up to the podium.

A beautiful day in the neighborhood

One hope for change is through municipal and regional government bodies, linked to local citizen groups who know where the need for service is. Freenets, which go back to 1984, drew on local volunteers to provide free Internet access to everyone with a dial-up line, and mesh networks have powered similar efforts in Catalonia and elsewhere. In the 1990s, a number of towns in the US started creating their own networks, usually because they had been left off the list of areas that telecom companies wanted to upgrade.

Despite legal initiatives by the telecom companies to squelch municipal networks, they are gradually catching on. The logistics involve quite a bit of compromise (often, a commercial vendor builds and runs the network, contracting with the city to do so), but many town managers swear that advantages in public safety and staff communications make the investment worthwhile.

The limited regulations that cities have over cable companies (a control that sometimes is taken away) is a crude instrument, like a potter trying to manipulate clay with tongs. To craft a beautiful work, you need to get your hands right on the material. Ideally, citizens would design their own future. The creation of networks should involve companies and local governments, but also the direct input of citizens.

National governments and international bodies still have roles to play. Burke pointed out that public safety issues, such as 911 service, can’t be fixed by the market, and developing nations have very little fiber infrastructure. So, we need large-scale projects to achieve universal access.

Several speakers also lauded state regulators as the most effective centers to handle customer complaints, but I think the IP transition will be increasingly a group effort at the local level.

Back to school

Education emerged at the conference as one of the key responsibilities that companies and governments share. The transition to digital TV was accompanied by a massive education budget, but in my home town, there are still people confused by it. And it’s a minuscule issue compared to the task of going to fiber, wireless, and IP services.

I had my own chance to join the educational effort on the evening following the conference. Friends from Western Massachusetts phoned me because they were holding a service for an elderly man who had died. They lacked the traditional 10 Jews (the minyan) required by Jewish law to say the prayer for the deceased, and asked me to Skype in. I told them that remote participation would not satisfy the law, but they seemed to feel better if I did it. So I said, “If Skype will satisfy you, why can’t I just participate by phone? It’s the same network.” See, FCC? I’m doing my part.

March 19 2013

Four short links: 19 March 2013

  1. VizCities Dev Diary — step-by-step recount of how they brought London’s data to life, SimCity-style.
  2. Google Fibre Isn’t That ImpressiveFor [gigabit broadband] to become truly useful and necessary, we’ll need to see a long-term feedback loop of utility and acceptance. First, super-fast lines must allow us to do things that we can’t do with the pedestrian internet. This will prompt more people to demand gigabit lines, which will in turn invite developers to create more apps that require high speed, and so on. What I discovered in Kansas City is that this cycle has not yet begun. Or, as Ars Technica put it recently, “The rest of the internet is too slow for Google Fibre.”
  3. gov.uk Recommendations on Open SourceUse open source software in preference to proprietary or closed source alternatives, in particular for operating systems, networking software, Web servers, databases and programming languages.
  4. Internet Bad Neighbourhoods (PDF) — bilingual PhD thesis. The idea behind the Internet Bad Neighborhood concept is that the probability of a host in behaving badly increases if its neighboring hosts (i.e., hosts within the same subnetwork) also behave badly. This idea, in turn, can be exploited to improve current Internet security solutions, since it provides an indirect approach to predict new sources of attacks (neighboring hosts of malicious ones).

August 27 2012

Four short links: 27 August 2012

  1. International Broadband Pricing Study Dataset for Reuse3,655 fixed and mobile broadband retail price observations, with fixed broadband pricing data for 93 countries and mobile broadband pricing data for 106 countries.
  2. The Dictator’s Practical Internet Guide to Power Retention — tongue-in-cheek “The goal of this guide is to provide leaders of authoritarian, autocratic, theocratic, totalitarian and other single-leader or single-party regimes with a basic set of guidelines on how to use the internet to ensure you retain the most power for the longest time. The best way to achieve this is to never have your authority contested. This guide will accompany you in the obliteration of political dissidence. By having everyone agree with you, or believe that everyone agrees with you, your stay at the head of state will be long and prosperous.” (via BoingBoing)
  3. Ultra Cinnamon (GitHub) — arduino-based monitor & access system for restricted locations.
  4. CKEditor Beta 4 Out — moving to Github, added inline editing. (via Javascript Weekly)

June 25 2012

Four short links: 25 June 2012

  1. Stop Treating People Like Idiots (Tom Steinberg) -- governments miss the easy opportunities to link the tradeoffs they make to the point where the impacts are felt. My argument is this: key compromises or decisions should be linked to from the points where people obtain a service, or at the points where they learn about one. If my bins are only collected once a fortnight, the reason why should be one click away from the page that describes the collection times.
  2. UK Study Finds Mixed Telemedicine Benefits -- The results, in a paper to the British Medical Journal published today, found telehealth can help patients with long-term conditions avoid emergency hospital care, and also reduce deaths. However, the estimated scale of hospital cost savings is modest and may not be sufficient to offset the cost of the technology, the report finds. Overall the evidence does not warrant full scale roll-out but more careful exploration, it says. (via Mike Pearson)
  3. Pay Attention to What Nick Denton is Doing With Comments (Nieman Lab) -- Most news sites have come to treat comments as little more than a necessary evil, a kind of padded room where the third estate can vent, largely at will, and tolerated mainly as a way of generating pageviews. This exhausted consensus makes what Gawker is doing so important. Nick Denton, Gawker’s founder and publisher, Thomas Plunkett, head of technology, and the technical staff have re-designed Gawker to serve the people reading the comments, rather than the people writing them.
  4. Informed Consent Source of Confusion (Nature) -- fascinating look at the downstream uses of collected bio data and the difficulty in gaining informed consent: what you might learn about yourself (do I want to know I have an 8.3% greater chance of developing Alzheimers? What would I do with that knowledge besides worry?), what others might learn about you (will my records be subpoenable?), and what others might make from the knowledge (will my data be used for someone else's financial benefit?). (via Ed Yong)

January 30 2012

A discussion with David Farber: bandwidth, cyber security, and the obsolescence of the Internet

David Farber, a veteran of Internet technology and politics, dropped by Cambridge, Mass. today and was gracious enough to grant me some time in between his numerous meetings. On leave from Carnegie Mellon, Dave still intervenes in numerous policy discussions related to the Internet and "plays in Washington," as well as hosting the popular Interesting People mailing list. This list delves into dizzying levels of detail about technological issues, but I wanted to pump him for big ideas about where the Internet is headed, topics that don't make it to the list.

How long can the Internet last?

I'll start with the most far-reaching prediction: that Internet protocols simply aren't adequate for the changes in hardware and network use that will come up in a decade or so. Dave predicts that computers will be equipped with optical connections instead of pins for networking, and the volume of data transmitted will overwhelm routers, which at best have mixed optical/electrical switching. Sensor networks, smart electrical grids, and medical applications with genetic information could all increase network loads to terabits per second.

When routers evolve to handle terabit-per-second rates, packet-switching protocols will become obsolete. The speed of light is constant, so we'll have to rethink the fundamentals of digital networking.

I tossed in the common nostrum that packet-switching was the fundamental idea behind the Internet and its key advance over earlier networks, but Dave disagreed. He said lots of activities on the Internet reproduce circuit-like behavior, such as sessions at the TCP or Web application level. So theoretically we could re-architect the underlying protocols to fit what the hardware and the applications have to offer.

But he says his generation of programmers who developed the Internet are too tired ("It's been a tough fifteen or twenty years") and will have to pass the baton to a new group of young software engineers who can think as boldly and originally as the inventors of the Internet. He did not endorse any of the current attempts to design a new network, though.

Slaying the bandwidth bottleneck

Like most Internet activists, Dave bewailed the poor state of networking in the U.S. In advanced nations elsewhere, 100-megabit per second networking is available for reasonable costs, whereas here it's hard to go beyond a 30 megabits (on paper!) even at enormous prices and in major metropolitan areas. Furthermore, the current administration hasn't done much to improve the situation, even though candidate Obama made high bandwidth networking a part of his platform and FCC Chairman Julius Genachowski talks about it all the time.

Dave has been going to Washington on tech policy consultations for decades, and his impressions of the different administrations has a unique slant all its own. The Clinton administration really listened to staff who understood technology--Gore in particular was quite a technology junkie--and the administration's combination of judicious policy initiatives and benign neglect led to the explosion of the commercial Internet. The following Bush administration was famously indifferent to technology at best. The Obama administration lies somewhere in between in cluefulness, but despite their frequent plaudits for STEM and technological development, Dave senses that neither Obama nor Biden really have the drive to deal with and examine complex technical issues and insist on action where necessary.

I pointed out the U.S.'s particular geographic challenges--with a large, spread-out population making fiber expensive--and Dave countered that fiber to the home is not the best solution. In fact, he claims no company could make fiber pay unless it gained 75% of the local market. Instead, phone companies should string fiber to access points 100 meters or so from homes, and depend on old copper for the rest. This could deliver quite adequate bandwidth at a reasonable cost. Cable companies, he said, could also greatly increase Internet speeds. Wireless companies are pretty crippled by loads that they encouraged (through the sale of app-heavy phones) and then had problems handling, and are busy trying to restrict users' bandwidth. But a combination of 4G, changes in protocols, and other innovations could improve their performance.

Waiting for the big breach

I mentioned that in the previous night's State of the Union address, Obama had made a vague reference to a href="http://www.whitehouse.gov/blog/2012/01/26/legislation-address-growing-danger-cyber-threats">cybersecurity initiative with a totally unpersuasive claim that it would protect us from attack. Dave retorted that nobody has a good definition of cybersecurity, but that this detail hasn't held back every agency with a stab at getting funds for it from putting forward a cybersecurity strategy. The Army, the Navy, Homeland Security, and others are all looking or new missions now that old ones are winding down, and cybersecurity fills the bill.

The key problem with cybersecurity is that it can't be imposed top-down, at least not on the Internet, which, in a common observation reiterated by Dave, was not designed with security in mind. If people use weak passwords (and given current password cracking speeds, just about any password is weak) and fall victim to phishing attacks, there's little we can do with dictats from the center. I made this point in an article twelve years ago. Dave also pointed out that viruses stay ahead of pattern-matching virus detection software.

Security will therefore need to be rethought drastically, as part of the new network that will replace the Internet. In the meantime, catastrophe could strike--and whoever is in the Administration at the time will have to face public wrath.

Odds without ends

We briefly discussed FCC regulation, where Farber tends to lean toward asking the government to forebear. He acknowledged the merits of arguments made by many Internet supporters, that the FCC tremendously weakened the chances for competition in 2002 when it classified cable Internet as a Title 1 service. This shielded the cable companies from regulations under a classification designed back in early Internet days to protect the mom-and-pop ISPs. And I pointed out that the cable companies have brazenly sued the FCC to win court rulings saying the companies can control traffic any way they choose. But Farber says there are still ways to bring in the FCC and other agencies, notably the Federal Trade commission, to enforce anti-trust laws, and that these agencies have been willing to act to shut down noxious behavior.

Dave and I shared other concerns about the general deterioration of modern infrastructure, affecting water, electricity, traffic, public transportation, and more. An amateur pilot, Dave knows some things about the air traffic systems that make on reluctant to fly. But there a few simple fixes. Commercial air flights are safe partly because pilots possess great sense and can land a plane even in the presence of confusing and conflicting information. On the other hand, Dave pointed out that mathematicians lack models to describe the complexity of such systems as our electrical grid. There are lots of areas for progress in data science.

September 29 2011

ePayments Week: Will NFC add value?

Here's what caught my eye in the payments space this week.

Square's COO questions NFC

SquareSquare's chief operating officer Keith Rabois went against the grain this week and questioned whether there was any value to be had by implementing near-field communications (NFC) for mobile payments. To be fair, he was at the GigaOM Mobile Conference responding to Om Malik's question of whether the short-range wireless function on mobiles would make Square's card reader redundant. Rabois called NFC "a technology in search of a value proposition," saying it's not clear who it helps. The process of swiping a credit card, he continued, is "very etched in the American consciousness ... and the Square card reader allows us to take advantage of that, to allow people to sell things more successfully without changing people's behavior."

He may have a point that the particular technology matters less than the mobile wallet itself. We could do pretty much the same thing by using through-the-cloud technologies (as Bump does) or direct billing (like Boku or Zong). But I think he's overlooked the clear value that seems likely to come to merchants as consumers ditch plastic for mobile wallets.

To name just three:

  • Merchants can administer reward and loyalty programs more efficiently if they're managed through phones rather than on rubber-stamped cards.
  • Merchants can deliver location- and time-specific coupons if they are acquainted with a customer's phone. Placecast is showing how you can deliver offers within a geofenced area. Merchants will also have the opportunity to move discounts quickly if they need to clear inventory. All of that is theoretically possible today with Twitter, but first you have to get them to follow you. Once someone has paid with their phone, presumably it's a lower barrier to get them to agree to receive offers via that phone.
  • Merchants can dynamically steer customers to their best payment option. If PayPal offers a lower percentage for a period than the merchant's credit card service, the merchant can offer products or services at a discount and let the customers choose on their devices.

The benefits for consumers may be a bit less clear and are likely to be a tradeoff: it's our data that we'll be giving up in exchange for being on the receiving end of those benefits listed above. In other words, your digital trail in exchange for daily coupons and every 10th cup of coffee free.

Android Open, being held October 9-11 in San Francisco, is a big-tent meeting ground for app and game developers, carriers, chip manufacturers, content creators, OEMs, researchers, entrepreneurs, VCs, and business leaders.

Save 20% on registration with the code AN11RAD

Amazon's Kindle Fire doesn't have to be as good as iPad to steal market share

Amazon FireShould Apple worry about competition from Amazon's Kindle Fire? The quick consensus seems to be "no" since these are different devices for different functions. Still, I couldn't help myself from making the comparison between this contest and the dramatic rise of Android handsets against the near leveling of the iPhone market. Most reports on the Android versus iOS competition seem to pit the two evenly, as if it were in bad taste to mention that many Android phones cost hundreds of dollars less. Geeks might choose their smartphones based on their affection for Google or Apple. But you only need to visit the AT&T kiosk in your local mall and watch the purchasing decisions to get a truer picture of what's driving this race: cost. Apple's iPhone may be an object of beauty, inside and out, but when you're on a tight budget, you'll put up with the carrier's user interface.

The same thing could happen with Fire and iPad. Fire may not offer anywhere near the same capabilities as the iPad — though with its ability to access web services via its Silk browser, it may not lag far behind. But there are many millions of customers who won't have to think long and hard to save $300 if they can still have movies, TV, books, games, and the web, all on a color touchscreen.

Steven Levy in Wired noted that even if Fire isn't a threat to Apple's iPad, it will certainly be one to Barnes & Noble's Nook and to Netflix. At a time when half of Netflix's membership seems to be furious with the company, many are sure to notice they can get a whole new world of streaming for $79 a year from Amazon Prime.

Mobile broadband is less popular as an add-on

Customers use more mobile broadband services, and they use mobile broadband more frequently, when the capability is built into their devices and not used as an add-on (for example, a USB dongle or stick). This not-too-surprising finding comes from YouGov UK's recent survey of 2,552 British mobile broadband users. It reinforces the suspicion that the easier you make it to get to online services, the more likely they are to get used. Certainly, there's some allowance built into those results for the dongle or stick getting lost or just stuck at the bottom of the backpack. But it also seems likely that those who buy a device that's capable of reaching the web are more likely to use it than those for whom it was an afterthought.

Got news?

News tips and suggestions are always welcome, so please send them along.


If you're interested in learning more about the payment development space, check out PayPal X DevZone, a collaboration between O'Reilly and PayPal.

Related:

August 17 2011

Big broadband names must back £530m government drive to reach rural areas, experts warn

Plans to extend fast broadband to all of UK must be supported by the major internet service providers, industry figures say

The government's £530m drive to get super-fast broadband to the UK's rural communities will fail unless more is done to encourage the biggest providers to reach the most remote customers, industry insiders have warned.

Jeremy Hunt announced funding for Northern Ireland, Scotland and the English county councils this week to bring broadband to every home and business in the country. But a pioneering council-run project which has used £90m of public subsidy to build a network in South Yorkshire is already running into difficulties.

Five councils will have built a fibre-optic super-fast broadband network reaching 80% of premises across the Sheffield, Barnsley, Doncaster and Rotherham areas by the end of the year, but none of the best-known internet service providers – BT, TalkTalk and BSkyB – have so far signed up.

Without them, demand from consumers looks likely to be so low that the consortium, Digital Region, may be forced to abandon the aim of selling to householders and focus its attention on businesses and the public sector.

"The local projects are failing. Take-up is very low because of poor marketing and the limited choice of internet service providers," claimed Piers Daniell, an entrepreneur whose company, Fluidata, connects some 40 of the smaller internet service providers (ISPs) into BT's network.

"The bigger ISPs are key to driving uptake in the regions because they have the brands people are familiar with, they have big marketing budgets and additional services like television and phone calls."

Linking into a small regional network means renting optical fibre to connect that region to an ISP's own servers, and many of them are based in London. It also requires money spent matching up computer systems so that the process of adding new customers can be automated.

Most internet service providers are not prepared to make the extra investment, even in an area like South Yorkshire, which has 40,000 businesses and 500,000 home users.

ISPs prefer to buy their broadband capacity from BT and at most one or two other networks, said Daniell, which meant many of the dozens of rural networks being funded by government subsidy could be ignored, unless they are built by BT.

If the government's money is well spent, remote villages in regions such as Cumbria and the Scottish Highlands, where it is currently hard to load a simple web page, should be able to download or stream high-quality movies by 2015.

But Digital Region has also run into difficulties with BT. The consortium runs its fibre to street cabinets, but relies on BT's copper wire to get the signal from the cabinets to individual premises. Telecoms watchdog Ofcom has been asked to intervene over the price BT charges to connect new customers – Digital Region is charged £127 per customer, while connecting a BT customer costs £75.

David Carr, chief executive of Digital Region, said: "We support the government's apparent desire to introduce competition into the super-fast broadband market, but we question how committed it is to creating the regulatory conditions in which that competition can flourish. In this respect, we're hoping that the price reduction recommended by Ofcom will bring pricing more in line with what BT charges itself internally."

Carr said council-run networks had considered clubbing together to form a single access point for ISPs which want to reach rural customers. Fluidata is hoping to offer this service on a commercial basis, and to advise councils as they build.

"Local authorities need more support in building networks that can be sold," said Daniell. "It's easier if we have got communication with the people building these networks while they are building them, rather than trying to fix them afterwards."


guardian.co.uk © Guardian News & Media Limited 2011 | Use of this content is subject to our Terms & Conditions | More Feeds


March 11 2011

Four short links: 11 March 2011

  1. The Coming Mobile Data Apocalypse (Redmonk) -- it is clear that the appetite for mobile bandwidth will grow exponentially over the next twelve to eighteen months. With high volumes of smartphones shipping, more and larger form factors entering the market, and the accelerating build out of streaming services, bandwith consumption is set to spike. Equally apparent is that the carriers are ill provisioned to address this demand, both from a network capacity perspective as well as with their pricing structures.
  2. Hamster Burial Kit and 998 Other Ideas -- For Seth Godin's Alternative MBA program, this week the nine of us came up with 111 business ideas each. But ideas are only valuable when someone (like you) makes something happen. What follows are our 999 business ideas, free for the taking.
  3. Sci Foo Short Videos -- questions posed to Sci Foo attendees with interesting answers. I liked "What Worries You?"
  4. Instapaper 3 Released -- all the features are ones I've wanted, which tells me Marco is listening very closely to his customers. Again I say: Instapaper changes the way I use the web as much as RSS did.

February 17 2011

Broadband availability and speed visualized in new government map

Today, the United States Department of Commerce's National Telecommunications and Information Administration (NTIA) unveiled a new National Broadband Map, which can be viewed at BroadbandMap.gov.

The map includes more than 25 million searchable records and it incorporates crowdsourced reporting. Built entirely upon Wordpress, the map is also one of the largest implementations of open source and open data in government to date.

Importantly, the data behind the map shows that despite an increase in broadband adoption to 68%, a digital divide persists between citizens who have full access to the rich media of the 2011 Internet and those who are limited by geography or means.

national-broadband-map-wired-broadband.jpg

The launch of a national map of broadband Internet access fulfills a Congressional mandate created by the 2009 federal stimulus, which directed regulators to collect better data to show which communities have broadband access — and which do not. The National Broadband Map is searchable, right down to individual census block.

"Broadband is as vital and transformative today as electricity was in the 20th century," said FCC chairman Julius Genachowski in a press briefing today. "Millions live in areas where they can't get access even if they want it." Genachowski asserted that extending broadband access to the nearly one third of Americans still without high-speed Internet is essential for the United States to remain globally competitive. "

The FCC chairman also noted that the release of the map was not only important in terms of what it could tell legislators and regulators but that it was "also part of making government more open and participatory," with respect to how it used technology to help citizens drive solutions.

As Anne Neville, director of the State Broadband Initiative at the NTIA, explains in the first post on the Broadband Map blog, crowdsourcing will be an important part of gathering more data. Wherever broadband speed data isn't available, the Commerce Department wants people to submit reports using the speed test apps. By reporting dead zones, citizens can add further results to the database of more 2 million reports that have already been filed.

The creators of the map showed some social media savvy by providing a short URL for maps, like nbm.gov/8Z0j, and creating the @BroadbandMap Twitter account (though the account hadn't sent any tweets at the time this post went live).

The designers of the map said during the press briefing that it embodied "the spirit of the Internet through open formats and protocols." Specifically, the National Broadband Map was built on the LAMP stack (Linux, Apache, MySQL and PHP) familiar to open source developers everywhere. Currently, the site has around 35 RESTful APIs that will enable developers to write applications to look at specific providers. The open government data behind the National Broadband Map can also be downloaded for anyone to use. According to Commerce Department officials, this data will be updated twice a year over the next five years.

Responding to reporters' questions on how the new map might be used by regulators, NTIA administrator Lawrence E. Strickling said that the National Broadband Map will be of great use to all manner of people, particularly for those interested in economic development. There is "nothing about our map that dictates that it will be regulatory," he noted.

That said, at least one visualization from the online gallery could certainly be used to direct more truth in advertising: a comparison of advertised broadband speed vs actual speed shown in testing.

The FCC chairman and other staff have also indicated that a national map of broadband access will enable policy makers to better target resources toward bringing more people online, particularly if Universal Service Fund reform allows for increased funding of rural broadband. While data from more than 600 broadband providers is factored into the map, there's still more that civic developers might do in working with user-submitted data and government data to show how much choice consumers have in broadband access in a specific area.

Given that access to the Internet has become increasingly important to economic, educational, professional and commercial opportunities, understanding who has it and who doesn't is an important component of forming better public policy. Whether the United States government is able to successfully provide broadband access to all Americans through a combination of public and private partnerships and open spectrum reallocation is one of the central challenges of the moment.

December 13 2010

Four short links: 13 December 2010

  1. European mobile operators say big sites need to pay for users' data demands (Guardian) -- it's like the postal service demanding that envelope makers pay them because they're not making enough money just selling stamps. What idiocy.
  2. Grace Programming Language -- language designers working on a new teaching language.
  3. Gawker Media's Entire Database Hacked -- 1.5M usernames and passwords, plus content from their databases, in a torrent. What's your plan to minimize the harm of an event like this, and to recover? (via Andy Baio)
  4. Macmillan Do Interesting Stuff (Cameron Neylon) -- have acquired some companies that provide software tools to support scientists, and are starting a new line of business around it. I like it because it's a much closer alignment of scientists' interests with profit motive than, say, journals. Timo Hannay, who heads it, runs Science Foo Camp with Google and O'Reilly.

August 11 2010

What I get and don't get about the Google/Verizon proposal

Nobody knew for a long time what Google and Verizon were cooking up on
the network neutrality front, and after the release of their brief,
two-page roadmap (posted href="http://www.scribd.com/doc/35599242/Verizon-Google-Legislative-Framework-Proposal">On
Scribd as a PDF, among other places) nobody still knows. All the
usual Internet observers have had their say, and in general the
assessment is negative.

My first reaction was to ignore the whole thing, mainly because the
language of the agreement didn't match any Internet activity I could
recognize. Some of the false notes struck:

  • The Consumer Protections section keeps using the term "lawful" as if
    there was a regulatory regime on the Internet. Not even the people
    regularly accused of trying to extend government control over the
    Internet (ICANN, WSIS, and the ITU) believe they can define what's
    lawful and make people stick to it.

    If I can send and receive only lawful content, who do Google and
    Verizon think can stop me from exchanging child pornography or
    instructions to blow up buildings? What, in turn, distinguishes lawful
    applications and services from unlawful ones (outside of Saudi Arabia
    and the United Arab Emirates)?

    Deduction: This passage represents no meaningful or enforceable rules,
    but is thrown in to make regulators feel there's a policy where in
    fact there is none.

  • The Transparency section strews around nice, general statements no one
    could complain about--don't we all want our services to tell us what
    they're doing?--but the admonitions are too general to interpret or
    apply.

    For instance, Apple is adamant about its right to determine what apps
    are available to iPhone and iPad buyers. Is that transparency?
    Apparently not, because every Apple developer gnaws his fingernails
    waiting to hear whether and when his app will be accepted into the App
    Store. But I don't see language in the Google/Verizon transparency
    section that covers the App Store at all. They might well say it's not
    networking issue.

    Fine, let's turn to networking. The carriers maintain that they need
    flexibility and a certain degree of secrecy to combat abuses such as
    spam; see for instance my blog href="http://www.oreillynet.com/onlamp/blog/2008/04/consider_the_economics_in_netw.html">Consider
    the economics in network neutrality. Squaring this complex
    issue--which is covered by the Google/Verizon in the next item on
    Network Management--with transparency is a dilemma.

    Deduction: we can all say we're transparent and feel good, but life is
    too complex for authorities to be totally transparent about what
    they're transparent about.

  • The worst passage in my view is the one in the Regulatory Authority
    section assigning authority to the FCC for "broadband." That
    ill-defined term, used far too much in Washington, tends to appear in
    the context of universal service. One can regulate broadband by such
    things as providing incentives to build more networks, but the
    Regulatory Authority section sidesteps the more basic questions of who
    gets to regulate the building, interconnecting, and routing through
    networks.

    Deduction: Google and Verizon put this in to encourage the government
    to continue pouring money into the current telcos and cable companies
    so they can build more high-speed networks, but its effect on
    regulation is nil.

Not too inspiring on first impression, but because so many other
people raised such a brouhaha over the Google/Verizon announcement, I
decided to think about it a bit more. And I actually ended up feeling
good about one aspect. The proposal is really a big concession to the
network neutrality advocates. I had been feeling sour about proposals
for network neutrality because, as nice as they sound in the abstract,
the devil is in the details. Network management for spam and other
attacks provides one example.

But the Google/Verizon announcement explicitly denounces
discrimination and mandates adherence to Internet standards. (Of
course, some Internet standards govern discrimination.) It seems to me
that, after this announcement, no network provider can weep and wring
its hands and claim that it would be unable to do business on a
non-discriminatory basis. And network neutrality advocates can cite
this document for support.

But as others have pointed out, the concession granted in the
"Non-Discrimination Requirement" section is ripped away by the
"Additional Online Services" section to "traffic prioritization." This
makes it clear that the "services" offered in that section reach deep
into the network infrastructure where they can conflict directly with
public Internet service. Unless someone acknowledges the contradiction
between the two sections and resolves it in a logical manner, this
document becomes effectively unusable.

What about the other pesky little exemption in the proposal--wireless
networks? Certainly, a lot of computing is moving to mobile devices.
But wireless networks really are special. Not only are they hampered
by real limits on traffic--the networks being shared and having
limited spectrum--but users have limited tolerance for unwanted
content and for fidgeting around with their devices. They don't want
to perform sophisticated control over transmission over content; they
need someone to do it for them.

Anyway, fiber is always going to provide higher bandwidth than
wireless spectrum. So I don't believe wireless will become
dominant. It will be a extremely valuable companion to us as we walk
through the day, saving data about ourselves and getting information
about our environment, but plenty of serious work will go on over the
open Internet.

So in short, I disdain the Google/Verizon agreement from an editor's
point of view but don't mind it as a user. In general, I have nothing
against parties in a dispute (here, the telephone companies who want
to shape traffic and the Internet sites who don't want them to)
conducting talks to break down rigid policy positions and arrive at
compromises. The Google/Verizon talks are fraught with implications,
of course, because Google is a wireless provider and Verizon
distributes lots of phones with Google's software and services. So I
take the announcement as just one stake in the ground along a large
frontier. I don't see the proposal being adopted in any regulatory
context--it's too vague and limited--but it's interesting for what it
says about Google and Verizon.

March 16 2010

Google Fiber and the FCC National Broadband Plan

I've puzzled over Google's Fiber project ever since they announced it. It seemed too big, too hubristic (even for a company that's already big and has earned the right to hubris)--and also not a business Google would want to be in. Providing the "last mile" of Internet service is a high cost/low payoff business that I'm glad I escaped (a friend an I seriously considered starting an ISP back in '92, until we said "How would we deal with customers?").


But the FCC's announcement of their plans to widen broadband Internet access in the US (the "National Broadband Strategy") puts Google Fiber in a new context. The FCC's plans are cast in terms of upgrading and expanding the network infrastructure. That's a familiar debate, and Google is a familiar participant. This is really just an extension of the "network neutrality" debate that has been going on with fits and starts over the past few years.


Google has been outspoken in their support for the idea that network carriers shouldn't discriminate between different kinds of traffic. The established Internet carriers largely have opposed network neutrality, arguing that they can't afford to build the kind of high-bandwidth networks that are required for delivering video and other media. While the debate over network neutrality has quieted down recently, the issues are still floating out there, and no less important. Will the networks of the next few decades be able to handle whatever kinds of traffic we want to throw at it?


In the context of network neutrality, and in the context of the FCC's still unannounced (and certain to be controversial) plans, Google Fiber is the trump card. It's often been said that the Internet routes around damage. Censorship is one form of damage; non-neutral networks are another. Which network would you choose? One that can't carry the traffic you want, or one that will? Let's get concrete: if you want video, would you choose a network that only delivers real-time video from providers who have paid additional bandwidth charges to your carrier? Google's core business is predicated upon the availability of richer and richer content on the net. If they can ensure that all the traffic that people want can be carried, they win; if they can't, if the carriers mediate what can and can't be carried, they lose. But Google Fiber ensures that our future networks will indeed be able to "route around damage", and makes what the other carriers do irrelevant. Google Fiber essentially tells the carriers "If you don't build the network we need, we will; you will either move with the times, or you won't survive."


Looked at this way, non-network-neutrality requires a weird kind of collusion. Deregulating the carriers by allowing them to charge premium prices for high bandwidth services, only works as long as all the carriers play the same game, and all raise similar barriers against high-bandwidth traffic. As soon as one carrier says "Hey, we have a bigger vision; we're not going to put limits on what you want to do," the game is over. You'd be a fool not to use that carrier. You want live high-definition video conferencing? You got it. You want 3D video, requiring astronomical data rates? You want services we haven't imagined yet? You can get those too. AT&T and Verizon don't like it? Tough; it's a free market, and if you offer a non-competitive product, you lose. The problem with the entrenched carriers' vision is that, if you discriminate against high-bandwidth services, you'll kill those services off before they can even be invented.


The U.S. is facing huge problems with decaying infrastructure. At one time, we had the best highway system, the best phone system, the most reliable power grid; no longer. Public funding hasn't solved the problem; in these tea-party days, nobody's willing to pay the bills, and few people understand why the bills have to be as large as they are. (If you want some insight into the problems of decaying infrastructure, here's an op-ed piece on Pennsylvania's problems repairing its bridges.) Neither has the private sector, where short-term gain almost always wins over the long-term picture.


But decaying network infrastructure is a threat to Google's core business, and they aren't going to stand by idly. Even if they don't intend to become a carrier themselves, as Eric Schmidt has stated, they could easily change their minds if the other carriers don't keep up. There's nothing like competition (or even the threat of competition) to make the markets work.


We're looking at a rare conjunction. It's refreshing to see a large corporation talk about creating the infrastructure they need to prosper--even if that means getting into a new kind of business. To rewrite the FCC Chairman's metaphor, it's as if GM and Ford were making plans to upgrade the highway system so they could sell better cars. It's an approach that's uniquely Googley; it's the infrastructure analog to releasing plugins that "fix" Internet Explorer for HTML5. "If it's broken and you won't fix it, we will." That's a good message for the carriers to hear. Likewise, it's refreshing to see the FCC, which has usually been a dull and lackluster agency, taking the lead in such a critical area. An analyst quoted by the Times says "One again, the FCC is putting the service providers on the spot." As well they should. A first-class communications network for all citizens is essential if the U.S. is going to be competitive in the coming decades. It's no surprise that Google and the FCC understands this, but I'm excited by their commitment to building it.


March 05 2010

Report from HIMMS Health IT conference: building or bypassing infrastructure

Today the Healthcare Information and
Management Systems Society (HIMSS)
conference wrapped up. In
previous blogs, I laid out the href="http://radar.oreilly.com/2010/03/report-from-himms-health-it-co.html">
benefits of risk-taking in health care IT followed by my main
theme, href="http://radar.oreilly.com/2010/03/report-from-himms-health-it-co-1.html">
interoperability and openness. This blog will cover a few topics
about a third important issue, infrastructure.

Why did I decide this topic was worth a blog? When physicians install
electronic systems, they find that they need all kinds of underlying
support. Backups and high availability, which might have been
optional or haphazard before, now have to be professional. Your
patient doesn't want to hear, "You need an antibiotic right away, but
we'll order it tomorrow when our IT guy comes in to reboot the
system." Your accounts manager would be almost as upset if you told
her that billing will be delayed for the same reason.

Network bandwidth

An old sales pitch in the computer field (which I first heard at
Apollo Computer in the 1980s) goes, "The network is the computer." In
the coming age of EHRs, the network is the clinic. My family
practitioner (in an office of five practitioners) had to install a T1
line when they installed an EHR. In eastern Massachusetts, whose soil
probably holds more T1 lines than maple tree roots, that was no big
deal. It's considerably more problematic in an isolated rural area
where the bandwidth is more comparable to what I got in my hotel room
during the conference (particularly after 10:30 at night, when I'm
guessing a kid in a nearby room joined an MMPG). One provider from the
mid-West told me that the incumbent changes $800 per month for a T1.
Luckily, he found a cheaper alternative.

So the FCC is href="http://www.fcc.gov/cgb/rural/rhcp.html">involved in health care
now. Bandwidth is perhaps their main focus at the moment, and
they're explicitly tasked with making sure rural providers are able to
get high-speed connections. This is not a totally new concern; the
landmark 1994 Telecom Act included rural health care providers in its
universal service provisions. I heard one economist deride the
provision, asking what was special about rural health care providers
that they should get government funding. Fifteen years later, I think
rising health care costs and deteriorating lifestyles have answered
that question.

Wireless hubs

The last meter is just as important as the rest of your network, and
hospitals with modern, technology-soaked staff are depending
increasingly on mobile devices. I chatted with the staff of a small
wireless company called Aerohive that aims its products at hospitals.
Its key features are:

Totally cable-free hubs

Not only do Aerohive's hubs communicate with your wireless endpoints,
they communicate with other hubs and switches wirelessly. They just
make the hub-to-endpoint traffic and hub-to-hub traffic share the
bandwidth in the available 2.4 and 5 GHz ranges. This allows you to
put them just about anywhere you want and move them easily.

Dynamic airtime scheduling

The normal 802.11 protocols share the bandwidth on a packet-by-packet
basis, so a slow device can cause all the faster devices to go slower
even when there is empty airtime. I was told that an 802.11n device
can go slower than a 802.11b device if it's remote and its signal has
to go around barriers. Aerohive just checks how fast packets are
coming in and allocates bandwidth on that ratio, like time-division
multiplexing. If your device is ten times faster than someone else's
and the bandwidth is available, you can use ten times as much
bandwidth.

Dynamic rerouting

Aerohive hubs use mesh networking and an algorithm somewhat like
Spanning Tree Protocol to reconfigure the network when a hub is added
or removed. Furthermore, when you authenticate with one hub, its
neighbors store your access information so they can pick up your
traffic without taking time to re-authenticate. This makes roaming
easy and allows you to continue a conversation without a hitch if a
hub goes down.

Security checking at the endpoint

Each hub has a built-in firewall so that no unauthorized device can
attach to the network. This should be of interest in an open, public
environment like a hospital where you have no idea who's coming in.

High bandwidth

The top-of-the-line hub has two MIMO radios, each with three
directional antennae.

Go virtual, part 1

VMware has href="http://www.vmware.com/solutions/industry/healthcare/case-studies.html">customers
in health care, as in other industries. In addition, they've
incorporated virtualization into several products from medical
equipment and service vendors,

Radiology

Hospitals consider these critical devices. Virtualization here
supports high availability.

Services

A transcription service could require ten servers. Virtualization can
consolidate them onto one or two pieces of hardware.

Roaming desktops

Nurses often move from station to station. Desktop virtualization
allows them to pull up the windows just as they were left on the
previous workstation.

Go virtual, squared

If all this talk of bandwidth and servers brings pain to your head as
well as to the bottom line, consider heading into the cloud. At one
talk I attended today on cost analysis, a hospital administrator
reported that about 20% of their costs went to server hosting. They
saved a lot of money by rigorously eliminating unneeded backups, and a
lot on air conditioning by arranging their servers more efficiently.
Although she didn't discuss Software as a Service, those are a couple
examples of costs that could go down if functions were outsourced.

Lots of traditional vendors are providing their services over the Web
so you don't have to install anything, and several companies at the
conference are entirely Software as a Service. I mentioned href="http://www.practicefusion.com/">Practice Fusion in my
previous blog. At the conference, I asked them three key questions
pertinent to Software as a Service.

Security

This is the biggest question clients ask when using all kinds of cloud
services (although I think it's easier to solve than many other
architectural issues). Practice Fusion runs on HIPAA-compliant
Salesforce.com servers.

Data portability

If you don't like your service, can you get your data out? Practice
Fusion hasn't had any customers ask for their data yet, but upon
request they will produce a DVD containing your data in CSV files, or
in other common formats, overnight.

Extendibility

As I explained in my previous blog, clients increasingly expect a
service to be open to enhancements and third-party programs. Practice
Fusion has an API in beta, and plans to offer a sandbox on their site
for people to develop and play with extensions--which I consider
really cool. One of the API's features is to enforce a notice to the
clinician before transferring sensitive data.

The big selling point that first attracts providers to Practice Fusion
is that it's cost-free. They support the service through ads, which
users tell them are unobtrusive and useful. But you can also pay to
turn off ads. The service now has 30,000 users and is adding about 100
each day.

Another SaaS company I mentioned in my previous blog is href="http://www.covisint.com/">Covisint. Their service is
broader than Practice Fusion, covering not only patient records but
billing, prescription ordering, etc. Operating also as an HIE, they
speed up access to data on patients by indexing all the data on each
patient in the extended network. The actual data, for security and
storage reasons, stays with the provider. But once you ask about a
patient, the system can instantly tell you what sorts of data are
available and hook you up with the providers for each data set.

Finally, I talked to the managers of a nimble new company called href="http://carecloud.com/">CareCloud, which will start serving
customers in early April. CareCloud, too, offers a range of services
in patient health records, practice management, and and revenue cycle
management. It was built entirely on open source software--Ruby on
Rails and a PostgreSQL database--while using Flex to build their
snazzy interface, which can run in any browser (including the iPhone,
thanks to Adobe's upcoming translation to native code).upcoming
translation to native code). Their strategy is based on improving
physicians' productivity and the overall patient experience through a
social networking platform. The interface has endearing Web 2.0 style
touches such as a news feed, SMS and email confirmations, and
integration with Google Maps.

And with that reference to Google Maps (which, in my first blog, I
complained about mislocating the address 285 International Blvd NW for
the Georgia World Congress Center--thanks to the Google Local staff
for getting in touch with me right after a tweet) I'll end my coverage
of this year's HIMSS.

February 10 2010

Google Enters the Home Broadband Market

In a week already full of Google announcements, another bomb was casually dropped today via Google's blog. The Borg from California announced that it was experimentally entering the Fiber to the Curb (FTTC) market, and that they planned to offer much higher speeds than current offerings (1Gb/sec) and competitive pricing. The announcement also talks about what, when you remove the marketspeak, is a commitment to net neutrality in their service. This, of course, is not surprising, given Google's strong lobbying for neutrality to the FCC and congress.

What is becoming very clear is that Google wants to have a finger in, if not own, most of the pie when it comes to how consumers and business access their information. Android was a first foray into the mobile market, and we know that Google was in the chase for cellular spectrum during the big auction. Google Voice is another attempt to make an end run around the traditional telecomm infrastructure. But if Google becomes a major player in Fiber to the home, they take a huge step forward.

Once Google has a pipe into the house, they can easily become a player in VoIP and landline telephone service, as well as cable TV and on-demand. Of course, these areas are fraught with regulatory issues. Many towns require cable providers to enter into individual franchise agreements in order to provide service, which can be a nightmare when you multiply it times N towns. But it's much easier to offer when you have a bit pipe already in place. And a 1Gb service will allow for HD or even Blu-Ray 3D service on-demand to the house.

In a way, you can say that it's about time that someone offered Gb fiber in the US. In Europe and Asia, this level of service is already in place, and it's a bit of a crime that we lag so far behind. Google could jumpstart the market in the US, and without all the bagage that the traditional telcos are carrying around.

Mind you, this is just an experiment. According to Google, the pilot will involve somewhere between 50,000 and 500,000 households. But unlike many companies, Google 'experiments' have a habit of turning into game-changing products.

January 14 2010

Innovation Battles Investment as FCC Road Show Returns to Cambridge

Opponents can shed their rhetoric and reveal new depths to their
thought when you bring them together for rapid-fire exchanges,
sometimes with their faces literally inches away from each other. That
made it worth my while to truck down to the MIT Media Lab for
yesterday's href="http://hraunfoss.fcc.gov/edocs_public/attachmatch/DOC-295521A1.pdf">Workshop
on Innovation, Investment and the Open Internet, sponsored by the
Federal Communications Commission. In this article I'll cover:

Context and background

The FCC kicked off its country-wide hearing campaign almost two years
ago with a meeting at Harvard Law School, which quickly went wild. I
covered the href="http://radar.oreilly.com/archives/2008/02/network-neutrality-how-the-fcc.html">
experience in one article and the href="http://radar.oreilly.com/archives/2008/02/network-neutrality-code-words.html">
unstated agendas in another. With a star cast and an introduction
by the head of the House's Subcommittee on Telecommunications and the
Internet, Ed Markey, the meeting took on such a cachet that the
public flocked to the lecture hall, only to find it filled because
Comcast recruited people off the street to pack the seats and keep
network neutrality proponents from attending. (They had an overflow
room instead.)

I therefore took pains to arrive at the Media Lab's Bartos Theater
early yesterday, but found it unnecessary. Even though Tim Berners-Lee
spoke, along with well-known experts across the industry, only 175
people turned up, in my estimation (I'm not an expert at counting
crowds). I also noticed that the meeting wasn't worth a
mention today in the Boston Globe.

Perhaps it was the calamitous earthquake yesterday in Haiti, or the
bad economy, or the failure of the Copenhagan summit to solve the
worst crisis ever facing humanity, or concern over three wars the US
is involved in (if you count Yemen), or just fatigue, but it seems
that not as many people are concerned with network neutrality as two
years ago. I recognized several people in the audience yesterday and
surmised that the FCC could have picked out a dozen people at random
from their seats, instead of the parade of national experts on the
panel, and still have led a pretty darned good discussion.

And network neutrality is definitely the greased pig everyone is
sliding around. There are hundreds of things one could discuss
in the context of innovation and investment, but various political
forces ranging from large companies (AT&T versus Google) to highly
visible political campaigners (Huffington Post) have made network
neutrality the agenda. The FCC gave several of the movement's leaders
rein to speak, but perhaps signaled its direction by sending Meredith
Attwell Baker as the commissioner in attendance.

In contrast to FCC chair Julius Genachowski, who publicly calls for
network neutrality (a position also taken by Barack Obama href="http://www.barackobama.com/issues/technology/index_campaign.php#open-internet">
during his presidential campaign), Baker has traditionally
espoused a free-market stance. She opened the talks yesterday by
announcing that she is "unconvinced there is a problem" and posing the
question: "Is it broken?" I'll provide my own opinion later in this
article.

Two kinds of investment

Investment is the handmaiden, if not the inseminator, of innovation.
Despite a few spectacular successes, like the invention of Linux and
Apache, most new ideas require funding. Even Linux and Apache are
represented now by foundations backed now by huge companies.

So why did I title this article "Innovation Battles Investment"?
Because investment happens at every level of the Internet, from the
cables and cell towers up to the applications you load on your cell
phone.

Here I'll pause to highlight an incredible paradigm shift that was
visible at this meeting--a shift so conclusive that no one mentioned
it. Are you old enough to remember the tussle between "voice" and
"data" on telephone lines? Remember the predictions that data would
grow in importance at the expense of voice (meaning Plain Old
Telephone Service) and the milestones celebrated in the trade press when
data pulled ahead of voice?

Well, at the hearing yesterday, the term "Internet" was used to cover
the whole communications infrastructure, including wires and cell
phone service. This is a mental breakthrough all it's own, and one
I'll call the Triumph of the Singularity.

But different levels of infrastructure benefit from different
incentives. I found that all the participants danced around this.
Innovation and investment at the infrastructure level got short shrift
from the network neutrality advocates, whether in the bedtime story
version delivered by Barbara van Schewick or the deliberately
intimidating, breakneck overview by economist Shane Greenstein, who
defined openness as "transparency and consistency to facilitate
communication between different partners in an independent value
chain."

You can explore his href="http://www.kellogg.northwestern.edu/faculty/greenstein/images/articles.html">
papers on your own, but I took this to mean, more or less, that
everybody sharing a platform should broadcast their intentions and
appraise everybody else of their plans, so that others can make the
most rational decisions and invest wisely. Greenstein realized, of
course that firms have little incentive to share their strategies. He said that
communication was "costly," which I take as a reference not to an expenditure of
money but to a surrender of control and relinquishing of opportunities.

This is just what the cable and phone companies are not going to do.
Dot-com innovator Jeffrey Glueck, founder of href="http://www.skyfire.com/">Skyfire, would like the FCC to
require ISPs to give application providers and users at least 60 to 90
days notice before making any changes to how they treat traffic. This
is absurd in an environment where bad actors require responses within
a few seconds and the victory goes to the router administrators with
the most creative coping strategy. Sometimes network users just have
to trust their administrators to do the best thing for them. Network
neutrality becomes a political and ethical issue when administrators
don't. But I'll return to this point later.

The pocket protector crowd versus the bean
counters

If the network neutrality advocates could be accused of trying to
emasculate the providers, advocates for network provider prerogative
were guilty of taking the "Trust us" doctrine too far. For me, the
best part of yesterday's panel was how it revealed the deep gap that
still exists between those with an engineering point of view and those
with a traditional business point of view.

The engineers, led by Internet designer David Clark, repeated the
mantra of user control of quality of service, the vehicle for this
being the QoS field added to the IP packet header. Van Schewick
postulated a situation where a user increases the QoS on one session
because they're interviewing for a job over the Internet, then reduces
the QoS to chat with a friend.

In the rosy world envisioned by the engineers, we would deal not with
the physical reality of a shared network with our neighbors, all
converging into a backhaul running from our ISP to its peers, but with
the logical mechanism of a limited, dedicated bandwidth pipe (former
senator Ted Stevens can enjoy his revenge) that we would spend our
time tweaking. One moment we're increasing the allocation for file
transfer so we can upload a spreadsheet to our work site; the next
moment we're privileging the port we use for an MPMG.

The practicality of such a network service is open to question. Glueck
pointed out that users are unlikely ever to ask for lower quality of
service (although this is precisely the model that Internet experts
have converged on, as I report in my 2002 article href="http://www.oreillynet.com/pub/a/network/2002/06/11/platform.html">
A Nice Way to Get Network Quality of Service?). He recommends
simple tiers of service--already in effect at many providers--so that
someone who wants to carry out a lot of P2P file transfers or
high-definition video conferencing can just pay for it.

In contrast, network providers want all the control. Much was made
during the panel of a remark by Marcus Weldon of Alcatel-Lucent in
support of letting the providers shape traffic. His pointed out that
video teleconferencing over the fantastically popular Skype delivered
unappealing results over today's best-effort Internet delivery, and
suggested a scenario where the provider gives the user a dialog box
where the user could increase the QoS for Skype in order to enjoy the
video experience.

Others on the panel legitimately flagged this comment as a classic
illustration of the problem with providers' traffic shaping: the
provider would negotiate with a few popular services such as Skype
(which boasts tens of millions of users online whenever you log in)
and leave innovative young services to fend for themselves in a
best-effort environment.

But the providers can't see doing quality of service any other way.
Their business model has always been predicated on designing services
around known costs, risks, and opportunities. Before they roll out a
service, they need to justify its long-term prospects and reserve
control over it for further tweaking. If the pocket protector crowd in
Internet standards could present their vision to the providers in a
way that showed them the benefits they'd accrue from openness
(presumably by creating a bigger pie), we might have progress. But the
providers fear, above else, being reduced to a commodity. I'll pick up
this theme in the next section.

Is network competition over?

Law professor Christopher S. Yoo is probably the most often heard (not
at this panel, unfortunately, where he was given only a few minutes)
of academics in favor of network provider prerogatives. He suggested
that competition was changing, and therefore requiring a different
approach to providers' funding models, from the Internet we knew in
the 1990s. Emerging markets (where growth comes mostly from signing up
new customers) differ from saturated markets (where growth comes
mainly from wooing away your competitors' customers). With 70% of
households using cable or fiber broadband offerings, he suggested the
U.S. market was getting saturated, or mature.

Well, only if you accept that current providers' policies will stifle
growth. What looks like saturation to an academic in the U.S. telecom
field looks like a state of primitive underinvestment to people who
enjoy lightning-speed service in other developed nations.

But Yoo's assertion makes us pause for a moment to consider the
implications of a mature network. When change becomes predictable and
slow, and an infrastructure is a public good--as I think everyone
would agree the Internet is--it becomes a candidate for government
takeover. Indeed, there have been calls for various forms of
government control of our network infrastructure. In some places this
is actually happening, as cities and towns create their own networks.
A related proposal is to rigidly separate the physical infrastructure
from the services, barring companies that provide the physical
infrastructure from offering services (and therefore presumably
relegating them to a maintenance role--a company in that position
wouldn't have much incentive to take on literally ground-breaking new
projects).

Such government interventions are politically inconceivable in the
United States. Furthermore, experience in other developed nations with
more successful networks shows that it is unnecessary.

No one can doubt that we need a massive investment in new
infrastructure if we want to use the Internet as flexibly and
powerfully as our trading partners. But there was disagreement yesterday about
how much of an effort the investment will take, and where it will come
from.

Yoo argued that a mature market requires investment to come from
operating expenditures (i.e., charging users more money, which
presumably is justified by discriminating against some traffic in
order to offer enhanced services at a premium) instead of capital
expenditures. But Clark believes that current operating expenditures
would permit adequate growth. He anticipated a rise in Internet access
charges of $20 a month, which could fund the added bandwidth we need
to reach the Internet speeds of advanced countries. In exchange for
paying that extra $20 per month, we would enjoy all the content we
want without paying cable TV fees.

The current understanding by providers is that usage is rising
"exponentially" (whatever that means--they don't say what the exponent
is) whereas charges are rising slowly. Following some charts from
Alcatel-Lucent's Weldon that showed profits disappearing entirely in a
couple years--a victim of the squeeze between rising usage and slow
income growth--Van Schewick challenged him, arguing that providers can
enjoy lower bandwidth costs to the tune of 30% per year. But Weldon
pointed out that the only costs going down are equipment, and claimed
that after a large initial drop caused by any disruptive new
technology, costs of equipment decrease only 10% per year.

Everyone agreed that mobile, the most exciting and
innovation-supporting market, is expensive to provide and suffering an
investment crisis. It is also the least open part of the Internet and
the part most dependent on legacy pricing (high voice and SMS
charges), deviating from the Triumph of the Singularity.

So the Internet is like health care in the U.S.: in worse shape than
it appears. We have to do something to address rising
usage--investment in new infrastructure as well as new
applications--just as we have to lower health care costs that have
surpassed 17% of the gross domestic product.

Weldon's vision--a rosy one in its own way, complementing the
user-friendly pipe I presented earlier from the engineers--is that
providers remain free to control the speeds of different Internet
streams and strike deals with anyone they want. He presented provider
prerogatives as simple extensions of what already happens now, where
large companies create private networks where they can impose QoS on
their users, and major web sites contract with content delivery
networks such as Akamai (represented at yesterday's panel by lawyer
Aaron Abola) to host their content for faster response time. Susie Kim
Riley of Camiant testified that European providers are offering
differentiated services already, and making money by doing so.

What Weldon and Riley left out is what I documented in href="http://www.oreillynet.com/pub/a/network/2002/06/11/platform.html">
A Nice Way to Get Network Quality of Service? Managed networks
providing QoS are not the Internet. Attempts to provide QoS over the
Internet--by getting different providers to cooperate in privileging
certain traffic--have floundered. The technical problems may be
surmountable, but no one has figured out how to build trust and to design
adequate payment models that would motivate providers to cooperate.

It's possible, as Weldon asserts, that providers allowed to manage
their networks would invest in infrastructure that would ultimately
improve the experience for all sites--those delivered over the
Internet by best-effort methods as well as those striking deals. But
the change would still represent increased privatization of the public
Internet. It would create what application developers such as Glueck
and Nabeel Hyatt of Conduit Labs fear most: a thousand different
networks with different rules that have to be negotiated with
individually. And new risks and costs would be placed in the way of
the disruptive innovators we've enjoyed on the Internet.

Competition, not network neutrality, is actually the key issue facing
the FCC, and it was central to their Internet discussions in the years
following the 1996 Telecom Act. For the first five years or so, the
FCC took seriously a commitment to support new entrants by such
strategies as requiring incumbent companies to allow interconnection.
Then, especially under Michael Powell, the FCC did an about-face.

The question posed during this period was: what leads to greater
investment and growth--letting a few big incumbents enter each other's
markets, or promoting a horde of new, small entrants? It's pretty
clear that in the short term, the former is more effective because the
incumbents have resources to throw at the problem, but that in the
long term, the latter is required in order to find new solutions and
fix problems by working around them in creative ways.

Yet the FCC took the former route, starting in the early 2000s. They
explicitly made a deal with incumbents: build more infrastructure, and
we'll relax competition rules so you don't have to share it with other
companies.

Starting a telecom firm is hard, so it's not clear that pursuing the
other route would have saved us from the impasse we're in today. But a
lack of competition is integral to our problems--including the one
being fought out in the field of "network neutrality."

All the network neutrality advocates I've talked to wish that we had
more competition at the infrastructure level, because then we could
rely on competition to discipline providers instead of trying to
regulate such discipline. I covered this dilemma in a 2006 article, href="http://lxer.com/module/newswire/view/53907/">Network Neutrality
and an Internet with Vision. But somehow, this kind of competition
is now off the FCC agenda. Even in the mobile space, they offer
spectrum though auctions that permit the huge incumbents to gather up
the best bands. These incumbents then sit on spectrum without doing
anything, a strategy known as "foreclosure" (because it forecloses
competitors from doing something useful with it).

Because everybody goes off in his own direction, the situation pits two groups against each other that should be
cooperating: small ISPs and proponents of an open Internet.

What to regulate

Amy Tykeson, CEO of a small Oregon Internet provider named
BendBroadband, forcefully presented the view of an independent
provider, similar to the more familiar imprecations by href="http://www.brettglass.com/"> Brett Glass of Lariat. In their
world--characterized by paper-thin margins, precarious deals with
back-end providers, and the constant pressure to provide superb
customer service--flexible traffic management is critical and network
neutrality is viewed as a straitjacket.

I agree that many advocates of network neutrality have oversimplified
the workings of the Internet and downplayed the day-to-day
requirements of administrators. In contrast, as I have shown, large
network providers have overstepped their boundaries. But to end this
article on a positive note (you see, I'm trying) I'll report that the
lively exchange did produce some common ground and a glimmer of hope
for resolving the differing positions.

First, in an exchange between Berners-Lee and van Schewick on the
pro-regulatory side and Riley on the anti-regulatory side, a more
nuanced view of non-discrimination and quality of service emerged.
Everybody on panel offered vociferous exclamations in support of the position that it was
unfair discrimination for a network provider to prevent a user from
getting legal content or to promote one web site over a competing web
site. And this is a major achievement, because those are precisely
the practices that providers liked AT&T and Verizon claim the
right to do--the practices that spawned the current network neutrality
controversy.

To complement this consensus, the network neutrality folks approved
the concept of quality of service, so long as it was used to improve
the user experience instead of to let network providers pick winners.
In a context where some network neutrality advocates have made QoS a
dirty word, I see progress.

This raises the question of what is regulation. The traffic shaping
policies and business deals proposed by AT&T and Verizon are a
form of regulation. They claim the same privilege that large
corporations--we could look at health care again--have repeatedly
tried to claim when they invoke the "free market": the right of
corporations to impose their own regulations.

Berners-Lee and others would like the government to step in and issue
regulations that suppress the corporate regulations. A wide range of
wording has been proposed for the FCC's consideration. Commissioner
Baker asked whether, given the international reach of the Internet,
the FCC should regulate at all. Van Schewick quite properly responded
that the abuses carried out by providers are at the local level and
therefore can be controlled by the government.

Two traits of a market are key to innovation, and came up over and
over yesterday among dot-com founders and funders (represented by Ajay
Agarwal of Bain Capital) alike: a level playing field, and
light-handed regulation.

Sometimes, as Berners-Lee pointed out, government regulation is
required to level the playing field. The transparency and consistency
cited by Greenstein and others are key features of the level playing
field. And as I pointed out, a vacuum in government regulation is
often filled by even more onerous regulation by large corporations.

One of the most intriguing suggestions of the day came from Clark, who
elliptically suggested that the FCC provide "facilitation, not
regulation." I take this to mean the kind of process that Comcast and
BitTorrent went through, of which Sally Shipman Wentworth of ISOC
boasted about in her opening remarks. Working with the IETF (which she
said created two new working groups to deal with the problem), Comcast
and BitTorrent worked out a protocol that should reduce the load of
P2P file sharing on networks and end up being a win-win for everybody.

But there are several ways to interpret this history. To free market
ideologues, the Comcast/BitTorrent collaboration shows that private
actors on the Internet can exploit its infinite extendibility to find
their own solutions without government meddling. Free market
proponents also call on anti-competition laws to hold back abuses. But
those calling for parental controls would claim that Comcast wanted
nothing to do with BitTorrent and started to work on technical
solutions only after getting tired of the feces being thrown its way
by outsiders, including the FCC.

And in any case--as panelists pointed out--the IETF has no enforcement
power. The presence of a superior protocol doesn't guarantee that
developers and users will adopt it, or that network providers will
allow traffic that could be a threat to their business models.

The FCC at Harvard, which I mentioned at the beginning of this
article, promised intervention in the market to preserve Internet
freedom. What we got after that (as I predicted) was a slap on
Comcast's wrist and no clear sense of direction. The continued
involvement of the FCC--including these public forums, which I find
educational--show, along with the appointment of the more
interventionist Genachowski and the mandate to promote broadband in
the American Recovery and Reinvestment Act, that it can't step away
from the questions of competition and investment.

December 16 2009

Four short links: 16 December 2009

  1. OECD Broadband Portal -- global data on broadband penetration and pricing available from June 2009.
  2. Easy Statistics for A/B Testing -- it really is easy. And it mentions hamsters. This is worth reading. (via Hacker News)
  3. last.fm's SSD Streaming Infrastructure -- Each single SSD can support around 7000 concurrent listeners, and the serving capacity of the machine topped out at around 30,000 concurrent connections in it’s tested configuration. Lots of hardware and OS configuration geeking here, it's great. (via Hacker News)
  4. Videos Sell More Product -- Zappos sells 6-30% more merchandise when accompanied by video demos. By the end of next year, Zappos will have ten full working video studios, with the goal of producing around 50,000 product videos by 2010, up from the 8,000 videos they have on the site today (via johnclegg on Twitter)

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl