Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

January 16 2014

Court prods FCC in unexpected direction in this week’s Verizon ruling

A court ruling this past Tuesday on FCC “network neutrality” regulation closes and opens a few paths in a three-way chess game that has been going on for years between the US District Court of Appeals, the FCC, and the major Internet server providers. (Four-way if you include Congress, and five-way if you include big Internet users such as Google — so, our chess game is coming closer to Chinese Checkers at this point.)

A lot of bloggers, and even news headlines, careened into histrionics (“Net neutrality is dead. Bow to Comcast and Verizon, your overlords“). The Free Press, although oversimplifying the impact, did correctly link the ruling to what they and many other network neutrality supporters consider the original sin of FCC rulings: eviscerating the common carrier regulation of broadband providers.

Even better, many commenters noted the ambiguities and double messages in the ruling. Unlike a famous earlier ruling on Comcast regulation, this week’s court ruling spends a good deal of time affirming the FCC’s right to regulate Internet providers. Notably, pp. 35-36 essentially confirm the value and validity of network neutrality (in the form of promoting innovation at the edges by placing no restraints on transmissions).

Let’s go over the efforts of Congress and the FCC to promote competition in Internet service, leading up to Tuesday’s ruling.

Two mandates handed down from the 20th century: Computer II and the Telecom Act of 1996

The major goal of the landmark 1996 Act was to promote competition. The market for Internet service was stunningly different in 1996 from what it is in the US today. There was a lot of competition — but not much reach and not much bandwidth. Many people still lacked Internet access, and those who got it from home dialed into an ISP, often a mom-and-pop operation. Few people could get Internet access over their cable TV network. More competition would presumably lead to more and faster Internet connections.

Although idealists (like me) looked forward to a teeming ecosystem of multiple ISPs, Congress was perhaps more realistic in expecting an oligopoly with a few big players in each geographic market (three companies was considered a good number for robust competition). Many expectations were placed on incumbents: there were seven “baby Bells” that came out of the breakup of the old AT&T, and although each occupied a separate geographic market, observers hoped they would enter each other’s markets. Cable companies were in most markets as well. Somehow, from all this raw material, new services were supposed to arise.

The law established interconnection points that the baby Bells had to provide to competitors. Theoretically, smaller companies could have exploited those points to find a market niche, but that hardly went anywhere (and many observers blamed the Bells for stymying competition). The seven baby Bells quickly recombined to make three (Verizon, CenturyLink, and a new AT&T), who competed for mobile phones but refrained from competing with landlines in each other’s regions. In many areas, a baby Bell and a cable company formed a duopoly.

Much of the country has enjoyed continuous Internet access (no dial-up) with increasing bandwidth, but many observers think the age of fiber expansion is over at both Verizon and AT&T. While the US remains far behind other developed countries in bandwidth.

Internet regulation (or lack thereof) goes back to 1966 with a series of “Computer inquiries” from the FCC. These have been universally praised for allowing the Internet to arise and spread, simply by announcing that the FCC would not regulate it. Computer II, in particular, distinguished the service offered by telephone companies over the line from the data service running through modems on either side. The Telecom Act enshrined this difference by defining “information services” that were separate from the “telecommunication services” that the FCC had long regulated as common carriers.

Telecommunication services (regulated under Title II of the law) have to provide equal, non-discriminitory access to all users. Information services do not. Clearly, companies will go to extreme lengths to evade being labeled a telecommunication service.

The big divide: cable versus baby Bell

Although we hear a lot about “digital divides” between urban and rural areas, rich and poor, white and minority (luckily decreasing), the divide I’m going to talk about here is a regulatory one. Cable companies are not common carriers; they have always been regulated differently. Local communities can require certain services (such as community and educational channels), but the cable companies are definitively free of the burdens of Title II.

Thanks to the Telecom Act, however, cable companies and telecom companies have come to look more and more alike. They all provide voice calls; they all provide TV channels; they all provide Internet access; and, increasingly, they all provide movies on demand and other services. The key problem the FCC faces — not blamable on Congress, the US District Court, or anybody in particular — is that for historical reasons it imposes much heavier requirements on telecom companies than on functionally identical cable companies. Cable companies offer both Internet transport and content of their own, all over the same physical channel — and now, telecom companies do the same. Something’s gotta give: either Title II regulation has to be imposed on cable companies, or it has to be removed from the baby Bells.

We should note, for historical context, that a Republican administration replaced a Democratic one in 2000, and in 2001 Michael K. Powell was appointed FCC chair. He brought with him a profound faith in the free market as a spur to competition and innovation. When the FCC announced in 2002 that cable modem service was an information service, Powell wrote a justification that reads almost like an apology:

The Commission does not have unconstrained discretion to pick its preferred definition or classification, as some imply. The Commission must attempt to faithfully apply the statutory definition to a service, based on the nature of the service, including the technology used and its capabilities, and the nature of the interactive experience for the consumer…The Commission is not permitted to look at the consequences of different definitions and then choose the label that comports with its preferred regulatory treatment.

But that, of course, is exactly what they did in their inquiry. “Even if Computer II were to apply, however, we waive on our own motion the requirements of Computer II in situations where the cable operator additionally offers local exchange service. The Commission, on its own motion or on petition, may exercise its discretion to waive such requirements on the basis of good cause shown and where the particular facts would make strict compliance inconsistent with the public interest.” (paragraph 45)

I’d like to argue that it was inevitable for them to jump off on this side of the fence. They could hardly evade the reasoning in paragraph 43: “The Commission has never before applied Computer II to information services provided over cable facilities. Indeed, for more than 20 years, Computer II obligations have been applied exclusively to traditional wireline services and facilities.” Regarding the alternative they saw, “to find a telecommunications service inside every information service,” they say, “Such radical surgery is not required.” In short, the technical reality behind Internet connections was irrelevant to the policy dilemma. This FCC decision is often called Brand X, after a court ruling that upheld the decision after a challenge led by an ISP of that name.

By the way, it’s not fair to consider Powell a tool of large corporations, as some critics do. He was deeply committed to the principle of free markets, and articulated four “Internet freedoms” reminiscent of Richard M. Stallman’s four software freedoms.

The sin ascribed to the FCC by Free Press and other network neutrality supporters is actually an inescapable corollary to the cable decision. In 2005 — after Powell left — they decided that new lines and equipment rolled out by telecom companies would not be subject to the common carrier requirements that had been in place for some 70 years. The decision explicitly and repeatedly refers to their Brand X cable modem ruling. They claim the change will enhance competition rather than hurting it.

I think the FCC was hamstrung by the evolution of the Internet industry. The hoped-for ecosystem of small Internet competitors was stunted and scattered. Real competition existed only among the big incumbents, both in telecom and in cable. As we’ll see, this had a major impact on campaigns among Internet activists. As for the FCC, the decisions to free those companies from common carrier status stemmed from a hope that they’d put on their boxing gloves. And they did — but the punches were aimed at the FCC rather than each other.

Giving up on the substrate

Over the past few years, advocates for more competition and growth on the Internet have tacitly moved “up the stack,” complaining about ISP practices such as interfering with certain content and their plans to charge certain Internet sites for favorable treatment. For instance, Comcast was found to be secretly throttling traffic when users were downloading large files. When unmasked, Comcast claimed it was placing restrictions on downloads to be fair to all users; critics suggested it regarded the streaming downloads as competition for its own offerings since movies played a large part in the downloads.

One can imagine that, back in the 1990s, ISP practices like this would lead to an exodus by disgusted customers. Nowadays, there’s much less choice. Network neutrality advocates seem to be taking the battle to the software layer because achieving large-scale competition at lower layers seems unattainable. Indeed, real competition would require companies to compete more on the physical layer. Meanwhile, advocates for tiered service suggest it will lower costs and encourage competition.

The FCC is caught between an aroused community of network neutrality advocates and a powerful set of industries looking for ways to increase revenue. Occasionally, it tries to intervene. But the same argument the FCC makes for removing regulation, enthusiastically accepted by the industry, is bitterly opposed when used for exerting regulation. In each case, this argument is:

  • The action we’re taking will promote investment and new services by Internet companies, such as the social networks and content providers.
  • That innovation will stimulate demand by users for more bandwidth, along with a willingness to pay.
  • That in turn leads to more investment and innovation (such as more efficient codecs for multimedia content) in Internet infrastructure.

Comcast’s secret traffic stifling led to the first court battle. In its 2010 ruling, the DC district court basically told the FCC that it had tied its own hands by refusing to regulate the cable companies as common carriers. Cable modems fall in the cracks between the various categories regulated by the Telecom Act. The FCC can’t use Title II (common carrier status). Title III (broadcasting) doesn’t permit the kinds of regulation the FCC was trying to impose. When the FCC tries to cite its mandate to regulate pricing, the court tells it that it can regulate on the basic tier.

The court essentially looked through the Telecom Act for a clause that explicitly let the FCC regulate a practice that didn’t emerge until a decade after the Act was passed, and — unsurprisingly — didn’t find one. The core of the ruling might be found on page 16: “…the Commission must defend its exercise of ancillary authority on a case-by-case basis.”

It would seem like all the players and props were on stage for the final act of the network neutrality drama. But Tuesday’s court ruling showed that the endgame is not at hand. The bottom line is the same — the FCC cannot apply its anti-discrimination and anti-blocking rules; but, as I mentioned at the beginning of the article, the court offered its own sort of encouragement.

The court essentially used a duck test. They found that the FCC regulation looked like a common carrier obligation, so they rapped its knuckles for trying to force common carrier status on companies. Because the FCC had previously removed common carrier status from these companies, the court said it couldn’t impose such regulations now.

Verizon’s lawyers started by cutting and pasting Comcast’s objections to the FCC ruling, changing section 230(b) of the Telecom Act to section 706 and adding some other distracting objections of their own. The court didn’t buy the comparison, which leaves hope for those who want the FCC to rein in ISP business practices. The court even revises the appearance of its early ruling, saying a bit snarkily, “In Comcast, we held that the Commission had failed to cite any statutory authority that justified its order, not that Comcast had never impaired Internet traffic.”

Some network neutrality advocates have reacted to the decisions and rulings I’ve discussed (as Free Press does) by asking the FCC to reverse its 2005 decision that allowed telecom companies essentially to expand as much as they want without opening up to competition. This would encounter insurmountable hurdles because government agencies have to cite compelling reasons to change any decision they’ve made, and eagle-eyed courts hold them to that high standard.

Other people trace the problem to the 1996 Telecom Act, apparently already outdated by rapid changes in the industry. I don’t have to assess the likelihood of getting Congress to take on a major revision at this time in its history, or the likelihood of Internet activists getting the result they want.

Or maybe communities will pool their resources to create their own infrastructure, a particularly bold suggestion when you consider how much it costs to string fiber between cities.

Tuesday’s ruling did not close off the FCC’s right to regulate Internet services — in fact, I think it expanded possibilities beyond the place they seemed to stand following the Comcast decision. I am not sure the current debate over things such as blocking is productive. I think much bigger forces are in play, as I discussed in my article last week about Internet centralization. However, I’ll lay odds that few, if any, lawyers will lose business as a result of Tuesday’s decision.

December 09 2013

Who will upgrade the telecom foundation of the Internet?

Although readers of this blog know quite well the role that the Internet can play in our lives, we may forget that its most promising contributions — telemedicine, the smart electrical grid, distance education, etc. — depend on a rock-solid and speedy telecommunications network, and therefore that relatively few people can actually take advantage of the shining future the Internet offers.

Worries over sputtering advances in bandwidth in the US, as well as an actual drop in reliability, spurred the FCC to create the Technology Transitions Policy Task Force, and to drive discussion of what they like to call the “IP transition”.

Last week, I attended a conference on the IP transition in Boston, one of a series being held around the country. While we tussled with the problems of reliability and competition, one urgent question loomed over the conference: who will actually make advances happen?

What’s at stake and why bids are coming in so low

It’s not hard to tally up the promise of fast, reliable Internet connections. Popular futures include:

  • Delivering TV and movie content on demand
  • Checking on your lights, refrigerator, thermostat, etc., and adjusting them remotely
  • Hooking up rural patients with health care experts in major health centers for diagnosis and consultation
  • Urgent information updates during a disaster, to aid both victims and responders

I could go on and on, but already one can see the outline of the problem: how do we get there? Who is going to actually create a telecom structure that enables everyone (not just a few privileged affluent residents of big cities) to do these things?

Costs are high, but the payoff is worthwhile. Ultimately, the applications I listed will lower the costs of the services they replace or improve life enough to justify an investment many times over. Rural areas — where investment is currently hardest to get — could probably benefit the most from the services because the Internet would give them access to resources that more centrally located people can walk or drive to.

The problem is that none of the likely players can seize the initiative. Let’s look at each one:

Telecom and cable companies
The upgrading of facilities is mostly in their hands right now, but they can’t see beyond the first item in the previous list. Distributing TV and movies is a familiar business, but they don’t know how to extract value from any of the other applications. In fact, most of the benefits of the other services go to people at the endpoints, not to the owners of the network. This has been a sore point with the telecom companies ever since the Internet took off, and spurs them on constant attempts to hold Internet users hostage and shake them down for more cash.

Given the limitations of the telecom and cable business models, it’s no surprise they’ve rolled out fiber in the areas they want and are actually de-investing in many other geographic areas. Hurricane Sandy brought this to public consciousness, but the problem has actually been mounting in rural areas for some time.

Angela Kronenberg of COMPTEL, an industry association of competitive communications companies, pointed out that it’s hard to make a business case for broadband in many parts of the United States. We have a funny demographic: we’re not as densely populated as the Netherlands or South Korea (both famous for blazingly fast Internet service), nor as concentrated as Canada and Australia, where it’s feasible to spend a lot of money getting service to the few remote users outside major population centers. There’s no easy way to reach everybody in the US.

Governments
Although governments subsidize network construction in many ways — half a dozen subsidies were reeled off by keynote speaker Cameron Kerry, former Acting Secretary of the Department of Commerce — such stimuli can only nudge the upgrade process along, not control it completely. Government funding has certainly enabled plenty of big projects (Internet access is often compared to the highway system, for instance), but it tends to go toward familiar technologies that the government finds safe, and therefore misses opportunities for radical disruption. It’s no coincidence that these safe, familiar technologies are provided by established companies with lobbyists all over DC.

As an example of how help can come from unusual sources, Sharon Gillett mentioned on her panel the use of unlicensed spectrum by small, rural ISPs to deliver Internet to areas that otherwise had only dial-up access. The FCC ruling that opened up “white space” spectrum in the TV band to such use has greatly empowered these mavericks.

Individual consumers
Although we are the ultimate beneficiaries of new technology (and will ultimately pay for it somehow, through fees or taxes) hardly anyone can plunk down the cash for it in advance: the vision is too murky and the reward too far down the road. John Burke, Commissioner of the Vermont Public Service Board, flatly said that consumers choose the phone service almost entirely on the basis of price and don’t really find out its reliability and features until later.

Basically, consumers can’t bet that all the pieces of the IP transition will fall in place during their lifetimes, and rolling out services one consumer at a time is incredibly inefficient.

Internet companies
Google Fiber came up once or twice at the conference, but their initiatives are just a proof of concept. Even if Google became the lynchpin it wants to be in our lives, it would not have enough funds to wire the world.

What’s the way forward, then? I find it in community efforts, which I’ll explore at the end of this article.

Practiced dance steps

Few of the insights in this article came up directly in the Boston conference. The panelists were old hands who had crossed each other’s paths repeatedly, gliding between companies, regulatory agencies, and academia for decades. At the conference, they pulled their punches and hid their agendas under platitudes. The few controversies I saw on stage seemed to be launched for entertainment purposes, distracting from the real issues.

From what I could see, the audience of about 75 people came almost entirely from the telecom industry. I saw just one representative of what you might call the new Internet industries (Microsoft strategist Sharon Gillett, who went to that company after an august regulatory career) and two people who represent the public interest outside of regulatory agencies (speaker Harold Feld of Public Knowledge and Fred Goldstein of Interisle Consulting Group).

Can I get through to you?

Everyone knows that Internet technologies, such as voice over IP, are less reliable than plain old telephone service, but few realize how soon reliability of any sort will be a thing of the past. When a telecom company signs you up for a fancy new fiber connection, you are no longer connected to a power source at the telephone company’s central office. Instead, you get a battery that can last eight hours in case of a power failure. A local power failure may let you stay in contact with outsiders if the nearby mobile phone towers stay up, but a larger failure will take out everything.

These issues have a big impact on public safety, a concern raised at the beginning of the conference by Gregory Bialecki in his role as a Massachusetts official, and repeated by many others during the day.

There are ways around the new unreliability through redundant networks, as Feld pointed out during his panel. But the public and regulators must take a stand for reliability, as the post-Sandy victims have done. The issue in that case was whether a community could be served by wireless connections. At this point, they just don’t deliver either the reliability or the bandwidth that modern consumers need.

Mark Reilly of Comcast claimed at the conference that 94% of American consumers now have access to at least one broadband provider. I’m suspicious of this statistic because the telecom and cable companies have a very weak definition of “broadband” and may be including mobile phones in the count. Meanwhile, we face the possibility of a whole new digital divide consisting of people relegated to wireless service, on top of the old digital divide involving dial-up access.

We’ll take that market if you’re not interested

In a healthy market, at least three companies would be racing to roll out new services at affordable prices, but every new product or service must provide a migration path from the old ones it hopes to replace. Nowhere is this more true than in networks because their whole purpose is to let you reach other people. Competition in telecom has been a battle cry since the first work on the law that became the 1996 Telecom Act (and which many speakers at the conference say needs an upgrade).

Most of the 20th century accustomed people to thinking of telecom as a boring, predictable utility business, the kind that “little old ladies” bought stock in. The Telecom Act was supposed to knock the Bell companies out of that model and turn them into fierce innovators with a bunch of other competitors. Some people actually want to reverse the process and essentially nationalize the telecom infrastructure, but that would put innovation at risk.

The Telecom Act, especially as interpreted later by the FCC, fumbled the chance to enforce competition. According to Goldstein, the FCC decided that a duopoly (baby Bells and cable companies) were enough competition.

The nail in the coffin may have been the FCC ruling that any new fiber providing IP service was exempt from the requirements for interconnection. The sleight of hand that the FCC used to make this switch was a redefinition of the Internet: they conflated the use of IP on the carrier layer with the bits traveling around above, which most people think of as “the Internet.” But the industry and the FCC had a bevy of arguments (including the looser regulation of cable companies, now full-fledged competitors of the incumbent telecom companies), so the ruling stands. The issue then got mixed in with a number of other controversies involving competition and control on the Internet, often muddled together under the term “network neutrality.”

Ironically, one of the selling points that helps maintain a competitive company, such as Granite Telecom, is reselling existing copper. Many small businesses find that the advantages of fiber are outweighed by the costs, which may include expensive quality-of-service upgrades (such as MPLS), new handsets to handle VoIP, and rewiring the whole office. Thus, Senior Vice President Sam Kline announced at the conference that Granite Telecom is adding a thousand new copper POTS lines every day.

This reinforces the point I made earlier about depending on consumers to drive change. The calculus that leads small businesses to stick with copper may be dangerous in the long run. Besides lost opportunities, it means sticking with a technology that is aging and decaying by the year. Most of the staff (known familiarly as Bellheads) who designed, built, and maintain the old POTS network are retiring, and the phone companies don’t want to bear the increasing costs of maintenance, so reliability is likely to decline. Kline said he would like to find a way to make fiber more attractive, but the benefits are still vaporware.

At this point, the major companies and the smaller competing ones are both cherry picking in different ways. The big guys are upgrading very selectively and even giving up on some areas, whereas the small companies look for niches, as Granite Telecom has. If universal service is to become a reality, a whole different actor must step up to the podium.

A beautiful day in the neighborhood

One hope for change is through municipal and regional government bodies, linked to local citizen groups who know where the need for service is. Freenets, which go back to 1984, drew on local volunteers to provide free Internet access to everyone with a dial-up line, and mesh networks have powered similar efforts in Catalonia and elsewhere. In the 1990s, a number of towns in the US started creating their own networks, usually because they had been left off the list of areas that telecom companies wanted to upgrade.

Despite legal initiatives by the telecom companies to squelch municipal networks, they are gradually catching on. The logistics involve quite a bit of compromise (often, a commercial vendor builds and runs the network, contracting with the city to do so), but many town managers swear that advantages in public safety and staff communications make the investment worthwhile.

The limited regulations that cities have over cable companies (a control that sometimes is taken away) is a crude instrument, like a potter trying to manipulate clay with tongs. To craft a beautiful work, you need to get your hands right on the material. Ideally, citizens would design their own future. The creation of networks should involve companies and local governments, but also the direct input of citizens.

National governments and international bodies still have roles to play. Burke pointed out that public safety issues, such as 911 service, can’t be fixed by the market, and developing nations have very little fiber infrastructure. So, we need large-scale projects to achieve universal access.

Several speakers also lauded state regulators as the most effective centers to handle customer complaints, but I think the IP transition will be increasingly a group effort at the local level.

Back to school

Education emerged at the conference as one of the key responsibilities that companies and governments share. The transition to digital TV was accompanied by a massive education budget, but in my home town, there are still people confused by it. And it’s a minuscule issue compared to the task of going to fiber, wireless, and IP services.

I had my own chance to join the educational effort on the evening following the conference. Friends from Western Massachusetts phoned me because they were holding a service for an elderly man who had died. They lacked the traditional 10 Jews (the minyan) required by Jewish law to say the prayer for the deceased, and asked me to Skype in. I told them that remote participation would not satisfy the law, but they seemed to feel better if I did it. So I said, “If Skype will satisfy you, why can’t I just participate by phone? It’s the same network.” See, FCC? I’m doing my part.

March 20 2013

Rethinking games

At a recent board games night hosted by Greg Brown (@practicingruby), we played a game called “Pandemic” that made me rethink the meaning of games. I won’t bother you with a detailed description; it’s enough to say that there are four or five players who take turns, and the goal is to defeat outbreaks of disease.

What makes this game unique is that you’re not playing against the other players, you’re playing against the game itself. It’s almost impossible to win, particularly at higher levels of difficulty (which Greg encourages, even for newbies). But you quickly realize that you don’t have a chance of winning if you don’t cooperate with the other players. The game is all about cooperation and collaboration. The players don’t all have equal abilities; one can move other players’ pieces around on the board, another can create research centers, another can cure larger swaths of disease. On your turn, you could just move and do whatever you think is best; but once you get the hang of it, you spend a good bit of time before each move discussing with the other players what the best strategy is, whether there are other effective ways to accomplish the same goal, and so on. You’re always discussing whether it would be better to solve a problem yourself, or move someone else so they can solve the problem more effectively on their turn.

In some ways, it’s not all that different from a role-playing game, but there is never any advantage to stabbing another player in the back or striking out on your own. But at the same time, even though it’s radically collaborative, it’s challenging. As I said, it’s almost impossible to win, and the game is structured to become more difficult the longer it goes on.

It’s a great example of rethinking gaming and rethinking competition, all in a little game that comes in a box and is played with pawns on a board.

February 08 2012

Tip for B&N: Don't just follow Amazon

This post is part of the TOC podcast series. You can also subscribe to the free TOC podcast through iTunes.


I follow dozens of publishing blogs and tweet streams, but there's one that always rises above the rest for me. Any time I see something from Joseph Esposito (@JosephJEsposito), president of Portable CEO consulting, I make sure I read it. He's a frequent contributor to the Scholarly Kitchen blog, and one of his recent articles there got me thinking about the need for better competition in the publishing industry. I sat down with Joe to discuss Amazon's dominance, what B&N should do to improve its position and much more.

Key points from the full video interview (below) include:

  • "B&N needs an 'MCI solution'" — Amazon is the clear market leader and, as #2, B&N must avoid just following Amazon's lead and come up with a completely new and different product and content model. What B&N is doing with in-store Nook merchandising is great, but they've got to go much further. [Discussed at the 1:00 mark.]
  • Can B&N do anything to disrupt Amazon Prime? — Amazon and anyone else creating a Prime-like service will start to run into the same challenges Netflix has encountered. [Discussed at 4:07.]
  • Broad content repositories vs. narrow, vertical ones — Specific genres lend themselves more to this sort of offering, and each one could have a different pricing model. Safari Books Online is a great example. [Discussed at 5:52.]
  • Pay-for-performance is the only option — Amazon has publicly stated that the Kindle Owner's Lending Library program pays most publishers a flat fee. I strongly believe that's the wrong model, and Joe talks about why the flat fee probably won't be a viable long-term option. [Discussed at 6:45.]
  • Apps vs. HTML5/EPUB — Publishers are starting to figure out that platform-specific investments often aren't wise. Development costs for a single platform, even if that's iOS, are still high, so the future leads to more open, portable solutions. [Discussed at 8:26.]
  • DRM — Joe makes an excellent point when he notes that, "the pro-DRM stance that many publishers have is not really getting them anywhere." [Discussed at 11:05.]
  • Discoverability & recommendations — Discoverability will continue to get worse before it improves, but better integration with the social graph can provide a way forward. [Discussed at 15:06.]

You can view the entire interview in the following video.

TOC NY 2012 — O'Reilly's TOC Conference, being held Feb. 13-15, 2012, in New York City, is where the publishing and tech industries converge. Practitioners and executives from both camps will share what they've learned and join together to navigate publishing's ongoing transformation.

Register to attend TOC 2012

Related:


January 30 2012

A discussion with David Farber: bandwidth, cyber security, and the obsolescence of the Internet

David Farber, a veteran of Internet technology and politics, dropped by Cambridge, Mass. today and was gracious enough to grant me some time in between his numerous meetings. On leave from Carnegie Mellon, Dave still intervenes in numerous policy discussions related to the Internet and "plays in Washington," as well as hosting the popular Interesting People mailing list. This list delves into dizzying levels of detail about technological issues, but I wanted to pump him for big ideas about where the Internet is headed, topics that don't make it to the list.

How long can the Internet last?

I'll start with the most far-reaching prediction: that Internet protocols simply aren't adequate for the changes in hardware and network use that will come up in a decade or so. Dave predicts that computers will be equipped with optical connections instead of pins for networking, and the volume of data transmitted will overwhelm routers, which at best have mixed optical/electrical switching. Sensor networks, smart electrical grids, and medical applications with genetic information could all increase network loads to terabits per second.

When routers evolve to handle terabit-per-second rates, packet-switching protocols will become obsolete. The speed of light is constant, so we'll have to rethink the fundamentals of digital networking.

I tossed in the common nostrum that packet-switching was the fundamental idea behind the Internet and its key advance over earlier networks, but Dave disagreed. He said lots of activities on the Internet reproduce circuit-like behavior, such as sessions at the TCP or Web application level. So theoretically we could re-architect the underlying protocols to fit what the hardware and the applications have to offer.

But he says his generation of programmers who developed the Internet are too tired ("It's been a tough fifteen or twenty years") and will have to pass the baton to a new group of young software engineers who can think as boldly and originally as the inventors of the Internet. He did not endorse any of the current attempts to design a new network, though.

Slaying the bandwidth bottleneck

Like most Internet activists, Dave bewailed the poor state of networking in the U.S. In advanced nations elsewhere, 100-megabit per second networking is available for reasonable costs, whereas here it's hard to go beyond a 30 megabits (on paper!) even at enormous prices and in major metropolitan areas. Furthermore, the current administration hasn't done much to improve the situation, even though candidate Obama made high bandwidth networking a part of his platform and FCC Chairman Julius Genachowski talks about it all the time.

Dave has been going to Washington on tech policy consultations for decades, and his impressions of the different administrations has a unique slant all its own. The Clinton administration really listened to staff who understood technology--Gore in particular was quite a technology junkie--and the administration's combination of judicious policy initiatives and benign neglect led to the explosion of the commercial Internet. The following Bush administration was famously indifferent to technology at best. The Obama administration lies somewhere in between in cluefulness, but despite their frequent plaudits for STEM and technological development, Dave senses that neither Obama nor Biden really have the drive to deal with and examine complex technical issues and insist on action where necessary.

I pointed out the U.S.'s particular geographic challenges--with a large, spread-out population making fiber expensive--and Dave countered that fiber to the home is not the best solution. In fact, he claims no company could make fiber pay unless it gained 75% of the local market. Instead, phone companies should string fiber to access points 100 meters or so from homes, and depend on old copper for the rest. This could deliver quite adequate bandwidth at a reasonable cost. Cable companies, he said, could also greatly increase Internet speeds. Wireless companies are pretty crippled by loads that they encouraged (through the sale of app-heavy phones) and then had problems handling, and are busy trying to restrict users' bandwidth. But a combination of 4G, changes in protocols, and other innovations could improve their performance.

Waiting for the big breach

I mentioned that in the previous night's State of the Union address, Obama had made a vague reference to a href="http://www.whitehouse.gov/blog/2012/01/26/legislation-address-growing-danger-cyber-threats">cybersecurity initiative with a totally unpersuasive claim that it would protect us from attack. Dave retorted that nobody has a good definition of cybersecurity, but that this detail hasn't held back every agency with a stab at getting funds for it from putting forward a cybersecurity strategy. The Army, the Navy, Homeland Security, and others are all looking or new missions now that old ones are winding down, and cybersecurity fills the bill.

The key problem with cybersecurity is that it can't be imposed top-down, at least not on the Internet, which, in a common observation reiterated by Dave, was not designed with security in mind. If people use weak passwords (and given current password cracking speeds, just about any password is weak) and fall victim to phishing attacks, there's little we can do with dictats from the center. I made this point in an article twelve years ago. Dave also pointed out that viruses stay ahead of pattern-matching virus detection software.

Security will therefore need to be rethought drastically, as part of the new network that will replace the Internet. In the meantime, catastrophe could strike--and whoever is in the Administration at the time will have to face public wrath.

Odds without ends

We briefly discussed FCC regulation, where Farber tends to lean toward asking the government to forebear. He acknowledged the merits of arguments made by many Internet supporters, that the FCC tremendously weakened the chances for competition in 2002 when it classified cable Internet as a Title 1 service. This shielded the cable companies from regulations under a classification designed back in early Internet days to protect the mom-and-pop ISPs. And I pointed out that the cable companies have brazenly sued the FCC to win court rulings saying the companies can control traffic any way they choose. But Farber says there are still ways to bring in the FCC and other agencies, notably the Federal Trade commission, to enforce anti-trust laws, and that these agencies have been willing to act to shut down noxious behavior.

Dave and I shared other concerns about the general deterioration of modern infrastructure, affecting water, electricity, traffic, public transportation, and more. An amateur pilot, Dave knows some things about the air traffic systems that make on reluctant to fly. But there a few simple fixes. Commercial air flights are safe partly because pilots possess great sense and can land a plane even in the presence of confusing and conflicting information. On the other hand, Dave pointed out that mathematicians lack models to describe the complexity of such systems as our electrical grid. There are lots of areas for progress in data science.

May 13 2011

Winners of the writable API competition

Last month we ran a developer competition around the newly released Fluidinfo writable API for O'Reilly books and authors. The three judges — Tim O'Reilly, O'Reilly editor Mike Loukides, and O'Reilly GM Joe Wikert — have today announced the winners.

First prize: Book Chirpa

Book Chirpa

Mark McSpadden gets first prize for Book Chirpa. Mark wins an OSCON package that includes a full conference pass, coach airfare from within the US, and 4 nights hotel accommodation. Book Chirpa was "built to explore what people on Twitter are saying about O'Reilly books." It can show you the stream of O'Reilly book mentions, trending books, or a virtual library of all O'Reilly books mentioned on Twitter.

Second prize: Skillshelves

Skillshelf

Jonas Neubert gets second prize for Skillshelves. Jonas wins his choice of either an iPad 2 or a Xoom tablet. Skillshelves lets you "Show the world what tech topics you are an expert in — simply by making a list of O'Reilly books in your bookshelf."

Third prize: FluidCV

FluidCV

Eric Seidel gets third prize for FluidCV. Eric wins his choice of $500 worth of O'Reilly ebooks and/or videos. FluidCV pulls together information for your CV from tags in Fluidinfo, allowing the dynamic construction of a CV just by tagging relevant objects in Fluidinfo. Tag an O'Reilly book in Fluidinfo and the book cover and associated skill automatically appears in your CV. Eric's own FluidCV can be seen here.

Congratulations to the winners and many thanks to all who entered.

[Disclosure: Tim O'Reilly is an investor in Fluidinfo.]



Related:


August 11 2010

What I get and don't get about the Google/Verizon proposal

Nobody knew for a long time what Google and Verizon were cooking up on
the network neutrality front, and after the release of their brief,
two-page roadmap (posted href="http://www.scribd.com/doc/35599242/Verizon-Google-Legislative-Framework-Proposal">On
Scribd as a PDF, among other places) nobody still knows. All the
usual Internet observers have had their say, and in general the
assessment is negative.

My first reaction was to ignore the whole thing, mainly because the
language of the agreement didn't match any Internet activity I could
recognize. Some of the false notes struck:

  • The Consumer Protections section keeps using the term "lawful" as if
    there was a regulatory regime on the Internet. Not even the people
    regularly accused of trying to extend government control over the
    Internet (ICANN, WSIS, and the ITU) believe they can define what's
    lawful and make people stick to it.

    If I can send and receive only lawful content, who do Google and
    Verizon think can stop me from exchanging child pornography or
    instructions to blow up buildings? What, in turn, distinguishes lawful
    applications and services from unlawful ones (outside of Saudi Arabia
    and the United Arab Emirates)?

    Deduction: This passage represents no meaningful or enforceable rules,
    but is thrown in to make regulators feel there's a policy where in
    fact there is none.

  • The Transparency section strews around nice, general statements no one
    could complain about--don't we all want our services to tell us what
    they're doing?--but the admonitions are too general to interpret or
    apply.

    For instance, Apple is adamant about its right to determine what apps
    are available to iPhone and iPad buyers. Is that transparency?
    Apparently not, because every Apple developer gnaws his fingernails
    waiting to hear whether and when his app will be accepted into the App
    Store. But I don't see language in the Google/Verizon transparency
    section that covers the App Store at all. They might well say it's not
    networking issue.

    Fine, let's turn to networking. The carriers maintain that they need
    flexibility and a certain degree of secrecy to combat abuses such as
    spam; see for instance my blog href="http://www.oreillynet.com/onlamp/blog/2008/04/consider_the_economics_in_netw.html">Consider
    the economics in network neutrality. Squaring this complex
    issue--which is covered by the Google/Verizon in the next item on
    Network Management--with transparency is a dilemma.

    Deduction: we can all say we're transparent and feel good, but life is
    too complex for authorities to be totally transparent about what
    they're transparent about.

  • The worst passage in my view is the one in the Regulatory Authority
    section assigning authority to the FCC for "broadband." That
    ill-defined term, used far too much in Washington, tends to appear in
    the context of universal service. One can regulate broadband by such
    things as providing incentives to build more networks, but the
    Regulatory Authority section sidesteps the more basic questions of who
    gets to regulate the building, interconnecting, and routing through
    networks.

    Deduction: Google and Verizon put this in to encourage the government
    to continue pouring money into the current telcos and cable companies
    so they can build more high-speed networks, but its effect on
    regulation is nil.

Not too inspiring on first impression, but because so many other
people raised such a brouhaha over the Google/Verizon announcement, I
decided to think about it a bit more. And I actually ended up feeling
good about one aspect. The proposal is really a big concession to the
network neutrality advocates. I had been feeling sour about proposals
for network neutrality because, as nice as they sound in the abstract,
the devil is in the details. Network management for spam and other
attacks provides one example.

But the Google/Verizon announcement explicitly denounces
discrimination and mandates adherence to Internet standards. (Of
course, some Internet standards govern discrimination.) It seems to me
that, after this announcement, no network provider can weep and wring
its hands and claim that it would be unable to do business on a
non-discriminatory basis. And network neutrality advocates can cite
this document for support.

But as others have pointed out, the concession granted in the
"Non-Discrimination Requirement" section is ripped away by the
"Additional Online Services" section to "traffic prioritization." This
makes it clear that the "services" offered in that section reach deep
into the network infrastructure where they can conflict directly with
public Internet service. Unless someone acknowledges the contradiction
between the two sections and resolves it in a logical manner, this
document becomes effectively unusable.

What about the other pesky little exemption in the proposal--wireless
networks? Certainly, a lot of computing is moving to mobile devices.
But wireless networks really are special. Not only are they hampered
by real limits on traffic--the networks being shared and having
limited spectrum--but users have limited tolerance for unwanted
content and for fidgeting around with their devices. They don't want
to perform sophisticated control over transmission over content; they
need someone to do it for them.

Anyway, fiber is always going to provide higher bandwidth than
wireless spectrum. So I don't believe wireless will become
dominant. It will be a extremely valuable companion to us as we walk
through the day, saving data about ourselves and getting information
about our environment, but plenty of serious work will go on over the
open Internet.

So in short, I disdain the Google/Verizon agreement from an editor's
point of view but don't mind it as a user. In general, I have nothing
against parties in a dispute (here, the telephone companies who want
to shape traffic and the Internet sites who don't want them to)
conducting talks to break down rigid policy positions and arrive at
compromises. The Google/Verizon talks are fraught with implications,
of course, because Google is a wireless provider and Verizon
distributes lots of phones with Google's software and services. So I
take the announcement as just one stake in the ground along a large
frontier. I don't see the proposal being adopted in any regulatory
context--it's too vague and limited--but it's interesting for what it
says about Google and Verizon.

April 06 2010

DC Circuit court rules in Comcast case, leaves the FCC a job to do

Today's ruling in Comcast v. FCC will certainly change the
terms of debate over network neutrality, but the win for Comcast is
not as far-reaching as headlines make it appear. The DC Circuit court
didn't say, "You folks at the Federal Communications Commission have
no right to tell any Internet provider what to do without
Congressional approval." It said, rather, "You folks at the FCC didn't
make good arguments to prove that your rights extend to stopping
Comcast's particular behavior."

I am not a lawyer, but to say what happens next will take less of a
lawyer than a fortune-teller. I wouldn't presume to say whether the
FCC can fight Comcast again over the BitTorrent issue. But the court
left it open for the FCC to try other actions to enforce rules on
Internet operators. Ultimately, I think the FCC should take a hint
from the court and stop trying to regulate the actions of telephone
and cable companies at the IP layer. The hint is to regulate them at
the level where the FCC has more authority--on the physical level,
where telephone companies are regulated as common carriers and cable
companies have requirements to the public as well.




The court noted (on pages 30 through 34 of href="http://pacer.cadc.uscourts.gov/common/opinions/201004/08-1291-1238302.pdf">its
order) that the FCC missed out on the chance to make certain
arguments that the court might have looked on more favorably.
Personally and amateurly, I think those arguments would be weak
anyway. For instance, the FCC has the right to regulate activities
that affect rates. VoIP can affect phone rates and video downloads
over the Internet can affect cable charges for movies. So the FCC
could try to find an excuse to regulate the Internet. But I wouldn't
be the one to make that excuse.

The really significant message to the FCC comes on pages 30 and 32.
The court claims that any previous court rulings that give power to
the FCC to regulate the Internet (notably the famous Brand X decision)
are based on its historical right to regulate common carriers (e.g.,
telephone companies) and broadcasters. Practically speaking, this
gives the FCC a mandate to keep regulating the things that
matter--with an eye to creating a space for a better Internet and
high-speed digital networking (broadband).

Finding the right layer

Comcast v. FCC combines all the elements of a regulatory
thriller. First, the stakes are high: we're talking about who
controls the information that comes into our homes. Second, Comcast
wasn't being subtle in handling BitTorrent; its manipulations were
done with a conscious bias, carried out apparently arbitrarily (rather
than being based on corporate policy, it seems that a network
administrator made and implemented a personal decision), and were kept
secret until customers uncovered the behavior. If you had asked for a
case where an Internet provider said, "We can do anything the hell we
want regardless of any political, social, technical, moral, or
financial consequences," you'd choose something like Comcast's
impedance of BitTorrent.

And the court did not endorse that point of view. Contrary to many
headlines, the court affirmed that the FCC has the right to
regulate the Internet. Furthermore, the court acknowledged that
Congress gave the FCC the right to promote networking. But the FCC
must also observe limits.

The court went (cursorily in some cases) over the FCC's options for
regulating Comcast's behavior, and determined either that there was no
precedent for it or (I'm glossing over lots of technicalities here)
that the FCC had not properly entered those options into the case.

The FCC should still take steps to promote the spread of high-speed
networking, and to ensure that it is affordable by growing numbers of
people. But it must do so by regulating the lines, not what travels
over those lines.

As advocates for greater competition have been pointing out for
several years, the FCC fell down on that public obligation. Many trace
the lapse to the chairmanship of Bush appointee Michael Powell. And
it's true that he chose to try to get the big telephone and cable
companies to compete with each other (a duopoly situation) instead of
opening more of a space for small Internet providers. I cover this
choice in a 2004
article
. But it's not fair to say Powell had no interest in
competition, nor is it historically accurate to say this was a major
change in direction for the FCC.

From the beginning, when the 1996 telecom act told the FCC to promote
competition, implementation was flawed. The FCC chose 14 points in the
telephone network where companies had to allow interconnection (so
competitors could come on the network). But it missed at least one
crucial point. The independent Internet providers were already losing
the battle before Powell took over the reins at the FCC.

And the notion of letting two or three big companies duke it out
(mistrusting start-ups to make a difference) is embedded in the 1996
act itself.

Is it too late to make a change? We must hope not. Today's court
ruling should be a wake-up call; it's time to get back to regulating
things that the FCC actually can influence.

Comcast's traffic shaping did not change the networking industry. Nor
did it affect the availability of high-speed networks. It was a clumsy
reaction by a beleaguered company to a phenomenon it didn't really
understand. Historically, it will prove an oddity, and so will the
spat that network advocates started, catching the FCC in its snares.

The difficulty software layers add

The term "Internet" is used far too loosely. If you apply it to all
seven layers of the ISO networking model, it covers common carrier
lines regulated by the FCC (as well as cable lines, which are subject
to less regulation--but still some). But the FCC has historically
called the Internet a "service" that is separate from those lines.

Software blurs and perhaps even erases such neat distinctions. Comcast
does not have to rewire its network or shut down switches to control
it. All they have to do is configure a firewall. That's why stunts
like holding back BitTorrent traffic become networking issues and draw
interest from the FCC. But it also cautions against trying to regulate
what Comcast does, because it's hard to know when to stop. That's what
opponents of network neutrality say, and you can hear it in the court
ruling.

The fuzzy boundaries between software regulation and real-world
activities bedevils other areas of policy as well. Because
sophisticated real-world processing moves from mechanical devices into
software, it encourages inventors to patent software innovations, a
dilemma I explore in href="http://radar.oreilly.com/archives/2007/09/three_vantage_p.html">another
article. And in the 1990s, courts argued over whether encryption
was a process or a form of expression--and decided it was a form of
expression.

Should the FCC wait for Congress to tell it what to do? I don't think
so. The DC Circuit court blocked one path, but it didn't tell the FCC
to turn back. It has a job to do, and it just has to find the right
tool for the job.

February 05 2010

One hundred eighty degrees of freedom: signs of how open platforms are spreading

I was talking recently with Bob Frankston, who has a href="http://frankston.com/public/Bob_Frankston_Bio.asp">distinguished
history in computing that goes back to work on Multics, VisiCalc,
and Lotus Notes. We were discussing some of the dreams of the Internet
visionaries, such as total decentralization (no mobile-system walls,
no DNS) and bandwidth too cheap to meter. While these seem impossibly
far off, I realized that computing and networking have come a long way
already, making things normal that not too far in the past would have
seemed utopian.



Flat-rate long distance calls

I remember waiting past my bedtime to make long-distance calls, and
getting down to business real quick to avoid high charges.
Conventional carriers were forced to flat-rate pricing by competition
from VoIP (which I'll return to later in the blog). International
calls are still overpriced, but with penny-per-minute cards available
in any convenience store, I don't imagine any consumers are paying
those high prices.



Mobile phone app stores

Not that long ago, the few phones that offered Internet access did so
as a novelty. Hardly anybody seriously considered downloading an
application to their phones--what are you asking for, spam and
fraudulent charges? So the iPhone and Android stores teaming with
third-party apps are a 180-degree turn for the mobile field. I
attribute the iPhone app store once again to competition: the href="http://www.oreillynet.com/onlamp/blog/2008/01/the_unflappable_free_software.html">uncovering
of the iPhone SDK by a free software community.



Downloadable TV segments

While the studios strike deals with Internet providers, send out
take-down notices by the ream, and calculate how to derive revenue
from television-on-demand, people are already getting the most popular
segments from Oprah Winfrey or Saturday Night Live whenever they want,
wherever they want.



Good-enough generic devices

People no longer look down on cheap, generic tools and devices. Both
in software and in hardware, people are realizing that in the long run
they can do more with simple, flexible, interchangeable parts than
with complex and closed offerings. There will probably always be a
market for exquisitely designed premium products--the success of Apple
proves that--but the leading edge goes to products that are just "good
enough," and the DIY movement especially ensures a growing market for
building blocks of that quality.


I won't even start to summarize Frankston's own href="http://frankston.com/public/">writings, which start with
premises so far from what the Internet is like today that you won't be
able to make complete sense of any one article on its own. I'd
recommend the mind-blowing href="http://frankston.com/public/?n=Sidewalks">Sidewalks: Paying by
the Stroll if you want to venture into his world.

But I'll mention one sign of Frankston's optimism: he reminded me that
in the early 1990s, technologists were agonizing over arcane
quality-of-service systems in the hope of permitting VoIP over
ordinary phone connections. Now we take VoIP for granted and are
heading toward ubiquitous video. Why? Two things happened in parallel:
the technologists figured out much more efficient encodings, and
normal demand led to faster transmission technologies even over
copper. We didn't need QoS and all the noxious control and overhead it
entails. More generally, it's impossible to determine where progress
will come from or how fast it can happen.

January 14 2010

Innovation Battles Investment as FCC Road Show Returns to Cambridge

Opponents can shed their rhetoric and reveal new depths to their
thought when you bring them together for rapid-fire exchanges,
sometimes with their faces literally inches away from each other. That
made it worth my while to truck down to the MIT Media Lab for
yesterday's href="http://hraunfoss.fcc.gov/edocs_public/attachmatch/DOC-295521A1.pdf">Workshop
on Innovation, Investment and the Open Internet, sponsored by the
Federal Communications Commission. In this article I'll cover:

Context and background

The FCC kicked off its country-wide hearing campaign almost two years
ago with a meeting at Harvard Law School, which quickly went wild. I
covered the href="http://radar.oreilly.com/archives/2008/02/network-neutrality-how-the-fcc.html">
experience in one article and the href="http://radar.oreilly.com/archives/2008/02/network-neutrality-code-words.html">
unstated agendas in another. With a star cast and an introduction
by the head of the House's Subcommittee on Telecommunications and the
Internet, Ed Markey, the meeting took on such a cachet that the
public flocked to the lecture hall, only to find it filled because
Comcast recruited people off the street to pack the seats and keep
network neutrality proponents from attending. (They had an overflow
room instead.)

I therefore took pains to arrive at the Media Lab's Bartos Theater
early yesterday, but found it unnecessary. Even though Tim Berners-Lee
spoke, along with well-known experts across the industry, only 175
people turned up, in my estimation (I'm not an expert at counting
crowds). I also noticed that the meeting wasn't worth a
mention today in the Boston Globe.

Perhaps it was the calamitous earthquake yesterday in Haiti, or the
bad economy, or the failure of the Copenhagan summit to solve the
worst crisis ever facing humanity, or concern over three wars the US
is involved in (if you count Yemen), or just fatigue, but it seems
that not as many people are concerned with network neutrality as two
years ago. I recognized several people in the audience yesterday and
surmised that the FCC could have picked out a dozen people at random
from their seats, instead of the parade of national experts on the
panel, and still have led a pretty darned good discussion.

And network neutrality is definitely the greased pig everyone is
sliding around. There are hundreds of things one could discuss
in the context of innovation and investment, but various political
forces ranging from large companies (AT&T versus Google) to highly
visible political campaigners (Huffington Post) have made network
neutrality the agenda. The FCC gave several of the movement's leaders
rein to speak, but perhaps signaled its direction by sending Meredith
Attwell Baker as the commissioner in attendance.

In contrast to FCC chair Julius Genachowski, who publicly calls for
network neutrality (a position also taken by Barack Obama href="http://www.barackobama.com/issues/technology/index_campaign.php#open-internet">
during his presidential campaign), Baker has traditionally
espoused a free-market stance. She opened the talks yesterday by
announcing that she is "unconvinced there is a problem" and posing the
question: "Is it broken?" I'll provide my own opinion later in this
article.

Two kinds of investment

Investment is the handmaiden, if not the inseminator, of innovation.
Despite a few spectacular successes, like the invention of Linux and
Apache, most new ideas require funding. Even Linux and Apache are
represented now by foundations backed now by huge companies.

So why did I title this article "Innovation Battles Investment"?
Because investment happens at every level of the Internet, from the
cables and cell towers up to the applications you load on your cell
phone.

Here I'll pause to highlight an incredible paradigm shift that was
visible at this meeting--a shift so conclusive that no one mentioned
it. Are you old enough to remember the tussle between "voice" and
"data" on telephone lines? Remember the predictions that data would
grow in importance at the expense of voice (meaning Plain Old
Telephone Service) and the milestones celebrated in the trade press when
data pulled ahead of voice?

Well, at the hearing yesterday, the term "Internet" was used to cover
the whole communications infrastructure, including wires and cell
phone service. This is a mental breakthrough all it's own, and one
I'll call the Triumph of the Singularity.

But different levels of infrastructure benefit from different
incentives. I found that all the participants danced around this.
Innovation and investment at the infrastructure level got short shrift
from the network neutrality advocates, whether in the bedtime story
version delivered by Barbara van Schewick or the deliberately
intimidating, breakneck overview by economist Shane Greenstein, who
defined openness as "transparency and consistency to facilitate
communication between different partners in an independent value
chain."

You can explore his href="http://www.kellogg.northwestern.edu/faculty/greenstein/images/articles.html">
papers on your own, but I took this to mean, more or less, that
everybody sharing a platform should broadcast their intentions and
appraise everybody else of their plans, so that others can make the
most rational decisions and invest wisely. Greenstein realized, of
course that firms have little incentive to share their strategies. He said that
communication was "costly," which I take as a reference not to an expenditure of
money but to a surrender of control and relinquishing of opportunities.

This is just what the cable and phone companies are not going to do.
Dot-com innovator Jeffrey Glueck, founder of href="http://www.skyfire.com/">Skyfire, would like the FCC to
require ISPs to give application providers and users at least 60 to 90
days notice before making any changes to how they treat traffic. This
is absurd in an environment where bad actors require responses within
a few seconds and the victory goes to the router administrators with
the most creative coping strategy. Sometimes network users just have
to trust their administrators to do the best thing for them. Network
neutrality becomes a political and ethical issue when administrators
don't. But I'll return to this point later.

The pocket protector crowd versus the bean
counters

If the network neutrality advocates could be accused of trying to
emasculate the providers, advocates for network provider prerogative
were guilty of taking the "Trust us" doctrine too far. For me, the
best part of yesterday's panel was how it revealed the deep gap that
still exists between those with an engineering point of view and those
with a traditional business point of view.

The engineers, led by Internet designer David Clark, repeated the
mantra of user control of quality of service, the vehicle for this
being the QoS field added to the IP packet header. Van Schewick
postulated a situation where a user increases the QoS on one session
because they're interviewing for a job over the Internet, then reduces
the QoS to chat with a friend.

In the rosy world envisioned by the engineers, we would deal not with
the physical reality of a shared network with our neighbors, all
converging into a backhaul running from our ISP to its peers, but with
the logical mechanism of a limited, dedicated bandwidth pipe (former
senator Ted Stevens can enjoy his revenge) that we would spend our
time tweaking. One moment we're increasing the allocation for file
transfer so we can upload a spreadsheet to our work site; the next
moment we're privileging the port we use for an MPMG.

The practicality of such a network service is open to question. Glueck
pointed out that users are unlikely ever to ask for lower quality of
service (although this is precisely the model that Internet experts
have converged on, as I report in my 2002 article href="http://www.oreillynet.com/pub/a/network/2002/06/11/platform.html">
A Nice Way to Get Network Quality of Service?). He recommends
simple tiers of service--already in effect at many providers--so that
someone who wants to carry out a lot of P2P file transfers or
high-definition video conferencing can just pay for it.

In contrast, network providers want all the control. Much was made
during the panel of a remark by Marcus Weldon of Alcatel-Lucent in
support of letting the providers shape traffic. His pointed out that
video teleconferencing over the fantastically popular Skype delivered
unappealing results over today's best-effort Internet delivery, and
suggested a scenario where the provider gives the user a dialog box
where the user could increase the QoS for Skype in order to enjoy the
video experience.

Others on the panel legitimately flagged this comment as a classic
illustration of the problem with providers' traffic shaping: the
provider would negotiate with a few popular services such as Skype
(which boasts tens of millions of users online whenever you log in)
and leave innovative young services to fend for themselves in a
best-effort environment.

But the providers can't see doing quality of service any other way.
Their business model has always been predicated on designing services
around known costs, risks, and opportunities. Before they roll out a
service, they need to justify its long-term prospects and reserve
control over it for further tweaking. If the pocket protector crowd in
Internet standards could present their vision to the providers in a
way that showed them the benefits they'd accrue from openness
(presumably by creating a bigger pie), we might have progress. But the
providers fear, above else, being reduced to a commodity. I'll pick up
this theme in the next section.

Is network competition over?

Law professor Christopher S. Yoo is probably the most often heard (not
at this panel, unfortunately, where he was given only a few minutes)
of academics in favor of network provider prerogatives. He suggested
that competition was changing, and therefore requiring a different
approach to providers' funding models, from the Internet we knew in
the 1990s. Emerging markets (where growth comes mostly from signing up
new customers) differ from saturated markets (where growth comes
mainly from wooing away your competitors' customers). With 70% of
households using cable or fiber broadband offerings, he suggested the
U.S. market was getting saturated, or mature.

Well, only if you accept that current providers' policies will stifle
growth. What looks like saturation to an academic in the U.S. telecom
field looks like a state of primitive underinvestment to people who
enjoy lightning-speed service in other developed nations.

But Yoo's assertion makes us pause for a moment to consider the
implications of a mature network. When change becomes predictable and
slow, and an infrastructure is a public good--as I think everyone
would agree the Internet is--it becomes a candidate for government
takeover. Indeed, there have been calls for various forms of
government control of our network infrastructure. In some places this
is actually happening, as cities and towns create their own networks.
A related proposal is to rigidly separate the physical infrastructure
from the services, barring companies that provide the physical
infrastructure from offering services (and therefore presumably
relegating them to a maintenance role--a company in that position
wouldn't have much incentive to take on literally ground-breaking new
projects).

Such government interventions are politically inconceivable in the
United States. Furthermore, experience in other developed nations with
more successful networks shows that it is unnecessary.

No one can doubt that we need a massive investment in new
infrastructure if we want to use the Internet as flexibly and
powerfully as our trading partners. But there was disagreement yesterday about
how much of an effort the investment will take, and where it will come
from.

Yoo argued that a mature market requires investment to come from
operating expenditures (i.e., charging users more money, which
presumably is justified by discriminating against some traffic in
order to offer enhanced services at a premium) instead of capital
expenditures. But Clark believes that current operating expenditures
would permit adequate growth. He anticipated a rise in Internet access
charges of $20 a month, which could fund the added bandwidth we need
to reach the Internet speeds of advanced countries. In exchange for
paying that extra $20 per month, we would enjoy all the content we
want without paying cable TV fees.

The current understanding by providers is that usage is rising
"exponentially" (whatever that means--they don't say what the exponent
is) whereas charges are rising slowly. Following some charts from
Alcatel-Lucent's Weldon that showed profits disappearing entirely in a
couple years--a victim of the squeeze between rising usage and slow
income growth--Van Schewick challenged him, arguing that providers can
enjoy lower bandwidth costs to the tune of 30% per year. But Weldon
pointed out that the only costs going down are equipment, and claimed
that after a large initial drop caused by any disruptive new
technology, costs of equipment decrease only 10% per year.

Everyone agreed that mobile, the most exciting and
innovation-supporting market, is expensive to provide and suffering an
investment crisis. It is also the least open part of the Internet and
the part most dependent on legacy pricing (high voice and SMS
charges), deviating from the Triumph of the Singularity.

So the Internet is like health care in the U.S.: in worse shape than
it appears. We have to do something to address rising
usage--investment in new infrastructure as well as new
applications--just as we have to lower health care costs that have
surpassed 17% of the gross domestic product.

Weldon's vision--a rosy one in its own way, complementing the
user-friendly pipe I presented earlier from the engineers--is that
providers remain free to control the speeds of different Internet
streams and strike deals with anyone they want. He presented provider
prerogatives as simple extensions of what already happens now, where
large companies create private networks where they can impose QoS on
their users, and major web sites contract with content delivery
networks such as Akamai (represented at yesterday's panel by lawyer
Aaron Abola) to host their content for faster response time. Susie Kim
Riley of Camiant testified that European providers are offering
differentiated services already, and making money by doing so.

What Weldon and Riley left out is what I documented in href="http://www.oreillynet.com/pub/a/network/2002/06/11/platform.html">
A Nice Way to Get Network Quality of Service? Managed networks
providing QoS are not the Internet. Attempts to provide QoS over the
Internet--by getting different providers to cooperate in privileging
certain traffic--have floundered. The technical problems may be
surmountable, but no one has figured out how to build trust and to design
adequate payment models that would motivate providers to cooperate.

It's possible, as Weldon asserts, that providers allowed to manage
their networks would invest in infrastructure that would ultimately
improve the experience for all sites--those delivered over the
Internet by best-effort methods as well as those striking deals. But
the change would still represent increased privatization of the public
Internet. It would create what application developers such as Glueck
and Nabeel Hyatt of Conduit Labs fear most: a thousand different
networks with different rules that have to be negotiated with
individually. And new risks and costs would be placed in the way of
the disruptive innovators we've enjoyed on the Internet.

Competition, not network neutrality, is actually the key issue facing
the FCC, and it was central to their Internet discussions in the years
following the 1996 Telecom Act. For the first five years or so, the
FCC took seriously a commitment to support new entrants by such
strategies as requiring incumbent companies to allow interconnection.
Then, especially under Michael Powell, the FCC did an about-face.

The question posed during this period was: what leads to greater
investment and growth--letting a few big incumbents enter each other's
markets, or promoting a horde of new, small entrants? It's pretty
clear that in the short term, the former is more effective because the
incumbents have resources to throw at the problem, but that in the
long term, the latter is required in order to find new solutions and
fix problems by working around them in creative ways.

Yet the FCC took the former route, starting in the early 2000s. They
explicitly made a deal with incumbents: build more infrastructure, and
we'll relax competition rules so you don't have to share it with other
companies.

Starting a telecom firm is hard, so it's not clear that pursuing the
other route would have saved us from the impasse we're in today. But a
lack of competition is integral to our problems--including the one
being fought out in the field of "network neutrality."

All the network neutrality advocates I've talked to wish that we had
more competition at the infrastructure level, because then we could
rely on competition to discipline providers instead of trying to
regulate such discipline. I covered this dilemma in a 2006 article, href="http://lxer.com/module/newswire/view/53907/">Network Neutrality
and an Internet with Vision. But somehow, this kind of competition
is now off the FCC agenda. Even in the mobile space, they offer
spectrum though auctions that permit the huge incumbents to gather up
the best bands. These incumbents then sit on spectrum without doing
anything, a strategy known as "foreclosure" (because it forecloses
competitors from doing something useful with it).

Because everybody goes off in his own direction, the situation pits two groups against each other that should be
cooperating: small ISPs and proponents of an open Internet.

What to regulate

Amy Tykeson, CEO of a small Oregon Internet provider named
BendBroadband, forcefully presented the view of an independent
provider, similar to the more familiar imprecations by href="http://www.brettglass.com/"> Brett Glass of Lariat. In their
world--characterized by paper-thin margins, precarious deals with
back-end providers, and the constant pressure to provide superb
customer service--flexible traffic management is critical and network
neutrality is viewed as a straitjacket.

I agree that many advocates of network neutrality have oversimplified
the workings of the Internet and downplayed the day-to-day
requirements of administrators. In contrast, as I have shown, large
network providers have overstepped their boundaries. But to end this
article on a positive note (you see, I'm trying) I'll report that the
lively exchange did produce some common ground and a glimmer of hope
for resolving the differing positions.

First, in an exchange between Berners-Lee and van Schewick on the
pro-regulatory side and Riley on the anti-regulatory side, a more
nuanced view of non-discrimination and quality of service emerged.
Everybody on panel offered vociferous exclamations in support of the position that it was
unfair discrimination for a network provider to prevent a user from
getting legal content or to promote one web site over a competing web
site. And this is a major achievement, because those are precisely
the practices that providers liked AT&T and Verizon claim the
right to do--the practices that spawned the current network neutrality
controversy.

To complement this consensus, the network neutrality folks approved
the concept of quality of service, so long as it was used to improve
the user experience instead of to let network providers pick winners.
In a context where some network neutrality advocates have made QoS a
dirty word, I see progress.

This raises the question of what is regulation. The traffic shaping
policies and business deals proposed by AT&T and Verizon are a
form of regulation. They claim the same privilege that large
corporations--we could look at health care again--have repeatedly
tried to claim when they invoke the "free market": the right of
corporations to impose their own regulations.

Berners-Lee and others would like the government to step in and issue
regulations that suppress the corporate regulations. A wide range of
wording has been proposed for the FCC's consideration. Commissioner
Baker asked whether, given the international reach of the Internet,
the FCC should regulate at all. Van Schewick quite properly responded
that the abuses carried out by providers are at the local level and
therefore can be controlled by the government.

Two traits of a market are key to innovation, and came up over and
over yesterday among dot-com founders and funders (represented by Ajay
Agarwal of Bain Capital) alike: a level playing field, and
light-handed regulation.

Sometimes, as Berners-Lee pointed out, government regulation is
required to level the playing field. The transparency and consistency
cited by Greenstein and others are key features of the level playing
field. And as I pointed out, a vacuum in government regulation is
often filled by even more onerous regulation by large corporations.

One of the most intriguing suggestions of the day came from Clark, who
elliptically suggested that the FCC provide "facilitation, not
regulation." I take this to mean the kind of process that Comcast and
BitTorrent went through, of which Sally Shipman Wentworth of ISOC
boasted about in her opening remarks. Working with the IETF (which she
said created two new working groups to deal with the problem), Comcast
and BitTorrent worked out a protocol that should reduce the load of
P2P file sharing on networks and end up being a win-win for everybody.

But there are several ways to interpret this history. To free market
ideologues, the Comcast/BitTorrent collaboration shows that private
actors on the Internet can exploit its infinite extendibility to find
their own solutions without government meddling. Free market
proponents also call on anti-competition laws to hold back abuses. But
those calling for parental controls would claim that Comcast wanted
nothing to do with BitTorrent and started to work on technical
solutions only after getting tired of the feces being thrown its way
by outsiders, including the FCC.

And in any case--as panelists pointed out--the IETF has no enforcement
power. The presence of a superior protocol doesn't guarantee that
developers and users will adopt it, or that network providers will
allow traffic that could be a threat to their business models.

The FCC at Harvard, which I mentioned at the beginning of this
article, promised intervention in the market to preserve Internet
freedom. What we got after that (as I predicted) was a slap on
Comcast's wrist and no clear sense of direction. The continued
involvement of the FCC--including these public forums, which I find
educational--show, along with the appointment of the more
interventionist Genachowski and the mandate to promote broadband in
the American Recovery and Reinvestment Act, that it can't step away
from the questions of competition and investment.

January 06 2010

Four short links: 6 January 2010

  1. How Visa, Using Card Fees, Dominates a Market -- (NY Times) two interesting lessons here. First, that incentives to create a good system are easily broken when three parties are involved (here Visa sets the fees that merchants pay banks, so it's in Visa's interest to raise those fees as high as possible to encourage more banks to offer Visa cards). Second, that that value-based charging ("regardless of our costs, we'll charge as much as we can without bankrupting or driving away all of you") sounds great when you're doing the charging but isn't so appealing when you're on the paying end. Visa justifies its fees not on the grounds of cost to provide the service, but rather by claiming that their service makes everything more convenient and so people shop more.
  2. Doing It Wrong (Tim Bray) -- What I’m writing here is the single most important take-away from my Sun years, and it fits in a sentence: The community of developers whose work you see on the Web, who probably don’t know what ADO or UML or JPA even stand for, deploy better systems at less cost in less time at lower risk than we see in the Enterprise. This is true even when you factor in the greater flexibility and velocity of startups. I've been working with a Big Company and can only agree with this: The point is that that kind of thing simply cannot be built if you start with large formal specifications and fixed-price contracts and change-control procedures and so on. So if your enterprise wants the sort of outcomes we’re seeing on the Web (and a lot more should), you’re going to have to adopt some of the cultures and technologies that got them built.
  3. Analytics X Prize -- The Analytics X Prize is an ongoing contest to apply analytics, modeling, and statistics to solve the social problems that affect our cities. It combines the fields of statistics, mathematics, and social science to understand the root causes of dysfunction in our neighborhoods. Understanding these relationships and discovering the most highly correlated variables allows us to deploy our limited resources more effectively and target the variables that will have the greatest positive impact on improvement. The first contest is to predict homocides in Philadelphia. (via mikeloukides on Twitter)
  4. Protecting Cloud Secrets with Grendel (Wesabe blog) -- new open source package that implements Wesabe's policies for safe handling of customer data. It uses OpenPGP to store data, and offers access to the encrypted data via an internal (behind-the-firewall) REST service. The data can only be decrypted with the user's password. Hopefully the first of many standard tools and practices for respecting privacy.

June 18 2009

Road to Peace

Promo for the Bering Stait architectural competition

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl