Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

November 18 2012

June 24 2012


February 16 2012

Strata Week: The data behind Yahoo's front page

Here are a few of the data stories that caught my attention this week.

Data and personalization drive Yahoo's front page

Yahoo offered a peak behind the scenes of its front page with the release of the Yahoo C.O.R.E. Data Visualization. The visualization provides a way to view some of the demographic details behind what Yahoo visitors are clicking on.

The C.O.R.E. (Content Optimization and Relevance Engine) technology was created by Yahoo Labs. The tech is used by Yahoo News and its Today module to personalize results for its visitors — resulting in some 13,000,000 unique story combinations per day. According to Yahoo:

"C.O.R.E. determines how stories should be ordered, dependent on each user. Similarly, C.O.R.E. figures out which story categories (i.e. technology, health, finance, or entertainment) should be displayed prominently on the page to help deepen engagement for each viewer."

Screenshot from Yahoo's CORE visualization
Screenshot from Yahoo's CORE data visualization. See the full visualization here.

Scaling Tumblr

Over on the High Scalability blog, Todd Huff examines how the blogging site Tumblr was able to scale its infrastructure, something that Huff describes as more challenging than the scaling that was necessary at Twitter.

To put give some idea of the scope of the problem, Hoff cites these figures:

"Growing at over 30% a month has not been without challenges. Some reliability problems among them. It helps to realize that Tumblr operates at surprisingly huge scales: 500 million page views a day, a peak rate of ~40k requests per second, ~3TB of new data to store a day, all running on 1000+ servers."

Hoff interviews Blake Matheny, distributed systems engineer at Tumblr, for a look at the architecture of both "old" and "new" Tumblr. When the startup began, it was hosted on Rackspace where "it gave each custom domain blog an A record. When they outgrew Rackspace there were too many users to migrate."

The article also describes the Tumblr firehose, noting again its differences from Twitter's. "A challenge is to distribute so much data in real-time," Huff writes. "[Tumblr} wanted something that would scale internally and that an application ecosystem could reliably grow around. A central point of distribution was needed." Although Tumblr initially used Scribe/Hadoop, "this model stopped scaling almost immediately, especially at peak where people are creating 1000s of posts a second."

Strata 2012 — The 2012 Strata Conference, being held Feb. 28-March 1 in Santa Clara, Calif., will offer three full days of hands-on data training and information-rich sessions. Strata brings together the people, tools, and technologies you need to make data work.

Save 20% on registration with the code RADAR20

Visualization creation

Data scientist Pete Warden offers his own lessons learned about building visualizations this week in a story here on Radar. His first tip: "Play with your data" -- that is, before you decide what problem you want to solve or visualization you want to create, take the time to know the data you're working with.

Warden writes:

"The more time you spend manipulating and examining the raw information, the more you understand it at a deep level. Knowing your data is the essential starting point for any visualization."

Warden explains how he was able to create a visualization for his new travel startup, Jetpac, that showed where American Facebook users go on vacation. Warden's tips aren't simply about the tools he used; he also walks through the conceptualization of the project as well as the crunching of the data.

Got data news?

Feel free to email me.


December 30 2011

Paying for Parking | 2011-12-29

Parking is too cheap and the price is too sticky. As Tyler wrote in his NYT column:

If developers were allowed to face directly the high land costs of providing so much parking, the number of spaces would be a result of a careful economic calculation rather than a matter of satisfying a legal requirement. Parking would be scarcer, and more likely to have a price – or a higher one than it does now – and people would be more careful about when and where they drove.

The subsidies are largely invisible to drivers who park their cars – and thus free or cheap parking spaces feel like natural outcomes of the market, or perhaps even an entitlement. Yet the law is allocating this land rather than letting market prices adjudicate whether we need more parking, and whether that parking should be free. We end up overusing land for cars – and overusing cars too. You don’t have to hate sprawl,

Slowly things are beginning to change, however, as this excellent piece on parking in LA and parking scholar Donald Shoup describes:

Shoup is not opposed to all parking lots; he’s against cities requiring parking lots. “Would you require every home to come with a pool or every office to include a dining room because someone might want it?” asks Shoup. “Why not let developers build parking where the market demands it and charge its true value?”

…This spring the DOT plans to introduce an $18.5 million smart wireless meter system based on Shoup’s theories. Called ExpressPark, the 6,000-meter array will be installed on downtown streets and lots, along with sensors buried in the pavement of every parking spot to detect the presence of cars and price accordingly, from as little as 50 cents an hour to $6. Street parking, like pork bellies, will be open to market forces. As blocks fill, prices will rise; when occupancy drops, so will rates. In an area like downtown, ideal for Shoup’s progressive pricing, people will park based on how much they’re willing to pay versus how far they are willing to walk to a destination. In a trendy area like Melrose Avenue’s shopping district, where parking on side streets is forbidden to visitors, Shoup would open those residential blocks to market-priced meters, wooing home owners by guaranteeing that meter profits would be turned over to them in the form of property tax deductions. (That benefit could add up to thousands of dollars a year per household.)

Brooklyn’s Park Slope neighborhood is already experimenting with a version of the system, and so are San Francisco, Seattle, and Washington, D.C.

In D.C. you can now pay many parking meters via cell-phone. I’ve used the system and it works well.

Here are previous MR posts on parking.

Reposted from02mysoup-aa 02mysoup-aa

May 24 2011

Wer »just in time« propagiert, #redundanz abbaut, muss Pannen fürchten #Tsp-Kommentar S-Bahn: Zappenduster – #fail
oAnth via diaspora | 2011-05-24
Reposted bykrekk krekk

November 27 2010

November 26 2010



Wenn dann aber doch der Druck im Kessel steigt, die Leute sich nicht mehr alles gefallen lassen und beginnen, ihre Sache selbst in die Hand zu nehmen, ist der „Spiegel“ sofort zur Stelle, um die Ansätze der Gegenwehr zu ironisieren und abzufälschen. Am 30.8.2010 erschien der Titel: „Die Dagegen-Republik. Stuttgart 21, Atomkraft, Schulreform: Bürgeraufstand gegen die Politik“. Merke: Bürger, die beginnen, selbst Politik zu machen, sind nach der verdrehten und verdrehenden Logik des „Spiegel“ „gegen die Politik“.

Illustriert war das mit der „fetten Henne“, dem in seiner ursprünglichen Gestalt aus den Zeiten des beginnenden Wirtschaftswunders stammenden Adler, der an der Stirnseite des Bundestages prangt. Dort, wo sein Herz schlagen würde, wenn er nicht aus Blech wäre, klebte eine beim Wurf zerplatzte Tomate. Rote Spritzer drumherum, so dass man auch einen Herzschuss imaginieren könnte. Ein ironisches Spiel mit dem Phantasma der „linken Gewalt“, das immer dann beschworen wird, wenn es darum geht, die Polizei präventiv aufzurüsten.

Die Titelgeschichte im Inneren des Heftes begann mit einem über beide Seiten gehenden Breitbandfoto von einer Demonstration gegen das Projekt „Stuttgart 21“. Ganz groß darin ein Transparent: „Bei Abriss. Aufstand.“ Es geht um den Abriss von Teilen des historischen Stuttgarter Sackbahnhofs zugunsten des Baus einer unterirdischen Durchgangsstrecke, die angeblich notwendig ist, um Stuttgart optimal im internationalen Bahnverkehr zu positionieren. Nach inzwischen vorliegenden Gutachten ist das von Jahr zu Jahr teurer werdende Projekt aber verkehrstechnisch ziemlich überflüssig.


weiterlesen auf NDS -  20101126

"Bei Abriss Aufstand: Wie der wachsende Bürgerprotest madig gemacht werden soll"

sowie vollständig auf

October 25 2010

What to consider before shortening links

Chances are, you're reading this article after clicking on a shortened link. And if, like many modern infovores, your online reading is driven by your social network rather than your feed reader, most of the pages you've visited today were mediated by a shortened link.

Link shorteners have become ubiquitous over the last few years, and they're an increasingly important part of the social fabric of the web. But is that a good thing?

Below I explore some of the issues to be aware of, both as a user of link shortening services and a consumer of shortened links.

A brief history of link shorteners

Link shorteners have been around for a number of years. Wikipedia notes that the first link shortening service,, launched in 2002. Back then the primary need for shortened links was avoiding line-wrapping issues, which could break long links in some email readers. In the years since, the web development community has recognized the utility of simple, readable URLs that are free of implementation cruft. URLs tend to be tidier than they once were. But the need for even shorter URLs has been driven by constrained user interfaces, either because of hardware issues or artificial constraints imposed by particular services. It's no fun entering long URLs on a mobile device, and who wants to waste tweet space on URL characters?

The sharp rise in the number of shortening services -- there are more than 180 services in this list -- has been accompanied by a race to the bottom: who can generate the shortest URLs by creative use of domain name registration and by compressing URLs into as few characters as possible?

But while brevity might bring users to a service, it doesn't necessarily bring revenue. Value-added mediation features, such as access to click-through statistics on individual URLs, has given services another dimension.'s live usage statistics, and its previously close relationship with Twitter, has made it beloved by many. The statistics and other services offer combine web analytics with your social media influence. They answer the question: How much traffic did you drive today?

Issues with URL shorteners

The issues surrounding URL shorteners all follow from the shorteners acting as an intermediary for destination websites.


As more web users find new content via shortened links, certain shorteners may emerge as considerable referral generators for some sites. If a service goes down, traffic could be temporarily blocked.

We don't often think of the web as a long-term medium, but we should. There's very little truly ephemeral content on the web anymore. The 301Works project from the Internet Archive is intended to address the issue of URL shortening services going permanently offline, providing an escrow service for link shorteners with the hope of preserving the integrity of more of the web. And, as recently illustrated by the take-down of, there are more than just financial reasons why a site might go offline. Legal and ethical issues can arise, requiring developers to consider more than just what makes a short or cool-sounding domain name.

Some sites now offer their own domain-specific URL shorteners (e.g.,, that don't suffer from the same issues. These are more likely to offer the same resilience and stability as the sites themselves. As a user, you're better off turning to these services where available.


It's hard to tell what's at the end of a shortened link. From a user-interface point of view this can be frustrating. How many times have you followed a link in a tweet to find that it's actually something you've already read? But that lack of visibility can be more than just frustrating, it's also a possible vector for a phishing attack.

Not everyone pays attention to the URLs they're visiting, and an unwary user can easily be taken advantage of through a seemingly innocuous shortened link. They're a great way to hide a phishing site, exploit scripting vulnerabilities, or just avoid spam blocking. (Eric Hellman has created a nice list of "evil uses for URL shorteners.")

If a URL shortener is hacked, as happened with, then once innocuous links can suddenly become spam vectors.

Some services do apply a spam filter to shortened links, while others offer a preview mode or tools to increase visibility of the destination site. However, in the latter case, the user of the link usually has to take some additional action before following a link, making these less than ideal.

Again, domain-specific URL shorteners are likely to be more secure, if not more transparent than third-party services.


URL shorteners inevitably add overhead, requiring additional DNS lookups and HTTP requests. Domain-specific shorteners suffer in this regard just as much as third-party systems. Waiting for an additional web request can be particularly irritating on patchy mobile connections. For many users it won't be clear where performance issues lie: Is it the link shortener or the target service that's slow?

The role URL shorteners play in routing increasingly large chunks of Internet traffic makes their performance significant. It also makes them a highly visible target for denial of service attacks.

(Note: Here's an interesting dashboard that provides some insight into up-time and performance for a selection of shortening services.)


An often overlooked issue with URL shorteners is that they have the ability to track the links you're following across the web. Several services hand out tracking cookies as links are followed. It's this facility that underpins the shorteners' ability to offer usage statistics, although none yet offer the ability to track individual users. But, if you're the type of person who is concerned about how your Internet usage is being recorded, then this is yet another avenue to consider.

Summing up

In all likelihood, URL shorteners are here to stay. As users and developers of web services we ought to understand when they are and aren't useful, and what alternatives are available. Domain-specific shortening services avoid many of the issues identified here, but they don't offer the same cross-site analytics that are the unique selling point of third-party services.

Disclosure: O'Reilly uses Pro for its shortened domain, O'Reilly AlphaTech Ventures (OATV) is also an investor in


May 24 2010

Als die Männer noch Hüte trugen

Im Jahr 1949 fotografierte ein Mann namens Chalmers Butterfield das Straßenleben in London in der tiefen Farbenfülle von Kodachrome. Drüben bei How To Be A Retronaut, wo solche raren Funde gesammelt werden, gibt es zwei weitere Aufnahmen von Mr. Butterfield.

(Gefunden bei neatorama)

Hierzu siehe auch:

Farbfilmaufnahmen: London in den zwanziger Jahren |

Reposted fromglaserei glaserei

April 16 2010

March 31 2010


February 24 2010


näher, mein Gott, zu dir

Alkohol am Steuer = Fegefeuer.

Wäre Frau Käßmann eine 17jährige Komasäuferin, würde sie, von fünf Sozialarbeitern flankiert, zu einem mehrwöchigen Nachreifungs- Erlebnis -urlaub nach Lanzarote geschickt.
Wäre sie Chef eines anderen Großunternehmens, bekäme sie einen Dienstwagen mit Chauffeur, damit sie beim Fahren nicht so viel verschüttet.
Wäre sie die geschiedene Frau mit vier Kindern und Vollzeitjob, die jeder von nebenan kennt, würde sie jemand in den Arm nehmen; mindestens 1mal im Jahr, wenn sie beim Betriebsausflug nach dem ersten Pikkolöchen flennt.
Wäre sie Otto NormalverbraucherIn, würden längst alle Personalräte daran erinnern, dass Alkoholabhängigkeit eine Krankheit ist und kein dienstliches Fehlverhalten.
Aber als Frau, als geschiedene Frau mit vier Kindern an der Spitze der EKD, wird sie nun gekreuzigt werden.

katholische Priester von einem nahegelegenen Internat haben ihr heimlich einen in den Tee gegossen, um die Reporter und Staatsanwälte, die deren "Nachhilfestunden" mit engelhaft schönen Knaben näher beobachten wollten, auf eine andere Fährte zu lenken...

— vollständiges Zitat aus dem Blog: passe.par.tout 20100224

January 14 2010

Innovation Battles Investment as FCC Road Show Returns to Cambridge

Opponents can shed their rhetoric and reveal new depths to their
thought when you bring them together for rapid-fire exchanges,
sometimes with their faces literally inches away from each other. That
made it worth my while to truck down to the MIT Media Lab for
yesterday's href="">Workshop
on Innovation, Investment and the Open Internet, sponsored by the
Federal Communications Commission. In this article I'll cover:

Context and background

The FCC kicked off its country-wide hearing campaign almost two years
ago with a meeting at Harvard Law School, which quickly went wild. I
covered the href="">
experience in one article and the href="">
unstated agendas in another. With a star cast and an introduction
by the head of the House's Subcommittee on Telecommunications and the
Internet, Ed Markey, the meeting took on such a cachet that the
public flocked to the lecture hall, only to find it filled because
Comcast recruited people off the street to pack the seats and keep
network neutrality proponents from attending. (They had an overflow
room instead.)

I therefore took pains to arrive at the Media Lab's Bartos Theater
early yesterday, but found it unnecessary. Even though Tim Berners-Lee
spoke, along with well-known experts across the industry, only 175
people turned up, in my estimation (I'm not an expert at counting
crowds). I also noticed that the meeting wasn't worth a
mention today in the Boston Globe.

Perhaps it was the calamitous earthquake yesterday in Haiti, or the
bad economy, or the failure of the Copenhagan summit to solve the
worst crisis ever facing humanity, or concern over three wars the US
is involved in (if you count Yemen), or just fatigue, but it seems
that not as many people are concerned with network neutrality as two
years ago. I recognized several people in the audience yesterday and
surmised that the FCC could have picked out a dozen people at random
from their seats, instead of the parade of national experts on the
panel, and still have led a pretty darned good discussion.

And network neutrality is definitely the greased pig everyone is
sliding around. There are hundreds of things one could discuss
in the context of innovation and investment, but various political
forces ranging from large companies (AT&T versus Google) to highly
visible political campaigners (Huffington Post) have made network
neutrality the agenda. The FCC gave several of the movement's leaders
rein to speak, but perhaps signaled its direction by sending Meredith
Attwell Baker as the commissioner in attendance.

In contrast to FCC chair Julius Genachowski, who publicly calls for
network neutrality (a position also taken by Barack Obama href="">
during his presidential campaign), Baker has traditionally
espoused a free-market stance. She opened the talks yesterday by
announcing that she is "unconvinced there is a problem" and posing the
question: "Is it broken?" I'll provide my own opinion later in this

Two kinds of investment

Investment is the handmaiden, if not the inseminator, of innovation.
Despite a few spectacular successes, like the invention of Linux and
Apache, most new ideas require funding. Even Linux and Apache are
represented now by foundations backed now by huge companies.

So why did I title this article "Innovation Battles Investment"?
Because investment happens at every level of the Internet, from the
cables and cell towers up to the applications you load on your cell

Here I'll pause to highlight an incredible paradigm shift that was
visible at this meeting--a shift so conclusive that no one mentioned
it. Are you old enough to remember the tussle between "voice" and
"data" on telephone lines? Remember the predictions that data would
grow in importance at the expense of voice (meaning Plain Old
Telephone Service) and the milestones celebrated in the trade press when
data pulled ahead of voice?

Well, at the hearing yesterday, the term "Internet" was used to cover
the whole communications infrastructure, including wires and cell
phone service. This is a mental breakthrough all it's own, and one
I'll call the Triumph of the Singularity.

But different levels of infrastructure benefit from different
incentives. I found that all the participants danced around this.
Innovation and investment at the infrastructure level got short shrift
from the network neutrality advocates, whether in the bedtime story
version delivered by Barbara van Schewick or the deliberately
intimidating, breakneck overview by economist Shane Greenstein, who
defined openness as "transparency and consistency to facilitate
communication between different partners in an independent value

You can explore his href="">
papers on your own, but I took this to mean, more or less, that
everybody sharing a platform should broadcast their intentions and
appraise everybody else of their plans, so that others can make the
most rational decisions and invest wisely. Greenstein realized, of
course that firms have little incentive to share their strategies. He said that
communication was "costly," which I take as a reference not to an expenditure of
money but to a surrender of control and relinquishing of opportunities.

This is just what the cable and phone companies are not going to do.
Dot-com innovator Jeffrey Glueck, founder of href="">Skyfire, would like the FCC to
require ISPs to give application providers and users at least 60 to 90
days notice before making any changes to how they treat traffic. This
is absurd in an environment where bad actors require responses within
a few seconds and the victory goes to the router administrators with
the most creative coping strategy. Sometimes network users just have
to trust their administrators to do the best thing for them. Network
neutrality becomes a political and ethical issue when administrators
don't. But I'll return to this point later.

The pocket protector crowd versus the bean

If the network neutrality advocates could be accused of trying to
emasculate the providers, advocates for network provider prerogative
were guilty of taking the "Trust us" doctrine too far. For me, the
best part of yesterday's panel was how it revealed the deep gap that
still exists between those with an engineering point of view and those
with a traditional business point of view.

The engineers, led by Internet designer David Clark, repeated the
mantra of user control of quality of service, the vehicle for this
being the QoS field added to the IP packet header. Van Schewick
postulated a situation where a user increases the QoS on one session
because they're interviewing for a job over the Internet, then reduces
the QoS to chat with a friend.

In the rosy world envisioned by the engineers, we would deal not with
the physical reality of a shared network with our neighbors, all
converging into a backhaul running from our ISP to its peers, but with
the logical mechanism of a limited, dedicated bandwidth pipe (former
senator Ted Stevens can enjoy his revenge) that we would spend our
time tweaking. One moment we're increasing the allocation for file
transfer so we can upload a spreadsheet to our work site; the next
moment we're privileging the port we use for an MPMG.

The practicality of such a network service is open to question. Glueck
pointed out that users are unlikely ever to ask for lower quality of
service (although this is precisely the model that Internet experts
have converged on, as I report in my 2002 article href="">
A Nice Way to Get Network Quality of Service?). He recommends
simple tiers of service--already in effect at many providers--so that
someone who wants to carry out a lot of P2P file transfers or
high-definition video conferencing can just pay for it.

In contrast, network providers want all the control. Much was made
during the panel of a remark by Marcus Weldon of Alcatel-Lucent in
support of letting the providers shape traffic. His pointed out that
video teleconferencing over the fantastically popular Skype delivered
unappealing results over today's best-effort Internet delivery, and
suggested a scenario where the provider gives the user a dialog box
where the user could increase the QoS for Skype in order to enjoy the
video experience.

Others on the panel legitimately flagged this comment as a classic
illustration of the problem with providers' traffic shaping: the
provider would negotiate with a few popular services such as Skype
(which boasts tens of millions of users online whenever you log in)
and leave innovative young services to fend for themselves in a
best-effort environment.

But the providers can't see doing quality of service any other way.
Their business model has always been predicated on designing services
around known costs, risks, and opportunities. Before they roll out a
service, they need to justify its long-term prospects and reserve
control over it for further tweaking. If the pocket protector crowd in
Internet standards could present their vision to the providers in a
way that showed them the benefits they'd accrue from openness
(presumably by creating a bigger pie), we might have progress. But the
providers fear, above else, being reduced to a commodity. I'll pick up
this theme in the next section.

Is network competition over?

Law professor Christopher S. Yoo is probably the most often heard (not
at this panel, unfortunately, where he was given only a few minutes)
of academics in favor of network provider prerogatives. He suggested
that competition was changing, and therefore requiring a different
approach to providers' funding models, from the Internet we knew in
the 1990s. Emerging markets (where growth comes mostly from signing up
new customers) differ from saturated markets (where growth comes
mainly from wooing away your competitors' customers). With 70% of
households using cable or fiber broadband offerings, he suggested the
U.S. market was getting saturated, or mature.

Well, only if you accept that current providers' policies will stifle
growth. What looks like saturation to an academic in the U.S. telecom
field looks like a state of primitive underinvestment to people who
enjoy lightning-speed service in other developed nations.

But Yoo's assertion makes us pause for a moment to consider the
implications of a mature network. When change becomes predictable and
slow, and an infrastructure is a public good--as I think everyone
would agree the Internet is--it becomes a candidate for government
takeover. Indeed, there have been calls for various forms of
government control of our network infrastructure. In some places this
is actually happening, as cities and towns create their own networks.
A related proposal is to rigidly separate the physical infrastructure
from the services, barring companies that provide the physical
infrastructure from offering services (and therefore presumably
relegating them to a maintenance role--a company in that position
wouldn't have much incentive to take on literally ground-breaking new

Such government interventions are politically inconceivable in the
United States. Furthermore, experience in other developed nations with
more successful networks shows that it is unnecessary.

No one can doubt that we need a massive investment in new
infrastructure if we want to use the Internet as flexibly and
powerfully as our trading partners. But there was disagreement yesterday about
how much of an effort the investment will take, and where it will come

Yoo argued that a mature market requires investment to come from
operating expenditures (i.e., charging users more money, which
presumably is justified by discriminating against some traffic in
order to offer enhanced services at a premium) instead of capital
expenditures. But Clark believes that current operating expenditures
would permit adequate growth. He anticipated a rise in Internet access
charges of $20 a month, which could fund the added bandwidth we need
to reach the Internet speeds of advanced countries. In exchange for
paying that extra $20 per month, we would enjoy all the content we
want without paying cable TV fees.

The current understanding by providers is that usage is rising
"exponentially" (whatever that means--they don't say what the exponent
is) whereas charges are rising slowly. Following some charts from
Alcatel-Lucent's Weldon that showed profits disappearing entirely in a
couple years--a victim of the squeeze between rising usage and slow
income growth--Van Schewick challenged him, arguing that providers can
enjoy lower bandwidth costs to the tune of 30% per year. But Weldon
pointed out that the only costs going down are equipment, and claimed
that after a large initial drop caused by any disruptive new
technology, costs of equipment decrease only 10% per year.

Everyone agreed that mobile, the most exciting and
innovation-supporting market, is expensive to provide and suffering an
investment crisis. It is also the least open part of the Internet and
the part most dependent on legacy pricing (high voice and SMS
charges), deviating from the Triumph of the Singularity.

So the Internet is like health care in the U.S.: in worse shape than
it appears. We have to do something to address rising
usage--investment in new infrastructure as well as new
applications--just as we have to lower health care costs that have
surpassed 17% of the gross domestic product.

Weldon's vision--a rosy one in its own way, complementing the
user-friendly pipe I presented earlier from the engineers--is that
providers remain free to control the speeds of different Internet
streams and strike deals with anyone they want. He presented provider
prerogatives as simple extensions of what already happens now, where
large companies create private networks where they can impose QoS on
their users, and major web sites contract with content delivery
networks such as Akamai (represented at yesterday's panel by lawyer
Aaron Abola) to host their content for faster response time. Susie Kim
Riley of Camiant testified that European providers are offering
differentiated services already, and making money by doing so.

What Weldon and Riley left out is what I documented in href="">
A Nice Way to Get Network Quality of Service? Managed networks
providing QoS are not the Internet. Attempts to provide QoS over the
Internet--by getting different providers to cooperate in privileging
certain traffic--have floundered. The technical problems may be
surmountable, but no one has figured out how to build trust and to design
adequate payment models that would motivate providers to cooperate.

It's possible, as Weldon asserts, that providers allowed to manage
their networks would invest in infrastructure that would ultimately
improve the experience for all sites--those delivered over the
Internet by best-effort methods as well as those striking deals. But
the change would still represent increased privatization of the public
Internet. It would create what application developers such as Glueck
and Nabeel Hyatt of Conduit Labs fear most: a thousand different
networks with different rules that have to be negotiated with
individually. And new risks and costs would be placed in the way of
the disruptive innovators we've enjoyed on the Internet.

Competition, not network neutrality, is actually the key issue facing
the FCC, and it was central to their Internet discussions in the years
following the 1996 Telecom Act. For the first five years or so, the
FCC took seriously a commitment to support new entrants by such
strategies as requiring incumbent companies to allow interconnection.
Then, especially under Michael Powell, the FCC did an about-face.

The question posed during this period was: what leads to greater
investment and growth--letting a few big incumbents enter each other's
markets, or promoting a horde of new, small entrants? It's pretty
clear that in the short term, the former is more effective because the
incumbents have resources to throw at the problem, but that in the
long term, the latter is required in order to find new solutions and
fix problems by working around them in creative ways.

Yet the FCC took the former route, starting in the early 2000s. They
explicitly made a deal with incumbents: build more infrastructure, and
we'll relax competition rules so you don't have to share it with other

Starting a telecom firm is hard, so it's not clear that pursuing the
other route would have saved us from the impasse we're in today. But a
lack of competition is integral to our problems--including the one
being fought out in the field of "network neutrality."

All the network neutrality advocates I've talked to wish that we had
more competition at the infrastructure level, because then we could
rely on competition to discipline providers instead of trying to
regulate such discipline. I covered this dilemma in a 2006 article, href="">Network Neutrality
and an Internet with Vision. But somehow, this kind of competition
is now off the FCC agenda. Even in the mobile space, they offer
spectrum though auctions that permit the huge incumbents to gather up
the best bands. These incumbents then sit on spectrum without doing
anything, a strategy known as "foreclosure" (because it forecloses
competitors from doing something useful with it).

Because everybody goes off in his own direction, the situation pits two groups against each other that should be
cooperating: small ISPs and proponents of an open Internet.

What to regulate

Amy Tykeson, CEO of a small Oregon Internet provider named
BendBroadband, forcefully presented the view of an independent
provider, similar to the more familiar imprecations by href=""> Brett Glass of Lariat. In their
world--characterized by paper-thin margins, precarious deals with
back-end providers, and the constant pressure to provide superb
customer service--flexible traffic management is critical and network
neutrality is viewed as a straitjacket.

I agree that many advocates of network neutrality have oversimplified
the workings of the Internet and downplayed the day-to-day
requirements of administrators. In contrast, as I have shown, large
network providers have overstepped their boundaries. But to end this
article on a positive note (you see, I'm trying) I'll report that the
lively exchange did produce some common ground and a glimmer of hope
for resolving the differing positions.

First, in an exchange between Berners-Lee and van Schewick on the
pro-regulatory side and Riley on the anti-regulatory side, a more
nuanced view of non-discrimination and quality of service emerged.
Everybody on panel offered vociferous exclamations in support of the position that it was
unfair discrimination for a network provider to prevent a user from
getting legal content or to promote one web site over a competing web
site. And this is a major achievement, because those are precisely
the practices that providers liked AT&T and Verizon claim the
right to do--the practices that spawned the current network neutrality

To complement this consensus, the network neutrality folks approved
the concept of quality of service, so long as it was used to improve
the user experience instead of to let network providers pick winners.
In a context where some network neutrality advocates have made QoS a
dirty word, I see progress.

This raises the question of what is regulation. The traffic shaping
policies and business deals proposed by AT&T and Verizon are a
form of regulation. They claim the same privilege that large
corporations--we could look at health care again--have repeatedly
tried to claim when they invoke the "free market": the right of
corporations to impose their own regulations.

Berners-Lee and others would like the government to step in and issue
regulations that suppress the corporate regulations. A wide range of
wording has been proposed for the FCC's consideration. Commissioner
Baker asked whether, given the international reach of the Internet,
the FCC should regulate at all. Van Schewick quite properly responded
that the abuses carried out by providers are at the local level and
therefore can be controlled by the government.

Two traits of a market are key to innovation, and came up over and
over yesterday among dot-com founders and funders (represented by Ajay
Agarwal of Bain Capital) alike: a level playing field, and
light-handed regulation.

Sometimes, as Berners-Lee pointed out, government regulation is
required to level the playing field. The transparency and consistency
cited by Greenstein and others are key features of the level playing
field. And as I pointed out, a vacuum in government regulation is
often filled by even more onerous regulation by large corporations.

One of the most intriguing suggestions of the day came from Clark, who
elliptically suggested that the FCC provide "facilitation, not
regulation." I take this to mean the kind of process that Comcast and
BitTorrent went through, of which Sally Shipman Wentworth of ISOC
boasted about in her opening remarks. Working with the IETF (which she
said created two new working groups to deal with the problem), Comcast
and BitTorrent worked out a protocol that should reduce the load of
P2P file sharing on networks and end up being a win-win for everybody.

But there are several ways to interpret this history. To free market
ideologues, the Comcast/BitTorrent collaboration shows that private
actors on the Internet can exploit its infinite extendibility to find
their own solutions without government meddling. Free market
proponents also call on anti-competition laws to hold back abuses. But
those calling for parental controls would claim that Comcast wanted
nothing to do with BitTorrent and started to work on technical
solutions only after getting tired of the feces being thrown its way
by outsiders, including the FCC.

And in any case--as panelists pointed out--the IETF has no enforcement
power. The presence of a superior protocol doesn't guarantee that
developers and users will adopt it, or that network providers will
allow traffic that could be a threat to their business models.

The FCC at Harvard, which I mentioned at the beginning of this
article, promised intervention in the market to preserve Internet
freedom. What we got after that (as I predicted) was a slap on
Comcast's wrist and no clear sense of direction. The continued
involvement of the FCC--including these public forums, which I find
educational--show, along with the appointment of the more
interventionist Genachowski and the mandate to promote broadband in
the American Recovery and Reinvestment Act, that it can't step away
from the questions of competition and investment.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!