Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

October 22 2011

Fliegender Gerichtsstand beim Filesharing?

In einem beim Amtsgericht München anhängigen Filesharing-Prozess habe ich für die Beklagte u.a. die Zuständigkeitsrüge erhoben und die Ansicht vertreten, dass man in Filesharing-Sachverhalten keinen sog. fliegenden Gerichtsstand annehmen könne.

Es kam erwartungsgemäß ein richterlicher Hinweis, der etwas unreflektiert die Rechtsprechung nachbetete, wonach ein Gerichtsstand im Sinne von § 32 ZPO an jedem Ort begründet sei, an dem eine Internetseite bestimmungsgemäß abgerufen werden könne.

Das mag man für Websites ja so sehen, aber für das Filesharing über P2P-Netze? Eine Internetseite die man bestimmungsgemäß abrufen könnte, gibt es jedenfalls nicht. Ob man in München tatsächlich und bestimmungsgemäß diejenigen Inhalte von der Festplatte der Beklagten saugen konnte, die Gegenstand des Verfahrens sind, ist m.E. eher spekulativ. Beliebig und jederzeit, wie bei einer Website geht es zumindest nicht. Vielleicht sollten auch die Befürworter des fliegenden Gerichtsstandes erkennen, dass die Nutzung eines P2P-Netzwerks mit dem Abruf einer Website nicht unbedingt gleichzusetzen ist.

July 30 2011

OLG München: Teilnahme an P2P-Netzen begründet immer Rechtsverletzung in gewerblichem Ausmaß

Das OLG München hat mit Beschluss vom 26.07.2011 (Az.: 29 W 1268/11) entschieden, dass dem Filesharing vom urheberrechtlich geschütztem Material über ein P2P-Netzwerk grundsätzlich ein gewerbliches Ausmaß zukommt.

Der Kernsatz der Entscheidungsbegründung hierzu lautet:

Einer Rechtsverletzung, die im Angebot einer Datei mit urheberrechtlich geschütztem Inhalt auf einer Internet-Tauschbörse liegt, kommt – ohne das es weiterer erschwerender Umstände bedürfte – grundsätzlich gewerbliches Ausmaß im Sinne von § 101 Abs. 2 UrhG zu.

Warum ich diese Annahme juristisch für unzutreffend halte, habe ich hier (unter Update) kürzlich erst erläutert.

Wenn man dem OLG München folgt, heißt das natürlich auch, dass es in P2P-Netzwerken kein privates Handeln mehr gibt.

May 26 2011

02mydafsoup-01
Open rights group

Open Rights Group

Watch the torrent release of The Tunnel for free BEFORE the cinema release! http://vo.do/1ViA

oAnth - via diaspora | 2011-05-27 

April 15 2011

Getting your book in front of 160 million users is usually a good thing

Last week, Megan Lisa Jones launched a promotion for her new book "Captive" in a (seemingly) unlikely forum: BitTorrent, a space commonly associated with "piracy." At about a week into her two-week promotion, I checked in with BitTorrent to see how it was going. In an email interview, BitTorrent spokesperson Allison Wagda said that as of 10 am Tuesday, "Captive" had been downloaded 342,242 times.

Though the environment may feel like a strange bedfellow for publishing, the impressive level of exposure for a new book release can't be denied. The marketing appeal of BitTorrent, Wagda said, is two-fold:

The technology and the audience. For larger downloads, BitTorrent is the fastest, easiest way to distribute and download a file to lots of people. And there's no infrastructure cost. Since we have a built-in massive audience, publishers and creators gain a unique ability to engage with users.

For more on how a platform like BitTorrent could be used by publishers, I turned to Matt Mason, director of innovation at Syrup and author of The Pirate's Dilemma. Our interview follows.

What advantages can be gained by staging a promotion through a platform like BitTorrent?

Matt Mason: The real problem for most authors, to quote Tim O'Reilly, isn't piracy, but obscurity. There are millions of books on Amazon, and the average book in the US sells around 500 copies a year. A lot of authors, including Cory Doctorow, Seth Godin, Paulo Coelho and myself have had success by giving away electronic copies of our books as a way to promote the books. It can spread the message of the book further, boost sales of physical copies, boost ebook sales, and stimulate other opportunities like speaking and consulting engagements.

The great thing about BitTorrent is you are talking to a massive audience — more than 160 million people use it. Research has shown that people who use file-sharing sites are more likely to spend money on content. Whatever you're trying to promote, 160 million people who are big consumers of all kinds of media is a huge opportunity.

Do you think this is a viable promotion/distribution model?

Matt Mason: Absolutely, and it will become more widely used as content creators and distributors wake up to the benefits of BitTorrent. It is quite simply the cheapest and most efficient way to share digital information, because the audience is the server farm. It's way to create a giant repository of content with no servers. It has a huge user base and it is growing every day. It's not about giving something away for free, but about distributing it in the smartest possible way. In the next five years, I think we'll see all kinds of publishers waking up to this.

What are some of the obstacles environments like BitTorrent face as promotion platforms?

Matt Mason: One of the biggest problems peer-to-peer technologies like BitTorrent have is the stigma of piracy, but P2P is actually a new and better way of distributing information. Piracy has been at the birth of every major new innovation in media, from the printing press to the recording industry to the film industry — all were birthed out of people doing disruptive, innovative things with content that earned them the label "pirate" (including Thomas Edison).

I think of piracy as a market signal — it signifies a change in consumer behavior that the market hasn't caught up with. If an ecosystem like BitTorrent grows to 160 million users, it's not a piracy environment, it's just a new environment. Media is an industry where the customer really is always right. If people are trying to get your content in a new way, the only smart thing to do is to find a sensible way to offer it to them there.



Related:


April 11 2011

The quiet rise of machine learning

The concept of machine learning was brought to the forefront for the general masses when IBM's Watson computer appeared on Jeopardy and wiped the floor with humanity. For those same masses, machine learning quickly faded from view as Watson moved out of the spotlight ... or so they may think.

Machine learning is slowly and quietly becoming democratized. Goodreads, for instance, recently purchased Discovereads.com, presumably to make use of its machine learning algorithms to make book recommendations.

To find out more about what's happening in this rapidly advancing field, I turned to Alasdair Allan, an author and senior research fellow in Astronomy at the University of Exeter. In an email interview, he talked about how machine learning is being used behind the scenes in everyday applications. He also discussed his current eSTAR intelligent robotic telescope network project and how that machine learning-based system could be used in other applications.

In what ways is machine learning being used?

Alasdair AllanAlasdair Allan: Machine learning is quietly taking over in the mainstream. Orbitz, for instance, is using it behind the scenes to optimize caching of hotel prices, and Google is going to roll out smarter advertisements — much of the machine learning that consumers are seeing and using every day is invisible to them.

The interesting thing about machine learning right now is that research in the field is going on quietly as well because large corporations are tied up in non-disclosure agreements. While there is a large amount of academic literature on the subject, it's actually hard to tell whether this open research is actually current.

Oddly, machine learning research mirrors the way cryptography research developed around the middle of the 20th century. Much of the cutting edge research was done in secret, and we're only finding out now, 40 or 50 years later, what GCHQ or the NSA was doing back then. I'm hopeful that it won't take quite that long for Amazon or Google to tell us what they're thinking about today.

How does your eSTAR intelligent robotic telescope network work?

Alasdair Allan: My work has focused on applying intelligent agent architectures and techniques to astronomy for telescope control and scheduling, and also for data mining. I'm currently leading the work at Exeter building a peer-to-peer distributed network of telescopes that, acting entirely autonomously, can reactively schedule observations of time-critical transient events in real-time. Notable successes include contributing to the detection of the most distant object yet discovered, a gamma-ray burster at a redshift of 8.2.

eStar Diagram
A diagram showing how the eSTAR network operates. The Intelligent Agents access telescopes and existing astronomical databases through the Grid. CREDIT: Joint Astronomy Centre. Eta Carinae image courtesy of N. Smith (U. Colorado), J. Morse (Arizona State U.), and NASA.

All the components of the system are thought of as agents — effectively "smart" pieces of software. Negotiation takes place between the agents in the system. each of the resources bids to carry out the work, with the science agent scheduling the work with the agent embedded at the resource that promises to return the best result.

This architectural distinction of viewing both sides of the negotiation as agents — and as equals — is crucial. Importantly, this preserves the autonomy of individual resources to implement observation scheduling at their facilities as they see fit, and it offers increased adaptability in the face of asynchronously arriving data.

The system is a meta-network that layers communication, negotiation, and real-time analysis software on top of existing telescopes, allowing scheduling and prioritization of observations to be done locally. It is flat, peer-to-peer, and owned and operated by disparate groups with their own goals and priorities. There is no central master-scheduler overseeing the network — optimization arises through emerging complexity and social convention.

How could the ideas behind eSTAR be applied elsewhere?

Alasdair Allan: Essentially what I've built is a geographically distributed sensor architecture. The actual architectures I've used to do this are entirely generic — fundamentally, it's just a peer-to-peer distributed system for optimizing scarce resources in real-time in the face of a constantly changing environment.

The architectures are therefore equally applicable to other systems. The most obvious use case is sensor motes. Cheap, possibly even disposable, single-use, mesh-networked sensor bundles could be distributed over a large geographic area to get situational awareness quickly and easily. Despite the underlying hardware differences, the same distributed machine learning-based architectures can be used.


At February's Strata conference, Alasdair Allan discussed the ambiguity surrounding a formal definition of machine learning:

This interview was edited and condensed.

Related:

February 11 2011

The Locker Project: data for the people

Singly, a new company that made an appearance at the Strata Conference Startup showcase, exists to provide oxygen and commercial support to the open source Locker Project, and the new protocol TeleHash.

With some wonderful serendipity I met Singly on my first night at Strata. The next day, I talked in depth to Jeremie Miller and Simon Murtha-Smith, two of the three Singly co-founders (see later in this post). I also had the opportunity to ask Tim O'Reilly and Roger Magoulas for some of their thoughts on the significance of this project (see below for their comments).

It was a real "pinch myself in case I need to wake up from a dream" experience for me, to stumble across Jeremie Miller with Simon Murtha-Smith sitting behind a handwritten sign demoing Singly at Strata (see my pic opening this post). As Marshall Kirkpatrick noted at ReadWriteWeb:

Jeremie Miller is a revered figure among developers, best known for building XMPP, the open source protocol that powers most of the Instant Messaging apps in the world. Now Miller has raised funds and is building a team that will develop software aimed directly at the future of the web.

Singly, by giving people the ability to do things with their own data, has the potential to change our world. And, as Kirkpatrick notes, this won't be the first time Jeremie has done that.

I was drawn over to the Singly table when an awesome app they were demonstrating caught my eye. Fizz, an application from Bloom, was running on a locker with data aggregated from three different places

Fizz is an intriguing early manifestation of capabilities never seen before on the web. It provides the ability for us to control, aggregate, share and play with our own data streams, and bring together the bits and pieces of our digital selves scattered about the web (for more about Bloom and Singly, see Tim O'Reilly's comments below). The picture below is my Fizz. The large circles represent people and the small circles represent their status updates. Bloom explained the functionality in a company blog post:

Clicking a circle will reveal its contents. Typing in the search box will highlight matching statuses. This is an early preview of our work and we'll be adding more features in the next few weeks. We'd love to hear your feedback and suggestions.

If you are not already familiar with the Bloom team — Ben Cerveny, Tom Carden, and Jesper Sparre Andersen — go directly to their about page and you will understand why the match of Bloom and the Locker Project is a cause for great delight.

The Locker Project: A whole new way to connect from the protocol up

Singly, the Locker Project, and TeleHash take on and deliver an elegant and open solution to some of the holy grails of the next generation of networked communications. I have written on, and been nibbling at the edges of some of these grails in various projects myself for quite a while now. A glance at the monster mash of my pre-Strata post will give you an idea of how important I think Singly is.

That previous post raised the question of how to invert the search pyramid and to transform search into a social, democratic act. But if you are really interested in social search, I suggest staying keyed into what Singly is doing with the Locker Project.

One of Singly's three founders, Simon Murtha-Smith, was building a company called Introspectr, a social aggregator and search product. Singly's other founder Jason Cavnar was working on another similar project. And they came together as Singly because social aggregation and search is a very hard problem for one company to solve. They realized that the basic infrastructure needs to be open source and built on an open protocol.

To me, what is so important about the Locker Project is that it is built on a new open protocol, TeleHash. And having the Singly team focused on supplying tools and the trust/security layer for the Locker Project will mean that developers will have the whole stack.

I asked Miller to explain the relationship between TeleHash, the Locker Project and Singly.

Tish Shute: What is TeleHash?

Jeremie Miller: It's a peer-to-peer protocol to move bits of data for applications around. Not file sharing, but it's for actual applications to find each other and connect. So if you had an app and I had an app, whenever we're running that app on our devices, we can actually find those other devices from each other and then connect. Our applications can connect and do something. TeleHash is actually what has led to the Locker project itself.

Tish Shute: So TeleHash led to the Locker Project and the Locker Project led to Singly?

Jeremie Miller: Singly is a company that is sponsoring the open source Locker Project.

TeleHash is a protocol that lets the lockers connect with each other and share things. The locker is like all of your data. It's sort of like a digital person.




Singly interview

Left to right: Jeremie Miller, Jason Cavnar, Simon Murtha-Smith. I took the pic above of all three founders being interviewed by Marshall Kirkpatrick of ReadWriteWeb. I think we will look back on this moment and say it was an inflection point for the web. At least I tweeted that!


Tish Shute: A locker stores bits and pieces of your digital self?

Jeremie Miller: Yes. So TeleHash lets the lockers directly peer-to-peer connect with each other and share things. Singly, as a company, is going to be hosting lockers first and foremost. But the Locker Project is an open source project. You can have a locker in your machine or you can install it wherever you want.

Tish Shute: Will Singly provide the trust layer and hosting?

Jeremie Miller: Yeah. Singly is a company that will host lockers, as well as when people build applications that run inside your lockers or use your data, you need to be able to trust them. Maybe it's initially social data and you don't care that much about, but once you add browsing history, your health data, your running logs, or sleeping information, it's important to be careful about what you're running inside your locker and sharing.

So Singly will also look at the applications that are available that you can install and actually run them and look at what data they access. It will be able to come back and either certify or vouch for them.

I hope in the long-run, as this grows and builds, that power users may actually be able to buy a small device that they can plug into their home network and that would be their locker. Wouldn't that be cool? This little hard drive that you plug in.

Tish Shute: Architecturally, are TeleHash and the Locker Project related to your work on XMPP?

Jeremie Miller: XMPP in Jabber was designed for the specific purpose of instant messaging, but it was still a federated model in that you still had to go through a central point. It was designed with that in mind — for the communication path to be routed through somewhere. Where I've evolved is that I'm fascinated with truly distributed protocols that are completely decentralized so that things are going peer-to-peer instead of going through any server.

Peer-to-peer has gotten a pretty bad rap over the last 10 years because of file sharing, but the potential for it is awesome. There's so many really good things that can be done with peer-to-peer, and it hasn't gotten used much.

But the other side of the peer-to-peer thing that I think is critically important, look at the explosion of the computing devices around an individual — both in the home and on our person. I look at my home network router and I've got 30 devices in my house on Wi-Fi. That's a lot of devices.

But right now, to work with those devices I'm almost always going through a server somewhere, or through a data center somewhere. That's ridiculous.

Tish Shute: So we need a peer-to-peer network just to manage our own devices?

Jeremie Miller: A peer-to-peer network, yes. My phone should be talking straight to my computer, or to the iPad, or to the washing machine, or to the refrigerator. The applications in my TV should all be talking peer-to-peer. And it should be easy to do that. It shouldn't be that the only way you can do that is to go through a data center somewhere.

Note: Updates to the Locker Project will be posted through @lockerproject and GitHub

The Locker Project is not just "one more rebel army trying to undo these big data aggregations"

I discussed the impact TeleHash, the Locker Project and Singly might have on social network incumbents with Roger Magoulas and Tim O'Reilly. Both had insightful comments.

Magoulas pointed out:

I think Singly has Facebook-like aspects, but I think a better description is an app platform that integrates your personal and social network data — including data from Facebook.

Singly is likely to have challenges with some of their data sources, particularly if it gains traction with users. I like the app platform business model, although they face risks getting critical mass and app developer attention. I also like how they plan on using open source connectors to keep up with changing social network platforms.

Jeremie [Miller] has credibility with the open source community and is likely to find cooperating developers. The team seems to bring complementary strengths to the project and you can tell they all work well together.

Tim O'Reilly elaborated on the potential of this platform to bring something new to the ecosystem. Our interview follows:

Tish Shute: Will the Locker Project be able to break the lock of big sites, like Facebook, controlling everyone's data? Sometimes I feel we are stuck in the era of Zyngification, where you have to do what Zynga did and leverage the system in order to gain traction or do anything with social data.

Tim O'Reilly: I don't think breaking the Facebook lock is the objective of the Locker Project. The value of Facebook is having your data there with other people's data. What Singly may be able to do is give people better tools for managing their data. If you can take the data from various sites and manage it yourself, then you can potentially make better decisions about what you're going to allow and not allow. Right now, the interfaces on a lot of these sites make it difficult to understand the implications of making your data available.

If this is done right, it will create a marketplace where people will build interfaces that provide more control over personal data. People will still want to put data on sites for the same reason you put money in the bank: it's more valuable when it's combined with other people's money.

To conceive of this project as one more rebel army trying to undo these big data aggregations is just the wrong way to frame it.

Tish Shute: Framing the question the way you just did — that this is not just one more rebel army — might mean the stage at Strata will be filled with new startups next year. That's what I thought when I found out what the Locker Project and Singly are about: that we're about to see an explosion of creativity with personal and social data.

Tim O'Reilly: The tools we have now are pretty primitive. If we get a better set of tools, I think we'll see a lot of innovation. Some of those startups might be acquired by Facebook or Google, but if those smaller companies give people better visibility and control over their data, that's a good thing.

Tish Shute: I loved the marriage between Singly and Bloom [mentioned above]. It's interesting because Ben Cerveny and the Bloom team haven't really talked a lot about Bloom yet. I gather Bloom is moving toward consumer-facing work with data?

Tim O'Reilly: People think of data visualization as output, and the insight that I think Ben has had with Bloom is that data visualization will become a means of input and control.

I've started to feel that visualization as a way of making sense of complex data is kind of a dead-end. What you really want to do is build feedback loops where people can actually figure something out. Being able to manipulate data in real-time is an important shift. Data visualizations would then become interfaces rather than reports.

This post was edited and condensed. A longer version, featuring additional interviews and analysis, is available at UgoTrade.

January 21 2011

Four short links: 21 January 2011

  1. Proof-of-Concept Android Trojan Captures Spoken Credit-Card Numbers -- Soundminer sits in the background and waits for a call to be placed [...] the application listens out for the user entering credit card information or a PIN and silently records the information, performing the necessary analysis to turn it from a sound recording into a number. Very clever use of sensors for evil! (via Slashdot)
  2. Cloud9 IDE -- open source IDE for node.js. I'm using it as I learn node.js, and it's sweet as, bro.
  3. The Quantified Self Conference -- May 28-29 in Mountain View. (via Pete Warden)
  4. Bram Cohen Demos P2P Streaming -- the creator of BitTorrent is winding up to release a streaming protocol that is also P2P. (via Hacker News)

January 16 2011

02mydafsoup-01

January 03 2011

2011 Watchlist: 6 themes to track

Now's the time of year for everyone to write about the trends they see in the coming year. I've resisted that in the past, but this year I'll make an exception. We'll see if it becomes a tradition. Here's my quick list of six themes to watch in 2011:

The Hadoop family

Big data is no secret, and it grew so big in 2010 it can hardly count as a "trend" for 2011. Hadoop grew up with big data, and big data grew up with Hadoop. But what I've seen recently is the flowering of the Hadoop platform. It's not just a single tool, it's an ecosystem of tools that interoperate -- and the total is more than the sum of its parts. Watch HBase, Pig, Hive, Mahout, Flume, ZooKeeper, and the rest of the elephantine family in the coming year.

Real time data

Websites may not be "real time" in a rigorous sense, but they certainly aren't static, and they've gone beyond the decade-old notion of "dynamic," in which the same set of inputs produced the same outputs. Sites like Twitter and Facebook change with time; users want to find out what's happening now (or some reasonably relaxed version of now). Most of the tools we have for working with big datasets are batch-oriented, like Hadoop. One of the most exciting announcements of 2010 was the brief glimpse of Google's Percolator, which enables streaming computation on Google-sized datasets. While Percolator is a proprietary product and will probably remain so, I would be willing to bet that there will be an open source tool performing the same function within the next year. Watch for it.


Strata: Making Data Work, being held Feb. 1-3, 2011 in Santa Clara, Calif., will focus on the business and practice of data. The conference will provide three days of training, breakout sessions, and plenary discussions -- along with an Executive Summit, a Sponsor Pavilion, and other events showcasing the new data ecosystem.

Save 30% off registration with the code STR11RAD



The rise of the GPU


Our ability to create data is outstripping our ability to compute with it. For a number of years, a subculture of data scientists have been using high-performance graphics cards as computational tools, whether or not they need graphics. The computational capabilities that are used for rendering graphics are equally useful for general vector computing. That trend is quickly becoming mainstream, as more and more industries find that they need the ability to process large amounts of data in real time ("real" real time, not web time): finance, biotech, robotics, almost anything that requires real-time results from large amounts of data.

Amazon's decision to provide GPU-enabled EC2 instances ("Cluster GPU Instances") validates the GPU trend. You won't get the processing power you need at a price you want just by enabling traditional multicore CPUs. You need the dedicated computational units that GPUs provide.

The return of P2P

P2P has been rumbling in the background ever since Napster appeared. Recently, the rumblings have been getting louder. Many factors are coming together to drive a search for a new architectural model: the inability of our current provider paradigm to supply the kind of network we'll need in the next decade, frustration with Facebook's "Oops, we made a mistake" privacy policies, and even WikiLeaks. Whether we're talking about Bob Frankston's Ambient Connectivity, the architecture of Diaspora, Tor onion routing, or even rebuilding the Internet's client services from the ground up on a peer-to-peer basis, the themes are the same: centralization of servers and network infrastructure are single points of control and single points of failure. And the solution is almost always some form of peer-to-peer architecture. The Internet routes around damage -- and in the coming years, we'll see the Internet repair itself. The time for P2P has finally come.

Everything is even more social

2010 was certainly been the year of Facebook. But I think that's just the beginning of the social story, rather than the end. I don't think the Internet will ossify into a Facebook-dominated world. Rather, I think we'll see social features incorporated into everything: corporate sites, ecommerce sites, mobile apps, music, and books. Although Apple's Ping is lame, and social music sites (such as MOG) are hardly new, Ping points the way: the incorporation of social features into new kinds of products.

The meaning of privacy

Any number of events this year have made it clear that we need to think seriously about what privacy means. We can't agree with the people who say "There's no such thing as privacy, get over it." At the same time, insisting on privacy in stupidly rigid ways will paralyze the Internet and make it difficult, if not impossible, to explore new areas -- including healthcare, government, sharing, and community. As Tim O'Reilly has said, what's needed isn't legislation, but a social consensus on what should and should not be done with data: how much privacy is reasonably needed, and what forms of privacy we can do without. We're now in a position where solving those problems is not only possible, but necessary. I don't expect much progress toward a solution in the next year, but I do expect to see the meaning of "privacy" discussed seriously.

A few more things

What? No mobile? No HTML5? No JavaScript? Yes, they're certainly going to grow, but I see them as 2010's news. You don't get any points for predicting "Mobile is going to be big in 2011." Duh. I might hazard a guess that HTML5 will become an equal partner to native apps on mobile platforms -- there's a good chance of that, but I'm not convinced that will happen. I am convinced that JavaScript is the language to watch; in the last few years, it has ceased to be a glue language for HTML and come into its own. Node.js is just what was needed to catapult it from a bit player into a starring role.



Related:



December 09 2010

Turbulenzen in der Wolke: Proprietäre Clouds oder freie Netze

Proprietäres Cloud-Computing legt die Macht über die Daten in die Hände weniger.

Weiterlesen

November 11 2010

Copyright und P2P-Netzwerke: Himmel oder Hölle der Kreativen?

Ginge es nach der Musikindustrie, braucht es immer neue Gesetzesverschärfungen und Bedro­hungs­szenarien gegen das Filesharing.

Weiterlesen

July 16 2010

Filesharing: Abmahnungen durch Waldorf Rechtsanwälte

In letzter Zeit sind mir überdurchschnittlich viele Filesharing-Abmahnungen der Kanzlei Waldorf vorgelegt worden, was allerdings auch Zufall sein kann. Der Anteil an Filmen und Musik ist dabei in etwa gleich hoch. Im Filmbereich fällt vor allen Dingen Constantin Film (zuletzt z.B. Pandorum und Jenseits der Angst) durch eine rege Abmahntätigkeit auf, während sich die großen internationalen Filmrechtseinhaber nach wie vor eher zurückhalten. Auch kleinere (Qualitäts-)Produktionen wie “Der Knochenmann” (Majestic Filmverleih) sind aktuell Gegenstand von Abmahnungen. Im Musikbereich mahnt Waldorf derzeit häufig für Sony Music ab, u.a. Werke der Künstler Alicia Keys und AC/DC.

Die Situation für die von Abmahnungen Betroffenen hat sich nach der BGH-Entscheidung keinesfalls verbessert, auch wenn es gelegentlich 80-jährige Rentner trifft, die mir glaubhaft versichern, dass sie keine AC/DC-Fans sind.

May 18 2010

02mydafsoup-01
Ein soziales Netzwerk, das den Nutzern die Kontrolle über ihre Daten gibt: Vier Studenten wollen es mit dem US-Platzhirsch Facebook aufnehmen - und werden dafür im Netz mit Geld überschüttet.
Studentenprojekt Diaspora - Vier gegen Facebook - Computer - sueddeutsche.de
Reposted fromRollo Rollo

April 06 2010

DC Circuit court rules in Comcast case, leaves the FCC a job to do

Today's ruling in Comcast v. FCC will certainly change the
terms of debate over network neutrality, but the win for Comcast is
not as far-reaching as headlines make it appear. The DC Circuit court
didn't say, "You folks at the Federal Communications Commission have
no right to tell any Internet provider what to do without
Congressional approval." It said, rather, "You folks at the FCC didn't
make good arguments to prove that your rights extend to stopping
Comcast's particular behavior."

I am not a lawyer, but to say what happens next will take less of a
lawyer than a fortune-teller. I wouldn't presume to say whether the
FCC can fight Comcast again over the BitTorrent issue. But the court
left it open for the FCC to try other actions to enforce rules on
Internet operators. Ultimately, I think the FCC should take a hint
from the court and stop trying to regulate the actions of telephone
and cable companies at the IP layer. The hint is to regulate them at
the level where the FCC has more authority--on the physical level,
where telephone companies are regulated as common carriers and cable
companies have requirements to the public as well.




The court noted (on pages 30 through 34 of href="http://pacer.cadc.uscourts.gov/common/opinions/201004/08-1291-1238302.pdf">its
order) that the FCC missed out on the chance to make certain
arguments that the court might have looked on more favorably.
Personally and amateurly, I think those arguments would be weak
anyway. For instance, the FCC has the right to regulate activities
that affect rates. VoIP can affect phone rates and video downloads
over the Internet can affect cable charges for movies. So the FCC
could try to find an excuse to regulate the Internet. But I wouldn't
be the one to make that excuse.

The really significant message to the FCC comes on pages 30 and 32.
The court claims that any previous court rulings that give power to
the FCC to regulate the Internet (notably the famous Brand X decision)
are based on its historical right to regulate common carriers (e.g.,
telephone companies) and broadcasters. Practically speaking, this
gives the FCC a mandate to keep regulating the things that
matter--with an eye to creating a space for a better Internet and
high-speed digital networking (broadband).

Finding the right layer

Comcast v. FCC combines all the elements of a regulatory
thriller. First, the stakes are high: we're talking about who
controls the information that comes into our homes. Second, Comcast
wasn't being subtle in handling BitTorrent; its manipulations were
done with a conscious bias, carried out apparently arbitrarily (rather
than being based on corporate policy, it seems that a network
administrator made and implemented a personal decision), and were kept
secret until customers uncovered the behavior. If you had asked for a
case where an Internet provider said, "We can do anything the hell we
want regardless of any political, social, technical, moral, or
financial consequences," you'd choose something like Comcast's
impedance of BitTorrent.

And the court did not endorse that point of view. Contrary to many
headlines, the court affirmed that the FCC has the right to
regulate the Internet. Furthermore, the court acknowledged that
Congress gave the FCC the right to promote networking. But the FCC
must also observe limits.

The court went (cursorily in some cases) over the FCC's options for
regulating Comcast's behavior, and determined either that there was no
precedent for it or (I'm glossing over lots of technicalities here)
that the FCC had not properly entered those options into the case.

The FCC should still take steps to promote the spread of high-speed
networking, and to ensure that it is affordable by growing numbers of
people. But it must do so by regulating the lines, not what travels
over those lines.

As advocates for greater competition have been pointing out for
several years, the FCC fell down on that public obligation. Many trace
the lapse to the chairmanship of Bush appointee Michael Powell. And
it's true that he chose to try to get the big telephone and cable
companies to compete with each other (a duopoly situation) instead of
opening more of a space for small Internet providers. I cover this
choice in a 2004
article
. But it's not fair to say Powell had no interest in
competition, nor is it historically accurate to say this was a major
change in direction for the FCC.

From the beginning, when the 1996 telecom act told the FCC to promote
competition, implementation was flawed. The FCC chose 14 points in the
telephone network where companies had to allow interconnection (so
competitors could come on the network). But it missed at least one
crucial point. The independent Internet providers were already losing
the battle before Powell took over the reins at the FCC.

And the notion of letting two or three big companies duke it out
(mistrusting start-ups to make a difference) is embedded in the 1996
act itself.

Is it too late to make a change? We must hope not. Today's court
ruling should be a wake-up call; it's time to get back to regulating
things that the FCC actually can influence.

Comcast's traffic shaping did not change the networking industry. Nor
did it affect the availability of high-speed networks. It was a clumsy
reaction by a beleaguered company to a phenomenon it didn't really
understand. Historically, it will prove an oddity, and so will the
spat that network advocates started, catching the FCC in its snares.

The difficulty software layers add

The term "Internet" is used far too loosely. If you apply it to all
seven layers of the ISO networking model, it covers common carrier
lines regulated by the FCC (as well as cable lines, which are subject
to less regulation--but still some). But the FCC has historically
called the Internet a "service" that is separate from those lines.

Software blurs and perhaps even erases such neat distinctions. Comcast
does not have to rewire its network or shut down switches to control
it. All they have to do is configure a firewall. That's why stunts
like holding back BitTorrent traffic become networking issues and draw
interest from the FCC. But it also cautions against trying to regulate
what Comcast does, because it's hard to know when to stop. That's what
opponents of network neutrality say, and you can hear it in the court
ruling.

The fuzzy boundaries between software regulation and real-world
activities bedevils other areas of policy as well. Because
sophisticated real-world processing moves from mechanical devices into
software, it encourages inventors to patent software innovations, a
dilemma I explore in href="http://radar.oreilly.com/archives/2007/09/three_vantage_p.html">another
article. And in the 1990s, courts argued over whether encryption
was a process or a form of expression--and decided it was a form of
expression.

Should the FCC wait for Congress to tell it what to do? I don't think
so. The DC Circuit court blocked one path, but it didn't tell the FCC
to turn back. It has a job to do, and it just has to find the right
tool for the job.

April 01 2010

February 18 2010

Kerner klärt auf: Über “illegale Tauschbörsen” und Entwertung

Geschichte wird gemacht – eine alte Weisheit und immer wieder wahr. Nutzer des Familiensenders Sat1 kommen heute Abend mal wieder in den Genuss der Talk-Sendung von Superstar Johannes Buddy Kerner. In der Sendung geht es um Musik-Downloads: Legal, illegal, nicht egal!. Na gut, es ist wichtig, komplexe Themen auch einfach zu behandeln, damit sie viele verstehen, bewusst die Unwahrheit oder mit irreführenden Verkürzungen sollte man aber trotzdem nicht arbeiten. Eigentlich – aber Kerner – so zumindest im Ankündigungstext zu lesen – hat es sich zur Aufgabe gemacht, genau das zu tun. Oder nicht?

Dort heißt es: „Musik kostenfrei zur Verfügung stellen und das Downloaden über illegale Tauschbörsen im Internet ist nicht erlaubt, da es gegen das Urheberrecht verstößt.“ Kann man diese fiesen „illegalen Tauschbörsen“ nicht einsperren und nur noch die legalen Tauschbörsen frei herumlaufen lassen? Illegale Technologien finde ich nämlich gemein.

Zum Download sagt iRights.info (vielleicht etwas differenzierender, aber nur vielleicht): Tauschbörsen zu nutzen, ist nicht per se rechtswidrig. Im Gegenteil, man kann sie für nützliche und völlig legale Zwecke nutzen. Aber vieles, was mit Tauschbörsen zu tun hat, ist tatsächlich verboten. (…) Kopien zu privaten Zwecken dürfen nach aktueller Rechtslage allerdings nur noch angefertigt werden, „soweit nicht zur Vervielfältigung eine offensichtlich rechtswidrig hergestellte oder öffentlich zugänglich gemachte Vorlage verwendet wird.“ Im Klartext heißt das, dass auch solche Dateien nicht heruntergeladen werden dürfen, die für jedermann erkennbar rechtswidrig online gestellt wurden. Der Gesetzgeber geht davon aus, dass jeder weiß oder wissen muss, dass zum Beispiel die Film- oder Musikindustrie keine Dateien in Tauschbörsen einstellen würde. Trifft das zu, sind Downloads solcher Dateien nicht erlaubt. Trotz dieser Gesetzesänderung ist es noch häufig fraglich, ob die Kopiervorlage im Internet „offensichtlich rechtswidrig“ online gestellt wurde. Denn viele Künstler, Autoren oder Filmemacher – ja selbst Unternehmen aus der Unterhaltungsindustrie – nutzen das Internet zunehmend als Verbreitungsmedium. Zum Teil stellen die Rechteinhaber selbst ihre Inhalte in Tauschbörsen ein. Liegt ein solcher Fall vor, sind die entsprechenden Dateien natürlich nicht offensichtlich rechtswidrig online gestellt worden. Vielmehr ist dies rechtmäßig geschehen. Solche Dateien dürfen selbstverständlich auch heruntergeladen werden. Mehr Infos: Privatkopie und Co: Teil 3: Download – Tauschbörsen und offizielle Angebote. Auch soll es so etwas wie Musik unter einer Creative-Commons-Lizenz oder einer GNU General Public Licence geben. Geschenkt.

Kurz darauf heißt es: „Achtung: Eltern haften als Anschlussinhaber für illegale Angebote in Tauschbörsen!“ Ich werde gleich meine Mutter anrufen und sie bitten den Internetanschluss abzumelden, nicht dass sie für die ganzen BitTorrent-Angebote aus der Südsee haftet! Oder doch nicht? Schließlich wird die steile Behauptung von einem Experten unterstützt. Dem interessenlosen Vorsitzenden der Deutschen Phonoverbände Haentjes. Der meint nämlich: „Eltern müssen sich stets darüber im Klaren sein, dass sie als Inhaber eines Internetanschlusses für Urheberrechtverletzungen ihrer Kinder haften. Und das könnte sehr teuer werden!“ Sind damit jetzt ungerechtfertigte Massenabmahnungen der Musikindustrie gemeint? Also doch nicht Mama anrufen!

Und weiter: „Da viele illegale Downloader es nicht als Unrecht empfinden, sich MP3s umsonst runterzuladen, sollten Eltern ihre Kinder darüber aufklären, dass sie eine Straftat begehen, wenn sie Musik auf ihren Rechner laden und anderen Musikliebhabern zugänglich machen.“ Vielleicht mit einem Verweis auf iRights.info damit sie die Rechtslage kennenlernen und nicht mit Propaganda überschüttet werden?

Und natürlich ist die Sendung ganz up-to-date und hilft gerne mit, ein Bedrohungsszenario aufzubauen. Schließlich ist die Verwirklichung von zwanghaften Träumen ein hohes Gut: „Zukünftig ist es durchaus möglich, dass die Service-Provider Verstöße gegen das Urheberrecht durch Musik-Downloads mit Sperrung oder Einschränkung des Internetzugangs ahnden.“ Vielleicht sollte man den Kindern auch gleich noch die verfassungsrechtlichen Grundsätze von Informationsfreiheit erklären? Oder lieber nicht, nachher machen sie sich noch schlau und wehren sich am Ende noch gegen solche Pläne.

Und natürlich wird auch die Bedeutung der Digitalisierung nicht verkannt: „Die Erfindung des MP3-Formats war rückblickend das Ende der Musikindustrie, wie man sie kannte.“ Richtig. „Die musikalischen Inhalte wurden vom herkömmlichen Trägermedium CD entkoppelt. Gleichzeitig werden die Produktions- und Vertriebskosten minimiert. Der digitale Bereich spielt eine immer größer werdende Rolle und gewinnt immer mehr an Relevanz. Im Vergleich zum Jahr 2006 stieg die Anzahl der legalen Musikdownloads im Jahr 2007 um 53 Prozent auf insgesamt 1,7 Milliarden Einheiten.“ Auch richtig. „Die Musikindustrie hat inzwischen verstanden, dass der technologische Fortschritt als Wachstumschance für die Branche zu sehen ist, anstatt sich dagegen zu wehren und arbeitet an Strategien, den Fortschritt gewinnbringend zu nutzen.“ Eher nicht so richtig. Die Musikindustrie hat es bislang leider nicht geschafft, bessere und verbraucherfreundliche Angebote zu schaffen als BitTorrent-Clients diese zur Verfügung stellen. Und was heißt „anstatt sich dagegen zu wehren“? Vielleicht verstehe ich den Ankündigungstext mit den ganzen Falschdarstellungen und Drohungen ja nicht richtig.

Jetzt wirds wieder richtiger: „Seit der Erfindung des MP3-Formats hat die Musikindustrie aussichtsreiche Chancen ungenutzt verstreichen lassen. Gerne schiebt sie die Schuld für sinkende Umsätze auf die Musikpiraterie, die in illegalen Peer to Peer (P2P)-Tauschbörsen betrieben wird und nutzt diese als Erklärung für die anfängliche Ablehnung der Digitalisierung, doch muss sich die Musikindustrie schon mindestens seit dem Jahr 1960 mit Piraterie im großen Stil auseinandersetzen.“ Holla, eine Erkenntnis!

Wie gehts weiter? „Die Schließung einer bestimmten Tauschbörse hat nur zur Folge, dass sich die Nutzer auf andere Angebote verteilen.“ Ja, stimmt. Und dann: „Ein generelles Umdenken und die Entwicklung eines Unrechtsempfindens gegenüber kostenfreiem illegalem Download von Musiktiteln, ist nötig. Die Musikunternehmen müssen ihren Kunden eine attraktive legale Alternative schaffen, um ihre Produkte von den illegalen kostenfreien Downloads abzugrenzen und den Konsumenten vom Kauf zu überzeugen.“ Überzeugung und Alternativen statt Strafen, Bedrohungen und falschen Zahlen. Eine gute Idee! Das wird immer besser hier. Ah ich hatte den nächsten Satz noch nicht gesehen. Dort heißt es: „Durch die illegalen Tauschbörsen wurde die digitale Musik entwertet, da alle Musiktitel umsonst bezogen werden konnten.“ Das ist ja fies! Entwertet! Aber ja zum Glück nur die „digitale Musik“. Mit dem Elektro-Zeugs konnte ich eh nie was anfangen. Oder ist damit jetzt schon wieder was anderes gemeint? Kann es vielleicht nicht auch einfach so sein, dass die Nutzer keine Lust hatten der Musikindustrie zum dritten Mal die gleiche Musik abzukaufen (Kassette/Platte, CD und jetzt mp3s)? Kann es nicht auch sein, dass die musikaffinen Nutzer weiterhin Musik kaufen, wenn sie diese gut finden? Ich hatte mal sowas gehört. Und irgendwas war da doch noch mit Interoperabilität und Kopierschutz. Egal, merken: Digitale Musik wird durch Tauschbörsen entwertet. Und was machen wir jetzt? „Ziel muss es sein, den digitalen Musikdownload wieder mit einem Wert zu versehen.“ Da bin ich aber beruhigt. Ziel, Plan, Weg und Erfolg. Also alles in guten Händen. Mehr Infos zu „illegalen Tauschbörsen“ gibt es übrigens beim „Bundesverband der Musikindustrie e.V.“ und beim Aufklärungsportal von „proMedia Gesellschaft zum Schutz geistigen Eigentums mbH“.

Oder aber bei iRights.info. Das hätte den Vorteil: Erläuterungen nach geltender Rechtslage, laienverständlich formuliert, keine Propaganda und kühle Analyse mit warmen Praxistips. Aber nur wer will. Die anderen sind herzlich eingeladen heute Abend Johannes B. Kerner bei der Analyse der gegenwärtigen Sachlage zu glauben. Oder wirds am Ende doch ganz anders und JBK stellt kritische Fragen und klärt auf? We´ll see. Bis dahin.

January 14 2010

Innovation Battles Investment as FCC Road Show Returns to Cambridge

Opponents can shed their rhetoric and reveal new depths to their
thought when you bring them together for rapid-fire exchanges,
sometimes with their faces literally inches away from each other. That
made it worth my while to truck down to the MIT Media Lab for
yesterday's href="http://hraunfoss.fcc.gov/edocs_public/attachmatch/DOC-295521A1.pdf">Workshop
on Innovation, Investment and the Open Internet, sponsored by the
Federal Communications Commission. In this article I'll cover:

Context and background

The FCC kicked off its country-wide hearing campaign almost two years
ago with a meeting at Harvard Law School, which quickly went wild. I
covered the href="http://radar.oreilly.com/archives/2008/02/network-neutrality-how-the-fcc.html">
experience in one article and the href="http://radar.oreilly.com/archives/2008/02/network-neutrality-code-words.html">
unstated agendas in another. With a star cast and an introduction
by the head of the House's Subcommittee on Telecommunications and the
Internet, Ed Markey, the meeting took on such a cachet that the
public flocked to the lecture hall, only to find it filled because
Comcast recruited people off the street to pack the seats and keep
network neutrality proponents from attending. (They had an overflow
room instead.)

I therefore took pains to arrive at the Media Lab's Bartos Theater
early yesterday, but found it unnecessary. Even though Tim Berners-Lee
spoke, along with well-known experts across the industry, only 175
people turned up, in my estimation (I'm not an expert at counting
crowds). I also noticed that the meeting wasn't worth a
mention today in the Boston Globe.

Perhaps it was the calamitous earthquake yesterday in Haiti, or the
bad economy, or the failure of the Copenhagan summit to solve the
worst crisis ever facing humanity, or concern over three wars the US
is involved in (if you count Yemen), or just fatigue, but it seems
that not as many people are concerned with network neutrality as two
years ago. I recognized several people in the audience yesterday and
surmised that the FCC could have picked out a dozen people at random
from their seats, instead of the parade of national experts on the
panel, and still have led a pretty darned good discussion.

And network neutrality is definitely the greased pig everyone is
sliding around. There are hundreds of things one could discuss
in the context of innovation and investment, but various political
forces ranging from large companies (AT&T versus Google) to highly
visible political campaigners (Huffington Post) have made network
neutrality the agenda. The FCC gave several of the movement's leaders
rein to speak, but perhaps signaled its direction by sending Meredith
Attwell Baker as the commissioner in attendance.

In contrast to FCC chair Julius Genachowski, who publicly calls for
network neutrality (a position also taken by Barack Obama href="http://www.barackobama.com/issues/technology/index_campaign.php#open-internet">
during his presidential campaign), Baker has traditionally
espoused a free-market stance. She opened the talks yesterday by
announcing that she is "unconvinced there is a problem" and posing the
question: "Is it broken?" I'll provide my own opinion later in this
article.

Two kinds of investment

Investment is the handmaiden, if not the inseminator, of innovation.
Despite a few spectacular successes, like the invention of Linux and
Apache, most new ideas require funding. Even Linux and Apache are
represented now by foundations backed now by huge companies.

So why did I title this article "Innovation Battles Investment"?
Because investment happens at every level of the Internet, from the
cables and cell towers up to the applications you load on your cell
phone.

Here I'll pause to highlight an incredible paradigm shift that was
visible at this meeting--a shift so conclusive that no one mentioned
it. Are you old enough to remember the tussle between "voice" and
"data" on telephone lines? Remember the predictions that data would
grow in importance at the expense of voice (meaning Plain Old
Telephone Service) and the milestones celebrated in the trade press when
data pulled ahead of voice?

Well, at the hearing yesterday, the term "Internet" was used to cover
the whole communications infrastructure, including wires and cell
phone service. This is a mental breakthrough all it's own, and one
I'll call the Triumph of the Singularity.

But different levels of infrastructure benefit from different
incentives. I found that all the participants danced around this.
Innovation and investment at the infrastructure level got short shrift
from the network neutrality advocates, whether in the bedtime story
version delivered by Barbara van Schewick or the deliberately
intimidating, breakneck overview by economist Shane Greenstein, who
defined openness as "transparency and consistency to facilitate
communication between different partners in an independent value
chain."

You can explore his href="http://www.kellogg.northwestern.edu/faculty/greenstein/images/articles.html">
papers on your own, but I took this to mean, more or less, that
everybody sharing a platform should broadcast their intentions and
appraise everybody else of their plans, so that others can make the
most rational decisions and invest wisely. Greenstein realized, of
course that firms have little incentive to share their strategies. He said that
communication was "costly," which I take as a reference not to an expenditure of
money but to a surrender of control and relinquishing of opportunities.

This is just what the cable and phone companies are not going to do.
Dot-com innovator Jeffrey Glueck, founder of href="http://www.skyfire.com/">Skyfire, would like the FCC to
require ISPs to give application providers and users at least 60 to 90
days notice before making any changes to how they treat traffic. This
is absurd in an environment where bad actors require responses within
a few seconds and the victory goes to the router administrators with
the most creative coping strategy. Sometimes network users just have
to trust their administrators to do the best thing for them. Network
neutrality becomes a political and ethical issue when administrators
don't. But I'll return to this point later.

The pocket protector crowd versus the bean
counters

If the network neutrality advocates could be accused of trying to
emasculate the providers, advocates for network provider prerogative
were guilty of taking the "Trust us" doctrine too far. For me, the
best part of yesterday's panel was how it revealed the deep gap that
still exists between those with an engineering point of view and those
with a traditional business point of view.

The engineers, led by Internet designer David Clark, repeated the
mantra of user control of quality of service, the vehicle for this
being the QoS field added to the IP packet header. Van Schewick
postulated a situation where a user increases the QoS on one session
because they're interviewing for a job over the Internet, then reduces
the QoS to chat with a friend.

In the rosy world envisioned by the engineers, we would deal not with
the physical reality of a shared network with our neighbors, all
converging into a backhaul running from our ISP to its peers, but with
the logical mechanism of a limited, dedicated bandwidth pipe (former
senator Ted Stevens can enjoy his revenge) that we would spend our
time tweaking. One moment we're increasing the allocation for file
transfer so we can upload a spreadsheet to our work site; the next
moment we're privileging the port we use for an MPMG.

The practicality of such a network service is open to question. Glueck
pointed out that users are unlikely ever to ask for lower quality of
service (although this is precisely the model that Internet experts
have converged on, as I report in my 2002 article href="http://www.oreillynet.com/pub/a/network/2002/06/11/platform.html">
A Nice Way to Get Network Quality of Service?). He recommends
simple tiers of service--already in effect at many providers--so that
someone who wants to carry out a lot of P2P file transfers or
high-definition video conferencing can just pay for it.

In contrast, network providers want all the control. Much was made
during the panel of a remark by Marcus Weldon of Alcatel-Lucent in
support of letting the providers shape traffic. His pointed out that
video teleconferencing over the fantastically popular Skype delivered
unappealing results over today's best-effort Internet delivery, and
suggested a scenario where the provider gives the user a dialog box
where the user could increase the QoS for Skype in order to enjoy the
video experience.

Others on the panel legitimately flagged this comment as a classic
illustration of the problem with providers' traffic shaping: the
provider would negotiate with a few popular services such as Skype
(which boasts tens of millions of users online whenever you log in)
and leave innovative young services to fend for themselves in a
best-effort environment.

But the providers can't see doing quality of service any other way.
Their business model has always been predicated on designing services
around known costs, risks, and opportunities. Before they roll out a
service, they need to justify its long-term prospects and reserve
control over it for further tweaking. If the pocket protector crowd in
Internet standards could present their vision to the providers in a
way that showed them the benefits they'd accrue from openness
(presumably by creating a bigger pie), we might have progress. But the
providers fear, above else, being reduced to a commodity. I'll pick up
this theme in the next section.

Is network competition over?

Law professor Christopher S. Yoo is probably the most often heard (not
at this panel, unfortunately, where he was given only a few minutes)
of academics in favor of network provider prerogatives. He suggested
that competition was changing, and therefore requiring a different
approach to providers' funding models, from the Internet we knew in
the 1990s. Emerging markets (where growth comes mostly from signing up
new customers) differ from saturated markets (where growth comes
mainly from wooing away your competitors' customers). With 70% of
households using cable or fiber broadband offerings, he suggested the
U.S. market was getting saturated, or mature.

Well, only if you accept that current providers' policies will stifle
growth. What looks like saturation to an academic in the U.S. telecom
field looks like a state of primitive underinvestment to people who
enjoy lightning-speed service in other developed nations.

But Yoo's assertion makes us pause for a moment to consider the
implications of a mature network. When change becomes predictable and
slow, and an infrastructure is a public good--as I think everyone
would agree the Internet is--it becomes a candidate for government
takeover. Indeed, there have been calls for various forms of
government control of our network infrastructure. In some places this
is actually happening, as cities and towns create their own networks.
A related proposal is to rigidly separate the physical infrastructure
from the services, barring companies that provide the physical
infrastructure from offering services (and therefore presumably
relegating them to a maintenance role--a company in that position
wouldn't have much incentive to take on literally ground-breaking new
projects).

Such government interventions are politically inconceivable in the
United States. Furthermore, experience in other developed nations with
more successful networks shows that it is unnecessary.

No one can doubt that we need a massive investment in new
infrastructure if we want to use the Internet as flexibly and
powerfully as our trading partners. But there was disagreement yesterday about
how much of an effort the investment will take, and where it will come
from.

Yoo argued that a mature market requires investment to come from
operating expenditures (i.e., charging users more money, which
presumably is justified by discriminating against some traffic in
order to offer enhanced services at a premium) instead of capital
expenditures. But Clark believes that current operating expenditures
would permit adequate growth. He anticipated a rise in Internet access
charges of $20 a month, which could fund the added bandwidth we need
to reach the Internet speeds of advanced countries. In exchange for
paying that extra $20 per month, we would enjoy all the content we
want without paying cable TV fees.

The current understanding by providers is that usage is rising
"exponentially" (whatever that means--they don't say what the exponent
is) whereas charges are rising slowly. Following some charts from
Alcatel-Lucent's Weldon that showed profits disappearing entirely in a
couple years--a victim of the squeeze between rising usage and slow
income growth--Van Schewick challenged him, arguing that providers can
enjoy lower bandwidth costs to the tune of 30% per year. But Weldon
pointed out that the only costs going down are equipment, and claimed
that after a large initial drop caused by any disruptive new
technology, costs of equipment decrease only 10% per year.

Everyone agreed that mobile, the most exciting and
innovation-supporting market, is expensive to provide and suffering an
investment crisis. It is also the least open part of the Internet and
the part most dependent on legacy pricing (high voice and SMS
charges), deviating from the Triumph of the Singularity.

So the Internet is like health care in the U.S.: in worse shape than
it appears. We have to do something to address rising
usage--investment in new infrastructure as well as new
applications--just as we have to lower health care costs that have
surpassed 17% of the gross domestic product.

Weldon's vision--a rosy one in its own way, complementing the
user-friendly pipe I presented earlier from the engineers--is that
providers remain free to control the speeds of different Internet
streams and strike deals with anyone they want. He presented provider
prerogatives as simple extensions of what already happens now, where
large companies create private networks where they can impose QoS on
their users, and major web sites contract with content delivery
networks such as Akamai (represented at yesterday's panel by lawyer
Aaron Abola) to host their content for faster response time. Susie Kim
Riley of Camiant testified that European providers are offering
differentiated services already, and making money by doing so.

What Weldon and Riley left out is what I documented in href="http://www.oreillynet.com/pub/a/network/2002/06/11/platform.html">
A Nice Way to Get Network Quality of Service? Managed networks
providing QoS are not the Internet. Attempts to provide QoS over the
Internet--by getting different providers to cooperate in privileging
certain traffic--have floundered. The technical problems may be
surmountable, but no one has figured out how to build trust and to design
adequate payment models that would motivate providers to cooperate.

It's possible, as Weldon asserts, that providers allowed to manage
their networks would invest in infrastructure that would ultimately
improve the experience for all sites--those delivered over the
Internet by best-effort methods as well as those striking deals. But
the change would still represent increased privatization of the public
Internet. It would create what application developers such as Glueck
and Nabeel Hyatt of Conduit Labs fear most: a thousand different
networks with different rules that have to be negotiated with
individually. And new risks and costs would be placed in the way of
the disruptive innovators we've enjoyed on the Internet.

Competition, not network neutrality, is actually the key issue facing
the FCC, and it was central to their Internet discussions in the years
following the 1996 Telecom Act. For the first five years or so, the
FCC took seriously a commitment to support new entrants by such
strategies as requiring incumbent companies to allow interconnection.
Then, especially under Michael Powell, the FCC did an about-face.

The question posed during this period was: what leads to greater
investment and growth--letting a few big incumbents enter each other's
markets, or promoting a horde of new, small entrants? It's pretty
clear that in the short term, the former is more effective because the
incumbents have resources to throw at the problem, but that in the
long term, the latter is required in order to find new solutions and
fix problems by working around them in creative ways.

Yet the FCC took the former route, starting in the early 2000s. They
explicitly made a deal with incumbents: build more infrastructure, and
we'll relax competition rules so you don't have to share it with other
companies.

Starting a telecom firm is hard, so it's not clear that pursuing the
other route would have saved us from the impasse we're in today. But a
lack of competition is integral to our problems--including the one
being fought out in the field of "network neutrality."

All the network neutrality advocates I've talked to wish that we had
more competition at the infrastructure level, because then we could
rely on competition to discipline providers instead of trying to
regulate such discipline. I covered this dilemma in a 2006 article, href="http://lxer.com/module/newswire/view/53907/">Network Neutrality
and an Internet with Vision. But somehow, this kind of competition
is now off the FCC agenda. Even in the mobile space, they offer
spectrum though auctions that permit the huge incumbents to gather up
the best bands. These incumbents then sit on spectrum without doing
anything, a strategy known as "foreclosure" (because it forecloses
competitors from doing something useful with it).

Because everybody goes off in his own direction, the situation pits two groups against each other that should be
cooperating: small ISPs and proponents of an open Internet.

What to regulate

Amy Tykeson, CEO of a small Oregon Internet provider named
BendBroadband, forcefully presented the view of an independent
provider, similar to the more familiar imprecations by href="http://www.brettglass.com/"> Brett Glass of Lariat. In their
world--characterized by paper-thin margins, precarious deals with
back-end providers, and the constant pressure to provide superb
customer service--flexible traffic management is critical and network
neutrality is viewed as a straitjacket.

I agree that many advocates of network neutrality have oversimplified
the workings of the Internet and downplayed the day-to-day
requirements of administrators. In contrast, as I have shown, large
network providers have overstepped their boundaries. But to end this
article on a positive note (you see, I'm trying) I'll report that the
lively exchange did produce some common ground and a glimmer of hope
for resolving the differing positions.

First, in an exchange between Berners-Lee and van Schewick on the
pro-regulatory side and Riley on the anti-regulatory side, a more
nuanced view of non-discrimination and quality of service emerged.
Everybody on panel offered vociferous exclamations in support of the position that it was
unfair discrimination for a network provider to prevent a user from
getting legal content or to promote one web site over a competing web
site. And this is a major achievement, because those are precisely
the practices that providers liked AT&T and Verizon claim the
right to do--the practices that spawned the current network neutrality
controversy.

To complement this consensus, the network neutrality folks approved
the concept of quality of service, so long as it was used to improve
the user experience instead of to let network providers pick winners.
In a context where some network neutrality advocates have made QoS a
dirty word, I see progress.

This raises the question of what is regulation. The traffic shaping
policies and business deals proposed by AT&T and Verizon are a
form of regulation. They claim the same privilege that large
corporations--we could look at health care again--have repeatedly
tried to claim when they invoke the "free market": the right of
corporations to impose their own regulations.

Berners-Lee and others would like the government to step in and issue
regulations that suppress the corporate regulations. A wide range of
wording has been proposed for the FCC's consideration. Commissioner
Baker asked whether, given the international reach of the Internet,
the FCC should regulate at all. Van Schewick quite properly responded
that the abuses carried out by providers are at the local level and
therefore can be controlled by the government.

Two traits of a market are key to innovation, and came up over and
over yesterday among dot-com founders and funders (represented by Ajay
Agarwal of Bain Capital) alike: a level playing field, and
light-handed regulation.

Sometimes, as Berners-Lee pointed out, government regulation is
required to level the playing field. The transparency and consistency
cited by Greenstein and others are key features of the level playing
field. And as I pointed out, a vacuum in government regulation is
often filled by even more onerous regulation by large corporations.

One of the most intriguing suggestions of the day came from Clark, who
elliptically suggested that the FCC provide "facilitation, not
regulation." I take this to mean the kind of process that Comcast and
BitTorrent went through, of which Sally Shipman Wentworth of ISOC
boasted about in her opening remarks. Working with the IETF (which she
said created two new working groups to deal with the problem), Comcast
and BitTorrent worked out a protocol that should reduce the load of
P2P file sharing on networks and end up being a win-win for everybody.

But there are several ways to interpret this history. To free market
ideologues, the Comcast/BitTorrent collaboration shows that private
actors on the Internet can exploit its infinite extendibility to find
their own solutions without government meddling. Free market
proponents also call on anti-competition laws to hold back abuses. But
those calling for parental controls would claim that Comcast wanted
nothing to do with BitTorrent and started to work on technical
solutions only after getting tired of the feces being thrown its way
by outsiders, including the FCC.

And in any case--as panelists pointed out--the IETF has no enforcement
power. The presence of a superior protocol doesn't guarantee that
developers and users will adopt it, or that network providers will
allow traffic that could be a threat to their business models.

The FCC at Harvard, which I mentioned at the beginning of this
article, promised intervention in the market to preserve Internet
freedom. What we got after that (as I predicted) was a slap on
Comcast's wrist and no clear sense of direction. The continued
involvement of the FCC--including these public forums, which I find
educational--show, along with the appointment of the more
interventionist Genachowski and the mandate to promote broadband in
the American Recovery and Reinvestment Act, that it can't step away
from the questions of competition and investment.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl