Net Neutrality Heats up in Europe

The net neutrality debate isn’t quite over in the US, as we learned today from the pushback from true believers against T-Mobile’s offer of uncapped data for a year to Pokémon GO players. T-Mob’s offer makes sense, of course, as everybody who can is climbing aboard the Pokémon bandwagon: Even Yelp will now let you filter search results according to PokéStops.

Yelp filters by PokéStop distance

Yelp filters by PokéStop distance

But no good deed goes unpunished.

In Europe, the Body of European Regulators for Electronic Communications (BEREC) Board of Regulators is conducting a consultation on the detailed orders it issues to National Regulatory Agencies (NRAs) who must harmonize the commandments from Brussels with national law. The consultation is, quite frankly, a mess. The basic problem that regulators all over the world have with net neutrality is to decode what the term means by picking it apart and then to pick and choose among the parts for elements that have political appeal without being too damaging to the Internet.

At the core, net neutrality is simply an attempt to preserve the web’s status as the pre-eminent Internet application. About 75% of Internet users aren’t aware of a difference between the web and the Internet (including judges)  and this is the pool from which net neutrality advocates are drawn. The web’s nominal inventor, Sir Tim Berners-Lee, joined American professor/advocates Larry Lessig and Barbara van Schewick in an open letter urging BEREC to keep non-web applications in the dog house by banning transmission models that would benefit them.

The letter strikes the familiar memes: It denounces those pesky fast lanes, attacks zero-rating plans like the one T-Mobile offers Pokémon GO users, pans application-oriented traffic management, and attacks “specialized services” that run over carrier broadband facilities but don’t actually touch the web. True to form, the open letter accuses carriers of extracting monopoly rents from consumers by charging for specialized services since, in their minds, congestion would never happen on the Internet if the carriers would simply supply adequate interconnection capacity.

This is wrong, of course. If it were true, there would be no business proposition for CDNs such as Akamai. CDN’s are fast lanes for content-based applications such as the web, and it turns out that communication-based applications such as gaming and conferencing need fast lanes too. This all comes down to the fact that the Internet is an application that runs on statistical networks where service quality depends on how heavily loaded the networks are during any given fraction of a second. The time it takes to get a packet of information from any given web server to any given browser can vary from 50 milliseconds (50 thousandths of a second) to a full second without the user perceiving the network to be congested. Voice delivery times vary as well, but any delay longer than 150 ms. is perceived by the user as a pop or click. So each application has its own requirements and the web is just one application among many.

Martin Geddes explains this in a detailed submittal to BEREC that’s a lot more humble and a lot better informed than the web supremacists’ open letter. Geddes (whose work has appeared in these pages here, here, and here) argues quite forcefully that BEREC needs to close the gap between its view of the Internet and the way the Internet actually works:

The guidelines directly contradict and ignore credible research from one NRA (i.e. Ofcom) on traffic management detection. In my view they are not technically implementable, as no fit-for-purpose monitoring system exists. To continue to pretend otherwise is to harm users, service providers and application developers.

Even more concerning, they appear to place members of professional engineering bodies and NRAs in conflict with their duty to protect the public. I am particularly concerned about sections #96 to #123 on “specialised services”, which have very poor technical merit. They appear to be unethical, placing users and enterprises at direct risk of economic harm…

The number of internal contradictions and technical errors within these rules suggests they have not been given proper technical or economic scrutiny. The quality of your advisors seems to be inadequate for the task. Indeed, the weak understanding of networks on which these guidelines have been built has resulted in a flawed consultation.

I consider this as an inappropriate use of industry resources. It is not the job of industry members to provide free scientific education through this retrospective mechanism. In my view this process has seriously damaged BEREC’s credibility and legitimacy as a regulatory body. At this sensitive time for all European institutions, few can welcome such a development.

BEREC draws Geddes’ ire by following a dated view that NRAs can ignore the statistical character of the Internet and demand a “base level of quality” for the web as a precondition to allowing the sale of appropriate better-than-web services for non-web applications.

One might easily assume BEREC has things pretty much right because the regulators are drawing criticism from both sides; but that assumption is faulty given the fact that the two sides consist of those who understand and care about network engineering and those (the web supremacist group) who don’t.

Net neutrality is flawed in two ways, and Geddes is absolutely laser-focused on one of them: it hampers the prospects for non-web uses of the Internet. While the Internet has been closely bound to the web as long as most people have been using it, its design goals go far beyond the web we have today.

Sadly, the net neutrality regime doesn’t even do a good job of protecting innovation and consumer choice on the web itself.

Geddes touches on this issue in his detailed responses to some of BEREC’s proposed rules:

#43 and #44: You state “…whether it is the ISP that picks winners and losers…” and “Each of these factors may contribute to a material reduction in end-user choice…”

These statements are contradictory. As such, the determination to ban ISPs from being able to “pick winners and losers” cannot be sustained on the basis proposed.

For instance, one ISP might specialise in servicing the needs of users of Apple devices, and give preferential performance to iCloud applications; another to Microsoft users and to their Azure/Office365 applications. This differentiation and diversity would result in a material increase in end-user choice over a homogeneous ISP market with “one-size-fits-all” services.

The underlying false assumption is that any expression of intention by ISPs is by necessity aimed at rent extraction. In competitive markets (e.g. mobile in most EU countries, retail fixed broadband here in the UK), such practises are more likely to be a response to demand for more efficient and effective network resource allocation.

The BEREC supposition that a “baseline level of service” should be a precondition to specialized services is squarely at odds with the market structure of Internet services in the UK and, to a lesser extent, in the rest of the EU. The UK provides for a meaningful level of competition between ISPs by unbundling access to physical networks, wires, and basic digital communication services. This market structure is supposed to enable ISPs to pursue different business models such as the ones cited.

BEREC is oddly applying rhetoric from the US net neutrality debate to a market with a fundamentally different structure. Net neutrality is, after all, an attempt to discipline ISP behavior in a market in which unbundling is not operative. Tim Wu’s seminal 2003 paper, Network Neutrality, Broadband Discrimination was a reaction to the FCC’s decision to forego unbundling of the DSL and cable networks in favor of facilities-based competition between DSL and cable. This wasn’t an option in Europe because cable TV was not widely deployed outside of Scandinavia, the low countries, and Great Britain. Wu claimed that unbundling (AKA “open access”) was inferior to net neutrality in any event:

If we agree with the normative goal of network neutrality, to what degree does the structural remedy of open access actually serve its interest? Might we do better by targeting network neutrality directly with questions of broadband discrimination?

I believe there are several reasons to question the fit between open-access remedies and network neutrality. First, the concept of network neutrality is not as simple as some IP partisans have suggested. Neutrality, as a concept, is finicky, and depends entirely on what set of subjects you choose to be neutral among. A policy that appears neutral in a certain time period, like “all men may vote”, may lose its neutrality in a later time period, when the range of subjects is enlarged.

This problem afflicts the network neutrality embodied in the IP protocols. As the universe of applications has grown, the original conception of IP neutrality has dated: for IP was only neutral among data applications. Internet networks tend to favor, as a class, applications insensitive to latency (delay) or jitter (signal distortion). Consider that it doesn’t matter whether an email arrives now or a few milliseconds later. But it certainly matters for applications that want to carry voice or video. In a universe of applications, that includes both latency-sensitive and insensitive applications, it is difficult to regard the IP suite as truly neutral as among all applications. [pages 148-9]

So net neutrality was motivated by the realization that broadband networks do more than simply provide access to the web (“IP neutrality”). They also support non-web applications such as the gaming and conferencing applications I’ve mentioned, and they also provide home telephone and TV services.

So we can’t determine how to regulate “neutrality” until we can answer the question about the applications among which we want to be neutral.

For Berners-Lee, Lessig, and van Schewick this is a simple question: We want to be neutral among large web sites such as Netflix, YouTube, and Facebook. For engineers and network strategists such as Geddes it’s a much harder and broader issue.

To understand how much broader the question really is, review the second half of our podcast with Geddes’ colleague Neil Davies on the report he wrote for Ofcom, the UK NRA.