Toward a Better Open Internet Order
The MIT Technology Review has published an op-ed I wrote about the net neutrality controversy, The Neutrality Delusion. I argue that net neutrality is an important aspiration that still lacks clear definition.
I also point out that the obsession with net neutrality has taken policy wonks’ eyes off the ball for the last 15 years, effectively pre-empting discussions about much more important issues such as privacy and security. Most of the comments – 120 at last count – are positive, but not all by any means.
Some new developments have taken place since I wrote the article that make for stronger arguments.
The 2015 Open Internet Order is essentially voluntary
The DC Circuit opinion denying an en banc rehearing of the order contains a very revealing admission by the two judges who ruled in the FCC’s favor last year. As AT&T’s Hank Hultquist explains, the order doesn’t apply to all ISPs:
According to the concurrence, which was written by Judges Sri Srinivasan and David S. Tatel (the same judges who wrote the underlying decision btw), “the net neutrality rule applies only to ‘those broadband providers who hold themselves out as neutral, indiscriminate conduits’ to any content of a subscriber’s own choosing,” (quoting the underlying decision). The concurrence goes on to say, “the rule does not apply to an ISP holding itself out as providing something other than a neutral, indiscriminate pathway – i.e., an ISP making sufficiently clear that it provides a filtered service involving the ISP’s exercise of editorial discretion.”
This essentially means that any ISP that doesn’t want to abide by the regulations can simply change its terms of service to clarify that it doesn’t offer a neutral, unfiltered service. Since no ISP is absolutely neutral, one could argue that all could avail themselves of the loophole.
No ISP is absolutely neutral
Some come close by specifying fairly broad network management practices: Comcast’s Fair Share system is a protocol-agnostic system that deals with congestion by slowing down bandwidth hogs more than ordinary users, for example.
Fair Share was disclosed to the IETF in RFC 6057. The system is fair, but it’s not purely neutral. No FCC has ever made an issue of Fair Share, even though one of its predecessors was skewered by the Kevin Martin commission.
This is an important distinction because fairness is more important than neutrality. Tim Wu believed that neutrality was a great idea when he floated the idea of net neutrality in 2003, but even he recognized that the Internet is fundamentally unfair to non-web apps:
As the universe of applications has grown, the original conception of IP neutrality has dated: for IP was only neutral among data applications. Internet networks tend to favor, as a class, applications insensitive to latency (delay) or jitter (signal distortion). Consider that it doesn’t matter whether an email arrives now or a few milliseconds later. But it certainly matters for applications that want to carry voice or video. In a universe of applications, that includes both latency-sensitive and insensitive applications, it is difficult to regard the IP suite as truly neutral as among all applications.
Neutral isn’t fair.
What parts of previous Open Internet Orders can be saved?
The preference for neutrality over fairness is a mistake, but one that’s hard to see. Internet unfairness is experienced by real-time applications that find their preference for low latency stymied by applications that insist on sending big chunks of data at the wrong time.
The classic example is one user watching a Netflix stream at the same time that another user is trying to do a video call. Netflix doesn’t stream traffic at a steady rate, it alternates between big chunks of data and longish periods of silence. This is rough on video calls that want a constant delivery rate.
One solution to the video calling problem is to pay the ISP for guaranteeing low latency delivery, but this practice is banned by the 2015 Order for any ISP that chooses not to opt out of the regulations. The 2010 Open Internet order by FCC Chairman Genachowski was much better.
It didn’t offer blanket permission for ISPs to offer low latency services, but it at least left the door open. Overall, the 2010 Order was much more nuanced, offering much better economic and technical analysis. But it still leaves a lot to be desired.
Improving future Open Internet orders
The 2015 Order was a sledgehammer wielded with little regard for facts or consequences. Its economic analysis consisted of a litany of horror stories – incentives for rent-seeking – but no balancing examination of the incentives to keep prices low and quality high.
It’s absurd not to consider the lengths to which ISPs will go to retain old customers, sign up new ones, and keep service calls to a minimum. Regardless of the competition for wired broadband in particular markets, ISPs compete on the stock market against the entire economy.
ISP executives with stock options have incentives to show healthy business growth. Even if Comcast has 65% market share, some 30% of Americans don’t subscribe to wired broadband at home. There’s considerable room for growth.
So the economic analysis needs to be credible. This can begin with a full consideration of the arguments made by both sides. Genachowski’s orders at least responded to the arguments on both sides, but Wheeler pretended all commenters were on board, which they weren’t.
Technical analysis has even more room for improvement
From the Martin FCC’s Comcast order through the following two commissions, the technical analysis in Open Internet Orders has been deplorable. At a minimum, orders need to be grounded in the technical realities of today’s Internet, the standards created by IETF, 3GPP, and IEEE 802 standards for both Ethernet and Wi-Fi.
Genachowski provides a hint for how to do this in his footnote 269:
We recognize that the standards for fourth-generation (4G) wireless networks include the capability to prioritize particular types of traffic, and that other broadband Internet access services may incorporate similar features. Whether particular uses of these technologies constitute reasonable network management will depend on whether they are appropriate and tailored to achieving a legitimate network management purpose.
The footnote is a bit off because it considers bearer classes “network management” instead of service definition. And discussions of this sort should be in the main body of the order rather than in footnotes. Mainly, they should influence the regulations.
Sync up with reality
Orders that are out of sync with public standards are doomed to fail. They may please the more belligerent advocates, but they don’t serve the public interest.
The Commission should not run around issuing service bans without proving that it understands the issues. Some demonstration that it has considered its options is appropriate as well.
The FCC is a very powerful agency, capable of causing billions of dollars of economic loss to the US economy. It’s also capable of holding back new technologies for decades, as it did with mobile phones in the 1950s and ’60s. It’s also capable of enabling and accelerating new technologies, as it did with Wi-Fi and licensed cellular.
Administrative agencies don’t do their best work when consumed with settling scores and playing politics. We’re all going to benefit from FCC actions based on balanced assessment, rational analysis, and good old-fashioned American optimism.
Fortunately, we seem to be headed in that direction even if we aren’t quite there yet.