Ofcom Report Raises Questions about Feasibility of Net Neutrality Regulations

[powerpress]

photo_2160987_orgIn Part 2 of our interview with Neil Davies, the network expert discusses the results of a study commissioned by the British regulator Ofcom, and what lessons can be drawn from it. Ofcom had wanted to devise regulations for Internet traffic management, so they asked Davies to write a report on tools that claim to detect differential traffic management in order to determine their effectiveness. After extensive analysis, Davies found that none of the highly-touted tools actually succeed at detecting most forms of traffic management. The reason for the ineffectiveness of the these tools is the complexity of the Internet; when there’s a delay between two endpoints, the tools are unable to pinpoint the cause of the delay.

Davies’ findings cast a shadow on net neutrality regulations, which are an attempt to ban behavior that generally can’t be detected, let alone measured, HTF Editor Richard Bennett says. And even if the behavior could be measured, it’s not always anti-consumer. “If I was on a business Skype call from my house while the kids were watching multiple video streams, I would like it to be differentially treated because I would like my voice call not to fail,” Davies explains. “Differential management itself is not necessarily against the end-user’s interest. It could be very much for it.”

Differentiated traffic “obviously has a value to the end user,” and could “potentially garner a price premium,” Davies says. Enabling the network to better run the applications that the user wants gives the network more value. For instance, non-real-time bulk updates don’t require high prioritization, and users should be able to choose which traffic gets priority, Davies says.

Any attempt at measurement could draw the wrong conclusions. Bennett posits: “If I’m slowing down the video streaming because I want the Skype to look better — and my critics only look at the slowing down of the video streaming and they don’t look at the Skype performance — then I’m not being fairly assessed.”

Adds Davies: “And whether that was an intentional malfeasance or not is another question.”

Davies’s Ofcom report concluded that one potential solution is for regulators to require a certain “quality floor” that everyone could rely on. Using a “Delta Q” measurement, regulators could decide, for instance, that everyone should have the ability to do telemedicine from their house, which would require a certain level of upstream and downstream quality. “You don’t go to the ISPs and say you must make telemedicine work,” Davies says. “What you do say is ‘You must be able to deliver this quality and quantity.'” For measurement, the most promising approach, Davies says, is passive measurement, where measurement tools extract information from live networks to determine end-to-end delays. Ultimately there’s a  need for more research in this area.

Active management of a network is crucial, as is a recognition of what an application’s requirements are, Davies and Bennett agree. Adds Davies: “If you don’t do that – if you say everything has to be treated the same, and you try to legislate in some way, regulate for that … you’re going to say all of the traffic must be carried with the transport characteristics of the most demanding application.” And the economics don’t work when you do that.

Bennett has commended the UK regulator for doing “something the FCC should have done a year ago” when it commissioned a study of traffic management tools and techniques “aimed at advising the regulator of the limits of its knowledge.” In contrast, without engaging in any similar study of traffic management techniques, America’s broadband regulator has been imposing “harsh new rules on the Internet aimed at making it behave more like the old-fashioned telephone network” — a move being challenged in the courts.