Regulators Stoke Gatekeeping Fears

FTC Commissioner Terrell McSweeny and former FCC Chief Counsel Jon Sallet have written a frightening editorial for Wired: “Kill the Open Internet, and Wave Goodbye to Consumer Choice“. If the open Internet is on the brink of death, I’d certainly be worried, not just for “consumer choice” but for the economy, the political system, world peace, and the starving children of the developing world. And so should you.

But the article doesn’t exactly make the case that the Internet’s head is on the chopping block. In fact, it contains so many counter-factual declarations it makes me happy. The Internet is clearly not in danger if this is all they’ve got.

The article deals with the net neutrality debate in the US. As there is no power in this nation that can bring the Internet to its knees, I’m not worried. But the authors are so poorly informed I wonder if there isn’t an epidemic of bad information 15 years after this debate kicked off.

Defining Net Neutrality

By way of defining net neutrality, the article claims: “First, for more than a decade, the status quo in the US has been an open internet that supports thriving innovation among websites, apps, and new digital services.” This is too modest. In fact, the status quo in the US has been an open Internet for nearly 30 years, starting with the founding of The World, the planet’s first consumer-oriented Internet Service Provider in 1989.

It goes on play the monopoly card: “Second, innovators and consumers are dependent on a few large broadband providers that serve as gatekeepers to the internet.” Yet most Internet access comes from mobile devices that happily bounce between multiple cellular and “wired via Wi-Fi” networks each day.

Network choice is limited. Most consumers are limited to two choices for fixed-location connections and typically have no more than 10 mobile networks to choose from. But we also have access to other choices at work or in public locations, which helps a lot.

What Gatekeeping Looks Like

Broadband markets are not what gatekeeping looks like. If we had no more than one or two meaningful choices for vital Internet services, that would be gatekeeping. European consumers know about this kind of gatekeeping, hence their government has issues with a dominant search provider; issues we don’t have in the US.

The EU has fined Google $2.7 Billion for abuse of its gatekeeper status over search. But neither the FCC nor the FTC has done any such thing here. The FTC did conduct a cursory examination of Google a few years ago.

While staffers wanted to sue the search giant, political appointees would have none of it.

Gatekeepers Don’t Matter Much

McSweeny and Sallet ask us to ignore reality:

Pressure on these broadband companies to deliver better options and prices for consumers is already vanishingly small. That’s true even in mobile broadband, where the presence of four nationwide carriers continues to deliver better results in the form of unlimited data plans and other options.

Prices for mobile broadband are falling, options are increasing, and unlimited data plans are on the rise. The Wall Street Journal says the impact of brutal price competition is rippling through profits, inflation and antitrust law. So what is this, some sort of mirage?

It seems like bait-and-switch. Rather than addressing the issues the EU sees with Google’s real gatekeeper power, the Obama era regulators urge us to ignore the elephant in the room and worry about the firms whose prices are falling as they struggle in larger and more obvious ways to earn our business.

This requires them to convince us that ISPs harbor unexpressed wishes to corner the market for broadband. While every capitalist firm would certainly like to wield market power, technology has made this increasingly difficult for infrastructure companies like our ISPs.

Why the Internet is Open

The Internet has always been open to consumers, applications, and services because all of the commercial players have learned that it’s best for everyone for it to be open. We don’t really need regulatory actions to protect a system that is not really under attack.

To claim that the Internet is actually in trouble is to smear particular companies that have never acted in a way that’s contrary to the health of the Internet. So the crisis allegation is a canard even if it’s not misdirection.

The path to profitability in the Internet economy doesn’t flow through raising prices and limiting access. Rather, it depends on acquiring customers first and figuring out how to monetize them later. Because there is increasing competition for ISP services, “later” never comes.

But this is not the case in search, social networks, retail, and video streaming. While mobile prices are falling, video streaming prices are rising.

Not content with being the dominant provider of Internet retail, Amazon is looking for higher prices by taking over Whole Foods, the nation’s high price grocery (and placebo) chain. And the Internet ads that pay for Google and Facebook cost more than ever. The gatekeeper allegation is misdirected.

Cherry-Picking Broadband Data

The regulators claim we have very limited broadband choice:

Roughly 21 percent of US census blocs have no high-speed landline broadband provider, and 37 percent have only one option. This is no choice at all. For downloading data at 100 Mbps, 88 percent of the country has either no option or just one provider.

This data is cherry-picked. The FCC’s most recent report on wired broadband says 97% of census blocks have two or more providers in the 10-24 Mbps range (See table 4.) The UK’s Ofcom considers 24 Mbps to be “ultra-broadband”, while the Wheeler FCC considered 25 Mbps the minimum speed for “real broadband”.

Accepting that definition, the FCC says 42% have two or more choices. But this data excludes wireless options that are getting faster and cheaper, especially when they use house-mounted antennas.

These services reach half of American homes. And virtually all of the US is reached by satellite services that are now as fast as 25 Mbps.

Beware of Mergers

Faced with an ever-increasing regulatory burden, ISPs are increasingly diversifying into content, a largely unregulated space. The regulators see this as problematic, even though they helped create the market dynamics that made it inevitable:

The incentive for broadband companies to discriminate against online video providers will only grow stronger as the market becomes more competitive, as it has recently with the arrival of services that carry live television channels just like traditional cable operators.

Yet the leaders in live TV over Internet connections are ISPs looking to expand into markets outside their broadband footprints. AT&T’s DirecTV subsidiary offers price-competitive skinny bundles to all comers, not just its broadband customers.

And they want the same deal in cable markets that they give Netflix users among its own customer base. And it’s not like Netflix is hurting: its share price went up 2800% during the Obama administration and it now has more than twice as many customers as the largest ISP.

No Real Economic Analysis in 2015 Open Internet Order

The regulators claim the FCC’s 2015 Open Internet Order was loaded with economic and technical analysis:

The 2015 Open Internet Order set forth 16 pages of economic and technological analysis to support the conclusion that “broadband providers (including mobile broadband providers) have the economic incentives and technical ability to engage in practices that pose a threat to Internet openness by harming other network providers, edge providers, and end users.”

But this “analysis” (from paragraphs 78-103 of a 400 page order) consists of references to comments filed by advocacy groups in support of the FCC’s ad hoc “virtuous cycle/circle” hypothesis. Only two of the references are from critics, and they’re misstated.

The 2015 OIO is completely devoid of any support for the virtuous cycle/circle hypothesis either empirically or from the economics literature.

No Real Technical Analysis in the 2015 OIO Either

The Order’s technical analysis consists of a distorted summary of the BITAG report on traffic differentiation (paragraph 85):

Techniques used by broadband providers to identify and select traffic may include approaches based on packet payloads (using deep packet inspection), network or transport layer headers (e.g., port numbers or priority markings), or heuristics (e.g., the size, sequencing, and/or timing of packets). Using these techniques, broadband providers may apply network practices to traffic that has a particular source or  destination, that is generated by a particular application or by an application that belongs to a particular class of applications, that uses a particular application – or transport-layer protocol, or that is classified for special treatment by the user, application, or application provider.
In fact, the BITAG report (to which I was a contributor) simply provided an inventory of techniques. It did not say that ISPs use these techniques across network connections (page i):
The ability to treat traffic differentially has been built into Internet protocols from the beginning. The specifications for both IPv4 and IPv6 have included fields to support traffic differentiation since their inception (initially IPv4’s Type of Service or ToS field) to indicate to routers the quality of service desired, in terms of queuing precedence and routing parameters around delay, rate, and reliability. This was changed to more generic service descriptions with the definition of the Differentiated Services Field, and implemented in IPv4 and IPv6. Notably, traffic differentiation in this sense has not been implemented in multi-provider environments, although it is extensively used within specific networks. End to end deployment would require the harmonization and cooperation of a large number, if not all, of the relevant network operators.
But this distortion of the Internet architecture has been endemic at the FCC, which raises doubts about the agency’s ability to understand Internet technology, let alone to police it.

Touting Prophylactic Solutions Over Antitrust Enforcement

Finally, the regulators tout prescriptive regulations over traditional antitrust by invoking legal rather than technical or economic opinion:
Supreme Court Justice Anthony Kennedy faced precisely this argument when he wrote the majority opinion in the Supreme Court case upholding requirements that cable systems carry broadcast stations. He wrote that regulation could be preferred to antitrust because of “the considerable expense and delay inherent in antitrust litigation, and the great disparities in wealth and sophistication between [TV stations and cable systems],” as well as the burden of bringing a case, which would require “considerable expense and delay.”
This line of reasoning was predicated on the belief that net neutrality violations would be common without prescriptive rules. But history has shown that ISPs are discouraged from violating the Internet architecture by non-regulatory factors such as gaining and retaining customers.
In reality, the FCC has only settled one net neutrality violation, the 2014 Madison River incident. With complex technical and economic issues in the mix, any ruling by the FCC is headed for court anyway, and the courts can easily handle one incident every 30 years.
So what we see here is an editorial that repeats common claims and fails to support any of them in a serious way. I like it when attempts at fear-mongering are so easily dismissed.