Reimagining the FCC for the 21st Century
I read an intriguing article on Airbus’s plans for flying cars in Digital Trends. Before the end of the year, Europe’s aircraft maker will be testing both single-passenger and no-passenger flying vehicles. They sound like up-sized, battery powered drones, right out of the Jetsons. Such vehicles will support a ride-sharing model, probably the only practical way to provide transport in our increasingly populous cities.
19th Century Myths
While Europe’s government-owned aircraft producer tries to visualize the future of aviation, America’s telecom regulator expressed nostalgia for the 19th century in his farewell address. Departing FCC Chairman Tom Wheeler gave a speech at the Aspen Institute arguing nothing much has changed for communications since the telegraph era:
The idea of an open network goes back as far as the “first-come-first-served” traffic management of the telegraph. Telephone networks’ common carrier status was an extension of this concept warranted by a behavioral legacy and a demonstrated exercise of monopoly power. And, let’s not forget, it was the open telephone network that delivered the early Internet and allowed America to get online. America’s communications history is one of open networks.
The idea that Internet Service Providers must handle all the information they receive in a “first-come-first-served” manner is essentially the extreme version of an open network urged on the FCC by fringe advocacy groups such as Free Press (page 4):
The Internet generally operates along a First-In, First-Out (FIFO) model, in which packets are forwarded in the order in which they are received. Prioritization changes this design by increasing the delay of some packets while reducing the delay of others. Therefore, even if prioritization keeps the average packet latency the same (i.e., if DPI does not increase average latency), it will still increase the standard deviation by slowing low priority packets and speeding up high priority.
Great story, but it’s not actually true.
Internet Facts
Differentiation doesn’t necessarily produce service distortion. The Broadband Internet Technical Advisory Group (BITAG) showed how this works in its comprehensive report, “Differentiated Treatment of Internet Traffic.” Because different applications have different network service requirements, it’s desirable to depart from the first-come-first-served model in order to optimize end users’ Quality of Experience:
When differentiated treatment is applied with an awareness of the requirements for different types of traffic, it becomes possible to create a benefit without an offsetting loss. For example, some differentiation techniques improve the performance or quality of experience (QoE) for particular applications or classes of applications without negatively impacting the QoE for other applications or classes of applications. The use and development of these techniques has value.
This capability was designed into the Internet in its earliest days:
The ability to treat traffic differentially has been built into Internet protocols from the beginning. The specifications for both IPv4 and IPv6 have included fields to support traffic differentiation since their inception (initially IPv4’s Type of Service or ToS field) to indicate to routers the quality of service desired, in terms of queuing precedence and routing parameters around delay, rate, and reliability. This was changed to more generic service descriptions with the definition of the Differentiated Services Field, and implemented in IPv4 and IPv6. Notably, traffic differentiation in this sense has not been implemented in multi-provider environments, although it is extensively used within specific networks. End to end deployment would require the harmonization and cooperation of a large number, if not all, of the relevant network operators.
So the FCC’s open Internet regulations are inconsistent with both Internet theory and Internet practice.
Telegraph Facts
And indeed, the telegraph network did not treat all messages equally. Since the 1920s, telegraph operators gave consumers the choice between higher-priced full rate messages with immediate priority or the lower priced Day Letters and Night Letters with deferred delivery:
The basis for computing tolls on telegrams is the minimum charge for ten words. In the regular full rate telegram or the night message, it costs as much to send one word as it does to send ten words. Each additional word above ten is charged for at varying rates according to the original basic charge, which depends upon distance.
With the inauguration of the Night Letter and Day Letter services, however, the original method of computing tolls was somewhat modified. In these services, the ten word minimum is not observed, and fifty words is used as the basis of computation. Additional words are charged for in groups of ten.
Night Letters and Day Letters are known as deferred services. Full rate messages take precedence over them.
Consequently, if the FCC’s open Internet regulations really were based on history, they would not prohibit “paid prioritization” consistent with telegraph practice and Internet architecture. And if Mr. Wheeler were as studious about the history of communications networks as we expect scholars to be, he wouldn’t misstate that history in a way that serves current agendas.
It’s Not Just Technology
The Wheeler address covered the familiar notion of the “network compact”, asserting that nothing has changed in telecom since the heyday of monopoly networks but technology. While technology certainly has changed, some would argue that changes in market structure are more significant. While we had monopolies in telegram and telephone services (after a brief period of fierce competition,) IP networks have always existed in a competitive market (after a brief period of monopoly hangover.)
Consequently, it’s disingenuous to maintain that the only difference between today’s networks and those the 19th and early 20th centuries is technology. The technology we have today has made competition in wireline services viable by enabling formerly distinct networks – cable, telephone, satellite, and mobile – to compete with each other. It has also allowed wireless carriers to build multiple infrastructures in the same territories. So we can buy telephone, television, and Internet service from terrestrial wireless, satellite, and wireline networks in every major market in the US.
Without competition, we relied in regulators to define network obligations in antitrust deals such as the Kingsbury Commitment of 1913. The terms of this deal define Wheeler’s network compact today. While Wheeler’s idea is a reasonable survey from the regulator’s point of view, the fact that he has felt a need to articulate such conditions shows a degree of contempt for consumers as well as a willful disregard for current and future market conditions. Markets outperform regulators when they function well.
Charting a Course for the FCC
Wheeler steps down on Friday, so it’s opportune to think about how the FCC would function with stronger leadership. Over the weekend, John Eggerton broke a story on Multichannel News about the course the FCC Transition Team has chosen, “Exclusive: Trump Team Embraces FCC Remake Blueprint“. According to the story, the FCC will be re-organized according to terms offered to Congress by AEI scholars in 2014:
The majority plan was said to dovetail with comments from Eisenach and Layton to Congress in 2014 as AEI scholars. Their two main conclusions were that “the historical silo-based approach to communications regulation is inapposite to the modern communications ecosystem. Second, the Federal Communications Commission’s (“FCC,” or “Commission”) functions are largely duplicative of those of other agencies.”
I have some familiarity with the AEI plan as I was a co-author. While it’s not very detailed, it successfully highlights the fact that the FCC is an agency designed to enforce an obsolete law. The Communications Act identifies a group of “technology silos” that grew up in the monopoly era: old-fashioned telephone service, cable TV, broadcasting, and mobile phone service. It then devises technology-sensitive regulations for each silo, according to the traditional “regulated monopoly” practice. But these regulations cover the service offered over the network rather than the generic service IP provides.
The Communications Act considers all siloed services to need sector-specific regulation by a specialist agency rather than the kind of general economy regulation performed by the FTC. To the extent that the former silos compete with each other to provide IP services, they exhibit the same economic characteristics as any other industry. That argues for transferring duties to the FTC, as does the fact that the FTC is already heavily involved in cybersecurity and privacy. The FCC’s efforts in privacy and security are quite laughable, and the joke’s not funny. So transfer as much as possible to the trade commission.
Spectrum is Most of What’s Left
Spectrum, on the other hand, warrants an expert agency that has more expertise than the FCC currently has and many fewer people. Spectrum is not so much a legal issue – the FCC’s expertise – as a technical and economic problem. The FCC has done reasonably good things with spectrum in the last decade, so there’s something to build on. The executive branch has not done as well with spectrum, as it has been buffeted by the winds of fashion and politics.
I’d like to see a new agency created with the spectrum experts from NTIA and FCC assigned to transfer spectrum from obsolete allocations to rational ones. This agency would need to have the power to take spectrum from the Department of Defense, which sets the stage for a showdown. I’ll write more about that later.