GigaOm Review

Three posts on the GigaOm blog deserve your attention as each one highlights a central issue in network policy. Consider this summary and analysis.

Blair Levin, the czar of America’s National Broadband Plan 2.0 (the 1.0 version being the privatization of the Internet in the early 90s,) writes in support of the subsidies Google received from the two Kansas Cities to build its fiber network in “Now It Gets Interesting: How to Build a Social Contract for Broadband.” The current issue is that the incumbents, AT&T and Time Warner Cable, believe they’re entitled to compete with the new entrant on a level playing field which the subsidies that the cities gave Google have altered:

As we’ve reported on in the past, Kansas City has rolled out the digital red carpet for Google: giving rights of way, prime office space, expedited permitting, fee waivers, and more. In a notable example, the city charged only $10 per pole for Google to string its cable on municipal utility poles—as opposed to the usual $18.95 per pole rate. But now, local incumbents Time Warner Cable and AT&T want to feel the love, too.

Levin argues that Google made what amount to in-kind contributions to the cities in higher speed connections and free service to community anchor institutions such as schools and libraries. It’s easy to calculate the value of the free connections, but hard to put a price on the higher-speed connections to homes and small businesses. There’s a huge difference between the value if dial-up and DSL, a much smaller difference, between DSL and cable, and a smaller difference still between cable and FTTP in a world in which most applications assume that DSL is the base level of connectivity. Given that DSL goes up to 25 Mbps and more today, and the typical Internet server can’t push more than 10 Mbps per the FCC’s broadband performance studies, it would take an economist with better skills than I have to put a price tag on the public benefit of a gigabit per second to every home. We can’t just say it’s worth nothing, of course, but playing fair means that someone will need to establish a fair value to settle the dispute. In any case, the assumption that the U. S. needs to engage in an international race to win bragging rights for high broadband speeds needs to be supported with reference to actual applications that higher throughput will enable. There are some applications I can imagine that might fit the bill, but I’d rather concentrate on the motivations than the mechanisms. However you slice it, we don’t get orders of magnitude increases over what consumers want without some public funding, and that has to be balanced against all the other things the public wants to spend their money on.

Another GigaOm piece by wireless analyst Tim Farrar declares the spectrum crunch a myth. This is becoming a staple of link-bait commentary on the web, a fixation that’s about 90% pushed by bloggers, journalists, and other non-engineers. Farrar offers a new twist on the old claim, however: He claims that CTIA’s figures showing a decline in the rate of increase of mobile data over a six month period prove that the spectrum crunch that started with the iPhone is now over. What’s apparently happened, as Farrar acknowledges, is a slowdown in the rate of increase for mobile data after the two large carriers made their move from all-you-can-eat pricing to metered pricing. The policy change didn’t have an immediate effect, as it didn’t do anything until phones were replaced, but it was intended to reduce the utilization of mobile data to a more manageable level. But pricing changes don’t make more capacity available in and of themselves, even if they do make more cash available for investment in infrastructure upgrades. Nobody should be happy to see a permanent reduction in data usage per user, that much is clear. We don’t know exactly what the applications of the future are, but it’s a pretty good guess they’re going to use more data than Facebook. Check out the lively comments, ignoring those by the trolls.

The nugget for the week is a short post by Stacey Higginbotham, “The Internet is Like the Old Soviet Union Except It Works” on Dennis Weller and Bill Woodcock’s study of Internet peering agreements done for OECD. Despite the unfortunate title (which Stacey admits wasn’t an apt analogy) the article is useful because the study it links is worth reading. Weller and Woodcock maintain that 99.5% of “peering agreements” are made on a handshake basis without written contracts or Service Level Agreements. This doesn’t mean that Internet traffic is free, of course: there are SLAs between ISPs and commercial users, as there are between ISPs and transit providers. But the vast bulk of the infrastructure that the ITU seeks to regulate is currently interconnected on an informal, ad hoc basis, and that works fine for everyone. Check it out and have your eyes opened.