GTT’s Growing Pains Behind Over-Hyped Congestion Claims
Last Wednesday, I filed a story on claims by Free Press’s “Battle For the Net” that AT&T and other ISPs were causing massive Content Delivery Network (CDN) slowdowns at several interconnection points on the US Internet. In the story, I examined the methodology used by Free Press’s Internet Health Test to determine interconnection speeds and found it lacking because it employs routes that wouldn’t make any sense on a Internet organized for CDN performance. To determine interconnection speed from my office in Denver, the Internet Health Test routes traffic from Denver to Dallas and then back to Denver before it encounters a test server, then it goes back to Dallas and then back to Denver again before a measurement is made. The interconnected networks are two of the smallest networks in the US, neither of them is a CDN operator, and the convoluted route induces as much delay as a coast-to-coast route.
This was a story because UK Guardian writer Sam Thielman re-wrote a press release from Battle For the Net (which he called a “study”) and filed the rewrite as a story. Free Press summarily deleted the press release from Google Docs and replaced it with some links to other blog posts. While all this was going on, New America’s Open Technology Institute had a conference call with the FCC concerning the proposed merger of AT&T with DirecTV and filed an ex parte with the FCC reiterating the claims in the now-erased press release. It’s sad to say that this is the state of tech reporting and “public interest” advocacy today.
The smoke has officially cleared, and we now know what was going on beneath the surface of all this remonstration and gnashing of teeth. It turns out I made a pretty sound guess in a comment on Dan Rayburn’s Streaming Media blog: “Seems to me that the real story here is that GTT – a very, very small time transit network – has interconnection issues with several ISPs in several cities. This is probably GTT’s growing pains and isn’t really interesting to many people.”
Rayburn had a conference call with GTT in which they explained their growing pains:
GTT actually turned down capacity at interconnection points as they are shifting their flow of traffic because of acquisitions they have done in the market and consolidating how they connect with ISPs. In the last six years, GTT has acquired five companies (WBS Connect, PacketExchange, nLayer Communications, IP network Tinet from Inteliquent, UNSi) and a few months ago, announced an agreement to acquire their sixth, MegaPath.
As a result of these acquisitions, GTT is rebalancing its network. They’ve got commitments from several ISPs for increased interconnection capacity, but they’ve essentially been so busy rationalizing their routes and reorganizing that they haven’t even been able to turn up their new capacity quite yet.
It’s also interesting that no one from the press or the “public interest” community contacted GTT for their side of the story. As Rayburn says:
It should also be noted that GTT told me that no one from the Free Press ever contacted them, before the Free Press laid blame on the ISPs. If they had, GTT would have been able to inform the Free Press of some more details, which the Free Press neglected to even look into.
While Free Press didn’t have time to contact GTT, they had time to rewrite their “study” and OTI, partners with Free Press in the Battle For the Net campaign, had time to give a factually deficient briefing to the FCC on things that didn’t actually happen. Even the poor programmer who works on the M-Lab test system for OTI had to write a blog post explaining that the data doesn’t say who’s to blame:
It is important to note that while we are able to observe and record these episodes of performance degradation, nothing in the data allows us to draw conclusions about who is responsible for the performance degradation.
But that message isn’t the one that OTI delivered to the FCC. OTI pointed the finger of blame in precisely the wrong direction:
The results of the Internet Health Test demonstrate that many consumers are not getting the broadband service they paid for. Interconnection congestion is widespread throughout the United States, even after the Commission adopted the Open Internet Order. M-Lab’s research contradicts AT&T’s recent assertions that the interconnection marketplace is “functioning” and “competitive.” Indeed, at the same time AT&T was making this assertion before the Commission, the Internet Health Test was observing interconnection pairs that appeared to be non-functional. This ongoing conduct is evidence of a market failure in what has historically been a healthy and competitive market for interconnection services. The fact that AT&T customers are suffering more than most underscores the need for close scrutiny of its proposed transaction with DIRECTV.
Knowing what we know now, no honest person can stand behind that analysis. It remains to be seen whether the authors of OTI’s message to the FCC – Sarah Morris and Joshua Stager – will correct these false statements. There’s only one ethical path.