Internet Week in Washington

Congress is back in session with a bang, and the Internet is smack dab in the crosshairs of the Congress in a big way. Just this week, there have been hearings on rogue sites legislation in the Senate Judiciary Committee (Video,) a hearing on net neutrality and competition in the House Judiciary’s Subcommittee on Intellectual Property, Competition and the Internet, and a full grilling of the FCC commissioners in the House Energy and Commerce Subcommittee on Communications and Technology (Video.)

I took part in a staff briefing last week to help bring new staffers up to speed on the Internet, as did many others I’m sure. The level of the dialog in the Senate and House E & C committees was pretty good, and while I haven’t heard the Judiciary hearing yet, the witness list suggests it was good as well; Larry Downes was on it, and he’s one the best-informed all-around scholars in the country on this subject.

The technical subjects around the rogue sites issue pertain to whether the proposed measure will “break the Internet” as opponents charge, and to a lesser extent, whether such actions as domain seizures will be at all effective. “Breaking the Internet” sounds like a dire prediction until you realize that it’s a charge that has been made rather routinely against every significant change in received Internet practice, from TCP/IP itself on down to the anti-spam blacklists, and it generally goes hand-in-hand with the charge that people will simply circumvent the measure. Astute readers will note that these two charges cancel each other out: If we assume that a given technical measure will be widely circumvented, it’s not particularly consistent to claim that it’s going to “break the Internet.” How does that happen if everyone’s avoiding the measure?

The “breaking the Internet” claim depend on the assumption that people who want to download pirated content – which is apparently everyone, according to the critics – will switch to some as yet non-existent rogue DNS that provides addresses for blacklisted web stores that deal in unlawful content. It’s not really necessary to go that far, of course: Editing the right local file will do to supplement the standard DNS, but people who’ve been tricked into purchasing unlawful content aren’t going to do that, much less go to a rogue DNS that redirects them to bogus sites instead of the ones they want. What may not be apparent to the Congress – although the truth is certainly starting to dawn on them – is that the so-called “Internet engineers” who signed a letter written for them by the attorneys at the EFF are not the most articulate people in the world, so when they say “this is going to break the Internet” they actually mean something like “I don’t like this because it feels strange.” Any number of IETF standards have been criticized for their potential to “break end-to-end” as well; engineers aren’t known for their facility with the English language.

The Senate committee seemed to have an epiphany during the hearing on the question of blocking access to overseas sites that don’t rely on a US registrar who willingly complies with the ICE enforcement measures today, namely that search engine regulations can do the trick. If Google and Bing are willing to delete blacklisted domains from their search results, the rogue sites problem is greatly reduced. This is one case where the Internet’s tendency toward consolidation can be leveraged to address a problem. Witnesses seemed to suggest that the DNS blacklist is not an essential enforcement tool provided search blacklists can be adopted in its place, or in addition to it. If I could only have one tool, it would be stopping the credit card transactions, but multiple tools are better.

The House Communications and Technology subcommittee hearing was a marathon affair that made me happy to have three monitors on my desktop so I could let it play while working on some other things, pausing for the dramatic parts. Chairman Walden is extremely well-briefed on the subject, and was all over every aspect of the issue that matters. He has the distinction of being a former FCC license holder who’s very familiar with the powers of persuasion the Commission holds over similar firms. Walden was especially effective when he brought out the language of the Comcast/NBCU merger conditions that require the merged firm to abide by the Open Internet rules even if they’re struck down in court. That’s not a condition that suggests great confidence in the agency’s authority.

The key issue on the technical side of the debate, although it’s minor compared the question of FCC authority, is whether the Internet demands an “all packets are equal” rule that bans the practice that Free Press and friends labels “paid prioritization.” The term is misleading because prioritization is one means of providing Quality of Service for premium or very low cost transit services, but not the only one. So the terminology itself displays a “fallacy of composition” on the parts of the neutralists (confusing the part with the whole.) Chairman Genachowski said that the Internet has never used QoS. which is sorta true and sorta not, depending on which part you examine and during what period of time. QoS is certainly part of the architecture of the Internet and the subject of a number of RFCs. Genachowski’s response indicates that he’s not nearly as well briefed as Walden is on the subject of Internet norms and standards, which is unfortunate. The FCC is supposed to be the technical expert, after all, while the Congress is supposed to be the policy arbiter. The apparent issue is that these two bodies need to get their roles realigned in terms of their actual statutory relationship. The best way to do that is for the Congress to update the Communications Act, which is going to take a while.

One thing that is clear is that the passage of the FCC’s Open Internet order didn’t clear the air regarding net neutrality; the plague of the greatest public policy boondoggle of the 21st century is still very much with us.