Let’s not be distracted by shiny objects any more. The Internet still has tremendous promise as well as serious problems to solve. Making it better through continuous experimentation should be the top priority.
When usage, delay tolerance, and loss tolerance are all unknowns, we fall to an unknown level of quality. While this simplifies billing, it doesn’t do justice to the needs of applications, innovation, or investment.
A side effect of switching from the current billing model to a quality-based model is that the unproductive net neutrality debate summarily ends. When users have control over the end-to-end quality of each application transaction, the means used by the provider to deliver the desired quality are unimportant.
Even when the figures for 2016 are taken into account, the numbers show very clearly that Open Internet Orders are a drag on the rate of broadband improvement in the US. The numbers also show that the Title II order did more damage than the 2010 Title I order.
We want our broadband speeds to improve. The data show that the best way to make that happen is to challenge open Internet orders, especially those that classify broadband Internet service under Title II.
What the FCC can do is help to keep large swathes of the American population from falling behind. And it can do this by saying yes to network deployment and innovation. A good first step in that process is to let go of the vacuous virtuous cycle of networks + apps innovation. That argument is illogical.
We need the ability to offer virtual services that use software-defined networking to merge and coordinate diverse applications over the common Internet resource pool. But the regulatory problem needs to be solved by Congress and the FCC before the engineering can create real services