When usage, delay tolerance, and loss tolerance are all unknowns, we fall to an unknown level of quality. While this simplifies billing, it doesn’t do justice to the needs of applications, innovation, or investment.
A side effect of switching from the current billing model to a quality-based model is that the unproductive net neutrality debate summarily ends. When users have control over the end-to-end quality of each application transaction, the means used by the provider to deliver the desired quality are unimportant.
Even when the figures for 2016 are taken into account, the numbers show very clearly that Open Internet Orders are a drag on the rate of broadband improvement in the US. The numbers also show that the Title II order did more damage than the 2010 Title I order.
We want our broadband speeds to improve. The data show that the best way to make that happen is to challenge open Internet orders, especially those that classify broadband Internet service under Title II.
What the FCC can do is help to keep large swathes of the American population from falling behind. And it can do this by saying yes to network deployment and innovation. A good first step in that process is to let go of the vacuous virtuous cycle of networks + apps innovation. That argument is illogical.
We need the ability to offer virtual services that use software-defined networking to merge and coordinate diverse applications over the common Internet resource pool. But the regulatory problem needs to be solved by Congress and the FCC before the engineering can create real services
The nice thing about focusing on wireless for the final leg of the extended broadband system is that it doesn’t duplicate effort or waste money. Despite the glory of fiber optic networks, people want mobility. So wireless is going to be part of the solution regardless. Why don’t we just accept that and concentrate on building the best wireless networks first and fill in with fiber only when and where it’s truly needed?