When faced with the need to either stagnate or grow, Novell chose the status quo path. Let’s hope Orem doesn’t repeat the error with UTOPIA. It might have been a great idea in 2002, but the visions many of us had of networking in those days were blind to the progress that was possible for wireless. That was a serious miscalculation.
Before the broadband benchmark is adjusted again, the FCC really does need to lay out a methodology for coming up with the numbers. It appears than the 25/3 standard was driven by the desire of Netflix to stream 4K video everywhere.
The tribal forces of the left appear to be forming a drum circle around the idea that rural broadband is entirely screwed up in the US so we need to create thousands of broadband co-ops to solve it the problem in a few decades. I think we can do a lot better, but only if we can forget about the tribal identities and apply some reasoning informed by facts.
End-to-end is part of Internet history, but so is traffic differentiation. On the one hand, some forms of discrimination at the packet level are constructive. Applications have different needs and it’s good for networks to provide them with the type of service they desire.
If net neutrality is what its supporters say it is – the best overall way of setting expectations and managing Internet service agreements, it should be expected to become self-executing at some point. I think we passed that point about ten years ago, but we will see what we will see.
The nice thing about focusing on wireless for the final leg of the extended broadband system is that it doesn’t duplicate effort or waste money. Despite the glory of fiber optic networks, people want mobility. So wireless is going to be part of the solution regardless. Why don’t we just accept that and concentrate on building the best wireless networks first and fill in with fiber only when and where it’s truly needed?