How to Kill Wi-Fi

As we all know, Wi-Fi is a killer technology that’s been enormously important to innovators, consumers, and the Internet. It’s a very humble technology that only tries to replace Ethernet cables as a means of connecting to a local network, however. Despite this humble goal, Wi-Fi is an enormous success; or perhaps it’s a success because its goal is so humble. Wi-Fi has never sought to be all things to all people: In particular, Wi-Fi has never tried to replace the wired Internet, the telephone network, cable TV, or even the cellular network. It simply obviates the need to pull cables to every place where you might use a laptop, iPad, or smartphone, but that turns out to be a very powerful thing.

Bluetooth is a similar technology, one that has an even more limited range than Wi-Fi and one that has even more devices and users in play. Billions of people use these two unlicensed, low-power wireless systems.

Wi-Fi’s success has never been inevitable, however. The regulatory policy that made Wi-Fi possible could easily have gone a different way: the FCC could have required some sort of license to use Wi-Fi, it could have taxed it, it could have imposed a technology mandate on the Wi-Fi protocols (as regulators in Europe did with cell phones, thereby reducing the Continent to a follower rather than a leader in rolling out new generations of technology.) The easiest and surest way for the FCC to have strangled Wi-Fi in its crib would have been allowing a higher transmit power level.

At any radio frequency, more power means more coverage. If you increase the transmit power level, you ensure that radio waves will go farther before they spread so much that signal disperses into noise.

The implications of stronger signals aren’t too hard to understand if you’re willing to think about Wi-Fi from a system perspective. Some folks tout systems that would increase Wi-Fi power, commonly using the term “Super Wi-Fi” for them. Advocates of unlicensed use of the TV White Spaces tend to make this error quite frequently, but any time you see or hear “Super Wi-Fi” you should check your wallet because you’re about to be ripped off.

“Super Wi-Fi” is actually a system that would prevent Wi-Fi from working in most cases.

When you think through the implications of boosting the signal power level, this should become plainly obvious. The main purpose of Wi-Fi, once again, is replacing an Ethernet cable. The specifications for Ethernet cable says it can’t be more than 300 feet long. There are a lot of technical reasons for this, but the original decision – which I was a part of – picked 300 feet because we found that the average maximum length of office telephone wire runs was no more than 200 feet. In other words, most office workers sit within 200 feet of a wiring closet where the cables for their desk phones terminate. Increasing the spec by 50% meant that we could be reasonably sure that any office that could be wired for phones could just as easily be wired for Ethernet.

In fact, the original idea for Ethernet over Twisted Pair (StarLAN) used extra telephone wire pairs instead of Cat. 5 Ethernet cable. The speed of StarLAN was limited to 1 Mbps, so it was quickly replaced by 10BASE-T at 10 Mbps over new wires and a series of other standards up to hundreds of gigabits per second. So the layout of office buildings made the rules for Ethernet and Wi-Fi, just as the width of Roman ox carts made the rules for the railways 2000 years after the ox carts were designed.

Designing Wi-Fi for 300 feet of propagation in any direction from the access point allows the spectrum that it uses to be reused every 600 feet. If my access point covers a theoretical 300 foot radius, I can connect from just about any place within a 600 foot diameter. Outside my circle, someone else can operate their AP on the same channel as mine, and we don’t interfere with each other, and so on and so on.

Each access point covers 300,000 square feet or less. In the 50th most densely populated American city (Pasadena, CA) there are 5700 people per square mile or one person per 4800 square feet. So on average, a Wi-Fi access point covers 60 people. They’re not all Wi-Fi users – some 30 percent are not online at all, and others are toddlers – and those who are Wi-Fi users don’t use Wi-Fi at the same time. The average number of users connected to any Wi-Fi access point is less than one, in fact. During peak usage time, the number of users per channel is probably 10 or so, about a quarter of the potential users.

So Wi-Fi is a system designed to serve a potential user population of 40 people per channel, with no more than 10 active at a time. In practice it can go higher, but performance suffers as more people share the spectrum and have to wait for each other. In urban apartments, prime time Wi-Fi usage in the 2.4 GHz band is pretty horrific.

Now what happens if Wi-Fi signals could travel 3,000 feet instead of 300? This would simply require Wi-Fi to transmit at 1 watt, half the power of cellular GSM at 900 Mhz (two watts,) or ten times as much power as Wi-Fi uses in the 2.4 GHz (802.11 b/g/n) today, a tenth of a watt. That kind of level seems to be what the TVWS people want.

This doesn’t simply pump up the number of potential users to 400.  In this scenario, we’re looking at 28 million square feet, 4,000 people, or 1000 active users during prime time. If 100 users have a hard time sharing a Wi-Fi channel, how much joy would 1000 users have? You don’t need to be a rocket scientist to see that this would be a problem.

If you wanted to design a wireless network that shares spectrum among 4,000 potential users, with 1000 active at a time, you’d need 100 times more spectrum than we have for each Wi-Fi channel. “Super Wi-Fi” not only does not have more spectrum per channel than Wi-Fi, it has four times less per channel, 5 MHz vs. 20. There’s obviously nothing super about such a system: 100 times more users in a quarter of the spectrum doesn’t translate into peace and harmony.

The opportunity, however, is to define a system that uses more power specifically to cover rural areas with less population density. The critical factor turns out to be people per Hertz, and we know there aren’t as many people in the boondocks as in Pasadena. High-power spectrum for rural areas would be a winner because it would enable people who come to the Internet over satellite today to get there from a low-latency earth-based system. Super-sized Wi-Fi is not a generally useful tool in areas of greater population density, however.

So don’t be fooled. The surest way for the FCC to make Wi-Fi fail would be to increase its allowed transmit power or to increase its coverage by giving it access to “beach front” 700 MHz spectrum.

Sometimes it’s better to pursue humble goals successfully instead of grandiose ones that don’t make any sense in terms of basic arithmetic.