How to Kill Wi-Fi

As we all know, Wi-Fi is a killer technology that’s been enormously important to innovators, consumers, and the Internet. It’s a very humble technology that only tries to replace Ethernet cables as a means of connecting to a local network, however. Despite this humble goal, Wi-Fi is an enormous success; or perhaps it’s a success because its goal is so humble. Wi-Fi has never sought to be all things to all people: In particular, Wi-Fi has never tried to replace the wired Internet, the telephone network, cable TV, or even the cellular network. It simply obviates the need to pull cables to every place where you might use a laptop, iPad, or smartphone, but that turns out to be a very powerful thing.

Bluetooth is a similar technology, one that has an even more limited range than Wi-Fi and one that has even more devices and users in play. Billions of people use these two unlicensed, low-power wireless systems.

Wi-Fi’s success has never been inevitable, however. The regulatory policy that made Wi-Fi possible could easily have gone a different way: the FCC could have required some sort of license to use Wi-Fi, it could have taxed it, it could have imposed a technology mandate on the Wi-Fi protocols (as regulators in Europe did with cell phones, thereby reducing the Continent to a follower rather than a leader in rolling out new generations of technology.) The easiest and surest way for the FCC to have strangled Wi-Fi in its crib would have been allowing a higher transmit power level.

At any radio frequency, more power means more coverage. If you increase the transmit power level, you ensure that radio waves will go farther before they spread so much that signal disperses into noise.

The implications of stronger signals aren’t too hard to understand if you’re willing to think about Wi-Fi from a system perspective. Some folks tout systems that would increase Wi-Fi power, commonly using the term “Super Wi-Fi” for them. Advocates of unlicensed use of the TV White Spaces tend to make this error quite frequently, but any time you see or hear “Super Wi-Fi” you should check your wallet because you’re about to be ripped off.

“Super Wi-Fi” is actually a system that would prevent Wi-Fi from working in most cases.

When you think through the implications of boosting the signal power level, this should become plainly obvious. The main purpose of Wi-Fi, once again, is replacing an Ethernet cable. The specifications for Ethernet cable says it can’t be more than 300 feet long. There are a lot of technical reasons for this, but the original decision – which I was a part of – picked 300 feet because we found that the average maximum length of office telephone wire runs was no more than 200 feet. In other words, most office workers sit within 200 feet of a wiring closet where the cables for their desk phones terminate. Increasing the spec by 50% meant that we could be reasonably sure that any office that could be wired for phones could just as easily be wired for Ethernet.

In fact, the original idea for Ethernet over Twisted Pair (StarLAN) used extra telephone wire pairs instead of Cat. 5 Ethernet cable. The speed of StarLAN was limited to 1 Mbps, so it was quickly replaced by 10BASE-T at 10 Mbps over new wires and a series of other standards up to hundreds of gigabits per second. So the layout of office buildings made the rules for Ethernet and Wi-Fi, just as the width of Roman ox carts made the rules for the railways 2000 years after the ox carts were designed.

Designing Wi-Fi for 300 feet of propagation in any direction from the access point allows the spectrum that it uses to be reused every 600 feet. If my access point covers a theoretical 300 foot radius, I can connect from just about any place within a 600 foot diameter. Outside my circle, someone else can operate their AP on the same channel as mine, and we don’t interfere with each other, and so on and so on.

Each access point covers 300,000 square feet or less. In the 50th most densely populated American city (Pasadena, CA) there are 5700 people per square mile or one person per 4800 square feet. So on average, a Wi-Fi access point covers 60 people. They’re not all Wi-Fi users – some 30 percent are not online at all, and others are toddlers – and those who are Wi-Fi users don’t use Wi-Fi at the same time. The average number of users connected to any Wi-Fi access point is less than one, in fact. During peak usage time, the number of users per channel is probably 10 or so, about a quarter of the potential users.

So Wi-Fi is a system designed to serve a potential user population of 40 people per channel, with no more than 10 active at a time. In practice it can go higher, but performance suffers as more people share the spectrum and have to wait for each other. In urban apartments, prime time Wi-Fi usage in the 2.4 GHz band is pretty horrific.

Now what happens if Wi-Fi signals could travel 3,000 feet instead of 300? This would simply require Wi-Fi to transmit at 1 watt, half the power of cellular GSM at 900 Mhz (two watts,) or ten times as much power as Wi-Fi uses in the 2.4 GHz (802.11 b/g/n) today, a tenth of a watt. That kind of level seems to be what the TVWS people want.

This doesn’t simply pump up the number of potential users to 400.  In this scenario, we’re looking at 28 million square feet, 4,000 people, or 1000 active users during prime time. If 100 users have a hard time sharing a Wi-Fi channel, how much joy would 1000 users have? You don’t need to be a rocket scientist to see that this would be a problem.

If you wanted to design a wireless network that shares spectrum among 4,000 potential users, with 1000 active at a time, you’d need 100 times more spectrum than we have for each Wi-Fi channel. “Super Wi-Fi” not only does not have more spectrum per channel than Wi-Fi, it has four times less per channel, 5 MHz vs. 20. There’s obviously nothing super about such a system: 100 times more users in a quarter of the spectrum doesn’t translate into peace and harmony.

The opportunity, however, is to define a system that uses more power specifically to cover rural areas with less population density. The critical factor turns out to be people per Hertz, and we know there aren’t as many people in the boondocks as in Pasadena. High-power spectrum for rural areas would be a winner because it would enable people who come to the Internet over satellite today to get there from a low-latency earth-based system. Super-sized Wi-Fi is not a generally useful tool in areas of greater population density, however.

So don’t be fooled. The surest way for the FCC to make Wi-Fi fail would be to increase its allowed transmit power or to increase its coverage by giving it access to “beach front” 700 MHz spectrum.

Sometimes it’s better to pursue humble goals successfully instead of grandiose ones that don’t make any sense in terms of basic arithmetic.

  • Malcolm

    Very nicely put. Doing one (modest) thing, and doing it well, is really good. Thank you!

  • Jeffrey

    I think you are missing the point of “Super WiFi”. It’s not really WiFi and it’s likely not even going to show up in home access points in the near future. Because of the low frequency (and thus high propagation), it actually makes it good for backhaul. Imagine a WiFi access point that used “Super WiFi” as it’s connection point to the ISP. That’s a very practical application — check out for an example.

  • Did you read the last three paragraphs, Jeffrey? I specifically said: “The opportunity, however, is to define a system that uses more power specifically to cover rural areas with less population density…High-power spectrum for rural areas would be a winner because it would enable people who come to the Internet over satellite today to get there from a low-latency earth-based system.”

    That’s what the Kenya system is all about. Now that we’ve established that high power systems are good for rural areas, we can move on to the discussion about which licensing model will work best for them and how much spectrum should be allocated. It seems to me that some form of licensing will work best for enabling investors to build rural backhaul for a number of reasons.

    You’re right that Super Wi-Fi is nothing at all like Real Wi-Fi. That’s why I’m so annoyed with the people who use that term.

  • That’s less informational than suppositional, Brough. Why do you suppose Super Wi-Fi will ever be a credible alternative to 4G? None of the people who are trying to build TVWS systems have that as a goal; they’re actually talking about low speed M2M networks and things like that. See Webb’s report in the Aspen Institute Roundtable on Spectrum Policy:

  • Steve Crowley

    I agree TVWS based on the high-power IEEE 802.22 standard is most suited for rural areas, and is not much like 4G. Industry Canada has done some analysis confirming this. For very sparsely populated areas, satellite will be better. For suburban and most-densely populated areas, there are already other technologies established, and there aren’t that many TV channels available anyway.

    A twist on TVWS is the emerging IEEE 802.11af standard. That may well end up being certified as “Wi-Fi” by the Wi-Fi Alliance. Lower frequencies will propagate better through walls, etc. — problem is (as I wrote here a few days ago) interference propagates even better than coverage at lower frequencies (each wall attenuates it less), and I don’t think one can match the AP densities that can be achieved with 2.4 and 5 GHz Wi-Fi systems. For some applications when coverage is more important than capacity, fine.

    Lastly, it’s not TVWS, there’s talk in the public interest community of crafting a 600 MHz incentive-auction band plan that would have 20 or 30 MHz of contiguous nationwide white space. I expect that spectrum grab won’t get far as part of the incentive auction proceeding. One could craft a 4G system in that spectrum, but not sure what the point would be with other 4G systems nearby in frequency.

  • Skip Pizzi

    Still don’t understand the title (or the subhead). While the post does a good job of contrasting TVWS with WiFi, the titling could lead some readers to think that future use of TVWS will somehow hurt existing WiFi service (which is rightly never claimed in the piece per se, but some less careful readers may still take that away from the piece based on its title). Maybe a better title might have been “How We Could Have Killed WiFi,” or “Why TVWS Isn’t “Super WiFi,” but those might not have gotten me to read it 😉

  • Richard, thanks for pointing out the inherent performance limitations of unlicensed white space networks that employ large cells. One additional issue I see with these large cells is their inefficient use of spectrum. The large cells run completely counter to today’s mobile broadband architectures that emphasize many small cells to maximize capacity. Quite simply, denser cells translate to higher Gbps/square mile/MHz. In this regard, large cell white space networks are inherently inefficient.

    Ironically, regular Wi-Fi (non “Super Wi-F”) is efficient because it is a local area technology, and thus enables extremely high frequency reuse.

    I explore other problems with white space networks in this piece for GigaOM, “White spaces networks are not “super” nor even Wi-Fi,”

  • Re “Now what happens if Wi-Fi signals could travel 3,000 feet instead of 300?”: The RWTH Aachen guys address the inter-AP interference constraints in their 2011 ICC paper “Wi-Fi, but not on Steroids: Performance Analysis of a Wi-Fi-like Network Operating in TVWS under Realistic Conditions”

    Their findings are in line with what Richard’s arguing, I think: “Our results show that operating Wi-Fi hotspots in TVWS might be technologically attractive for outdoor rural areas where user demand is low. However, the extended coverage range in TVWS leads to increased congestion which rapidly limits the system capacity for an outdoor urban deployment with high user density.”