Why LTE Unlicensed Outperforms Wi-Fi

joinbuttonThe hottest issue in unlicensed wireless spectrum policy is the conflict between Wi-Fi and LTE Unlicensed (“LTE-U”). Some advocates are pressing for restrictions on LTE-U, even going to far as to claim that LTE-U is a “land-grab” intended to take spectrum away from W-Fi-based home networks. It’s worthwhile to put these issues into the technical context so we can separate the attention-grabbing headlines from the technical issues.

Wi-Fi and LTE-U are two systems that use unlicensed radio frequencies; other systems that use unlicensed spectrum are cordless phones, baby monitors, wireless microphones, garage door openers, automotive collision-avoidance systems, fitness monitors, and Bluetooth. The largest user of unlicensed is Bluetooth, followed by Wi-Fi and cordless phones.

There are several frequency bands assigned for unlicensed use: 5 KHz, 900 MHz, 2.4 GHz, 3.6 GHz, 5-6 GHz, and a number of bands at 10 GHz and above. Most tend to be used by similar applications, but there are subtle differences: Polar’s heart rate monitors used by swimmers use 5KHz because it propagates through water, but its running and cycling sensors use the more popular 2.4 GHz band favored by Wi-Fi, Bluetooth, Polar’s proprietary WIND protocol and Garmin’s ANT protocol.

Some fitness monitors are dual band so they can work with a variety of fitness computers and watches: the Polar H2 for example. Other manufacturers produce single band sensors that support multiple protocols, such as Bluetooth and Garmin ANT+: Wahoo is active in this space, for example.

Coexistence with other systems and protocols has long been an issue for unlicensed wireless systems: The two big kahunas, Bluetooth and Wi-Fi, operate on the same 2.4 GHz band and can cause interoperability problems for each other when they’re on the same channel. So both have had to adopt coexistence measures:

 [Bluetooth] 1.2 has made some enhancements to enable coexistence, including Adaptive Frequency Hopping (AFH) and optimizations such as Extended SCO channels for voice transmission within BT. With AFH, a BT device can indicate to the other devices in its piconet about the noisy channels to avoid. Wi-Fi optimization includes techniques such as dynamic channel selection to skip those channels that BT transmitters are using. Access points skip these channels by determining which channels to operate over based on the signal strength of the interferers in the band. Adaptive fragmentation is another technique that is often used to aid optimization. Here, the level of fragmentation of the data packets is increased or reduced in the presence of interference. For example, in a noisy environment, the size of the fragment can be reduced to reduce the probability of interference.

Neither Wi-Fi nor Bluetooth was required by law to do this; the manufacturers added these features because they make their products more reliable and attractive to potential users. The standards makers for these systems tend to collaborate with each other to resolve coexistence issues in the most effective way; once again, this isn’t required. Anyone is entitled to use the spectrum with an FCC-approved device, but it makes business sense to sell reliable products. Winning FCC approval is simply a matter of conforming to transmit power regulations; there are no FCC rules for Media Access Control (“MAC”) protocols, the chief distinguishing feature between Wi-Fi and Bluetooth.

MAC protocols determine how spectrum is shared between users, so they’re often described as “sharing etiquette.” Most unlicensed MACs for short-distance networks (up to 1,000 feet or so) are “listen before talk” regimes that simply try to avoid colliding with transmissions already in progress.

Hidden Nodes and Pesky Laws of Physics

Because radio waves don’t propagate instantly – they’re no faster than light, obviously – it’s possible for your phone to be transmitting to the same Wi-Fi access point as mine without my phone being able to hear your phone. This can happen if your transmission has just started because your signal hasn’t reached me yet, and it can also happen if you and I are both on the fringe of the access points reception area at opposite extremes, as we would be if you were at due north and I were due south with the access point in the middle.

This latter scenario is called the “hidden node” problem and Wi-Fi has specialized protocols to deal with it, the RTS/CTS and CTS-to-self procedures that you might have seen on a Wi-Fi configuration screen.

The former scenario – the failure of a signal to propagate to all listeners instantly because of pesky laws of physics – is mitigated by listening for a long enough time for far signals to reach all devices connected to the same access point. The listening interval is 10-16 microseconds (millionths of a second) for current Wi-Fi standards. If systems don’t listen for at least this long before transmitting, other Wi-Fi transmitters can fail to receive acknowledgements and be forced to retransmit packets that were correctly received, causing unnecessary congestion.

Increased speeds don’t decrease the listening interval because it’s determined by propagation rather than bit rate. So the interval that was OK for 1-2 Mbps Wi-Fi systems is now longer than the maximum packet for current standards. This means that Wi-Fi is less than 50% efficient today without a trick that I will get to shortly. For now, let just say Wi-Fi is inefficient at high speeds (100 Mbps and above) and long (more than 1000 feet) distances.

This is where LTE-U comes into the picture. Because it’s a variation on the MAC protocol used by mobile phones, it can operate over large distances – miles – without substantial loss of efficiency, and it can also operate more efficiently over short distances at high data rates. It does this because the LTE-U base station acts as coordinator to ensure that spectrum is shared fairly and quickly without unnecessary overhead.

While big companies who have invested heavily in Wi-Fi raise the issue of interference between Wi-Fi and LTE-U, they usually don’t emphasize the more pertinent issue: performance. Even when LTE-U and Wi-Fi are operating on different channels and therefore not interfering with each other, LTE-U will outperform Wi-Fi. This makes LTE-U a more attractive technical offering.

LTE-U has another advantage over Wi-Fi though the use of something called Licensed Assisted Access, or LAA. LAA allows some of the LTE signaling overhead to be carried by licensed spectrum, which greatly improves roaming performance.

When I contributed to the Wi-Fi standards in the 90s and 00s (the trick I mentioned previously that speeds up high-rate transmissions was one of my contributions; it’s called “frame aggregation” because bundles multiple packets for a single acknowledgement) it was understood that Wi-Fi was not going to do well in mobile scenarios or at high data rates. We were OK with that because a moderate speed (less than 100 Mbps) nomadic network still had plenty of value. We didn’t try to design a system that would rival cellular because we had a different market niche.

But today’s Wi-Fi operators – the cable companies, Google, and firms like Republic Wireless – want to offer fully mobile, high-speed networks. Their best path forward is probably to embrace LTE-U, both with and without LAA. An alternative is to improve the Wi-Fi Point Coordination Facility (PCF), an early attempt to give Wi-Fi some cellular properties that is still part of the W-Fi standard. The end result of that exercise would be something very much like LTE-U.

So the path of least resistance is fairly obvious: move up to LTE-U.

 


joinbuttonDo you agree? Is LTE-U unmistakably the best path forward for future wireless technologies? Or is short shrift being given to emerging Wi-Fi technologies? Sound off in the forum!