Isenberg Defends “Infinite Spectrum Capacity” Claims
Writing on his blog, David S. Isenberg defends the claims he made to New York Times reporter Brian X. Chen on infinite spectrum:
The pervasive misunderstanding of how we use — and could use — the electromagnetic spectrum was eloquently addressed in layman’s terms by The Myth of Interference, a 2003 Salon article by David Weinberger that I wish every reporter, legislator and wireless policy “expert” would read. The Myth charts a sensible path towards a physically veridical, technology based approach to the use of the electromagnetic spectrum. Central to The Myth’s argument is the fact that current policy, based on the licensing of frequency bands, was shaped by the technology of the 1920s. Today, as The Myth explains, the technologies of wireless communications are vastly expanded, but regulations are still stuck in the 1920s. And the incumbents want to keep them there.
The bad news is that the New York Times article doesn’t cleanly separate myth and analogy from physics and technology. The article mixes literal, physical reality with analogies that foster certain monopoly-preserving conclusions. The article fails to distinguish which is which. It talks about “slices of radio waves,” it confuses licensing with ownership and it treats Wi-Fi offload as if Wi-Fi were an alternative to the use of the electromagnetic spectrum.
Even the headline — “Carriers Warn of Crisis in Mobile Spectrum.” — isn’t what the story is about. A more accurate headline would have said, “Experts Question Carrier Spectrum Claims.” The lede is buried in the fourth paragraph: “But is there really a crisis?” The article misses key specifics, such as the fact that in the (now failed) 36 billion dollar T-Mobile merger with AT&T, the spectrum was valued at $6 billion, and the fact that only about 10% of cell towers have fiber optic backhaul, which means that cellular data capacity can be limited by old DS-1, DS-3 and microwave backhaul technology rather than by wireless electromagnetic receiver overload.
The result is journalism that’s (perhaps unconsciously) shaped by the industry the journalism purports to question. It is as if the cellcos claim 1+1=3, while a few dissident experts — Martin Cooper, David Reed and me — protest that it is 2, then the true answer must lie somewhere in the range between 2 and 3.
Don’t get me wrong. I’m glad that the Times is beginning to explore the idea that the spectrum-as-real-estate analogy is a regulatory fiction.
As I have problems with the notion of infinite spectrum and with Reed’s and Isenberg’s claims to “spectrum expertise” I left the following comment.
I’m not aware of any peer-reviewed academic paper that argues that the information-carrying capacity of RF spectrum is infinite. If there is one, can you please provide a citation?
I’d like to see how to design a system of that allows an infinite number of transmitters to communicate with an infinite number of receivers on a particular radio frequency in a given location without interference (the impaired ability of receivers to extract bits from a radio signal,) or simply a large number of transmitter/receiver pairs. In particular, I’m looking for a coding system that suspends Shannon’s Law.
I’d also like to see information about the credentials of the so-called experts on spectrum cited by this article, as I’m not aware of any academic papers, patent filings, or products that exploit the infinite spectrum concept that can be attributed to the three experts (or to anyone else.)
We know that Cooper was one of the eight inventors listed on Motorola’s cellular patents in 1975, and also that Cooper was a co-inventor of ArrayCom’s SDMA patent from the early ’90s, but ArrayCom has not been successful in creating any infinite spectrum systems. SDMA has extensibility limits, as does CDMA. They both achieve simultaneity by subdividing the code space, and this approach does not go to infinity.
As far as I can tell, neither Reed nor Isenberg has worked with actual spectrum systems in either a research or a product development setting, but I’d love to see evidence to the contrary in the nature of academic papers, patents, or products.
Finally, I don’t understand why the claim is made that the carriers invented the spectrum crisis. The National Broadband Plan forecast a spectrum crunch two years ago, and not only was it not a carrier product, Isenberg was a member of the team that produced it.
It seems a bit more prudent to base spectrum policy on science than on the speculation of non-experts writing in the popular press. Call it an elitist bias if you wish, but I think science is worthwhile.
So far Isenberg hasn’t responded.
Spectrum can be infinite if we’re willing to shrink the ranges and coverage angles infinitely small, which of course requires infinite resources. That doesn’t exactly provide a compelling commercial solution though it makes for great fiction.
Now to build an economically and functionally feasible solution, there’s where physics clamps down on what we can and can’t do.
Apparently Richard Bennet is unfamiliar with the concept of re-use of the radio-frequency spectrum. There are known techniques that mitigate interference between spectrum users sufficiently to enable very large increases in spectrum efficiency. These techniques in no way contradict Shannon. I have never claimed infinite spectrum availability, the projected availability is thousands of times or more than present now. So long as the availability of spectrum stays ahead of demand, we will have effectively infinite spectrum.
Marty, you may not be aware that I’m one of the people who created the Wi-Fi MAC protocol and improved it over the years, so I’ll forgive you for that shot about spectrum re-use. Re-use is the fundamental idea in cellular, going back to the original Bell Labs patents and their predecessors in the realm of railroad communication systems. The Heinrich Hertz mobile telegraph pioneered the concept in the 1890s.
In the 120 years that engineers have been refining this notion, we haven’t discovered a way to provide seamless roaming across large and small distances at gigabit/sec speeds at no cost. There’s been progress, but it comes in the form of reducing restrictions on a timescale that moves at half the speed of Moore’s Law and at a quarter the speed of similar advances in optics.
The question is whether technology alone – in the form of the scheduled replacement of installed equipment – can free up capacity as fast as apps can consume it. So far, the answer is “no.”