Forecasting Spectrum Demand
There’s been a lot of talk lately regarding the looming spectrum crisis in America, with the FCC and the mobile operators talking it up, and some bloggers, analysts, and buffs talking it down. The FCC released a report in October on the national broadband plan’s spectrum forecast, Mobile Broadband: The Benefits Of Additional Spectrum that reported in part on current trends:
…the amount of data used by wireless consumers is increasing substantially – exhibit 1 below shows an increase of over 450% in the amount of data consumed per line between the first quarter of 2009 and the second quarter of 2010.
Here’s the exhibit in question.
The report shows data consumption increasing 425% per year. So one question about spectrum forecasting is how long this 425% increase in data consumed per user would continue provided the spectrum and other resources are available to satisfy it.
Another is whether the 425% per year rate indicates an upper limit of demand. Many pundits assume that it does, but this is an actual usage number rather than a demand number, so it’s constrained by the ability of infrastructure to meet demand: This is what people actually used, not what they would have used had more capacity been available, after all. In order to understand demand, we need a more sophisticated data set than this: One that shows packet loss, abandoned sessions, and actual video stream rates as opposed to potential stream rates, for starters. Nobody is collecting this kind of data, as far as I know, so this figure takes on more importance than it probably should.
What’s behind it? There are three major dimensions:
- How many people are using smartphones?
- What applications are they running, and what range of consumption do these apps want to achieve?
- How much consumption is possible, given infrastructure capacity?
The overall rate of increase in data consumption has a complex relationship with infrastructure as well, as the most important figure to spectrum planners is peak load, not total transfers. I’d be willing to bet that there are many parts of the country effectively running at the maximum during their high-demand hours, minutes, and seconds, so easing the spectrum crunch relates most importantly to what happens during those periods. So I would add a fourth element to the list concerning peak load.
- What is the peak load period and how long does it last?
The main issue with a forecast for spectrum is predicting the final state, when demand for spectrum is at the maximum and won’t increase any more; once we have that figure, we can plan a series of steps to get there as quick as we need to. If we predict that universal smartphone use doesn’t happen until 2030, for example, we don’t need to make spectrum available much before then.
Smartphone use has doubled in each of the past three years, but it can only double one and a half more times without an increase in the birth rate, as FCC data indicate that 42% of Americans now have a smartphone. So we’re shortly going to see the end of data consumption increases driven by the conversion of dumbphone users into smartphone users.
Not all smartphones are equally smart, however. Blackberry users don’t consume as much data as iPhone users, in part because RIM compresses data inside its network, but mainly because Blackberry users run a different set of applications than iPhone users. Blackberry people are likely to use their smartphones for email, text messaging, and light web browsing, while iPhone users have the full range of rich media apps at their disposal, such as streaming, video calling, and mobile AR, and these applications consume a great deal of data; apparently Android users consume even more, and people with dongles use even more.
So the conversion notion doesn’t just unpack into dumb to smart, it goes dumb to text-based smartphone and then on to media-based smartphone. It’s in the media space that data consumption really skyrockets, of course; that’s why AT&T has seen mobile network traffic increase 5,000% over the past 3 years and Clearwire sees 4G devices using 280 as much data as dumbphones.
One attempt to find the upper limit would be to assume that everyone is headed toward an Android experience; if we go this way, we get there in a couple of years and probably no more than 10 times the consumption we have today. There are problems with this guess, however, as it assumes that today’s Android apps are the ultimate in data consumption forever, not just today, so forecasts vary from 10 times to 35 times as the upper limit.
Assuming there is an upper limit, we then have to consider how it gets satisfied. The choices are offloading to Wi-Fi, increasing spectral efficiency, deploying more towers, and increasing available spectrum for 4G and beyond. You can’t offload all of it to Wi-Fi, because Wi-Fi isn’t a mobile technology; it’s a “nomadic” technology that you can use in a number of places, but you have to be stationary to use it at all. Efficiency increases are nice, but they make capacity available more slowly than devices can use it: Spectral efficiency doubles capacity every 30 months, but Moore’s Law doubles demand every 18 months.
This brings us to reducing cell size, a bricks and mortar exercise that never gets cheaper. To split a cell, you need two new towers, each with a base station, power, and backhaul. That’s two building permits, each very expensive, and three time as many base stations and fiber runs as before. Given the costs of adding towers vs. the costs of everything else, it’s the least attractive alternative in the entire inventory of tricks.
So demand for mobile capacity is growing, at a rapid but ultimately unknowable rate. The most cost-effective way to meet the demand is with more 4G spectrum. There’s some uncertainty in the forecast, as there always will be, because we don’t know what applications people will want to run on 2015 that they aren’t running today. The combination of high demand, the costs of meeting the demand, and the uncertainty in the forecast suggests that auctions are the best way to address the problem, because they have the nice feature of being able to adjust to changes in demand. If it happens that the most valued high-bandwidth app for mobile broadband is live TV, for example, there are at least two ways to address the need: Mobile operators can offer one-to-one streams or multicast, and broadcasters can offer mobile broadcast. That decision about which way to go should be made by users, and auctions enable them to speak. The nice thing about flexible use incentive auctions is they don’t stop with the first sale.
We’re in a situation where the demand for mobile broadband is high, and likely to remain so for several years; it’s also uncertain, so there’s a strong need for flexible systems of allocation that can adapt to changing patterns of usage. The more spectrum is available at auction, the better this challenge can be addressed.
Having done this exercise, albeit in back-of-the-envelope fashion, I wonder what answer we would have got if we tried to predict the demand for wireline bandwidth in 1999? Spectrum usage seems to follow a curve like that for memory and other computer resources: Applications will find a way to use what’s available, so the only real issue for stimulating innovation is how to make as much as possible available as quickly as possible, in a form that can be adapted to actual usage.
Regarding the four major dimensions, and ignoring offloading, I’d add traffic management strategies such as differential charging for peak load periods, prioritized packet scheduling, or caching content during off hours. (Or, under a fixed rate plan, I could adjust the quality of the user experience throughout the billing period so system resources fit the budget.) These strategies will reduce peak traffic, and through QoS fees help pay for the added resources needed to meet demand.
I’d like to know more about the data used for that figure, and why data dropped for one quarter. When the report came out in October, I checked the reference but didn’t find the data. I tried again now and still don’t see it.
Related to Wi-Fi for offloading, progress in femtocell technology bears watching, but may end up better suited for filling holes than offloading.
The dynamics of supply and demand on asynchronous networks are much more subtle than the dynamics that economists generally consider, and I’ve found that all the forecasters have big holes in their understanding of these things. Taking an obvious example, Netflix offers movies in multiple resolutions, going up to a maximum of 4.8 Mbps. They serve up the one that best fits network conditions (in Canada they allow the user to set an upper bound) which can vary in a session. Their figures indicate that their average speed into AT&T is just over 2 Mbps. So is the demand for Netflix streaming 2 Mbps or is it 4.8? The forecasters would all say 2, but that’s because they confuse supply and with demand.
[…] 42% of Americans are using smartphones, and 49% of those users are accessing platforms such as Twitter and Facebook while on the move. Given our collective love of photo sharing and the increased use of smartphones to share all aspects of our lives, phone manufacturers and photo app developers have had to step up to the plate and provide quality products for our shutterbug population who are itching to instantly share their photos. […]
Isn’t it also true that, in a mobile network, certain applications such as email consume far more spectrum per 1Mb of data transfer, than, say, 1Mb of video download?