The FCC’s Privacy Nudge

When consumers are faced with a choice in a scenario where there’s a default, more often the not they will simply accept the default. This fact is well known in both the behavioral sciences and in common sense.

Congressional History with Default Choices

In a New York Times article on the Senate Judiciary Subcommittee on Antitrust, Competition Policy and Consumer Rights’ 2011 hearing on  Google’s monopolistic behavior, Steve Lohr explained the monetary value of defaults:

Indeed, Google made a big bet early in its history: In 2002, it reached a deal with AOL, guaranteeing a payment of $50 million to come from advertising revenue if AOL made Google its automatic first-choice search engine — the one shown to users by default. Today, Google pays an estimated $100 million a year to Mozilla, coming from shared ad revenue, to be the default search engine on Mozilla’s popular Firefox Web browser in the United States and other countries. Google has many such arrangements with Web sites.

AOL and Firefox users can choose other search engines, of course; but Google knows most simply accept the default without giving the options a chance. The billions of dollars of payoffs Google has made to secure default status aren’t illegal. Indeed, the hearing focused mainly on the ways that Google biases search results to promote its own services, another practice that’s not illegal on its face.

Choice Architecture

Behavioral scientists are keenly aware of the ways in which the structure of choices affects consumer behavior. As Lohr explains:

The role of defaults in steering decisions is by no means confined to the online world. For behavioral economists, psychologists and marketers, defaults are part of a rich field of study that explores “decision architecture” — how a choice is presented or framed. The field has been popularized by the 2008 book “Nudge,” by Richard H. Thaler, an economist at the University of Chicago and a frequent contributor to the Sunday Business section, and Cass R. Sunstein, a Harvard Law School professor who is now on leave and is working for the Obama administration. Nudges are default choices.

The United Kingdom established a “Nudge Unit” in 2010 to use Thaler and Sunstein’s insights to achieve policy goals. Understandably, it was both praised for its cleverness and criticized because of the creepiness factor that always surrounds novelty.

White House Nudge Unit

Not to be left out, the White House Office of Science and Technology Policy hired its own behavioral scientist, Maya Shankar, to head a “Nudge Unit” helping agencies improve their policy effectiveness.  Shanker teaches the use of Choice Architecture to push consumers in the desired directions:

“When I joined the office of Science and Technology Policy, in the spring of 2013, I came the with the goal to translate research insights from these fields into improvements in federal programs and policies, but I soon came to see the immense value of actually trying to institutionalize the continued translation of this research into policy improvements and policy design,”she said. “So I pitched the idea of creating a team of behavioral science and evaluation experts who could provide to agencies conceptual and technical support that would allow them to design and implement rapid, iterative trials that use behavioral insights.”

The government’s nudging power might be used to make us eat more broccoli, to exercise more, and to quit smoking and drinking. It can also be used to protect the economic interests of particular industry sectors over others, or to improve the profitability of particular firms.

FCC’s Nudge Power

Nudging is at the heart of the current controversy over the FCC’s Privacy NPRM. While the agency insists it only want to empower consumers to control the data transaction policy employed by their Internet Service Providers, it structures the choices in such a way as to unfairly bias the decision. Under the FCC framework, consumers are nudged to favor White House darlings  – the Internet “edge providers” who dominate Internet advertising today – and to disfavor ISPs who wish to become advertising networks in their own rights.

When people complain about the inconsistency between the FCC’s approach to opt in versus opt out and the FTC’s approach to the same question, this is what they’re talking about.

Inconsistent Nudging

The FTC allows the firms it regulates to collect non-sensitive data from consumers unless they opt out. But it requires affirmative consent – the opt in – before sensitive data can be collected and sold.

The FCC simply applies opt out to all data that ISPs might collect, regardless of its sensitivity. Hence, the FCC applies its choice framework according the status of the firm, while the FTC applies it according to the status – the sensitivity – of the data.

Nudging to Entrench the Status Quo

At the Cal Innovates Life, Liberty, and the Pursuit of Privacy discussion in the Senate Russell building Tuesday, Google booster Harold Feld plaintively asked (at 52:20):

What the hell is so hard about asking? We managed to do that just fine when it was telephone services and all the innovation that people wanted to do, like providing security systems or whatever. And Congress said “yeah, you know, if you can get the customer to say “sure” because you’ve made the case to the customer about this, then great, go to town. Innovate. Do the things you want to do.”

Feld’s question answers itself when we bear in mind the billions of dollars Google has spent to be become the default search engine because of the power of the nudge. When consumers give up data to the incumbent ad networks by default and only give up the same data to ISPs when they go against the nudge, nothing is going to change.


And that’s what’s so hard about asking: the default determines the results. It’s marvelous to give consumers the ultimate say in the way their data is used, but the ballot shouldn’t come with one of the candidates pre-selected. That’s not the way we roll in America.