FTC calls on Congress to enact baseline privacy legislation and more transparency of data brokers

Ed Felten has launched a new blog to explain tech to citizens and engage the technology community.

Over a century ago, Supreme Court Justice Lewis Brandeis “could not have imagined phones that keep track of where we are going, search engines that predict what we’re thinking, advertisers that monitor what we’re reading, and data brokers who maintain dossiers of every who, what, where, when and how of our lives,” said Federal Trade Commission Chairman Jon Leibowitz yesterday morning in Washington, announcing the release of the final version of its framework on consumer privacy.,

“But he knew that, when technology changes dramatically, consumers need privacy protections that update just as quickly. So we issue our report today to ensure that, online and off, the right to privacy, that ‘right most valued by civilized men,’ remains relevant and robust to Americans in the 21st century as it was nearly 100 years ago.”

What, exactly, privacy means in this digital age is still being defined all around us, reflected in the increasing number of small screens, cameras and explosion of data. The FTC’s final report, “Protecting Consumer Privacy in an Era of Rapid Change: Recommendations For Businesses and Policymakers,” makes a strong recommendation to Congress to draft and pass a strong consumer privacy law that provides rules of the road for the various entities that have the responsibility for protecting sensitive data.

The final report clearly enumerates the same three basic principles that the draft of the FTC’s privacy framework outlined for companies :

  1. Privacy by design, where privacy is “built in” at every stage that an application, service or product is developed
  2. Simplified choice, wherein consumers are empowered to make informed decisions by clear information about how their data will be used at a relevant “time and context,” including a “Do Not Track” mechanism, and businesses are freed of the burden of providing unnecessary choices
  3. Greater transparency, where the collection and use of consumer data is made more clear to those who own it.

“We are demanding more and better protections for consumer privacy not because industry is ignoring the issue,” said Leibowitz today. “In fact, the best companies already follow the privacy principles we lay out in the report. In the last year, online advertisers, major browser companies, and the W3C — an Internet standard setting group — have all made strides towards putting into place the foundation of a Do Not Track system, and we commit to continue working with them until all consumers can easily and effectively choose not to be tracked. I’m optimistic that we’ll get the job done by the end of the year.”

According to the FTC, the nation’s top consumer watchdog received over 450 comments on the draft online privacy report that it released in December 2010. In response to “technological advances” and comments, the FTC revised the privacy framework in several areas. (For a broad overview of the final FTC privacy framework, read Dan Rowinski’s overview at ReadWriteWeb and the Information Law Group’s summary of the commission report on consumer privacy).

First, it will not apply to “companies that collect and do not transfer only non-sensitive data from fewer than 5,000 consumers a year,” which would have been a burden on small businesses. Second, the FTC has brought action against Google and Facebook since the draft report was issued. Those actions — and the agreements reached — provide a model and guidance for other companies.

Third, the FTC made specific recommendations to companies that offer mobile services that include improved privacy protections and disclosures that are short, clear and effective on small screens. Fourth, the report also outlined “heightened privacy concerns” about large platform providers, such as ISPs, “operating systems, browsers and social media companies,” seeking to “comprehensively track consumers’ online activities.” When asked about “social plug-ins” from such a platform, chairman Leibowitz provided Facebook’s “Like” button as an example. (Google’s +1 button is presumably another such mechanism.)

Finally, the final report also included a specific recommendation with respect to “data brokers,” which chairman Leibowitz described as “cyberazzi” on Monday, echoing remarks at the National Press Club in November 2011. Over at Forbes, Kashmir Hill reports that the FTC officially defined a data broker as those who “collect and traffic in the data we leave behind when we travel through virtual and brick-and-mortar spaces.”

During the press conference, chairman Leibowitz said that American citizens should be able to learn see what information is held by them and “have the right to correct inaccurate data,” much as they do with credit reports. Specifically, the FTC has called on data brokers to “make their operations more transparent by creating a centralized website to identify themselves, and to disclose how they collect and use consumer data. In addition, the website should detail the choices that data brokers provide consumers about their own information.”

While the majority of the tech media’s stories about the FTC today focused on “Do Not Track” prospects and mechanisms, or the privacy framework’s impact on mobile, apps and social media, the reality of this historic moment is it’s world’s world’s data brokers that currently hold immense amounts of information regarding just about everyone “on the grid,” even if they never “Like” something on Facebook, turn on a smartphone or buy and use an app.

In other words, even though the FTC’s recommendations for privacy by design led TechMeme yesterday, that’s wasn’t new news. CNET’s Declan McCullagh, one of the closest observers of Washington tech policy in the media, picked up on the focus, writing that FTC stops short of calling for a new DNT law but “asks Congress to enact a new law that “would provide consumers with access to information about them held by a data broker” such as Lexis Nexis, US Search, or Reed Elsevier subsidiary Choicepoint — many of which have been the subject of FTC enforcement actions in the last few years.” As McCullagh reported, the American Civil Liberties Union “applauded” the FTC’s focus on data brokers.

They should. As Ryan Singel pointed out at Wired, the FTC’s report does “call for federal legislation that would force transparency on giant data collection companies like Choicepoint and Lexis Nexis. Few Americans know about those companies’ databases but they are used by law enforcement, employers and landlords.”

Would we, as Hill wondered, be less freaked out if we could see what data brokers have on us? A good question, and one that, should the industry coalesce around providing consumers access to their personal data in that context, just as utilities are beginning to do with energy data.

Another year without privacy legislation?

Whether it’s “baseline privacy protections” or more transparency for data brokers, the FTC is looking to Congress to act. Whether it will or not is another matter. While the Online privacy debate was just about as hot in Washington nearly two years ago as it is today, no significant laws were passed.The probability of significant consumer privacy legislation advancing in this session of Congress, however, currently appears quite low. While at least four major privacy bills have been introduced in the U.S. House and Senate, “none of that legislation is likely to make it into law in this Congressional session, however, given the heavy schedule of pending matters and re-election campaigns,” wrote Tanzina Vegas and Edward Wyatt in the New York Times.

The push the FTC gave yesterday was welcomed in some quarters. “We look forward to working with the FTC toward legislation and further developing the issues presented in the report,” said Leslie Harris, president of the Center for Democracy and Technology (CDT), in a prepared release. CDT also endorsed the FTC’s guidance on “Do Not Track” and focus on large platform providers. Earlier this winter, a coalition of Internet giants, including Google, Yahoo, Microsoft, and AOL, have committed to adopt “Do Not Track technology” in most Web browsers by the end of 2012. These companies, which deliver almost 90 percent of online behavioral advertisements, have agreed not to track consumers if they choose to opt out of online tracking using the Do Not Track mechanism, which will likely manifest as a button or browser plug-in. All companies that have made this commitment will be subject to FTC enforcement.

By way of contrast, Jim Harper, the Cato Institute’s director of information policy studies, called the framework a “groundhog report on privacy,” describing it as “regulatory cheerleading of the same kind our government’s all-purpose trade regulator put out a dozen years ago.” In May of 2000, wrote Harper, “the FTC issued a report finding “that legislation is necessary to ensure further implementation of fair information practices online” and recommending a framework for such legislation. Congress did not act on that, and things are humming along today without top-down regulation of information practices on the Internet.”

Overall, the “industry here has a self-interest beyond avoiding legislation,” said Leibowitz during today’s press conference. Consumers have very serious concerns about privacy, he went on, alluding to polling data, surveys and conversations, and “better, clearer privacy policies” will lead to people having “more trust in doing business online.”

This FTC privacy framework and the White House’s consumer privacy bill of rights will, at minimum, inform the debates going forward. What happens next will depend upon Congress finding a way to protect privacy and industry innovation. It will be a difficult balance to strike, particularly given concerns about protecting children online and the continued march of data breaches around the country.

Making technology more accessible

I interviewed Princeton professor Ed Felten, the FTC’s chief technologist and co-author of “Government Data and the Invisible Hand” (2009) after yesterday’s FTC press conference at FTC headquarters in D.C. In December 2010, we spoke about the FTC’s ‘Do Not Track’ proposal, after the release of the draft report.

Felten launched “Tech at the FTC” last Friday morning, a new blog that he hopes will play a number of different roles in the discussion of technology, government and society.

“It will combine Freedom to Tinker posts,” he said, “some of which were op-ed, some more like teaching. The latter is what I’m looking for: explanations of sophisticated technical information that cross over to a non-technical audience.”

Felten wants to start a conversation that’s “interesting to general public” and “draws them into the discussion” about the intersection of regulation and technology. One aspect of that will be a connected Twitter account, @TechFTC, along with his established social identity, @EdFelten.

Possible future topics will include security issues around passwords and authentication of people in digital environments, both of which Felten finds interesting as they relate to policy. He said that he expects to write about technology stories that are in the news, with the intent of helping citizens to understand at an accessible level what the take away is for them.

Social media and the Internet are “useful to give people a window into the way people in government are thinking about these issues,” said Felten. “They let people see that people in government are thinking about technology in a sophisticated way. It’s easy to fall into the trap where people in government don’t know about technology. That’s part of the goal: speak to the technical community in their language.

“Part of my job is to be an ambassador to the technology community, through speaking to and with the public,” said Felten. “The blog will help people know how to talk to the FTC and who to talk to, if they want to. People think that we don’t want to talk to them. Just emailing us, just calling us, is usually the best way to get a conversation started. You usually don’t need a formal process to do this — and those conversations are really valuable.”

In that context, he plans to write more posts like the one that went live Monday morning, on tech highlights of the FTC privacy report, in which he highlighted four sections of the framework that the computer science professor thought would be of interest to techies:

  1. De-identified data (pp. 18-22):   Data that is truly de-identified (or anonymous) can’t be used to infer anything about an individual person or device, so it doesn’t raise privacy concerns.  Of course, it’s not enough just to say that data is anonymous, or that it falls outside some narrow notion of PII.   But beyond that, figuring out whether your dataset is really de-identified can be challenging. If you’re going to claim that data is de-identified, you need to have a good reason-the report calls it a “reasonable level of justified confidence”-for claiming that the data does not allow inferences about individuals.  What “reasonable” means-how confident you have to be-depends on how much data there is, and what the consequences of a breach would be.  But here’s a good rule of thumb: if you plan to use a dataset to personalize or target content to individual consumers, it’s probably not de-identified.
  2. Sensitive data (pp. 47-48):  Certain types of information, such as health and financial information, information about children, and individual geolocation, are sensitive and ought to be treated with special care, for example by getting explicit consent from users before collecting it.   If your service is targeted toward sensitive data, perhaps because of its subject matter or target audience, then you should take extra care to provide transparency and choice and to limit collection and use of information.  If you run a general-purpose site that incidentally collects a little bit of sensitive information, your responsibilities will be more limited.
  3. Mobile disclosures (pp. 33-34): The FTC is concerned that too few mobile apps disclose their privacy practices.  Companies often say that users accept their data practices in exchange for getting a service.  But how can users accept your practices if you don’t say what they are?  A better disclosure would tell users not only what data you’re collecting, but also how you are going to use it and with whom you’ll share it.   The challenging part is how to make all of this clear to users without subjecting them to a long privacy policy that they probably won’t have time to read.   FTC staff will be holding a workshop to discuss these issues.
  4. Do Not Track (pp. 52-55): DNT gives users a choice about whether to be tracked by third parties as they move across the web.  In this section of the report, the FTC reiterates its five criteria for a successful DNT system, reviews the status of major efforts including the ad industry’s self-regulatory program and the W3C’s work toward a standard for DNT, and talks about what steps remain to get to a system that is practical for consumers and companies alike.

When asked about what the developers and founders of startups should be thinking about with respect to the FTC’s privacy framework, Felten emphasized those three basic principles — privacy by design, simplified choice, greater transparency — and then offered some common sense:

“Start with the basic question of ‘what Section 5 means for you,’ he suggested. “If you make a promise to consumers in your privacy policy, consumers are entitled to rely on that. The FTC has brought cases against companies that made them and didn’t hold up their responsibility around privacy. You have a responsibility to protect consumer data. If not, you may find yourself on the wrong side of the FTC act if there’s a breach and it harms consumers.”

tags: , ,