The convergence of Google, government and privacy

A look at the issues and toolsets driving the online privacy discussion.

Google recently added a new Privacy Tools page. If you follow tech policy in Washington, you couldn’t miss hearing about it, given that advertising for Google privacy tools was on relevant blogs, email newsletters and periodicals. And if you work, play, shop or communicate online, the issue of online privacy is more relevant to you than perhaps it ever has been before. Companies and governments are gathering unprecedented amounts of data about every click, link, and status update you make. The choices that they are making now around the use of personal information, identity, authentication and relationships will be fundamental to how we see one another and ourselves as the information age evolves.

This historic moment is why the Gov 2.0 Summit featured a deep dive into online privacy this year. The evolution of online privacy continues to spur debate on and offline. Below, Tim O’Reilly shared his thinking on the complex subject at the Summit:

Why the big push in awareness around the new privacy tools? For once, it’s simple: privacy is a hot issue in Washington, and Google has a major interest in contributing to and informing that conversation.

As one of the most powerful tech companies in the world, Google will inevitably be affected by any new electronics privacy laws from Congress or regulations from the Federal Trade Commission. From potential updates on the law to the complexities introduced by cloud computing to the unprecedented collection of user data from search, webmail or advertising (Google’s bread and butter), getting online privacy right will matter.

That’s doubly true because of recent missteps, like the widespread backlash over Google Buzz. With the anticipated launch of Google Me later this year, which is reported to add a social layer to Google’s products, online privacy will become even more relevant. Google is up against the hypergrowth of Facebook, which, with some 500 million users, has grown into one of the world’s top online destinations.

Every time users “like” a page on Facebook instead of linking a website on web, that action provides Facebook with information about relevance and connections that aren’t available to Google, at least for now. When Google returns a result using its search algorithms, they are in part parsing linking behavior. Should Facebook continue to expand its own search, it’s safe to assume that those results will be in part informed by liking behavior. The contrast between hyperlinks and “hyperlikes” is relevant because the future of the Internet is likely to be built on understanding those interactions. In both cases, users (and regulators) have a vested interest in understanding how, where and why their data is being used.

Is it possible to share and protect sensitive information? That will continue to be an open question — and a contentious question — for years to come. For an informed discussion on that topic, watch the conversation embedded below with John Clippinger, of Harvard’s Berkman Center, Loretta Garrison from the FTC and Ellen Blacker from AT&T.

New Google privacy tools

Last week, I attended a press briefing where Jonathan McPhie, Google’s product manager for web history, SSL search and personalized search offerings, walked through Google’s privacy settings.

“It’s on us to educate members of Congress,” said Google spokesman Adam Kovacevich. “Google is an engineering company. We would like to address challenging issues — copyright or free expression for instance — with great tools. The Content ID tool on YouTube is a great example of that. Our first instinct is to try to address them through tools like this.”

One of Google’s most public responses to online privacy concerns came last year, during the FTC privacy roundtables, when the company launched a dashboard to show users information associated with Google accounts. Yahoo also launched a dashboard to provide similar insight into data collection. Both dashboards were designed to provide users with more insight into what data was being collected around which interests. The introduction of these tools was particularly relevant to behavioral advertising, which many publishers are looking to as an important way of offering targeted, high-value marketing.

According to McPhie, there are now more than 100,000 unique visitors every day to Google’s dashboard. Users can view, modify or even export data there, though the company’s “Data Liberation Front.

McPhie framed the privacy discussion in the context of principles. Reflecting Google’s famous mantra, “Don’t Be Evil,” these online privacy principles should give users “meaningful choices” to protect their personal information.

Google itself has meaningful choices to make: from how they cooperate (or don’t) with law enforcement and government agencies that want access to its data, to “firewalling” private information from multiple services within companies, to monitoring internal controls on employee access.

“We want to help users know what’s associated with their account, so each user is aware from beginning what’s correlated across the account,” said Kovacevich, who noted that Google published the number of requests they get for user data online. Privacy researcher Chris Soghoian considered the pros and cons of this tool in praise of Google. Google is also working with a coalition of technology companies and think tanks on Digital Due Process, an effort advocating for an update of the Electronic Communications Privacy Act to incorporate privacy principles relevant to cloud computing.

Google has also made an effort to make privacy tools more visible, said McPhie, notably on the sparse home page and on every search page. By the time users reach the bottom of the new privacy tools page, said McPhie, users will be more “empowered.” He touted two areas where Google introduced privacy features earlier than competitors: encrypted webmail, in January 2010 and encrypted search this spring. “We launched [encrypted.google.com] in May,” said McPhie. “It encrypts communication between Google and the searcher. The concept is simple but implementation is complex, in maintaining performance. The challenge is around latency.”

“Another feature I don’t think people are aware of is the ability to pause,” said McPhie, referring to the ability of users to stop recording their Web History, then resume. Users can also remove items from that Web History if they wish.

Web browsing and privacy

Google’s browser, Chrome, includes an “Incognito Mode” that reduces information collected during a user’s browsing session. While Incognito Mode won’t help anyone protect information from a colleague or friend if they leave a browser window open, it does mean that the history will not log progress, URLs will not be stored and any cookies on the computer are session-level only. Any downloads made, however, will stick around.

McPhie also noted that an opt-out option for Google Analytics has been available since April. The tradeoff between integrating analytics or disabling the function is between making a website more useful for users vs. individual user privacy. That frames the debate going on within the government webmaster community, where the recent revamp of federal cookie policy by the Office of Management and Budget officially allowed the use of cookies and analytics.

Opting out of analytics doesn’t prevent a website from knowing certain things, like a HTTPS referrer, but “we chose privacy over data,” said McPhie. “That was significant for us.”

Future Google services and online privacy

Geolocation services have attracted notice of late, due to concerns over granular location information. As with other areas, a user chooses how much to share with friends, said McPhie. People have to sign up to see the locations of users in Google’s Latitude program.

“You can detect location or set location manually,” said McPhie, “and even hide location.” He highlighted monthly email reminders that remind users that the Latitude service is on. McPhie also noted that feedback from a shelter for battered women focused on a scenario where Latitude was installed without a user’s knowledge resulted in a feature addition. “Where possible, within 24 hours if you haven’t used it, a Latitude dialog pops up,” he said.

Online video will present other issues. While users can make an uploaded video “private,” if people have the right link, the video can be viewed. If YouTube does move forward with allowing livestreaming, expect these privacy controls to receive additional scrutiny.

Google’s moves into mobile, television, e-commerce, virtual worlds and even augmented reality will also create new privacy concerns. With each service, the question of where and how privacy is baked in will be important to analyze. On one count, at least, Google seems to have learned an important lesson: more transparency into how data is being used and what tools are available to control it can satisfy the privacy concerns of the majority of users. According to McPhie, only 1 in 7 people who used the online advertising preferences chose to opt-out entirely. Providing users with control over their own private data can’t be discounted, either.

That was just one element that Jules Polonestky, the former privacy officer at DoubleClick and AOL, focused on in his talk at the Gov 2.0 Summit and in our subsequent interview. Both are embedded below.

Questions for Google from the online audience

Before I headed into Google’s Washington offices, I solicited questions to ask from readers. Answers, which I’ve edited for readability, follow.

How long before Google Street View covers my country, especially the cities of Makkah & Madina? — Aamer Trambu (@TVtrambu)

While Google’s staff wouldn’t comment on any plan to roll out Street View in the Middle East, they did emphasize the availability of users to opt out using privacy tools. “Facial recognition, if we were to introduce it,” said McPhee, “would also have controls.”

[I have] concerns over the control of “predicted search” data on Google Instant. How is it stored, associated, protected? — Andrew N (@tsbandito)

“Google Instant works just like normal web searches,” said McPhie. “If you click on a result, press enter or take some other action like clicking on an ad, just like before, it’s recorded in your Web History.” He did highlight a way that Instant is a bit different: when you get a result and you don’t have to click on anything, Google records it as a search if you pause on a term for three seconds.

What is the ETA on Google turning on encryption for search by default? Do the filtering concerns of schools take priority? — Chris Soghoian (@csoghoian)

For now, you can make encrypted.google.com your home page, suggested McPhie. “For those unfamiliar with the issue, schools have an obligation for funding to provide filtering of pornographic images. The difficulty is that because schools didn’t know what people are searching for, they blocked google.com.”

McPhie focused on issues of search speed as an key consideration in default encryption. “The difficulty of offering encryption by default is that the central challenge is performance,” said McPhie. “There are some features where this is more difficult than text search. Encrypted search doesn’t support images or maps. Before we made this the default, we would need that to be supported as well. As soon as we have that feature parity, we will look into that as well.”

What extent will they be using social data in conglomeration with Web History? — Eric Andersen (@eric_andersen)

“We have a social search function, and that exists as a separate function from Web History,” said McPhie. “There’s a page called social circle and you can go through and see what information is there and edit it. You can say ‘I don’t want this guy promoted in social search.’ I can’t comment on rumors [regarding Google Me].”

How far will Google would go to protect user privacy? — Ilesh (@ileshs)

“We abide by the laws in the countries in which we operate,” said Kovacevich. “That doesn’t mean at the very first request for user data that we give it away. From a broad perspective in promoting freedom of speech globally, we are interested in the issue. We’re doing a big conference in Budapest with a Central European university.”

I recently heard about mail graphing. What about the data privacy concerns with that? — Meg Biallas (@MegBiallas)

This third-party ad on “is a great example of where we think data belongs to users, and they can use it in creative ways,” said McPhie. You can learn more about mail graphing in a recent edition of Strata Week.

How many of the U.S. government requests for information were made for information on people from outside of the United States? [This waas in regards to data requests, not removal requests.] — Annie Crombie (@annieatthelake)

“Honestly, I don’t know,” said Kovacevich. “We track them by the origin of the request.”

How are they going to use the information from what we watch on Google TV? — Tim Carmody (@tcarmody)

“We definitely have a goal to have all Google products and services included in the dashboard if it’s in your account,” said McPhie. “It’s safe to assume if there’s unique information collected via Google TV, it will be included there.”

What about Google’s own access to stored data? Any comment on that case? [This question referred to Google firing an engineer for violating privacy policies.] — Carl Brooks (@eekygeeky)

Google’s spokesman referred me to the company’s public statement on this question, which was published in TechCrunch:

“We dismissed David Barksdale for breaking Google’s strict internal privacy policies. We carefully control the number of employees who have access to our systems, and we regularly upgrade our security controls — for example, we are significantly increasing the amount of time we spend auditing our logs to ensure those controls are effective. That said, a limited number of people will always need to access these systems if we are to operate them properly — which is why we take any breach so seriously.” — Bill Coughran, Senior Vice President, Engineering, Google

Related:

tags: , , , ,