“The Mete and Measure of Privacy,” Cynthia Dwork, Microsoft Research
This was probably the presentation I was least able to follow well, so I’ll do my best to recap in English-major language; please feel free to suggest corrections in the comments. Dwork talked about the importance of being able to draw conclusions about a whole population from a representative data set while maintaining the confidentiality of the individuals in the data set. “Differential privacy” means the outcome of data analysis is equally likely independent of whether any individual does or doesn’t join the data set; this “equally likely” can be measured/represented by epsilon, with a smaller value being better (i.e. less privacy loss). An epsilon registry could then be created to help us better understand cumulative privacy loss.
Dwork also talked about targeted advertising. Companies who say “Your privacy is very important to us” have “no idea what they’re talking about” – they don’t know how to (and may have little interest in) keeping your data private. And when you hear “Don’t worry, we just want to advertise to you,” remember – your advertiser is not your friend. Advertisers want to create demand where none exists for their own benefit, not for yours. If an advertiser can pinpoint your mood, they may want to manipulate it (keeping you sad, or cheering you up when you are sad for a good reason). During this presentation, someone posted a link on Twitter to this article from The Atlantic, “The Internet’s Original Sin,” which is well worth a read.
Dwork quoted Latanya Sweeney, who asked, “Computer science got us into this mess. Can computer science get us out of it?” Dwork’s differential privacy is one attempt to simplify and solve the problem of privacy loss. Slides from a different but similar presentation are available through SlideServe.
“Protecting Privacy in an Uncertain World,” Betsy Masiello, Senior Manager, Global Public Policy, Google
Masiello’s talk focused on what Google is doing to protect users’ privacy. “It’s hard to imagine that Larry and Sergey had any idea what they were building,” she began. Today, “Everything is mobile…everything is signed in.” Services like Google Now send you a flow of relevant information, from calendar reminders to traffic to weather. In addition to Google, “the average user has 100 accounts online.” It’s impossible to remember that many passwords, especially if they’re good passwords; and even if they’re good passwords, your accounts still aren’t really safe (see Mat Honan’s 2012 article for Wired, “Kill the Password: Why a String of Characters Can’t Protect Us Anymore“).
To increase security, Google offers two-factor authentication. (You can find out what other sites offer 2FA by checking out twofactorauth.org. Dropbox, Skype, many – but not all – banks, Facebook, LinkedIn, Tumblr, and Twitter all support 2FA.) Masiello said that after news of hacks, they see more people sign up for 2FA. “It’s an awareness problem,” she said. In addition to 2FA, Google is encrypting its services, including Gmail (note that the URLs start with https). “E-mail is still the most common way people send private information,” she said, and as such deserves protection.
“Encryption is the 21st century way of protecting our personal information,” said Masiello. Governments have protested companies who have started using encryption, but “governments have all the tools they need to obtain information legally.” As Cory Doctorow has pointed out many times before, it’s impossible to build a back door that only the “good guys” can walk through. Masiello said, “Governments getting information to protect us doesn’t require mass surveillance or undermining security designed to keep us safe.” The PRISM revelations “sparked a very important debate about privacy and security online.” Masiello believes that we can protect civil liberties and national security, without back doors or mass surveillance.
“Getting security right takes expertise and commitment,” Masiello said. She mentioned the paper “Breaking the Web” by Anupam Chander and Uyen P. Le, and said that we already have a good set of guidelines: the OECD Privacy Principles, which include collection limitation, data quality, purpose specification, use limitation, security safeguards, openness, individual participation, and accountability. As for Google, Masiello said, “We don’t sell user data; we don’t share with third parties.” All of the advertising revenue is based on user searches, and it’s possible to opt out of interest-based ads. (Those creepy right-sidebar ads that used to show up in Gmail, having mined your e-mail to produce relevant ads, appear to be gone. And good riddance.)
Finally, Masiello talking about developing/setting industry standards for privacy and security that would facilitate innovation and competition. But privacy isn’t even the main concern in the future: it’s identity – what it means, and how we construct it.
“Sur-veillance, Sous-veillance and Co-veillance,” Lee Rainie, Director of Internet, Science and Technology Research, Pew Research Center
Lee Rainie definitely walked away with the “Most Neutral, Fact-Based Presentation” Award. Rainie has spoken at library conferences in New England before, but I – perhaps unwisely – chose to go to other sessions, so this was the first time I saw him speak, and he was great. Furthermore, all of his slides are available on SlideShare. He started off with a few findings:
1. Privacy is not binary / context matters
2. Personal control / agency matters
3. Trade-offs are part of the bargain
4. The young are more focused on network privacy than their elders (this is only surprising if you haven’t read danah boyd’s excellent It’s Complicated: The Social Lives of Networked Teens, and in fact Rainie quoted boyd a few slides later: “The new reality is that people are ‘public by default and private by effort.'”)
5. Many know that they don’t know what’s going on
6. People are growing hopeless and their trust is failing
The Pew Research Center has found that consumers have lost control over companies’ use and control of their data; they have adopted a transactional frame of mind (e.g. giving up control of personal data in exchange for the use of a platform or service). In general, trust in most institutions has gone down, with the exceptions of the military, firefighters, and librarians(!). But there is a pervasive sense of vulnerability, and users want anonymity – mostly from advertisers, hackers, and social connections, rather than the government (see slide below).
This slide supports the argument for privacy, especially against the “nothing to hide” argument: people desire – and deserve – privacy for many reasons, the least of which is to avoid the government or law enforcement. (Mostly, someone on Twitter pointed out, we want to avoid “that guy.”)
As for the future of privacy, people are feeling “hopeless.” Rainie remembered saying, in the early 2000s, “There’s going to be an Exxon-Valdez of data spills…” and there have been many since then, but little has been done to protect consumer privacy. “How do we convince people to have hope?” he asked.
Panel: “What Privacy Does Society Demand Now and How Much is New?” Danny Weitzner (moderator), Kobbi Nissim, Nick Sinai, Latanya Sweeney
Fortunately, the moderator and panelists have different initials. The questions and responses below are paraphrased from the notes I took during the panel session.
DW: What sort of privacy does society demand now? Is privacy different now?
NS: Access to your own data has always been a art of privacy; also the right to correct, erase, and transfer. Your data should be useable and portable.
KN: The ability to collect a lot of data all the time is new. There is a different balance of power (companies have too much).
LS: Privacy and security are just the beginning. Every American value is being changed by technology. Computer scientists aren’t trained to think of social science effects and the power of technology design.
DW: Cryptography and math are a foundation we can trust if implemented properly, as Snowden said this morning.
LS: I dislike choosing between two things. We need a cross-disciplinary approach, a blended approach.
NS: Any great company should constantly be trying to improve user experience. How does privacy/security get integrated into design?
KN: Aim for mathematical solutions/foundations. We need to re-architect economic incentives, regulations, how all the components work together.
DW: Where will the leadership and initiative come from? Government?
KN: Academia, research. We need to find ways to incentivize.
LS: Economic [incentives] or regulations are necessary for privacy by design. They’re all collapsing…every single one of them [Facebook, the IRS] is heading for a major disaster.
DW: People care about control of their data, yet the information environment is increasingly complicated.
LS: Society benefits from technology with certain protections.
KN: Regulations we have today were designed in a completely different era. We may be in compliance, and still we have damaged privacy severely.
DW: Privacy practices and discussion are/is evolving in the U.S.
LS: A huge dose of transparency would go a long way. This is the new 1776. It’s a whole new world. Technology is redefining society. The Federal Trade Commission could become the Federal Technology Commission.
DW: Are you optimistic? Are we heading toward a positive sense of privacy?
NS: Yes, by nature I’m optimistic, but complexity and user experience (user accounts, passwords) frustrates me. Entrepreneurs do help change the world.
KN: The genie is out of the bottle. This forces us to rethink privacy. Nineteen-fifties privacy has changed and isn’t the privacy we have today, but that doesn’t mean that privacy is dead. Privacy is a sword and a shield.
DW: We’re at the beginning of a long cycle. It’s only been a year [and a half] since Snowden. What do we expect from our government and our companies? How powerful should government and private organizations be? Marketing/advertising issues are trivial compared to bigger issues.
LS: The cost of collecting data is almost zero, so organizations (public and private) collect it and then figure out how to use it later. They should be more selective about collection. If we can expose the harm, it will lead to change.
Question/comment from audience: A lot of people are not aware they’re giving away their privacy (when browsing the internet, etc.).
LS: We need transparency.
NS: We need regulation and consumer protection.