Privacy in a Networked World, III

This is the third and last post about Privacy in a Networked World. See the first post (Snowden and Schneier) here and the second post (John DeLong and John Wilbanks) here.

“The Mete and Measure of Privacy,” Cynthia Dwork, Microsoft Research

This was probably the presentation I was least able to follow well, so I’ll do my best to recap in English-major language; please feel free to suggest corrections in the comments. Dwork talked about the importance of being able to draw conclusions about a whole population from a representative data set while maintaining the confidentiality of the individuals in the data set. “Differential privacy” means the outcome of data analysis is equally likely independent of whether any individual does or doesn’t join the data set; this “equally likely” can be measured/represented by epsilon, with a smaller value being better (i.e. less privacy loss). An epsilon registry could then be created to help us better understand cumulative privacy loss.

Dwork also talked about targeted advertising. Companies who say “Your privacy is very important to us” have “no idea what they’re talking about” – they don’t know how to (and may have little interest in) keeping your data private. And when you hear “Don’t worry, we just want to advertise to you,” remember – your advertiser is not your friend. Advertisers want to create demand where none exists for their own benefit, not for yours. If an advertiser can pinpoint your mood, they may want to manipulate it (keeping you sad, or cheering you up when you are sad for a good reason). During this presentation, someone posted a link on Twitter to this article from The Atlantic, “The Internet’s Original Sin,” which is well worth a read.

Dwork quoted Latanya Sweeney, who asked, “Computer science got us into this mess. Can computer science get us out of it?” Dwork’s differential privacy is one attempt to simplify and solve the problem of privacy loss. Slides from a different but similar presentation are available through SlideServe.

“Protecting Privacy in an Uncertain World,” Betsy Masiello, Senior Manager, Global Public Policy, Google

Masiello’s talk focused on what Google is doing to protect users’ privacy. “It’s hard to imagine that Larry and Sergey had any idea what they were building,” she began. Today, “Everything is mobile…everything is signed in.” Services like Google Now send you a flow of relevant information, from calendar reminders to traffic to weather. In addition to Google, “the average user has 100 accounts online.” It’s impossible to remember that many passwords, especially if they’re good passwords; and even if they’re good passwords, your accounts still aren’t really safe (see Mat Honan’s  2012 article for Wired, “Kill the Password: Why a String of Characters Can’t Protect Us Anymore“).

To increase security, Google offers two-factor authentication. (You can find out what other sites offer 2FA by checking out twofactorauth.org. Dropbox, Skype, many – but not all – banks, Facebook, LinkedIn, Tumblr, and Twitter all support 2FA.) Masiello said that after news of hacks, they see more people sign up for 2FA. “It’s an awareness problem,” she said. In addition to 2FA, Google is encrypting its services, including Gmail (note that the URLs start with https). “E-mail is still the most common way people send private information,” she said, and as such deserves protection.

“Encryption is the 21st century way of protecting our personal information,” said Masiello. Governments have protested companies who have started using encryption, but “governments have all the tools they need to obtain information legally.” As Cory Doctorow has pointed out many times before, it’s impossible to build a back door that only the “good guys” can walk through. Masiello said, “Governments getting information to protect us doesn’t require mass surveillance or undermining security designed to keep us safe.” The PRISM revelations “sparked a very important debate about privacy and security online.” Masiello believes that we can protect civil liberties and national security, without back doors or mass surveillance.

“Getting security right takes expertise and commitment,” Masiello said. She mentioned the paper “Breaking the Web” by Anupam Chander and Uyen P. Le, and said that we already have a good set of guidelines: the OECD Privacy Principles, which include collection limitation, data quality, purpose specification, use limitation, security safeguards, openness, individual participation, and accountability. As for Google, Masiello said, “We don’t sell user data; we don’t share with third parties.” All of the advertising revenue is based on user searches, and it’s possible to opt out of interest-based ads. (Those creepy right-sidebar ads that used to show up in Gmail, having mined your e-mail to produce relevant ads, appear to be gone. And good riddance.)

Finally, Masiello talking about developing/setting industry standards for privacy and security that would facilitate innovation and competition. But privacy isn’t even the main concern in the future: it’s identity – what it means, and how we construct it.

“Sur-veillance, Sous-veillance and Co-veillance,” Lee Rainie, Director of Internet, Science and Technology Research, Pew Research Center

Lee Rainie definitely walked away with the “Most Neutral, Fact-Based Presentation” Award. Rainie has spoken at library conferences in New England before, but I – perhaps unwisely – chose to go to other sessions, so this was the first time I saw him speak, and he was great. Furthermore, all of his slides are available on SlideShare. He started off with a few findings:

1. Privacy is not binary / context matters

2. Personal control / agency matters

3. Trade-offs are part of the bargain

4. The young are more focused on network privacy than their elders (this is only surprising if you haven’t read danah boyd’s excellent It’s Complicated: The Social Lives of Networked Teens, and in fact Rainie quoted boyd a few slides later: “The new reality is that people are ‘public by default and private by effort.'”)

5. Many know that they don’t know what’s going on

6. People are growing hopeless and their trust is failing

The Pew Research Center has found that consumers have lost control over companies’ use and control of their data; they have adopted a transactional frame of mind (e.g. giving up control of personal data in exchange for the use of a platform or service). In general, trust in most institutions has gone down, with the exceptions of the military, firefighters, and librarians(!). But there is a pervasive sense of vulnerability, and users want anonymity – mostly from advertisers, hackers, and social connections, rather than the government (see slide below).

Lee Rainie, slide 30, "Who users try to avoid: % of adult users who say they have used the internet in ways to avoid being observed or seen by..."

Lee Rainie, slide 30, “Who users try to avoid: % of adult users who say they have used the internet in ways to avoid being observed or seen by…”

This slide supports the argument for privacy, especially against the “nothing to hide” argument: people desire – and deserve – privacy for many reasons, the least of which is to avoid the government or law enforcement. (Mostly, someone on Twitter pointed out, we want to avoid “that guy.”)

As for the future of privacy, people are feeling “hopeless.” Rainie remembered saying, in the early 2000s, “There’s going to be an Exxon-Valdez of data spills…” and there have been many since then, but little has been done to protect consumer privacy. “How do we convince people to have hope?” he asked.

Panel: “What Privacy Does Society Demand Now and How Much is New?” Danny Weitzner (moderator), Kobbi Nissim, Nick Sinai, Latanya Sweeney

Fortunately, the moderator and panelists have different initials. The questions and responses below are paraphrased from the notes I took during the panel session.

DW: What sort of privacy does society demand now? Is privacy different now?

NS: Access to your own data has always been a art of privacy; also the right to correct, erase, and transfer. Your data should be useable and portable.

KN: The ability to collect a lot of data all the time is new. There is a different balance of power (companies have too much).

LS: Privacy and security are just the beginning. Every American value is being changed by technology. Computer scientists aren’t trained to think of social science effects and the power of technology design.

DW: Cryptography and math are a foundation we can trust if implemented properly, as Snowden said this morning.

LS: I dislike choosing between two things. We need a cross-disciplinary approach, a blended approach.

NS: Any great company should constantly be trying to improve user experience. How does privacy/security get integrated into design?

KN: Aim for mathematical solutions/foundations. We need to re-architect economic incentives, regulations, how all the components work together.

DW: Where will the leadership and initiative come from? Government?

KN: Academia, research. We need to find ways to incentivize.

LS: Economic [incentives] or regulations are necessary for privacy by design. They’re all collapsing…every single one of them [Facebook, the IRS] is heading for a major disaster.

DW: People care about control of their data, yet the information environment is increasingly complicated.

LS: Society benefits from technology with certain protections.

KN: Regulations we have today were designed in a completely different era. We may be in compliance, and still we have damaged privacy severely.

LS mentioned HIPPA, NS mentioned the Consumer Bill of Rights, DW mentioned “Privacy on the Books and on the Ground.”

DW: Privacy practices and discussion are/is evolving in the U.S.

LS: A huge dose of transparency would go a long way. This is the new 1776. It’s a whole new world. Technology is redefining society. The Federal Trade Commission could become the Federal Technology Commission.

DW: Are you optimistic? Are we heading toward a positive sense of privacy?

NS: Yes, by nature I’m optimistic, but complexity and user experience (user accounts, passwords) frustrates me. Entrepreneurs do help change the world.

KN: The genie is out of the bottle. This forces us to rethink privacy. Nineteen-fifties privacy has changed and isn’t the privacy we have today, but that doesn’t mean that privacy is dead. Privacy is a sword and a shield.

DW: We’re at the beginning of a long cycle. It’s only been a year [and a half] since Snowden. What do we expect from our government and our companies? How powerful should government and private organizations be? Marketing/advertising issues are trivial compared to bigger issues.

LS: The cost of collecting data is almost zero, so organizations (public and private) collect it and then figure out how to use it later. They should be more selective about collection. If we can expose the harm, it will lead to change.

Question/comment from audience: A lot of people are not aware they’re giving away their privacy (when browsing the internet, etc.).

LS: We need transparency.

NS: We need regulation and consumer protection.

 

Privacy in a Networked World, II

This is the second post about Privacy in a Networked World. The first post, about the conversation between Bruce Schneier and Edward Snowden, is here.

“Privacy in a Networked World,” John DeLong, Director of the Commercial Solutions Center, NSA

Other than the length and volume of applause, it’s difficult to measure an audience’s attitude toward a speaker. I’ll venture, though, that the audience of Privacy in a Networked World was generally pro-Snowden; the attitude toward John DeLong can perhaps be characterized as guarded open-mindedness laced with a healthy dose of skepticism.

DeLong’s talk was both forceful and defensive; he wanted to set the record straight about certain things, but he also knew that public opinion (in that room, at least) probably wasn’t in his favor. (He said repeatedly that he did not want to have an “Oxford-style debate,” though his talk wasn’t set up as a debate in the first place.) “Let’s not confuse the recipe with the cooking,” he said, in a somewhat belabored analogy where the NSA’s work was the cooking and the law was the recipe. (I cook a lot at home, and I’ll just say, I can usually tell when I’m looking at a bad recipe, and opt to adapt it or not make it at all.)

DeLong quoted at length from Geoffrey R. Stone’s “What I Told the NSA.” (Stone was a member of the President’s Review Group in fall 2013, after the Snowden revelations.) Stone’s conclusions were not altogether positive; he found that while the NSA “did its job,” many of its programs were “highly problematic and much in need of reform.” But it’s the Executive Branch, Congress, and FISA who authorized those programs and are responsible for reforming them. Stone added, “Of course, ‘I was only following orders’ is not always an excuse….To be clear, I am not saying that citizens should trust the NSA. They should not. Distrust is essential to effective democratic governance.”

DeLong said, “The idea that the NSA’s activities were unauthorized is wrong, wrong in a magnificent way.” He emphasized that the NSA is not a law enforcement agency, it’s an intelligence agency. He spoke in favor of people with different backgrounds and expertise – lawyers, engineers, mathematicians, privacy experts, etc. – coming together to work out solutions to problems, with respect for each others’ abilities. “Technology,” he said, “always comes back to how we as humans use it.” At present, “We do not have technologies that identify privacy risks….Privacy engineering could be one of the most important engineering feats of our time.”

DeLong talked about rebuilding the nation’s confidence in the NSA. “Confidence is the residue of promises kept,” he said. “More information does not necessarily equal more confidence.” (Someone on Twitter pointed out that much depends on the content of the information.) The talk was a good reminder not to villainize the entire NSA; part of DeLong’s forcefulness was undoubtedly on behalf of his co-workers and staff whom he felt were unfairly maligned. And technology that could identify privacy risks, built by people who have different perspectives and backgrounds, would be excellent. But do we need technology that identifies privacy risks, or do we need stronger oversight and better common sense? Mass surveillance erodes trust in government and hasn’t been terribly effective; what more do we need to know to put a stop to it?

“Privacy and Irony in Digital Health Data,” John Wilbanks, Chief Commons Officer, Sage Bionetworks

John Wilbanks gave a fast-paced, interesting talk about health data. The “irony” in the title of his talk soon became clear when he gave the example of Facebook’s mood manipulation experiment compared to a study of Parkinson’s disease. The sample size for Facebook was many times larger, with a constant flow of information from “participants,” as opposed to a much smaller sample population who filled out a survey and answered questions by phone. “What does our society value?” Wilbanks asked. This question can be answered by another question: “What do we surveil?”

Wilbanks showed a graph representing cardiovascular disease and terrorism: there is 1 death every 33 seconds from cardiovascular disease – “That’s like 9/11 every day” – and yet there’s not nearly the same kind of “surveillance” for health that there is for terrorism. Participating in a research study, Wilbanks said, is like “volunteering for surveillance,” and usually the mechanisms for tracking aren’t as comprehensive as, say, Facebook’s. Of course, privacy laws affect health research, and informed consent protects people by siloing their data; once the study is concluded, other researchers can’t use that data, and there’s no “network effect.”

Informed consent, while a good idea in theory, often leads to incomprehensible documents (much like Terms of Service). These documents are written by doctors, reviewed by lawyers, and edited by committee. Furthermore, said Wilbanks, people in health care don’t usually understand issues of power and data. So, he asked, how do we run studies at internet scale and make them recombinant? How do we scale privacy alongside the ability to do research? Wilbanks demonstrated some ideas to improve on traditional informed consent, which could also allow research subjects to get a copy of their own data and see which researchers are using data from the research in which they participated.

Obviously there are risks to individuals who share their personal health data, but there can be advantages too: more scientists having access to more data and doing more research can lead to more breakthroughs and improvements in the field of medicine.

Last year, Wilbanks talked about privacy and health data on NPR; you can listen to the segment here.

Still to come: Microsoft, Google, Pew, and a panel on “What Privacy Does Society Demand Now and How Much is New?” 

Privacy in a Networked World

This is the first post about Privacy in a Networked World, the Fourth Annual Symposium on the Future of Computation in Science and Engineering, at Harvard on Friday, January 23.

A Conversation between Bruce Schneier and Edward Snowden (video chat)

Bruce Schneier is a fellow at the Berkman Center for Internet & Society, and the author of Data and Goliath. Edward Snowden was a sysadmin at the NSA who revealed the extent of the government’s mass surveillance. The conversation was recorded (no joke) and is available on YouTube.

I have to say it was an incredibly cool feeling when Snowden popped up on the giant screen and was there in the room with us. There was sustained applause when he first appeared and also at the end of the conversation, when he was waving goodbye. Schneier started by asking Snowden about cryptography: What can and can’t be done? Snowden replied, “Encryption…is one of the few things that we can rely on.” When implemented properly, “encryption does work.” Poor cryptography, either through bad implementation or a weak algorithm, means weak security. End points are also weak spots, even if the data in transit is protected; it’s easier for an attacker to get around crypto than to break it.

Snowden pointed out a shift in the NSA’s focus over the last ten years from defense to offense. He encouraged us to ask Why? Is this proper? Appropriate? Does it benefit or serve the public?

The explosion in “passive” mass surveillance (collecting everything in case it’s needed later) is partly because it’s easy, cheap, and simple. If more data is encrypted, it becomes harder to sweep up, and hackers (including the NSA) who use more “active” techniques run a higher risk of exposure. This “hunger for risk has greatly increased” during the War on Terror era. Their targets are “crazy, unjustified….If they were truly risk averse they wouldn’t be doing this…it’s unlawful.”

Snowden said that the NSA “is completely free from any meaningful judicial oversight…in this environment, a culture of impunity develops.” Schneier said there were two kinds of oversight: tactical oversight within the organization (“did we follow the rules?”) and oversight from outside of the organization (“are these the right rules?”). He asked, “What is moral in our society?”

Snowden asked if the potential intelligence that we gain was worth the potential cost. He stated that reducing trust in the American infrastructure is a costly move; the information sector is crucial to our economy. The decrease in trust, he said, has already cost us more than the NSA’s budget. “They are not representing our interests.”

Schneier, using his NSA voice, said, “Corporations are spying on the whole internet, let’s get ourselves a copy!” (This was much re-tweeted.) “Personal information,” he said, “is the currency by which we buy our internet.” (Remember, if you can’t tell what the product is, you’re the product.) It’s “always amusing,” he said, when Google complains about the government spying on their users, because “it’s our job to spy on our users!” However, Schneier thinks that the attitudes of tech companies and standards bodies are changing.

These silos of information were too rich and interesting for governments to ignore, said Snowden, and there was no cost to scooping up the data because until 2013, “people didn’t realize how badly they were being sold up the river.” Schneier said that research into privacy-preserving technologies might increase now that there is more interest. Can we build a more privacy-preserving network, with less metadata?

“We’ve seen that the arguments for mass surveillance” haven’t really held up; there is little evidence that it has stopped many terrorist attacks. Schneier cited an article from the January 26, 2015 edition of The New Yorker, “The Whole Haystack,” in which author Mattathias Schwartz lists several recent terrorist attacks, and concludes, “In each of these cases, the authorities were not wanting for data. What they failed to do was appreciate the significance of the data they already had.”

Unlike during the Cold War, now “we all use the same stuff”: we can’t attack their networks and defend our networks, because it’s all the same thing. Schneier said, “Every time we hoard a zero-day opportunity [knowing about a security flaw], we’re leaving ourselves vulnerable to attack.”

PrivacyNetworkedWorld

Snowden was a tough act to follow, especially for John DeLong, Director of the Commercial Solutions Center for the NSA, but that’s exactly who spoke next. Stay tuned.

 

Introduction to Cyber Security

FutureLearnThis fall, I enrolled in, and completed, my first first MOOC (massive open online course), Introduction to Cyber Security at the Open University (UK) through their FutureLearn program. I found out about the course almost simultaneously through Cory Doctorow at BoingBoing and the Radical Reference listserv (thanks, Kevin).

Screen shot from course "trailer," featuring Cory Doctorow

Screen shot from course “trailer,” featuring Cory Doctorow

The free eight-week course started on October 15 and ended on December 5. Each week started with a short video, featuring course guide Cory Doctorow, and the rest of the week’s course materials included short articles and videos. Transcripts of the videos were made available, and other materials were available to download in PDF. Each step of each week included a discussion area, but only some of the steps included specific prompts or assignments to research and comment; facilitators from OU moderated the discussions and occasionally answered questions. Each week ended with a quiz; students had three tries to get each answer, earning successively fewer points for each try.

Week 1: [Security] Threat Landscape: Learn basic techniques for protecting your computers and your online information.
Week 2: Authentication and passwords
Week 3: Malware basics
Week 4: Networking and Communications: How does the Internet work?
Week 5: Cryptography basics
Week 6: Network security and firewalls
Week 7: “When your defenses fail”: What to do when things go wrong
Week 8: Managing and analyzing security risks

The FutureLearn website was incredibly easy to use, with a clean and intuitive design, and each week of the course was broken down into little bite-size chunks so it was easy to do a little bit at a time, or plow through a whole week in one or two sessions. I tended to do most of the work on Thursdays and Fridays, so there were plenty of comments in the discussions by the time I got there.

Anyone can still take the course, so I won’t go too in-depth here, but the following are some tips, facts, and resources I found valuable or noteworthy during the course:

  • Identify your information assets: these include school, work, and personal documents; photos; social media account information and content; e-mail; and more, basically anything you store locally on your computer or in the cloud. What is the value (high/medium/low) of this information to you? What are the relevant threats?
  • Passwords are how we identify ourselves (authentication). Passwords should be memorable, long, and unique (don’t use the same password for different sites or accounts). Password managers such as LastPass or KeyPass can help, but that is placing a lot of trust in them. Password managers should: require a password, lock up if inactive, be encrypted, and use 2-factor authentication.
  • Use 2-factor authentication whenever it is available.
  • 85% of all e-mail sent in 2011 was spam.
  • Anti-virus software uses two techniques: signatures (distinctive patterns of data) and heuristics (rules based on previous knowledge about known viruses).
  • The Sophos “Threatsaurus” provides an “A-Z of computer and data security threats” in plain English.
  • The Internet is “a network of networks.” Protocols (e.g. TCP/IP) are conventions for communication between computers. All computers understand the same protocols, even in different networks.
  • Wireless networks are exposed to risks to Confidentiality, Integrity, and Availability (CIA); thus, encryption is necessary. The best option currently is Wireless Protected Access (WPA2).
  • The Domain Name Server (DNS) translates URLs to IP addresses.
  • Any data that can be represented in binary format can be encrypted by a computer.
  • Symmetric encryption: 2 copies of 1 shared key. But how to transmit the shared key safely? Asymmetric encryption (a.k.a. public key cryptography) uses two keys and the Diffie-Hellman key exchange. (The video to explain this was very helpful.)
  • Pretty Good Privacy (PGP) is a collection of crypto techniques. In the course, we sent and received encrypted e-mail with Mailvelope.
  • Transport Layer Security (TLS) has replaced Secure Sockets Layer (SSL) as the standard crypto protocol to provide communication security over the Internet.
  • Firewalls block dangerous information/communications from spreading across networks. A personal firewall protects the computer it’s installed on.
  • Virtual Private Networks (VPNs) allow a secure connection across an untrusted network. VPNs use hashes, digital signatures, and message authentication codes (MACs).
  • Data loss is often due to “insider attacks”; these make up 36-37% of information security breaches.
  • Data is the representation of information (meaning).
  • The eight principles of the Data Protection Act (UK). Much of the information about legislation in Week 7 was specific to the UK, including the Computer Misuse Act (1990), the Regulation of Investigatory Powers Act (2000), and the Fraud Act (2006).
  • File permissions may be set to write (allows editing), read (allows copying), and execute (run program).
  • Use a likelihood-impact matrix to analyze risk: protect high-impact, high-likelihood data like e-mail, passwords, and online banking data.

Now that I’ve gained an increased awareness of cyber security, what’s changed? Partly due to this course and partly thanks to earlier articles, conference sessions, and workshops, here are the tools I use now:

See also this excellent list of privacy tools from the Watertown Free Library. Privacy/security is one of those topics you can’t just learn about once and be done; it’s a constant effort to keep up. But as more and more of our data becomes electronic, it’s essential that we keep tabs on threats and do our best to protect our online privacy.

Why Your Library’s Privacy Policy Matters

Today’s ALA/Booklist webinar, Why Your Library’s Policy Matters, was led by Cherie L. Givens, author of Information Privacy Fundamentals for Librarians and Information Professionals. The webinar seemed almost like a commercial for the book, because Givens only spoke generally, pointing listeners to the book for further detail. In fairness, it would be difficult to cover the topic of library privacy policies in depth in an hour, but I was still hoping for something slightly more concrete and practical. Nevertheless, here are the points she covered:

  • When drawing up a library privacy policy, make sure you are aware of relevant federal* and state legislation. State legislation (e.g. California) may be stricter than federal legislation.

*Particularly the Children’s Online Privacy Protection Act (COPPA), Family Education Rights and Privacy Act (FERPA), Protection of Pupil Rights Amendment (PPRA), No Child Left Behind (NCLB), the PATRIOT Act, Foreign Intelligence Surveillance Act (FISA), and National Security Letters (NSLs). (If your library does receive an NSL, the lawyers at ACLU would love to hear about it.)

  • The Federal Trade Commission (FTC) is a good resource for consumer protection (“We collect complaints about hundreds of issues from data security and deceptive advertising to identity theft and Do Not Call violations”).
  • People should have control over their Personally Identifiable Information (PII), including sensitive personal data such as Social Security Numbers. People should know when, how, and what PII is being communicated to others. It’s always best to collect as little information as possible, only what is necessary; minimize data collection and retention.
  • Every library needs a privacy policy, but the policy is just step one. The next step is to make sure your procedures match the policy, and that you contract for privacy with third parties (vendors) to ensure that they handle patron data according to the same standards.*
  • Perform a privacy audit/assessment: what information do you collect and how do you use it?
  • Look at other libraries’ privacy policies, and the privacy policies of small/medium-sized businesses.
  • The library privacy policy should be visible to users: hand it out with new library cards, post it near computers, keep a copy at the reference desk. (And on the library website?)
  • Privacy is important not just for intellectual freedom, but intellectual curiosity.

*I haven’t seen the contract language, but I would imagine this is much more difficult than it sounds, especially if a library is working with Overdrive, which allows patrons to check out Kindle books through Amazon. Amazon is a data-hungry beast.

These fair information practice principles I copied directly from slide 10 of Givens’ presentation:

  • Notice/Awareness: Provide notice of information collection practices before information is collected.
  • Choice/Consent: Give the subjects of data collection options about whether and how their personal information may be used.
  • Access/Participation: Provide access to an individual’s personal information so that the individual can review and correct it.
  • Integrity/Security: The data collector must take reasonable steps to make sure the data is accurate and secure.
  • Accountability or Enforcement/Redress: There must be a mechanism for addressing and resolving complaints for failing to abide by the above four principles.

Lastly, this great article was cited by one of the webinar participants. I remember reading it before (it was a Library Link of the Day on 10/4/14): “Librarians won’t stay quiet about government surveillance,” Washington Post, Andrea Peterson, 10/3/14.

This webinar will be archived with the rest of Booklist’s webinars, probably within the next week.

 

Nothing to hide: Readers’ rights to privacy and confidentiality

One of the first arguments that comes up in the privacy debate – whether the issue at hand is a police search of your vehicle or Amazon keeping a record of every Kindle book you read – is that only people who have “something to hide” care about privacy.

To say this is disingenuous, and if the people who made this argument thought for even five minutes, I bet they could come up with a few things about their lives that aren’t illegal, or even morally or ethically wrong, but that they’d like to keep private anyway. Let’s consider the issue of library books, and what the books you check out may reveal about you. (Notice The Anarchist Cookbook is not on the following list. I don’t know the statistics about where terrorists get their bomb-making instructions, but I doubt most of it comes from the public library. There’s this thing called the Internet, you see.)

  • What to Expect When You’re Expecting, or other books that might indicate you’re trying to start a family before you’ve told anyone else.
  • Cracking the New GRE, or other test-prep books for grad school or a planned career change you aren’t ready to tell your current boss about.
  • Managing Your Depression, The Lahey Clinic Guide to Cooking Through Cancer, or other books about medical conditions you or someone close to you may be experiencing.
  • Bankruptcy for Small Business Owners might prove worrisome to your clients or your bank.
  • The Guide to Getting It On, or any books on the topics of sexuality, sexual health, safe sex, etc. (In many libraries, kids can get their own library cards at a young age, and parents aren’t allowed to monitor their accounts.) See also: It Gets Better: Coming Out, Overcoming Bullying, Creating a Life Worth Living, or Transgender Lives, etc.
  • God Is Not Great or other anti-religious texts would likely be poorly received if you’re part of a religious family or community.
  • A Gentle Path Through the Twelve Steps, or other books about personal struggle and recovery.
  • How to Buy a House; How to Sell A House, or other real estate books when you haven’t told anyone you’re thinking of moving.

These are just a few examples of information that people might justifiably want to keep personal and private, but not because of any wrongdoing. And this is why librarians strive to protect patron privacy.

“We protect each library user’s right to privacy and confidentiality with respect to information sought or received and resources consulted, borrowed, acquired or transmitted.” -ALA Code of Ethics

11/1/14 Edited to add: This short graphic novel about privacy and technology from Al Jazeera America expands this idea, looking not just at people’s reading history but about all the information they share, voluntarily or not. Thanks to Library Link of the Day for the link.

"Even if you have nothing bad to hide, giving up privacy can mean giving up power over your life story and competing with others for control."

“Even if you have nothing bad to hide, giving up privacy can mean giving up power over your life story and competing with others for control.”

 

TOS42

“Maybe we’ve been given a false choice between opting in and giving up control over how that information is used–” “–between sharing and being left out.”

11/3/14 Edited to add: Kevin O’Kelly from the Somerville Public Library reminded me of Glenn Greenwald’s excellent TED Talk, “Why Privacy Matters.” In it, Greenwald says, “People who say that…privacy isn’t really important, they don’t actually believe it, and the way you know that they don’t actually believe it is that while they say with their words that privacy doesn’t matter, with their actions, they take all kinds of steps to safeguard their privacy. They put passwords on their email and their social media accounts, they put locks on their bedroom and bathroom doors, all steps designed to prevent other people from entering what they consider their private realm and knowing what it is that they don’t want other people to know.

And also: “We as human beings, even those of us who in words disclaim the importance of our own privacy, instinctively understand the profound importance of it. It is true that as human beings, we’re social animals, which means we have a need for other people to know what we’re doing and saying and thinking, which is why we voluntarily publish information about ourselves online. But equally essential to what it means to be a free and fulfilled human being is to have a place that we can go and be free of the judgmental eyes of other people.”

Greenwald is the author of No Place to Hide: Edward Snowden, the NSA, and the U.S. surveillance state (2014). His TED talk is well worth 20 minutes of your time.

 

NELA 2014: Consent of the Networked

Cross-posted on the NELA conference blog.

Intellectual Freedom Committee (IFC) Keynote: Consent of the Networked: The Worldwide Struggle for Internet Freedom, Rebecca MacKinnon (Monday, 8:30am)

MacKinnon pointed to many excellent resources during her presentation (see links below), but I’ll try to summarize a few of her key points. MacKinnon observed that “technology doesn’t obey borders.” Google and Facebook are the two most popular sites in the world, not just in the U.S., and technology companies affect citizen relationships with their governments. While technology may be a liberating force (as envisioned in Apple’s 1984 Superbowl commercial), companies also can and do censor content, and governments around the world are abusing their access to data.

“There are a lot of questions that people need to know to ask and they don’t automatically know to ask.”

MacKinnon noted that our assumption is that of a trend toward democracy, but in fact, some democracies may be sliding back toward authoritarianism: “If we’re not careful, our freedom can be eroded.” We need a global movement for digital rights, the way we need a global movement to act on climate change. If change is going to happen, it must be through an alliance of civil society (citizens, activists), companies, and politicians and policymakers. Why should companies care about digital rights? “They are afraid of becoming the next Friendster.” The work of a generation, MacKinnon said, is this: legislation, accountability, transparency, and building technology that is compatible with human rights.

It sounds overwhelming, but “everybody can start where they are.” To increase your awareness, check out a few of these links: