Privacy in a Networked World, III

This is the third and last post about Privacy in a Networked World. See the first post (Snowden and Schneier) here and the second post (John DeLong and John Wilbanks) here.

“The Mete and Measure of Privacy,” Cynthia Dwork, Microsoft Research

This was probably the presentation I was least able to follow well, so I’ll do my best to recap in English-major language; please feel free to suggest corrections in the comments. Dwork talked about the importance of being able to draw conclusions about a whole population from a representative data set while maintaining the confidentiality of the individuals in the data set. “Differential privacy” means the outcome of data analysis is equally likely independent of whether any individual does or doesn’t join the data set; this “equally likely” can be measured/represented by epsilon, with a smaller value being better (i.e. less privacy loss). An epsilon registry could then be created to help us better understand cumulative privacy loss.

Dwork also talked about targeted advertising. Companies who say “Your privacy is very important to us” have “no idea what they’re talking about” – they don’t know how to (and may have little interest in) keeping your data private. And when you hear “Don’t worry, we just want to advertise to you,” remember – your advertiser is not your friend. Advertisers want to create demand where none exists for their own benefit, not for yours. If an advertiser can pinpoint your mood, they may want to manipulate it (keeping you sad, or cheering you up when you are sad for a good reason). During this presentation, someone posted a link on Twitter to this article from The Atlantic, “The Internet’s Original Sin,” which is well worth a read.

Dwork quoted Latanya Sweeney, who asked, “Computer science got us into this mess. Can computer science get us out of it?” Dwork’s differential privacy is one attempt to simplify and solve the problem of privacy loss. Slides from a different but similar presentation are available through SlideServe.

“Protecting Privacy in an Uncertain World,” Betsy Masiello, Senior Manager, Global Public Policy, Google

Masiello’s talk focused on what Google is doing to protect users’ privacy. “It’s hard to imagine that Larry and Sergey had any idea what they were building,” she began. Today, “Everything is mobile…everything is signed in.” Services like Google Now send you a flow of relevant information, from calendar reminders to traffic to weather. In addition to Google, “the average user has 100 accounts online.” It’s impossible to remember that many passwords, especially if they’re good passwords; and even if they’re good passwords, your accounts still aren’t really safe (see Mat Honan’s  2012 article for Wired, “Kill the Password: Why a String of Characters Can’t Protect Us Anymore“).

To increase security, Google offers two-factor authentication. (You can find out what other sites offer 2FA by checking out twofactorauth.org. Dropbox, Skype, many – but not all – banks, Facebook, LinkedIn, Tumblr, and Twitter all support 2FA.) Masiello said that after news of hacks, they see more people sign up for 2FA. “It’s an awareness problem,” she said. In addition to 2FA, Google is encrypting its services, including Gmail (note that the URLs start with https). “E-mail is still the most common way people send private information,” she said, and as such deserves protection.

“Encryption is the 21st century way of protecting our personal information,” said Masiello. Governments have protested companies who have started using encryption, but “governments have all the tools they need to obtain information legally.” As Cory Doctorow has pointed out many times before, it’s impossible to build a back door that only the “good guys” can walk through. Masiello said, “Governments getting information to protect us doesn’t require mass surveillance or undermining security designed to keep us safe.” The PRISM revelations “sparked a very important debate about privacy and security online.” Masiello believes that we can protect civil liberties and national security, without back doors or mass surveillance.

“Getting security right takes expertise and commitment,” Masiello said. She mentioned the paper “Breaking the Web” by Anupam Chander and Uyen P. Le, and said that we already have a good set of guidelines: the OECD Privacy Principles, which include collection limitation, data quality, purpose specification, use limitation, security safeguards, openness, individual participation, and accountability. As for Google, Masiello said, “We don’t sell user data; we don’t share with third parties.” All of the advertising revenue is based on user searches, and it’s possible to opt out of interest-based ads. (Those creepy right-sidebar ads that used to show up in Gmail, having mined your e-mail to produce relevant ads, appear to be gone. And good riddance.)

Finally, Masiello talking about developing/setting industry standards for privacy and security that would facilitate innovation and competition. But privacy isn’t even the main concern in the future: it’s identity – what it means, and how we construct it.

“Sur-veillance, Sous-veillance and Co-veillance,” Lee Rainie, Director of Internet, Science and Technology Research, Pew Research Center

Lee Rainie definitely walked away with the “Most Neutral, Fact-Based Presentation” Award. Rainie has spoken at library conferences in New England before, but I – perhaps unwisely – chose to go to other sessions, so this was the first time I saw him speak, and he was great. Furthermore, all of his slides are available on SlideShare. He started off with a few findings:

1. Privacy is not binary / context matters

2. Personal control / agency matters

3. Trade-offs are part of the bargain

4. The young are more focused on network privacy than their elders (this is only surprising if you haven’t read danah boyd’s excellent It’s Complicated: The Social Lives of Networked Teens, and in fact Rainie quoted boyd a few slides later: “The new reality is that people are ‘public by default and private by effort.'”)

5. Many know that they don’t know what’s going on

6. People are growing hopeless and their trust is failing

The Pew Research Center has found that consumers have lost control over companies’ use and control of their data; they have adopted a transactional frame of mind (e.g. giving up control of personal data in exchange for the use of a platform or service). In general, trust in most institutions has gone down, with the exceptions of the military, firefighters, and librarians(!). But there is a pervasive sense of vulnerability, and users want anonymity – mostly from advertisers, hackers, and social connections, rather than the government (see slide below).

Lee Rainie, slide 30, "Who users try to avoid: % of adult users who say they have used the internet in ways to avoid being observed or seen by..."

Lee Rainie, slide 30, “Who users try to avoid: % of adult users who say they have used the internet in ways to avoid being observed or seen by…”

This slide supports the argument for privacy, especially against the “nothing to hide” argument: people desire – and deserve – privacy for many reasons, the least of which is to avoid the government or law enforcement. (Mostly, someone on Twitter pointed out, we want to avoid “that guy.”)

As for the future of privacy, people are feeling “hopeless.” Rainie remembered saying, in the early 2000s, “There’s going to be an Exxon-Valdez of data spills…” and there have been many since then, but little has been done to protect consumer privacy. “How do we convince people to have hope?” he asked.

Panel: “What Privacy Does Society Demand Now and How Much is New?” Danny Weitzner (moderator), Kobbi Nissim, Nick Sinai, Latanya Sweeney

Fortunately, the moderator and panelists have different initials. The questions and responses below are paraphrased from the notes I took during the panel session.

DW: What sort of privacy does society demand now? Is privacy different now?

NS: Access to your own data has always been a art of privacy; also the right to correct, erase, and transfer. Your data should be useable and portable.

KN: The ability to collect a lot of data all the time is new. There is a different balance of power (companies have too much).

LS: Privacy and security are just the beginning. Every American value is being changed by technology. Computer scientists aren’t trained to think of social science effects and the power of technology design.

DW: Cryptography and math are a foundation we can trust if implemented properly, as Snowden said this morning.

LS: I dislike choosing between two things. We need a cross-disciplinary approach, a blended approach.

NS: Any great company should constantly be trying to improve user experience. How does privacy/security get integrated into design?

KN: Aim for mathematical solutions/foundations. We need to re-architect economic incentives, regulations, how all the components work together.

DW: Where will the leadership and initiative come from? Government?

KN: Academia, research. We need to find ways to incentivize.

LS: Economic [incentives] or regulations are necessary for privacy by design. They’re all collapsing…every single one of them [Facebook, the IRS] is heading for a major disaster.

DW: People care about control of their data, yet the information environment is increasingly complicated.

LS: Society benefits from technology with certain protections.

KN: Regulations we have today were designed in a completely different era. We may be in compliance, and still we have damaged privacy severely.

LS mentioned HIPPA, NS mentioned the Consumer Bill of Rights, DW mentioned “Privacy on the Books and on the Ground.”

DW: Privacy practices and discussion are/is evolving in the U.S.

LS: A huge dose of transparency would go a long way. This is the new 1776. It’s a whole new world. Technology is redefining society. The Federal Trade Commission could become the Federal Technology Commission.

DW: Are you optimistic? Are we heading toward a positive sense of privacy?

NS: Yes, by nature I’m optimistic, but complexity and user experience (user accounts, passwords) frustrates me. Entrepreneurs do help change the world.

KN: The genie is out of the bottle. This forces us to rethink privacy. Nineteen-fifties privacy has changed and isn’t the privacy we have today, but that doesn’t mean that privacy is dead. Privacy is a sword and a shield.

DW: We’re at the beginning of a long cycle. It’s only been a year [and a half] since Snowden. What do we expect from our government and our companies? How powerful should government and private organizations be? Marketing/advertising issues are trivial compared to bigger issues.

LS: The cost of collecting data is almost zero, so organizations (public and private) collect it and then figure out how to use it later. They should be more selective about collection. If we can expose the harm, it will lead to change.

Question/comment from audience: A lot of people are not aware they’re giving away their privacy (when browsing the internet, etc.).

LS: We need transparency.

NS: We need regulation and consumer protection.

 

Privacy in a Networked World, II

This is the second post about Privacy in a Networked World. The first post, about the conversation between Bruce Schneier and Edward Snowden, is here.

“Privacy in a Networked World,” John DeLong, Director of the Commercial Solutions Center, NSA

Other than the length and volume of applause, it’s difficult to measure an audience’s attitude toward a speaker. I’ll venture, though, that the audience of Privacy in a Networked World was generally pro-Snowden; the attitude toward John DeLong can perhaps be characterized as guarded open-mindedness laced with a healthy dose of skepticism.

DeLong’s talk was both forceful and defensive; he wanted to set the record straight about certain things, but he also knew that public opinion (in that room, at least) probably wasn’t in his favor. (He said repeatedly that he did not want to have an “Oxford-style debate,” though his talk wasn’t set up as a debate in the first place.) “Let’s not confuse the recipe with the cooking,” he said, in a somewhat belabored analogy where the NSA’s work was the cooking and the law was the recipe. (I cook a lot at home, and I’ll just say, I can usually tell when I’m looking at a bad recipe, and opt to adapt it or not make it at all.)

DeLong quoted at length from Geoffrey R. Stone’s “What I Told the NSA.” (Stone was a member of the President’s Review Group in fall 2013, after the Snowden revelations.) Stone’s conclusions were not altogether positive; he found that while the NSA “did its job,” many of its programs were “highly problematic and much in need of reform.” But it’s the Executive Branch, Congress, and FISA who authorized those programs and are responsible for reforming them. Stone added, “Of course, ‘I was only following orders’ is not always an excuse….To be clear, I am not saying that citizens should trust the NSA. They should not. Distrust is essential to effective democratic governance.”

DeLong said, “The idea that the NSA’s activities were unauthorized is wrong, wrong in a magnificent way.” He emphasized that the NSA is not a law enforcement agency, it’s an intelligence agency. He spoke in favor of people with different backgrounds and expertise – lawyers, engineers, mathematicians, privacy experts, etc. – coming together to work out solutions to problems, with respect for each others’ abilities. “Technology,” he said, “always comes back to how we as humans use it.” At present, “We do not have technologies that identify privacy risks….Privacy engineering could be one of the most important engineering feats of our time.”

DeLong talked about rebuilding the nation’s confidence in the NSA. “Confidence is the residue of promises kept,” he said. “More information does not necessarily equal more confidence.” (Someone on Twitter pointed out that much depends on the content of the information.) The talk was a good reminder not to villainize the entire NSA; part of DeLong’s forcefulness was undoubtedly on behalf of his co-workers and staff whom he felt were unfairly maligned. And technology that could identify privacy risks, built by people who have different perspectives and backgrounds, would be excellent. But do we need technology that identifies privacy risks, or do we need stronger oversight and better common sense? Mass surveillance erodes trust in government and hasn’t been terribly effective; what more do we need to know to put a stop to it?

“Privacy and Irony in Digital Health Data,” John Wilbanks, Chief Commons Officer, Sage Bionetworks

John Wilbanks gave a fast-paced, interesting talk about health data. The “irony” in the title of his talk soon became clear when he gave the example of Facebook’s mood manipulation experiment compared to a study of Parkinson’s disease. The sample size for Facebook was many times larger, with a constant flow of information from “participants,” as opposed to a much smaller sample population who filled out a survey and answered questions by phone. “What does our society value?” Wilbanks asked. This question can be answered by another question: “What do we surveil?”

Wilbanks showed a graph representing cardiovascular disease and terrorism: there is 1 death every 33 seconds from cardiovascular disease – “That’s like 9/11 every day” – and yet there’s not nearly the same kind of “surveillance” for health that there is for terrorism. Participating in a research study, Wilbanks said, is like “volunteering for surveillance,” and usually the mechanisms for tracking aren’t as comprehensive as, say, Facebook’s. Of course, privacy laws affect health research, and informed consent protects people by siloing their data; once the study is concluded, other researchers can’t use that data, and there’s no “network effect.”

Informed consent, while a good idea in theory, often leads to incomprehensible documents (much like Terms of Service). These documents are written by doctors, reviewed by lawyers, and edited by committee. Furthermore, said Wilbanks, people in health care don’t usually understand issues of power and data. So, he asked, how do we run studies at internet scale and make them recombinant? How do we scale privacy alongside the ability to do research? Wilbanks demonstrated some ideas to improve on traditional informed consent, which could also allow research subjects to get a copy of their own data and see which researchers are using data from the research in which they participated.

Obviously there are risks to individuals who share their personal health data, but there can be advantages too: more scientists having access to more data and doing more research can lead to more breakthroughs and improvements in the field of medicine.

Last year, Wilbanks talked about privacy and health data on NPR; you can listen to the segment here.

Still to come: Microsoft, Google, Pew, and a panel on “What Privacy Does Society Demand Now and How Much is New?” 

Privacy in a Networked World

This is the first post about Privacy in a Networked World, the Fourth Annual Symposium on the Future of Computation in Science and Engineering, at Harvard on Friday, January 23.

A Conversation between Bruce Schneier and Edward Snowden (video chat)

Bruce Schneier is a fellow at the Berkman Center for Internet & Society, and the author of Data and Goliath. Edward Snowden was a sysadmin at the NSA who revealed the extent of the government’s mass surveillance. The conversation was recorded (no joke) and is available on YouTube.

I have to say it was an incredibly cool feeling when Snowden popped up on the giant screen and was there in the room with us. There was sustained applause when he first appeared and also at the end of the conversation, when he was waving goodbye. Schneier started by asking Snowden about cryptography: What can and can’t be done? Snowden replied, “Encryption…is one of the few things that we can rely on.” When implemented properly, “encryption does work.” Poor cryptography, either through bad implementation or a weak algorithm, means weak security. End points are also weak spots, even if the data in transit is protected; it’s easier for an attacker to get around crypto than to break it.

Snowden pointed out a shift in the NSA’s focus over the last ten years from defense to offense. He encouraged us to ask Why? Is this proper? Appropriate? Does it benefit or serve the public?

The explosion in “passive” mass surveillance (collecting everything in case it’s needed later) is partly because it’s easy, cheap, and simple. If more data is encrypted, it becomes harder to sweep up, and hackers (including the NSA) who use more “active” techniques run a higher risk of exposure. This “hunger for risk has greatly increased” during the War on Terror era. Their targets are “crazy, unjustified….If they were truly risk averse they wouldn’t be doing this…it’s unlawful.”

Snowden said that the NSA “is completely free from any meaningful judicial oversight…in this environment, a culture of impunity develops.” Schneier said there were two kinds of oversight: tactical oversight within the organization (“did we follow the rules?”) and oversight from outside of the organization (“are these the right rules?”). He asked, “What is moral in our society?”

Snowden asked if the potential intelligence that we gain was worth the potential cost. He stated that reducing trust in the American infrastructure is a costly move; the information sector is crucial to our economy. The decrease in trust, he said, has already cost us more than the NSA’s budget. “They are not representing our interests.”

Schneier, using his NSA voice, said, “Corporations are spying on the whole internet, let’s get ourselves a copy!” (This was much re-tweeted.) “Personal information,” he said, “is the currency by which we buy our internet.” (Remember, if you can’t tell what the product is, you’re the product.) It’s “always amusing,” he said, when Google complains about the government spying on their users, because “it’s our job to spy on our users!” However, Schneier thinks that the attitudes of tech companies and standards bodies are changing.

These silos of information were too rich and interesting for governments to ignore, said Snowden, and there was no cost to scooping up the data because until 2013, “people didn’t realize how badly they were being sold up the river.” Schneier said that research into privacy-preserving technologies might increase now that there is more interest. Can we build a more privacy-preserving network, with less metadata?

“We’ve seen that the arguments for mass surveillance” haven’t really held up; there is little evidence that it has stopped many terrorist attacks. Schneier cited an article from the January 26, 2015 edition of The New Yorker, “The Whole Haystack,” in which author Mattathias Schwartz lists several recent terrorist attacks, and concludes, “In each of these cases, the authorities were not wanting for data. What they failed to do was appreciate the significance of the data they already had.”

Unlike during the Cold War, now “we all use the same stuff”: we can’t attack their networks and defend our networks, because it’s all the same thing. Schneier said, “Every time we hoard a zero-day opportunity [knowing about a security flaw], we’re leaving ourselves vulnerable to attack.”

PrivacyNetworkedWorld

Snowden was a tough act to follow, especially for John DeLong, Director of the Commercial Solutions Center for the NSA, but that’s exactly who spoke next. Stay tuned.

 

Nothing to hide: Readers’ rights to privacy and confidentiality

One of the first arguments that comes up in the privacy debate – whether the issue at hand is a police search of your vehicle or Amazon keeping a record of every Kindle book you read – is that only people who have “something to hide” care about privacy.

To say this is disingenuous, and if the people who made this argument thought for even five minutes, I bet they could come up with a few things about their lives that aren’t illegal, or even morally or ethically wrong, but that they’d like to keep private anyway. Let’s consider the issue of library books, and what the books you check out may reveal about you. (Notice The Anarchist Cookbook is not on the following list. I don’t know the statistics about where terrorists get their bomb-making instructions, but I doubt most of it comes from the public library. There’s this thing called the Internet, you see.)

  • What to Expect When You’re Expecting, or other books that might indicate you’re trying to start a family before you’ve told anyone else.
  • Cracking the New GRE, or other test-prep books for grad school or a planned career change you aren’t ready to tell your current boss about.
  • Managing Your Depression, The Lahey Clinic Guide to Cooking Through Cancer, or other books about medical conditions you or someone close to you may be experiencing.
  • Bankruptcy for Small Business Owners might prove worrisome to your clients or your bank.
  • The Guide to Getting It On, or any books on the topics of sexuality, sexual health, safe sex, etc. (In many libraries, kids can get their own library cards at a young age, and parents aren’t allowed to monitor their accounts.) See also: It Gets Better: Coming Out, Overcoming Bullying, Creating a Life Worth Living, or Transgender Lives, etc.
  • God Is Not Great or other anti-religious texts would likely be poorly received if you’re part of a religious family or community.
  • A Gentle Path Through the Twelve Steps, or other books about personal struggle and recovery.
  • How to Buy a House; How to Sell A House, or other real estate books when you haven’t told anyone you’re thinking of moving.

These are just a few examples of information that people might justifiably want to keep personal and private, but not because of any wrongdoing. And this is why librarians strive to protect patron privacy.

“We protect each library user’s right to privacy and confidentiality with respect to information sought or received and resources consulted, borrowed, acquired or transmitted.” -ALA Code of Ethics

11/1/14 Edited to add: This short graphic novel about privacy and technology from Al Jazeera America expands this idea, looking not just at people’s reading history but about all the information they share, voluntarily or not. Thanks to Library Link of the Day for the link.

"Even if you have nothing bad to hide, giving up privacy can mean giving up power over your life story and competing with others for control."

“Even if you have nothing bad to hide, giving up privacy can mean giving up power over your life story and competing with others for control.”

 

TOS42

“Maybe we’ve been given a false choice between opting in and giving up control over how that information is used–” “–between sharing and being left out.”

11/3/14 Edited to add: Kevin O’Kelly from the Somerville Public Library reminded me of Glenn Greenwald’s excellent TED Talk, “Why Privacy Matters.” In it, Greenwald says, “People who say that…privacy isn’t really important, they don’t actually believe it, and the way you know that they don’t actually believe it is that while they say with their words that privacy doesn’t matter, with their actions, they take all kinds of steps to safeguard their privacy. They put passwords on their email and their social media accounts, they put locks on their bedroom and bathroom doors, all steps designed to prevent other people from entering what they consider their private realm and knowing what it is that they don’t want other people to know.

And also: “We as human beings, even those of us who in words disclaim the importance of our own privacy, instinctively understand the profound importance of it. It is true that as human beings, we’re social animals, which means we have a need for other people to know what we’re doing and saying and thinking, which is why we voluntarily publish information about ourselves online. But equally essential to what it means to be a free and fulfilled human being is to have a place that we can go and be free of the judgmental eyes of other people.”

Greenwald is the author of No Place to Hide: Edward Snowden, the NSA, and the U.S. surveillance state (2014). His TED talk is well worth 20 minutes of your time.

 

MLA Conference 2014, Day One (Wednesday), Part One

It’s that time again! This year, the Massachusetts Library Association conference is in Worcester, and once again the lovely and gracious Friends of the Library enabled some of our library staff (myself included) to attend. Here’s my round-up of the first three sessions I went to today, with more to come. Several conference-goers are also on Twitter (#masslib14).

Brand New You: How Libraries Use Branding to Establish Relevance and Engage Users

Anna Popp from the Massachusetts Library System presented on MLS’ experience developing their brand with Walter Briggs of Briggs Advertising. Popp convened a task force and established a clear decision-making protocol (essential, according to Briggs). Popp and Briggs explained that an organization’s brand is evolutionary, not visionary; it’s not the same as its vision or mission statement (‘it’s not what you aim to be, it’s what you are’).

MLS logoThe process involved brainstorming everything about the organization, then crossing out everything that wasn’t unique, with the goal of distilling it down to 3-5 words or phrases – the “brand mantra.” The brand mantra is an internal tool, and is not the same thing as a tagline (e.g., Nike’s brand mantra is “Authentic Athletic Performance,” not “Just Do It.”) MLS came up with “Uniting, Empowering, Library Enhancement.” The brand mantra is “the most important deliverable” from the branding process, more important even than the logo (at left). The logo’s job is not to show or tell what an organization does.

The tagline should be “evocative, inspiring, brief, lyrical” and have “integrity.” The (awesome) MLS tagline is “Stronger together,” which perfectly suits an organization dedicated to building a statewide community of libraries, empowering those libraries, and championing resource sharing.

Briggs finished the presentation by sharing some of his past work. I especially loved the Patten Free Library tagline, “More than you’ll ever know,” and the tangram-like logos (below) for the Curtis Memorial Library (both libraries are in Maine).

CurtisMemorialLibrary logoCurtisMemorialTeenCurtisMemorialKids

 

 

 

The takeaways from this session included: (1) Recognize what people bring to the table, (2) Establish role clarity – who will have an advisory role, who will have a decision-making role?, (3) Let people do their jobs, help when necessary, (4) Prepare to learn something about yourself, (5) Plan ahead, but be prepared for eventualities and opportunities. It may be hard to prove the ROI on a logo, but Popp mentioned the idea of “mindshare”: “in marketing, repetition wins.” Establish your relevance and constantly reaffirm it.

An Agenda for Information Activism: Internet Freedom and Press Freedom Today

Kevin Gallagher stepped up here in place of the original presenter, Josh Stearns, formerly of Free Press. Gallagher clearly knew his stuff, particularly the threat that government mass surveillance poses to journalists and society at large, and he did a good job on short notice. He wasn’t the most comfortable speaker, and his presentation jumped around a little bit; the audience wasn’t all familiar with some of the terms he used or the services he referenced. The presentation had no handouts or visual component (other than the trailer for the upcoming Aaron Schwartz documentary, The Internet’s Own Boy). However, privacy is something librarians care deeply about, and this program took a step toward convincing us all to do more research for ourselves, and think about what we can offer patrons, both in terms of tools and education. Here are a few points and links from the session (thanks also to Alison Macrina of Watertown Free Public Library):

  • When the government undermines and weakens Internet security standards for the purposes of surveillance and data-gathering, it makes us all less safe, not more.
  • There are library privacy laws in 48 states and the District of Columbia. Patron privacy and confidentiality is essential for the free pursuit of knowledge.
  • If the government can collect metadata on journalists’ communications, that exposes journalists’ sources, whose confidentiality should be protected.
  • Read the full text of the Guerilla Open Access Manifesto by Aaron Schwartz on the Internet Archive.
  • “There is already a war” against whistleblowers, journalists, and activists (examples: Julian Assange, Jeremy Hammond, Edward Snowden, Barrett Brown, Jim Risen).
  • “We need a new Church Committee.”
  • Government agencies and private companies are collecting personal data and metadata. Be aware of what personal data private companies are collecting, and what permissions you are giving when you use services like facebook. See Terms of Service; Didn’t Read.
  • Use search engines that value privacy, like DuckDuckGo, or use plugins like Ghostery or services like Disconnect.me. Install Tails, an operating system that lets you use the Internet anonymously via TOR.
  • What can we (in libraries) do? Use more privacy and security tools (like https everywhere from the EFF). Use free and open software instead of proprietary software (“There’s a free and open alternative to everything”). Make sure patron privacy policies are up to date, and make sure we aren’t collecting any more patron information than necessary. If libraries are receiving federal funds that force compliance with CIPA, make sure you aren’t filtering any more than you have to – or, if possible, don’t accept the strings-attached funds. Host a “crypto party.” Support the USA Freedom Act, make FOIA requests. Remember the Library Bill of Rights.

How We Doin’?: Public Libraries Using LibSat to Gather Patron Feedback

The Los Angeles Public Library uses LibStat.

The Los Angeles Public Library uses LibStat.

The Massachusetts Board of Library Commissioners (MBLC) is providing LibSat from Counting Opinions to all Massachusetts libraries for a three-year term. All library directors have the login information, and can pass it on to any of their staff. From what we saw in this session, LibSat is a pretty incredible tool to gather continuous patron feedback about their library experience; data nerds in the room were audibly delighted.

This session began with the proverb “A guest sees more in an hour than the host sees in a year.” Patron feedback is valuable to libraries, offering reminders of how much people appreciate library services and staff as well as presenting opportunities for improvement; patrons who rate the library’s importance as high but their satisfaction with the library as low direct attention to areas for improvement.

LibSat offers patrons a choice of a short survey (3-5 minutes), a regular survey (5-7 minutes), and an in-depth survey (~15 minutes). Other than possible survey fatigue, there’s really no reason MA libraries shouldn’t be using this tool. The results could really come in handy when it’s time to prepare those annual reports…

Next up:

Working with and Managing Multigenerational Staff/People

Building Intergenerational Collaboration & Programs: Serving People of Different Ages

Last year’s (rather long) MLA posts:

4/24/13: Teaching Technology to Patrons and Staff & Afraid to Advocate? Get Over It! & Library Services and the Future of the Catalog: Lessons from Recent ILS Upgrades & Loaning E-Readers to the Public: Legal and Strategic Challenges

4/25/13: On Life Support, But Not Dead Yet!: Revitalizing Reference for the 21st Century & Authors, Authors, Authors!: Three Local Authors Strut Their Stuff & Analyze Your Collection & Print and Digital Publishing: How Are Publishers, Editors, and Authors Adapting.

404 Day: Action Against Censorship

Join EFF!

Today is 4/04, “a nation-wide day of action to call attention to the long-standing problem of Internet censorship in public libraries and public schools.” Read the Electronic Frontier Foundation (EFF) piece about Internet filtering in libraries and the Children’s Internet Protection Act (CIPA). Though CIPA requires some Internet filtering in libraries that accept federal funding, libraries often go further than is required. (Also, no filter is perfect: some “bad” content will always get through, and plenty of legitimate content will be blocked.)

Barbara Fister at Inside Higher Ed is also eloquent, as always, in her piece “404 Day: Protecting Kids From…What?” She mentions Gretchen Casserotti, a public library director in Meridian, Idaho, who live-tweeted a school board meeting in which Sherman Alexie’s The Absolutely True Diary of a Part-Time Indian was, ultimately, removed from the school curriculum. (Book Riot collected these tweets in order.)

Kids don’t need to be protected from literature. They need to engage with it, think about it, and discuss it; if adults can help with that, all the better. But as Fister points out, it would be better to protect kids from, say, hunger, than from “dangerous” books.

Problem Novels or Resilience Literature?

speakLast month there was a snippet in EarlyWord that caught my eye; librarian Nancy Pearl interviewed YA author Laurie Halse Anderson. Anderson’s books, which tackle difficult but real topics such as rape (Speak) and eating disorders (Wintergirls), are occasionally targeted by those who wish to censor them. Pearl asked her about the “problem novel” label that is “often applied to books about teens dealing with real-life situations.” Anderson reframed the issue by calling these books “resilience literature” instead, “because the goal of the books is to help strengthen kids facing difficult situations.” I think that’s a beautiful and apt way to put it.

As my co-worker Rebecca wrote during Banned Books Week last year, “Books are safe spaces to experience new things. New thoughts. New ideas. Different points of view….Books teach us how to empathize with each other, how to stand up for the little guy, and how to recognize the bad guys in our lives….We experience strong emotion alongside our favorite characters – joy, catharsis, loss, excitement. Books are a safe way to learn about life, without all the painful bumps and bruises.”

There is a quote attributed to Mahatma Gandhi: “Your beliefs become your thoughts, your thoughts become your words, your words become your actions, your actions become your habits, your habits become your values, your values become your destiny.” Your thoughts become your words. Your words become your actions. Words are important, and labels are especially so. Anderson’s renaming “problem novels” to “resilience literature” is not only a more accurate term, it also casts these books, and the discussion surrounding them, into a more positive and constructive light.

Speak: Have you read books that fall into this category? What did you think of them?