The gleam in the dark: writing and reading fiction

I’ve been a fan of Lauren Groff’s writing for years, so I was delighted to find this interview with her (via Twitter). The interviewer, Jason Skipper, asked Groff about her research for Monsters of Templeton and Arcadia, and Groff replied, “Research is about following the gleam into the dark.” She followed this beautiful sentence by talking about the difference between “creative” facts that spur one’s imagination, as opposed to those facts that dampen the process. This makes sense: just think about researching for work of historical fiction. Some facts will be fascinating, suggesting plot points all on their own, while others will seem like obstacles to the story.

Skipper then asked Groff about connection, “as a person born on the cusp of the digital age – making you old enough to remember a time without it, and young enough to realize its potential.” I have copied most of Groff’s resonant reply: “We are cuspies, aren’t we? There’s a glow to that time before things went all matrix on us, before everyone was plugged into the mainframe by their fingertips….I do remember people talking more. Nostalgia is dangerous, though, and I can’t tell whether those days actually were more authentically connected, whether they seemed so because I was an adolescent, or whether memory is spackling everything over with a thick layer of pretty-pretty.”

She continued, “In terms of writing, I think what most fiction writers treasure more than anything is the feeling that they’re living for the length of a book inside another person.” This echoes the sentiment in editor Jennifer Jackson’s publicity letter in the ARC of Peter Heller’s The Dog Stars: “[The book] reminded me why I became a reader in the first place: because it is the best chance you will ever have to live another life.” Both author and reader see books as a means of escape and of empathy.

This isn’t a coincidence. In an article entitled “Your Brain on Fiction” in The New York Times earlier this year, professor of cognitive psychology and novelist Keith Oatley suggested that reading produces “a vivid simulation of reality.” The article’s author, Annie Murphy Paul, wrote, “Fiction with its redolent details, imaginative metaphors and attentive descriptions of people and their actions offers an especially rich replica. Indeed, in one respect novels go beyond simulating reality to give readers an experience unavailable off the page: the opportunity to enter fully into other peoples thoughts and feelings.”

Paul continued, “The novel, of course, is an unequaled medium for the exploration of human social and emotional life,” and cited work by Dr. Oatley and Dr. Raymond Mar indicating that “individuals who frequently read fiction seem to be better able to understand other people, empathize with them and see the world from their perspective…novels, stories and dramas can help us understand the complexities of social life.”

Just over a month after the “Your Brain on Fiction” article ran in the Times, the Boston Globe ran a piece by Jonathan Gottschall called “Why Fiction is Good for You.” (Originally, I was going to cite both these pieces in a post called “A spoonful of fiction makes reality go down,” about why kids should be able to read what they want without parents or teachers fearing that the content of the books will damage them somehow; it seems that rather the opposite is true.) Gottschall reports, “Research consistently shows that fiction does mold us…mainly for the better, not for the worse.” When people read fiction, they imagine themselves in the characters’ lives – which may be completely different from their own. This encourages empathy, and “by enhancing empathy, fiction reduces social friction.”

Imagination leads to understanding; understanding leads to empathy. It turns out – surprise, surprise – that stories are good for us.

10/4/2013 Edited to add: A study published in the journal Science found that after reading literary fiction, “as opposed to popular fiction or serious nonfiction, people performed better on tests measuring empathy, social perception and emotional intelligence.” Read the article from the New York Times “Mind” section, in which author Louise Erdrich is quoted: “For Better Social Skills, Scientists Recommend a Little Chekhov”

Rules for Writing Fiction

This two-part article from The Guardian (UK) isn’t new, but it’s worth another read even if you saw it when it was first published. Writers including Elmore Leonard, Margaret Atwood, Philip Pullman, Jonathan Franzen, Anne Enright, Hillary Mantel, Zadie Smith, and many more offer their personal rules for writing fiction. A few that are repeated throughout many lists include “take long walks,” “avoid adverbs,” and (seems obvious, but…) “write.” Many encourage habit and routine; many also admit it’s fine to break the rules sometimes. Whether you’re a writer or a reader, the lists are fascinating.

Read Part One

Read Part Two

Losing “cite” of what’s important

I’ve been thinking about citation a lot lately. (Chances are, if you aren’t a student and you just read that sentence, you’re already weeping from boredom; but if you are a student, you might have made some sort of frustrated growling sound, or perhaps banged your head against the nearest wall.) I’m in my last semester of grad school in a program where most professors require APA citation (some are flexible – you can use another format as long as you’re consistent about it). As an undergrad I used Chicago style, and in high school I used MLA, so I’ve transitioned a few times, but I was pretty sure I had APA down; after all, attention to detail is what I do. However, two professors this semester have corrected my APA style citations, which makes me think that other professors just weren’t looking that closely – or that they simply didn’t care as much about the details, just that sources were acknowledged somehow and that the writing was good.

Then this morning, I read an opinion piece in the Chronicle of Higher Education: “Citation Obsession? Get Over It!”  The author, a teacher of writing at James Madison University, says that librarians aren’t able to help students with broader information literacy concerns (such as finding, choosing, and evaluating sources) because they are overwhelmed with anxious students needing help with citations. “What a colossal waste,” he writes. “Citation style remains the most arbitrary, formulaic, and prescriptive element of academic writing taught in American high schools and colleges.” He continues, “Now a sacred academic shibboleth, citation persists despite the incredibly high cost-benefit ratio of trying to teach students something they (and we should also) recognize as relatively useless to them as developing writers.”

The author, Kurt Schick, advocates focusing on the function rather than the form of citation. Most citation styles include the same basic information: author, title, publication date, publisher. It would seem that, with the help of a style guide, it wouldn’t be too hard to put that information in the right order – and it wouldn’t be, if all we were using was books and articles from academic journals. But information can come from a wide range of sources in a wide range of formats, and there are all kinds of exceptions. As long as the core information is there, allowing others to track down the item in question, does it really matter where the periods and parentheses are? Schick argues that no, it doesn’t – not until it’s time to publish. Until then, professors’ and librarians’ and writing center staffs’ time is better spend helping students with essential information literacy and writing skills: “We could…reinvest time wasted on formatting to teach more important skills like selecting credible sources, recognizing bias or faulty arguments, paraphrasing and summarizing effectively, and attributing sourced information persuasively and responsibly.” Students should absolutely understand the importance of acknowledging their own use of others’ words and ideas in their writing; no one is arguing that the concept of citation and attribution isn’t essential. The formatting, however, may not be that important.

Until this sea change takes place, however, the form of citation remains paramount for many professors at many institutions. For anyone required to use APA, I highly recommend the Purdue OWL (Online Writing Lab) APA Formatting and Style Guide. Individual college and university libraries and writing centers may have also prepared their own handouts, guides, and tips for citation. There are also citation tools like RefWorks and Zotero, and databases often have a “cite” tool built in; these can certainly be helpful, but they aren’t perfect, and students will still need to proofread for formatting errors.

Know of any other good citation style guides online? Please share in the comments.

Grammar matters.

My dad sent me a link to this article today: “7 Grammar and Spelling Errors That Make You Look Dumb.” I highly recommend it if (a) you’re one of those people who cannot keep “your” and “you’re” straight, or (b) you’re one of those people who can keep “you’re” and “your” straight, and it drives you crazy that your friends, students, coworkers, etc. can’t. Within the article, there’s a link to Grammar Girl, which is also an excellent resource. Another one of my favorites is The Oatmeal, which delivers grammar lessons via comics (usually including dinosaurs or dolphins or something else amusing and lighthearted, yet memorable). Finally, there’s a great Hyperbole and a Half comic on the non-word “alot.”

Like it or not, grammar matters. I try to be non-judgmental and open-minded about a lot of things, but I absolutely judge based on grammar and spelling, and these resources allow me to admit that and justify it a little bit. In the grand scheme of things, is grammar important? Well, as the first article says, yes: potential employers and hiring committees are going to judge also. Should it be important is another question, but for now, it is. Everyone who has to write a cover letter or put together a resume is probably going to be judged on their grammar. Written communication is important – as is attention to detail, especially if you’re applying for a job. So if you’re in category (a), take advantage of the opportunity to learn from comics.

(c) The Oatmeal

Summer InfoLink

Still a bit backlogged with more New Orleans posts to come, as well as a post about the Wikipedia in Higher Education Summit; meanwhile, here’s a short piece I wrote on GSLIS After Dark for this summer’s issue of InfoLink. I’m also in the “Summer Reading” article.

Speaking of summer reading, I just finished Ben Nugent’s American Nerd: The Story of My People, and would definitely recommend it to anyone interested in the evolution and psychology of the nerd. It’s a great blend of well-researched history and the author’s personal experience.

And for those who are looking for a tech-heavy young adult novel set in a dystopian but not-too-distant future (lots of people are looking for that kind of book, right?), try Cory Doctorow’s Little Brother. The main character is well-written and compelling, and you will learn a lot about technology, privacy, and security in a painless way. Warning: it may make you a little paranoid.