Showing posts with label Writing History. Show all posts
Showing posts with label Writing History. Show all posts

From the Pages of Historically Speaking: An Interview with James M. Banner, Jr. on Being a Historian


"On Being a Historian: An Interview with James M. Banner, Jr."
Conducted by Donald A. Yerxa
Historically Speaking (September 2012)

Historian James Banner's new book Being a Historian: An Introduction to the Professional World of History (Cambridge University Press, 2012) is an insightful and often provocative overview of the current state of the discipline of history. Drawing on more than fifty years experience both within and outside academic walls, Banner argues that while there is much to celebrate, the discipline needs to acknowledge and confront a number of serious challenges. Banner, the author of many books and essays on history, education, and public affairs, is currently working on a book about revisionist history. Senior editor Donald A. Yerxa interviewed Banner in July 2012.

Donald A. Yerxa: For the benefit of our readers, would you briefly summarize your central argument in Being a Historian?

James M. Banner, Jr.:
The basic one, from which the book descends, is that history is a discipline—a distinct domain of knowledge— pursued in many professions. That is, there's no "history profession," as we colloquially call it, as such. That argument's corollary is that academic history, while still the center of gravity of the discipline, does not embody all of historians' knowledge, institutions, or practices. Of course, we know this, but our terminology and the way we relate the history of the discipline haven't caught up with the facts—much to the cost of reputation, reward, self-respect, and, most importantly, the training of historians. I thus also argue that, while the preparation of historians has substantially improved in recent decades, it remains deficient. That argument, that we have farther to go in preparing historians, is like an organ point in a passage of music, the rumbling contention of the entire book. Finally, I argue that historians (like, I must say, sociologists and biologists, attorneys and engineers) must seek more guidance, not from the idols of the tribe—academic professors—or from within the conventional template of graduate student preparation—how principally to become an academic scholar-teacher—but from within themselves, from their particular dispositions, gratifications, aims, and gifts.
Yerxa: What prompted you to write it? And for whom is it written?

Banner:
Part of the spur was purely personal as is—isn't it?—all writing. I wanted to try to draw together my reflections, frustrations, and concerns about the entire discipline of history formed over more than a half-century of being a historian. I wanted also to challenge my colleagues to go further in altering the way historians prepare young historians for their professional worlds. And there was a part of me that wished to do what I wish the senior historical organization in the U.S. and the largest and most influential body of historians in the world—the American Historical Association—would periodically do: assess the state of the discipline. And so the book is a kind of evaluation of the condition of the discipline today. But it's also a book with two very specific audiences in mind: first of all aspiring historians, for whom I want to provide a kind of honest, optimistic, yet disenthralled introduction to the discipline they're entering; and second, my more experienced colleagues who ought to be training historians to interact with the larger world as well as with scholars and students and who, I hope, are learning to reach out to that world themselves.

Yerxa: You argue with conviction that it is a mistake to confuse the discipline of history with the profession of history. Why is it such an important distinction to make?


Banner: Simply put, because of the facts. The academic profession is but one of the professions—although, surely, the central one still—in which historians practice their many crafts and apply their great variety of knowledge. Historians also practice history in law and medical schools, in government at all levels, as reporters, in museums and historical societies, and as schoolteachers. These historians, when employed as historians, are all professional historians acting professionally, taking part in the worldwide community of historical discourse and applying historical knowledge in some manner to some purpose. It’s the discipline that binds us, not our places of work, the kinds of work we pursue, the forms our work takes, or the audiences to which we direct that work. Those differ widely. The conventional terminology—“the” history profession—gives pride of place to those who coined the term and have long employed it: academic historians around whom, in the first century of the discipline’s emergence, the world of history gathered. After all, they were the people (mostly men) who created the departments, the standards, the training protocols, the products (mostly books), and the tenure system in which, until the 1960s, most historians were organized. But while historians must still be prepared by academic historians in research universities to master bodies of knowledge and to undertake and produce research scholarship, their employment has long escaped academic walls. In fact, there’s reason to believe that at least half of those now receiving history doctoral degrees, either by choice or necessity (we lack information about that critical matter), do not enter academic work. Consequently, in recent decades we’ve gotten used to distinguishing academic from public historians. That’s fine as far as it goes. But, as I also argue in the book, it’s a weak distinction. Increasingly, historians are hybrids—I’m one of them—who move back and forth between the classroom and other occupations, who write, film, and curate history while holding faculty positions and who teach while working in government or nonacademic institutions. An increasing number of historians are both academic and public historians. So why can’t we just term ourselves historians— colleagues all—and put aside the distinction, perhaps useful but increasingly outmoded, between public and academic historians?  read on >>>

Writing Well?

Randall Stephens

The Sokal Hoax is the stuff of academic legend. The journal Social Text published Alan Sokal's baroque send-up of po-mo, bad writing in the spring/summer 1996. Sokal gave it the absurdly pompous title: "Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity." Steven Weinberg wrote about it in the NYRB later that same year:

The targets of Sokal's satire occupy a broad intellectual range. There are those 'postmoderns' in the humanities who like to surf through avant garde fields like quantum mechanics or chaos theory to dress up their own arguments about the fragmentary and random nature of experience. There are those sociologists, historians, and philosophers who see the laws of nature as social constructions. There are cultural critics who find the taint of sexism, racism, colonialism, militarism, or capitalism not only in the practice of scientific research but even in its conclusions. Sokal did not satirize creationists or other religious enthusiasts who in many parts of the world are the most dangerous adversaries of science, but his targets were spread widely enough, and he was attacked or praised from all sides.
.
Sokal also threw in some hairy theory and clunky sentences. For instance, he wrote seriously about the nonsensical "morphogenetic field" theory. His sophistry meant to impress. And the editors of Social Text were impressed. In a move that paralleled conceptual art, Sokal, so thought unknowing readers, was pushing the boundaries of so-called "science." 

Do academics in the humanities still prize purple prose and fantastic theories over clear writing and measured analysis? Are scholars stubbornly proud of their bad writing, as if to shout from the rooftops that their work is only to be read and understood by a cabal of fellow scribblers? Can anyone make a case for not rooting out unidentified antecedents, passive voice, misplaced modifiers, lack of agreement, or double negatives? Should there be some kind of writing standard, even for academics? 

To that last question Helen Sword says "yes." The author of Stylish Academic Writing (Harvard University Press, 2012), Sword writes about her project in the WSJ

Unfortunately, the myth persists, especially among junior faculty still winding their anxious way up the tenure track, that the gates of academic publishing are guarded by grumpy sentries programmed to reject everything but jargon-laden, impersonal prose. In fact, nothing could be further from the truth. Nearly everyone, including the editors of academic journals, would much rather read lively, well-written articles than the slow-moving sludge of the typical scholarly paper.
.
Surely, scholars in the humanities should consider their audience and what kind of message they are trying to convey. Would any author happily describe his or her work as "inaccessible," "abstruse," or "turgid"? Probably not. Yet plow through many an article in an academic journal or read a random monograph from the shelves of your university library and those words will likely come to mind. 

Some years ago in grad school I worked with the labor historian Robert Zieger. Here's one bit of advice he offered undergrads and grad students: "Use vigorous, direct language. Short sentences work. Employ concrete, precise nouns and active verbs, being careful, for example, to find active substitutes for forms of the verb 'to be' and 'to go.' Inexperienced writers often erroneously think that convoluted language, long sentences, and pretentious diction impress teachers." And still . . . many academics seem to think "convoluted language, long sentences, and pretentious diction" will impress or suitably confuse readers. That would not be too far from what George Orwell described in "Politics and the English Language" (1946): “The great enemy of clear language is insincerity. When there is a gap between one’s real and one’s declared aims, one turns as if it were instinctively to long words and exhausted idioms, like a cuttlefish squirting out ink.” 

But surely one kind of writing doesn't suit all disciplines! And so Sword observes:

Stylishness is in the eye of the beholder, of course, and stylistic preferences can vary significantly across disciplines. Nevertheless, all stylish academics adhere to three key principles that any writer can master: communication, concreteness, and craft.

Rejecting those three, I penned my own bit of over-written, jargon-laden academic unprose. It's exaggerated, I know. But, not by much!

The prevailing sequence of hybridity in the post-colonised novel lends itself, interstitially, to notions of conquest, absence, and disquietude of the en-lightened Mastermind. A mind{less}ness prevails, just as order, disorder, and value-induced reasoning through a countless series of dilemmas grows. A closer look bears repeating in rough contexts unlike those aberrant occidentric diodanous boundarylands. The transformation of agency-related modes of being, working, and cleaning demarcate and imbue the singularities of eroticizational ideation. Or, in one scholar's incisive words: 

My growing conviction has been that the encounters and negotiations of differential meanings and values within 'colonial' textuality, its governmental discourses and cultural practices, have anticipated, avant la lettre, many of the problematics of signification and judgement that have become current in contemporary theory—aporia, ambivalence, indeterminacy, the question of discursive closure, the threat to agency, the status of intentionality, the challenge to 'totalizing' concepts, to name but a few.*

And still, we have to ask ourselves, if only it were so simple. . . . 

Do the dominant Deleuzeian somnambulant regimes of some prelinguistic realities reinscribe what some are now, rightly, calling Academies of Texthibitionism? Are gleeful literary curios to blame for our unctuous, precious dreamworld of the deracinated female body? All such question are implicitly, if not explicitlessly, reiterated and conformed by the worries of a market-driven, late capitalist, zero-sum hegemon—a two-term qualifier revamp if there ever was one, to paraphrase Dioxané Umbriage. 

As I have argued elsewhere, and as is made rather clearly in the notes to the notes of chapter 5, the vicissitudanal convergencies of self and the “sane” are only partially related to the singularities of a final Lacanian eroticizational ideation schema. Albeit, a brave one.

Roundup on Writing


From a cafe in Grunerløkka, Oslo
William Zinsser, "Looking for a Model," American Scholar blog, ND

Writing is learned by imitation; we all need models. “I’d like to write like that,” we think at various moments in our journey, mentioning an author whose style we want to emulate. But our best models may be men and women writing in fields different from our own. When I wrote On Writing Well, in 1974, I took as my model a book that had nothing to do with writing or the English language.>>>

PageView Editor, "My Daily Read: [an interview with] Helen Sword," Chronicle, May 2, 2012

Q: What is your greatest criticism of much academic writing?

A. In contrast to Sinclair’s lucid and engaging paper, many academic articles are quite frankly unreadable, not only by disciplinary outsiders but by close colleagues.  Often the problem is simply poor craftsmanship:  perhaps the author has tried to cram three or four major ideas into a single sentence, leaving the reader to do the hard work of disentangling all those nested subordinate clauses.  Another common issue is an excessive allegiance to the discourse of abstraction: it’s not uncommon to find nine, ten, or more spongy abstract nouns (examples: allegiance, discourse, abstraction) cohabiting in a single sentence. The human attention span has trouble coping with that much vagueness.  Stylish academic writers anchor abstract ideas in the physical world, using stories, case studies, metaphors, illustrations, concrete nouns, and vivid verbs, and lots and lots of examples.
>>>

Isabel Kaplan, "Classic Literature Isn't Dead: No Ifs, Ands, or Buts," Huff Post Books, June 5, 2012

THIS JUST IN: Contemporary writers are no longer influenced by classic literature -- or so claim a team of mathematicians from Dartmouth and Wisconsin in a recently published paper entitled, "Quantitative patterns of stylistic influence in the evolution of literature.">>>

Gail Collins, "How Texas Inflicts Bad Textbooks on Us," New York Review of Books, June 21, 2012

No matter where you live, if your children go to public schools, the textbooks they use were very possibly written under Texas influence. If they graduated with a reflexive suspicion of the concept of separation of church and state and an unexpected interest in the contributions of the National Rifle Association to American history, you know who to blame.>>>

Maria Popova, "Ray Bradbury on Facing Rejection ... and Being Inspired by Snoopy," Atlantic Monthly, May 21, 2012

Famous advice on writing abounds—Kurt Vonnegut's 8 tips on how to make a great story, David Ogilvy's 10 no-bullshit tips, Henry Miller's 11 commandments, Jack Kerouac's 30 beliefs and techniques, John Steinbeck's six pointers, and various invaluable insight from other great writers. In Snoopy's Guide to the Writing Life, Barnaby Conrad and Monte Schulz, son of Peanuts creator Charles M. Schulz, bring a delightfully refreshing lens to the writing advice genre by asking 30 famous authors and entertainers to each respond to a favorite Snoopy comic strip with a 500-word essay on the triumphs and tribulations of the writing life.>>>

Real-time

Dan Allosso

Boy oh boy, there are a lot of stories out there! It continues to amaze me, how everyplace I look, there's interesting, compelling history that could potentially turn into serious projects. Yes, okay, maybe I have a short attention span and maybe I need to complete some things (like my dissertation) before taking on any new projects. I’ll give you that.

So what I’m trying to do is get a little bit of info, when I find these topics, so I can get back to them later. In a sense, maybe this is how authors worked in the days when they were doing the final edits on one manuscript while writing the next, proposing the one after that, and looking for the projects after those. In the world of self-publishing, the steps are a little different, but maybe the principal is the same.

The one thing that has really struck me, as I’ve been getting down to writing one project that I’ve been thinking about for a couple years, is how wasteful it is to go over the same ground again and again simply because I didn’t complete the job earlier. I have file folders, backup hard drives, and memory sticks filled with documents. I’ve downloaded hundreds of pdfs from Google or the Internet Archive. I have a stack of index cards nearly four inches high, two partial bibliographies in Endnote and one in Sente. And I have a half dozen outlines and drafts.

It’s good that I’ve been thinking about this project as long as I have been, and it will probably be a better end product because of it. But next time, I’m going to try to be a little more careful about identifying the material I’m collecting, and writing about it as I’m collecting it. In real-time.

Maybe I thought I wasn’t ready to actually start writing this, or maybe I was just lazy – or too excited about the research. You know how it is: one link leads to another, and soon you’ve got gigabytes of great material. But now that it’s writing time, I need to go back over all this material, rediscovering the paths I followed that led me to these records and relearning how they all fit together. Makes me think if I could have been a little more detail-oriented on the front end.

So I’m trying to build a single bibliography for this new, potential project I’ve just discovered. I’m connecting the documents to the entries in Endnote, so I’ll know where they are (and I won’t have to wonder where the most recent ones are!) I’m writing little abstracts and synopses now, so when the time comes I’ll understand how it all fits together and where each record fits in the story. I’ve even got a timeline and a cast of characters, that I can add to anytime between now and whenever I really start this project.

Wish I would’ve started this sooner! The original project I came to grad school thinking about is still out there on a back burner. That folder on the backup drive measures about 29 gigabytes, and some of the files date back to 2006. It will be fun revisiting all that stuff someday. But very expensive.

Surviving a Book Edit

Philip White

The title of this blog post may have drawn more readers if it read “Surviving a Shark Attack” or “Surviving a Tsunami” but, though it may lack the same drama, I hope this particular musing will be more useful for the would-be book writer.

I have been working on my book (shameless plug alert!), Our Supreme Task: How Winston Churchill’s Iron Curtain Speech Defined the Cold War Alliance, since March 2009. From its genesis, it has gone through multiple metamorphoses, with entire chapters re-written and axed, new sources discovered and integrated, and days spent at the Churchill Archives Center, the National Churchill Museum, the Harry S. Truman Library and other archival treasure troves.

When I first settled on September 1 as my manuscript submission date, almost nine months ago, it seemed a lifetime away. After all, I’d already put hundreds or even thousands of hours into the project, had what I thought were five complete chapters (of 11) on my hard drive, and was rolling along with the remainder.

However, the deadline that once seemed so far off soon appeared right before my nose, like the knights caught unawares by Sir Lancelot in Monty Python and the Holy Grail. Liz Murphy, archivist at the National Churchill museum, came across a batch of pertinent Churchill letters just days before, and I was still hurrying to incorporate this new material. I was also hastily acquiring rights for photos from the Potsdam Conference and Churchill’s 1946 visit to the U.S., while trying to cut bloat from certain chapters. Arrggh! I thought I had this under control! How did it become this mad panic?

Anyway, I got the manuscript and images away a couple days early, and took a deep breath. Two weeks later, my editor mailed back a Yellow Pages-sized packet of paper, with red pen to indicate her first read comments and blue pen to show comments from the second pass. The first eight chapters were smooth sailing, but numbers nine and eleven were anything but – too much detail, too long, too everything other than ready to go to print. So I spent an entire day cutting away, and eventually, after four and a half days of hard work, sent back my response to her comments. In the midst of cutting almost 20,000 words, re-formatting a chapter and putting my pride to the sword, here’s what I learned:

Organization

As I’ve written before on this blog, I am not a naturally organized person. But I’ve developed some habits and systems to force myself to be less haphazard and they’ve proved effective. When I first opened the UPS envelope from PublicAffairs, I laid each chapter face down in its own pile on the kitchen table, with chapter one on the far left. Completed chapters went into a “Sept 2011 edits” folder.

As I moved through each section, I jotted down notes on my tablet to remind me about global changes, such as replacing the use of “C-T Day” (referring to Churchill and Truman’s March 6, 1946 visit to Fulton, Mo.) with “Churchill-Truman Day,” and removing overly complicated numerical details. I then addressed these as I went along. Though the temptation was there to discover the scope of my challenge, I did not so much as peek at chapter two before I was done with chapter one. It was agony. Nonetheless, these simple steps proved highly effective.

Venue

I recently read an old article on David McCullough’s writing, and discovered he works in a converted shed in his back garden. He built this haven so his grandkids wouldn’t have to tiptoe around the house while he was working, and so that he could focus. The bonus disc in the HBO adaptation of John Adams also features this hideaway.

Now, I doubt my tyrannical homeowner’s association would tolerate such a structure even if could summon the practical muster to build one (my wife will laugh when she reads this, as I can barely hammer a nail into the wall to hang a picture). So, when it came time to hunker down I left the family at home and went to the library at my alma mater, and when it closed, to my local Starbucks. Hey, good enough for Obama’s chief speechwriter, good enough for me. The combo of a large desk and silence at the former and my noise-canceling headphones and enough caffeine to kill a small horse at the latter did the job.

Know Thy Limits

With the afore-mentioned caffeine coursing through me and my enthusiasm stoked, I wanted to plough through the night on the first day of this process. It wouldn’t be the first time. But about six months ago I “hit the wall,” as a friend and fellow writer describes it – I can no longer work until 3:00 a.m., get up three hours later and repeat as needed. So I stopped at 1:45 a.m. that night, got six hours sleep, and then put the stovetop espresso maker back on. I had a lot more clarity in both my main job and the editing process than if I’d pushed myself to the limit of exhaustion.

Balance

Though much of the weekend was a write-off, I spent at least two hours with my sons and wife each day, worked out, and got enough sunshine to replenish my vitamin D levels so I didn’t feel like a cave troll. When pushing hard to make a deadline, it is tempting to shut every other part of your world down, but that’s counter-productive. By making time for myself and those around me, I kept myself focused and emotionally stable when I returned to my labors.

Receptivity

When you’ve spent more than two and a half years on a book, you become too close to it and the people who inhabit its pages, to the detriment of perspective and the authorial agenda – i.e. what to keep and what to discard. In my case, I wanted to honor the time commitment of each person I interviewed by recording as much of their stories on the page as possible. This added depth to the narrative and gave history a human touch, but it also slowed down the pace and distracted the reader (my editor).

Initially, I reacted poorly to the red and blue ink on the page – particularly in the chapters with entire pages crossed out. But once I’d examined my motivation for keeping those passages and recognized that it may not be constructive, I got over myself and forged ahead. That being said, there were certain sections she wanted to cut that I knew should stay, so I retained them and was ready to advocate for them. To develop and maintain a productive relationship with your editor, you must trust them and recognize that their comments are going to make your work better – ego by darned!

Commitment

“Good enough” is not good enough! It’s pointless to put in full effort in the research and writing phases if you’re going to phone it in during editing. Sure, you may be sick of the sight of your manuscript, but you must close out strong for your book to be its best. If you need to ask for a few extra days so you can do another complete read, then do so. Above all, don’t submit your final version until you’re sure you’ve done everything possible to make it a success.

To those of you who’ve also been through book submission and editing, or indeed thesis review, I pose a question: What have you learned about the process and yourself?

Notes from Grad School: Teaching Writing

Dan Allosso



As I prepared this summer to resume my role as a teaching assistant at a large, public, East-Coast research university, I’ve been reflecting on the responsibility that goes with that assignment. Most of the lower level courses offered to undergraduates by my department fill the university’s “general education” requirement, which means that in addition to the historical or diversity outcomes these classes are designed to achieve, many of them also satisfy the university’s writing requirement. So as well as leading discussions on the readings and answering questions arising from lectures, I am a writing teacher.



I happen to like writing, and I’ve had some experience with writing and teaching prior to becoming a grad student. This experience isn’t completely unique (the grad student in the next office was a journalist), but for those of us who didn’t come with these skills, the university doesn’t really do much to prepare us as writing teachers.



I don’t say this to criticize my particular school. Twenty years ago, when my father was earning his PhD in Comparative Literature at a major West-Coast university, the situation was similar. My Dad, whose main interest was teaching literature to young people, made a career there (following his earlier career as a high school English teacher) and wrote A Short Handbook for Writing Essays about Literature, which has been in constant use there ever since.



Looking at the resources available for people like me, who teach writing outside of English departments, it was clear to me that a concise, practical, nuts-and-bolts writing handbook was as needed today in History as it had been twenty years ago in Comp. Lit. So I started with my father’s manuscript, and tried to expand it for use by social science as well as humanities students. It was a fun opportunity to reflect on the thought process he had gone through in writing his handbook, and then to engage in a sort-of dialog across the years. The advantage for me was, I was also able to email my revisions and expansions to my dad in California and get his reactions.



The result of this summer project is A Short Handbook for Writing Essays in the Humanities and Social Sciences. At 80 pages, it’s about twice as long as the earlier handbook, and it includes topics and examples geared for history students as well as readers of literature (although I’m hoping that exposure to examples from outside their specific fields will help make some of these ideas clearer for readers). I hope to use this in the fall, if I can convince the professor and the other TAs I’m working with to let me test it on our students. I’m also thinking of posting some short YouTube videos covering the main ideas of each chapter. One of my Dad’s original motivations was to get all the basics down, so that he wouldn’t have to repeat himself every time a student came to his office with questions. As I’ve mentioned once or twice before, I think the web offers us an incredible opportunity to reach out to people both inside and outside our classrooms with material they can use in whatever field they pursue.

Acknowledging Shortcomings as a Writer

Philip White

As a writer, you get to know pretty quickly what you’re naturally good at. My good lady wife is the best, untrained copy editor I’ve come across and, as Erik Larson says about his other half, my “secret weapon” in writing my next book. A good friend is a grammar wizard, and what’s more (unlike me) he actually enjoys wielding his Chicago Manual of Style and Gregg Reference Manual. Another longtime buddy is the master at describing people and places, and holy crap I hate him for it!

Perhaps you also, in times of honest self-appraisal, realize what your weaknesses are. I lack the skills of the three mentioned above, and wonder how on Earth they acquired such gifts. The answer is obvious – a combination of acquired knowledge and God-given talent. It’s easy to get down on your inadequacies, to despair when staring your shortcomings in the eye, and to covet your neighbor’s grasp of the past participle!

But I’ve found that embracing what I lack and seeking assistance is actually quite a formative experience. By involving the aforementioned people in my writing and editing processes I’m submitting myself to an ongoing skills development program. And if you can check your ego enough to take constructive criticism from your spouse, you can certainly take it from any editor.

Next is to seek people who are willing to provide mentoring. I’m lucky enough to be related to one such person and seek regular guidance from my former college adviser on all matters regarding the written word. Another professor (and author of 15+ books) who’s based on the West Coast has guided me through the tangled web of the publishing industry. The keys to learning from these people? Humility, receptiveness to the opinions of people who know more than me, and a willingness to share my weaknesses openly.

Another way I am constantly trying to improve is by reading the work of writers I admire. This involves poring over all forms of books – history (Rick Atkinson, David McCullough), nonfiction (Larson, John Berendt) and historical fiction (Robert Harris, Juliet Barker), as well as keeping up on the latest features from the WSJ and magazines such as The Atlantic. The third individual I mentioned is the executive editor of a prominent culture magazine, and as he’s kindly put me on the mailing list (thanks again, Luke!), I am confronted by his brilliance once a month.

The idea here is that you become what you behold. So by focusing on those who are skilled writers, I’m hoping that I assimilate some of their powers of description, mastery of pacing, and brevity (hmmm, still working on that one, for sure).

I admit to have not having arrived at a place where I can be satisfied with my writing, nor do I ever want to get to such a place. But by surrounding myself with talented people who can teach me something and reading the best work in the genres I dabble in means I’m better than I was yesterday, and tomorrow will be another step along the road to “writing well.”

The Writer’s Toolkit

Philip White

What makes a successful writer? Talent? Certainly. Knowledge of and passion for one’s subject? Absolutely. The ability to find a market for your wares? No doubt.

Yet without the proper tools, a writer, like any craftsperson, will face serious difficulties.

The best communicators through the ages have turned to the latest innovations to help them eke out words and a living. The Roman orator and statesman Cicero relied on his Tiro’s invention of tablet-based shorthand (no, not an iPad 2!), a Royal Quiet Deluxe portable typewriter was Hemingway’s weapon of choice, and Raymond Chandler used a Dictaphone for the first drafts of his screenplays.

In the past few years I—a Twitter-averse, text message-avoiding, Facebook-shunning curmudgeon—have forced myself to find tools that eliminate paper-shuffling inefficiency and allow me to record late night thoughts that invariably evade me the next morning. (Putting such things out of reach of my four-year-old son, Johnny, and his two-year-old brother, Harry, was also a good move). So, here goes with my list of treasured tech tools, which see a lot more use than my dust-collecting hammer, screwdriver or pliers.

Speech Recognition Software + Mic Winston Churchill tormented many a secretary with late-night transcribing duties and, while she’d make a fine scribe, I doubt my good lady wife, Nicole, would care to record my nocturnal babble. So, I turned to technology—namely, a wireless, Bluetooth-compatible microphone and a copy of Dragon Naturally Speaking. When my mother first purchased its predecessor, Dragon Dictate (complete with a very poorly designed dragon logo), some 15 years ago, this was a crude, ineffective technology that required more coaching than a preening, high-strung NFL wide receiver. Now, it’s quick, user-friendly, and mostly accurate. Although you can’t eliminate the occasional hilarious gaffe—I don’t think Joe Stalin would’ve cared for what my speech recognition software calls him.

Tablet + Stylus We’ve moved on a little since the aforementioned Tironian notes were in their heyday, but the concept is similar—take a stylus and mark semi-intelligible scribble on a tablet. Now, I know there are a lot of Apple fanboys and girls who will want to string me up for saying this, but, despite its numerous merits, the iPad is not the best thing for the job. That honor goes to the HTC Flyer, whose seven-inch form and handy “Magic Pen” make on-the-go note taking a cinch. The real power is the wireless synch with Evernote, which runs OCR on your notes so you can find certain words later with a full content search from any device. And if it can read my scrawl, it can read anything. Another bonus is the ability to highlight and annotate within e-books downloaded from the Kobo store—as close to marking up a real paper and glue copy as you can get on a slab of aluminum and plastic. Yes, having to pay extra for the stylus at Best Buy is a fine example of tech company money grabbing, but to me, at least, it was worth it.

Olympus Phone Call Recording Thingy OK, before you think I’m going all 1984 here, I try to tell interviewees that I am going to record our conversation lest they outpace my makeshift shorthand. The technology here is simple: a tiny microphone that sits in my ear and records both sides of any phone conversation on my voice recorder. Just like the NSA (totally kidding, noble overlords). I then connect the recorder via a USB cable and rip the file right into iTunes. Love this gizmo, except the droning of my own voice. (Who, except talk show hosts, politicians, and Charlie Sheen actually likes to hear themselves talk?) There has to be something similar for the iPhone, iPad, iWhatever, too, Applenistas.

How about you, dear readers? Do you have any tech toys/tools that you’d find it hard to live and write without?

Less is More in Elgin Park, and in Writing

Heather Cox Richardson

Michael Paul Smith has been described as the “Mayor of Elgin Park,” a town he has created entirely through photographs posted on the internet. Elgin Park is a town in the American Midwest, constructed as if it were the 1950s, without any inhabitants. In the photographs visible on the web, it appears to be a real town. But Elgin Park exists only in pixels.

My first reaction to the models created by Mr. Smith was to be a bit creeped out. The recreation of a “perfect” model of an imaginary idyllic past, documented in photographs, seems too close for comfort to the world of History as Fantasy that historians so abhor.

But looking at Mr. Smith as an artist rather than looking at his art as historical representation offers an interesting perspective on writing history.

Mr. Smith explains that he works hard to make sure he does not provide too much information in his images. He leaves room for the viewer to project himself or herself into the photograph, using his or her own eyes and emotions to fill in details.
“Things visually ‘read’ better when the amount of information is kept in check,” Mr. Smith notes. “The brain / eye / emotions will fill in the details, even when there is minimum amount of data available. On the other hand, there can be too much information. When that happens, you end up with a literal representation of something and very little room for personal interpretation. The more the viewer can project themselves into something, the more powerful it becomes.”

This struck a chord with me because it is precisely what my wonderful editor hammered home when we worked together on a recent project. She insisted on chopping all my sentences in half. While I worried the resulting simplicity would insult readers by suggesting I thought they were stupid, she held her ground and told me the book itself read better with very simple prose. I came—eventually—to see that long complicated sentences and drawn out paragraphs commandeer all a reader’s attention, making him or her work at deciphering the mechanics of the prose. This can be useful if the writer’s idea is to focus, as certain theoreticians do, on words and their meaning. But for historians exploring other aspects of our field, it serves no real purpose. With complicated writing, a story never comes to life. Instead, it sits stubbornly on the page, imprisoned in a tangle of words.

Simple sentences, like Mr. Smith’s uncluttered images, free a reader’s mind to fill in the ideas and the emotions of a story.

Laura Ingalls Wilder, the author of the Little House books, was a master of creating evocative scenes with very simple sentences. Here, for example, she describes a department store in Dakota Territory:

The inside of the store was all new, and still smelled of pine shavings. It had, too, the faint starchy smell of bolts of new cloth. Behind two long counters, all along both walls ran long shelves, stacked to the ceiling with bolts of muslin and calicoes and lawns, challis and cashmeres and flannels and even silks.

There were no groceries, and no hardware, no shoes or tools. In the whole store there was nothing but dry goods. Laura had never before seen a store where nothing was sold but dry goods.

At her right hand was a short counter-top of glass, and inside it were cards of all kinds of buttons, and papers of needles and pins. On the counter beside it, a rack was full of spools of thread of every color. Those colored threads were beautiful in the light from the windows. (Little Town on the Prairie, 1941, p. 48).

In ten sentences, she suggests the look and feel of a brand new store, the excitement it generates, and the isolation and poverty in which Laura has always lived. Wilder has left room for her readers to imagine the scene, rather than forcing us to use all our mental energy on her prose.

While a simple style is certainly not the only way to write evocatively, it is one that historians, especially beginning historians, should not shun in the fear that they will look stupid if they don’t write in tangles. As Mr. Smith says, less can often be more.

Standards of Citation and the Internet

Bland Whitley

Why do we cite sources? I imagine that for most of us, annotating work has become second nature to such a degree that we rarely think about why exactly we’re doing it. I’ll stress two main reasons, though I’m sure others could think of different rationales. The first is a kind of reflexive establishment of scholarly bona fides. As undergrad and grad students, we were taught to base our arguments on the sources and authorities we consulted—you may vaguely recall those dreary high school assignments that required some minimum number of sources. All of this remains of course an essential building block in the development of historical understanding. It is through immersion in a variety of sources that we learn to build arguments out of a variety of competing claims and to establish a sense of the relative reliability of different texts and evidence. The second reason grows out of scholars’ relationship with one another. Whether collaborating or arguing, scholars require access to the evidence that informs particular arguments. Although these rationales are not mutually exclusive (they often reinforce one another), the second should command greater respect. Leading other scholars to one’s evidence, so that they can reach similar or very different conclusions, is what citation should deliver. Too often, though, we can all find ourselves practicing a strategy of citation for citation’s sake.

I’ve been thinking about these issues because of an interesting debate that has played out on a couple of listservs during the previous two weeks (H-SHEAR, geared toward historians of the early republic, and SEDIT-L, which serves scholarly editors). Daniel Feller, senior editor of the Papers of Andrew Jackson, kicked things off with an impassioned critique of lazy citations of material culled from the web. Singling out a few different recent works that have quoted passages from important addresses made by Jackson during his presidency, Feller found that the works were citing either internet sites of dubious scholarly quality, one of which was no longer live, or obscure older works that neither improved on contemporary versions of the text nor took advantage of the contextualizing annotations of modern versions. Why should this be the case, Feller asked. It’s not hard to find print versions of the original sources for Jackson’s addresses. Indeed, it’s never been easier, as all can be found either on Google Books, or through the Library of Congress’s American Memory site. Instead of taking a couple of extra minutes to track down better and more useful source material, the authors had stopped searching after finding the desired text on whatever website seemed halfway professional and then cited the link, no matter that such links frequently have the shelf lives of a clementine.

The response to Feller’s post has ranged from attaboys from traditionalists who view the internet as little more than a dumping ground/series of tubes for scholarly quacks, to condemnation of yet another attempt by an academic to marginalize “amateurs.” (Why is it that all listserv conversations seem to devolve into a spat between angry researchers impatient with professional norms and defenders of some mythical historical establishment?) One commentator referred to articles that have analyzed the high percentage of historical citations of websites that have become defunct, a phenomenon known as link rot. Another pointed out that citing a website that may soon go dead isn’t really all that different from citing an unpublished conference paper or oral history—in neither case is the source material truly available to anyone else. Feller, of course, wasn’t really criticizing publishing or citing material on the web. He was warning that the proliferation of source material on the web has degraded historians’ citation standards.

There are two issues at work here. First, how do we handle link rot? This is a conundrum with no easy solution. Increasingly, all people interested in history, scholars and aficionados alike, will be getting much of their information from the web. What is our responsibility for ensuring that others can check our source material? If we have a reasonable expectation that a given website might not be around for very long, should we even bother citing it? If source material becomes problematic simply because of the ephemeral nature of the venue on which it is found, however reputable, how do we convey its legitimacy as evidence? The second issue relates to the question of what constitutes an authoritative text. The web has dramatically expanded researchers’ capacity to obtain and analyze primary and secondary sources—public records, newspapers, transcripts or digitized scans of correspondence, and obscure county histories, formerly accessible to only the most dogged and sophisticated researchers, are now readily available to anyone. But the web has done all this at random. The Eye of Google™ gazes upon some works but not others. Outdated and overly restrictive copyright laws prevent the sharing of many works. Researchers looking for specific texts to buttress their arguments encounter (through the workings of the search engine) sources that they otherwise would never have considered consulting. Before, researchers would have learned what specific sources one needed to look up when seeking the text of, say, the electrifying second annual message of Millard Fillmore. Now, enter a few key words, and voilà: http://www.presidency.ucsb.edu/ws/index.php?pid=29492#axzz1LVft8YVF. Maybe you’re more interested in Fillmore’s controversial 3d annual message and prefer it from a printed work? Boom: http://books.google.com/books?id=muPv6F0gm1kC&pg=PA209&dq=%22millard+fillmore%22+%22annual+message%22&cd=8#v=onepage&q=%22millard%20fillmore%22%20%22annual%20message%22&f=false

Is the above http address a legitimate source for citation? It’s a well-done, university-backed website, and I can only assume (having neither the time nor inclination to verify) that the text is presented accurately. I certainly wouldn’t hesitate to direct students to it. So why not? Well, what if UC-Santa Barbara loses or otherwise decides to pull the site’s funding and it goes dead? Can we depend on other researchers to retrieve it from some archived site (the Internet Archive’s Way Back Machine)? What about the printed source? What of a recent reprint of James D. Richardson (something of the court historian for the nineteenth-century presidency)? Perhaps you’re interested in U.S. relations with Cuba and needed to discuss the Fillmore administration’s rejection of British and French entreaties to forswear annexation of the island. That’s covered in the edition (p. 212), so you could cite it as a source. But beware, Google only offers a summary view of the book. Although you might be accurate in locating Fillmore’s rejection of the British-French tripartite arrangement, you’d be obscuring the incompleteness of the edition you consulted. Rather than helping other researchers, the citation would simply reflect the ease with which specific texts can be found on the web. In cases where the source is not unique (unlike, say, a manuscript letter, diary, or newspaper), citation, when it’s necessary at all, should go beyond merely indicating where one viewed the text. It should point readers to the scholarly apparatus that makes the particular source useful and authoritative.

There’s that word again—authoritative. Now we enter the realm of scholarly editors, who take a special interest in presenting historical and literary texts that are built for the long haul. I’m going to go out on a limb and guess that part of Feller’s justified pique grew out of a realization that not only were the Jacksonian scholars he reviewed citing somewhat dubious sources, they were not consulting The Papers of Andrew Jackson. I experience the same frustration in my work with the Papers of Thomas Jefferson. An all-too standard pet peeve is coming across recent scholarship that cites, not our series, but Paul Leicester Ford’s earlier edition The Works of Thomas Jefferson. Now, there’s nothing wrong with Ford. If one is looking to quote TJ, many of his famous writings are covered in that edition. But Ford’s project was very different from the comprehensive, annotated approach undertaken by modern documentary editions. Not only do modern editions present text more accurately, they present it in context. The primary subjects’ words appear along with the incoming correspondence that might have prompted them. Annotations connect text to other primary sources, as well as to modern scholarship. There is, in short, a wealth of information, both critical and ancillary, that is useful to readers.

So why do so many people continue to rely on Ford? Because his edition has been scanned into Google Books and therefore is convenient for anyone unwilling or unable to search beyond a desktop. Now, I can understand that a lot of researchers out there may not have the institutional support of a major research library and therefore can find it a challenge to get to modern documentary editions. The volumes are expensive, and the work of getting them online (although ongoing) may not occur quickly enough to satisfy everyone—nor does it necessarily lower the price. Still, it seems to me that the facility of the web has encouraged a kind of entitled sensibility among many researchers, who become miffed when something is not available online for free. The kind of scholarship that fills documentary editions costs money, though. Editions may or may not have the ability to publish online with no expectation of remuneration—university presses do, after all, require some return. The internet, however, has untethered the connection between the free consumption of information and its labor-intensive production. Too many researchers, accustomed to getting so much of their information for free from the comfort of the coffee shop, seem increasingly unwilling to do the legwork necessary to gain access to superior sources. Instead they settle for the merely adequate. That’s a shame.

I don’t want to imply that there’s anything wrong with citing material from the web. It’s essential and will increasingly account for much of the information that ends up in our works, particularly as online publication becomes more prominent. We do need to be sensitive to the issue of link rot—the Chicago Manual has some useful hints in this regard, and I am hopeful that archivists and librarians, who are far more advanced in these matters, will come up with some viable solutions. More broadly, the bounty of the internet need not fundamentally alter what we choose to cite as evidence. Standards will and should evolve with the times, but we should not displace one set of works with another simply because the new batch is easily and freely obtainable. Any shift should be based on the responsibility we have to our readers to connect them with the best available sources, print or web-based.

Jane Kamensky on Learning from Fiction

Randall Stephens

Writing history is often a topic of discussion on this blog and animates the pages of
Historically Speaking. (See some recent posts on writing here, here, and here.) Have a look at the right hand column on this page. You'll see posts grouped under "How to Write," "Writing History," "Editing," and more.

In that vein, I'm happy to post below a selection from Jane Kamensky's lead essay on writing in the April issue of Historically Speaking. Kamensky provides useful advice on scene setting, prose, and style, drawing on the lessons of fiction. She brings the wisdom of experience. With Jill Lepore, she authored the novel Blindspot (Spiegel & Grau/Random House, 2008). Kamensky has also written The Exchange Artist: A Tale of High-Flying Speculation and America’s First Banking Collapse (Viking, 2008), a finalist for the 2009 George Washington Book Prize; and Governing the Tongue: The Politics of Speech in Early New England (Oxford University Press, 1997).


Jane Kamensky, "Novelties: A Historian's Field Notes from Fiction,"
Historically Speaking (April 2011)

Here in the twilight of the Enlightenment, academic historians have fallen in love with how little we can know. Over the last fifty years, people, events, even places in the past have grown more obscure to many of us. Compare a work of history written in 1960 to one published in 2010, and you might wonder whether the mists of time have somehow thickened.

Can aspects of the novelist’s imagination help us to cut through the fog? Two years ago, the historian Jill Lepore and I published a novel we wrote together. Set in Boston in 1764, Blindspot started out as a lark, a gift for a friend. It grew into a project that felt important, even urgent, to us as scholars: a different way of knowing and telling the past. What follows are nine lessons learned in that effort to conjure a known and knowable world: a Then as real as Now, in our minds and on our pages.

1. Face It

Most historians suffer from prosopagnosia: face blindness. My co-author and I had written a goodly number of pages when it dawned on us that we had yet to tell our readers what our two first-person narrators looked like. In a novel that is, in large measure, about seeing, such description seemed a matter of duty. Our readers, not to mention our narrators themselves, needed to know how tall Fanny and Jamie stood, the color of their hair, the cut of their proverbial jibs.

How tough could such an accounting be? This was fiction, after all; we answered only to our characters. But confronted with this delectable task, we promptly choked. Their eyes, how they twinkled; their dimples, how merry: it seemed we had naught but rank cliché at our fingertips.

How do you take stock of a human face? Every time you walk in to a bus, a bar, or a classroom, you take people’s mettle visually, instantly, almost without thinking. But the sheer narrative terror of that moment made me realize that, as historians, we seldom confront the embodied nature of past individuals. We’re capable of writing the history of the self, or the history of the body, or even the history of sexuality, without crafting characters capable of staring back at us, as a good portrait does.

Writers of fiction give their characters faces and yea, even bodies, in a variety of ways. Consider this description, so thorough and meticulous that it bends in spots toward inventory:

Thomas Cromwell is now a little over forty years old. He is a man of strong build, not tall. Various expressions are available to his face, and one is readable: an expression of stifled amusement. His hair is dark, heavy and waving, and his small eyes, which are of very strong sight, light up in conversation: so the Spanish ambassador will tell us, quite soon. It is said he knows by heart the entire New Testament in Latin, and so as a servant of the cardinal is apt—ready with a text if abbots flounder. . . . [H]e is at home in courtroom or waterfront, bishop’s palace or inn yard. He can draft a contract, train a falcon, draw a map, stop a street fight, furnish a house and fix a jury. He will quote you a nice point in the old authors, from Plato to Plautus and back again. He knows new poetry, and can say it in Italian. He works all hours, first up and last to bed. He makes money and he spends it. He will take a bet on anything.1

Cromwell, of course, is a character from history and from fiction, in this case Hilary Mantel’s magnificent novel, Wolf Hall. Her description begins with a physical body, and a face, courtesy of Hans Holbein’s 1533 portrait. But then she peers through the eyes to the soul, as if she knows the guy, and her reader should, too.

Can historians do anything quite so wonderful? We don’t know the inner life of our subjects the way a novelist can know her characters. After all, a writer of fiction invents the soul whose windows the eyes become. Mantel’s Cromwell isn’t, can’t, and shouldn’t be history’s Cromwell. Thomas Cromwell merely lived; Mantel’s Cromwell soars. Yet almost every line in her description can be fully sourced: to the portrait, to Cromwell’s letters, to contemporaneous descriptions of the man. At bottom, Mantel’s path to knowing Cromwell isn’t all that different from a scholar’s. The magic comes in the author’s moral confidence in what she’s got—and then, of course, in the telling. Biographers, who live a long time with their subjects, offer readers hard-won, hard-working encapsulations of character all the time. Historians, trained to concentrate on the background at the expense of the figure in the portrait, do so less often than we might.

Of course, those who study remoter pasts and less celebrated people rarely even know what their subjects looked like. Yet no matter how obscure the actors, they had eyes and mouths, expressions and gestures that quickened the pulse of loved ones and triggered the loathing of enemies. Even when we cannot see the people we write about—perhaps especially then—we’d do well to remember that they weren’t made of paper, and didn’t pass their fleeting lives in acid-free boxes within temperature-controlled archives. They lived behind faces and within bodies, in heat and in cold, pleasure and pain, experiencing the present from the inside out. Their present became our past, and we’re stuck working from the outside in, from the page to the person. That’s no excuse for confusing the journey with the destination.

2. Taste It

The challenge of “facing” our subjects represents the merest tip of a vast and complex phenomenological iceberg. As a sometime novelist, I spent a lot of time presumptuously tasting, hearing, smelling, seeing, and feeling on my characters’ behalf. Since Blindspot is set in the sweltering summer of 1764, that wasn’t always pleasant.

The novelist is not alone here. In the last two decades the “history of the senses,” pioneered by scholars including Michael Baxandall and John Berger (sight), Alain Corbin (smell), and Richard Rath and Mark Smith (sound), among others, has become a flourishing subfield.2 I admire this work a great deal. But for all its sophistication, the history of the senses is as remote from sensorily rich history as the history of the body is from embodied history.

Because they create rather than discover a world, writers of fiction constantly index and mobilize the senses. Think of Proust’s madeleine, surely the most famous cookie in literature, whose lime-scented crumbs set off a four-page-long reverie that begins in Swann’s aunt’s kitchen and spreads to encompass “the whole of Combray, and its surroundings, taking their proper shapes and growing solid . . . town and gardens alike, from my cup of tea.”3

In nonfiction writing it can be no coincidence that some of the best sensoryladen storytelling comes from authors not burdened by Ph.D.s. Consider two examples, each describing the day-to-day operations of the print trades in the 18th century. The first comes from a superb work of academic history, Jeffrey L. Pasley’s The Tyranny of Printers:

Though printing had its cerebral and prestigious aspects, it was still a dirty, smelly, physically demanding job. One of the first chores that would be delegated to a young apprentice printer was preparing the sheepskin balls used to ink the type. The skins were soaked in urine, stamped on daily for added softness, and finally wrung out by hand. The work got harder from there, and only a little more pleasant. Supplies of ink were often scarce in America, so printers frequently had to make it on site, by boiling lampblack (soot) in varnish (linseed oil and rosin). If the printing-office staff survived the noxious fumes and fire hazards of making ink, their persons and equipment nevertheless spent much of the workday covered in the stuff.4

This is lucid, economical writing, pointed toward a set of important questions about the role of printers in the emergent public sphere of the early United States.

Now compare Pasley’s to this description, by the journalist Adam Hochschild, of James Phillips’s London print shop, hard by the Bank of England, where a crucial meeting of Granville Sharp’s antislavery society took place in May, 1787:

Type would be sitting in slanted wooden trays with compartments for the different letters; the compositors who lined it up into rows, letter by letter, would be working, as the day ended, by the light of tallow candles whose smoke, over the decades, would blacken the ceiling. . . . Around the sides of the room, stacks of dried sheets, the latest antislavery book or Quaker tract, would await folding and binding. And finally, the most distinctive thing about an eighteenth-century print shop was its smell. To ink the type as it sat on the bed of the press, printers used a wool-stuffed leather pad with a wooden handle. Because of its high ammonia content, the most convenient solvent to rinse off the ink residue that built up on these pads was printers’ urine. The pads were soaked in buckets of this, then strewn on the slightly sloping floor, where printers stepped on them as they worked, to wring them out and let the liquid drain away.5

Though the two passages rely on some of the same sources, Hochschild’s version owes as much to Dickens as to Pasley. It is specific and transporting rather than generic and distancing. Key differences reside in the sensory details: one paragraph, three senses. Sight: the blackened ceilings, the smoking tallow candles. Touch: compositors’ fingers flying over cast-iron type, the heft and texture of the wooden-handled pads, the disequilibrium of standing on that sloping floor. And of course smell: the close shop on a warm spring night reeking of piss as well as Enlightenment ideals.

These sensory details give Hochschild’s scene volume. But they do more than that. The sight, feel, and smell of the shop impart a frisson of opposites —these are “unlikely surroundings,” as Hochschild puts it, for a key moment in the transformation of humanitarian thought. Then, quickly, we’re on to the substance of that meeting, an intellectual history drawn from tract literature. Sensory does not mean sensational. read more>>>

The Allure of Narrative Non-Fiction

Philip White

What on earth is “narrative non-fiction” exactly? John Berendt, one of the genre’s luminaries, gives this explanation on Penguin’s page for his mesmerizing book on the Fenice Opera House fire in Venice, The City of Falling Angels:

I write in the form of what has been called the New Journalism, or Narrative Nonfiction, or even Literary Nonfiction. Simply put, I write true stories in the style of short stories and novels. I use the literary techniques of fiction writers: extended dialogue, detailed descriptions, the imposition of a narrative structure with action moving from scene to scene.

“Detailed descriptions” is something of an understatement. Exhibit A: Berendt’s lyrical representation of Venice’s master glass blower, Signor Seguso, from chapter one: “Signor Seguso waited patiently at the table. He was eighty-sixtall, thin, his posture still erect. A fringe of wispy white hair and flowing eyebrows gave him the look of a kindly sorcerer, full of wonder and surprise. He had an animated face and sparkling eyes that captivated everyone who met him.” You don’t find that kind of thing in your average history book. Too often historians are face-blind, forgetting that the first thing we instinctively look at when we meet someone, the sight that gives babies comfort when they look at their parents, is the visage. After his exploration of Seguso’s face and the response it provokes, Berendt then dedicates several paragraphs to the gentleman’s hands. Again, this is not standard fare outside of the fiction realm, but gives a depth to his characters that makes them vivid, memorable.

Since a friend enthusiastically pressed his copy of Berendt’s book into my hands in late 2006—I devoured the engaging, atmospheric copy on two extended visits to the sun-soaked summer patio of my local coffee shop (that resulted in a wicked sunburn)—seeking standout narrative non-fiction has become a passion. I’ve discovered that there are two variations within the genre. First, the first-person, participatory kind (not to be confused with the irksome Amateur Hour that is “citizen journalism”) that Berendt writes in The City of Falling Angels and his ode to the mysteries of Savannah, Midnight in the Garden of Good and Evil. In books of this sort, the visual descriptions and reactions to people are exclusive to the writer, and colored by their true-life experiences. Continuing the traditions of Stephen Crane, Ernest Hemingway and, later, Hunter S. Thompson and Truman Capote (to name just a few of the past masters), we start to see the world—people, buildings, customs, oddities—through Berendt’s eyes—his Venice and his Savannah.

Second are the third-person, historical books penned by Jennet Conant and Erik Larson. These typically are told at a distance from past events, but without seeming detached. In The Irregulars, the former creates an absorbing profile of beloved children’s author Roald Dahl during his time as a covert British agent in Washington in 1942. Who knew the writer of The BFG (my favorite book as a lad) was a wartime spy? Or, that he worked with James Bond author Ian Fleming and future advertising legend David Ogilvy in Britain’s propaganda bureau?

Even more fascinating than Conant’s exploration of Dahl’s espionage and hobnobbing with the Who’s Who of Washington society is how she depicts his growth as a writer. During his time in the U.S., Dahl penned dark, Poe-like short stories for Collier’s, The New Yorker and Harper’s, his first children’s book, The Gremlins and, fittingly, “Shot Down Over Libya,” an account of being downed while piloting over North Africa in early WW II that the Saturday Evening Post picked up.

Conant also brought inventor, amateur scientist and Wall Street tycoon Alfred Loomis to life in Tuxedo Park and created arguably the most accessible, human exploration of the Manhattan Project and its overseer, Robert Oppenheimer, to date in 109 East Palace. Conant’s use of exclusive journals, unpublished manuscripts, and family letters informs her prose with rich personal detail, and she develops the relationships between her protagonists as if writing a screenplay, with compassion, wit and candor. For her next project, Conant has delved into another little-known facet of a famous person’s life—the covert government work of Julia Child and her husband and friends. Again, who would have thought the TV chef had it in her?

Like Conant and Berendt, Erik Larson is a former journalist, whose reporter’s diligence to fact finding is reflected in the multi-layered fabric of his art. He also paints his characters—from the brilliant yet vulnerable wireless pioneer Guglielmo Marconi in Thunderstruck to the charismatic and terrible serial killer Dr. H. H. Holmes in The Devil in the White Citywith a nuanced brush and an insight into the paradoxes and contradictions that we all exhibit. Indeed, Leonardo DiCaprio was so intrigued with Holmes that he signed on to play the diabolical doc in the movie adaptation of the latter. In these two books, Larson also performs the precarious task of running dual, parallel storylines within the same narrative with the uncanny sprezzatura of a great film director. Though not as complex, his book Isaac’s Storm recreates a sense of authentic time and place—in this case, New Orleans in the first decade of the twentieth century—by balancing the minutiae of his characters’ everyday lives with overarching social, political, and scientific trends that defined the period. Next up for Larson is the story of an American family living in Nazi Germany. Needless to say I’ve preordered it.

Have I plunged into hagiography? Perhaps. But I feel that the genuine merit of these three writers and their composition styles is often overlooked, despite the critical claim each of their books has received. The useful lessons I have learned from such texts include the realization that microscopic details, which may seem trivial when examined individually, can be woven together to add color to a historical tale. Larson has also passed on the wisdom that when written and oral sources fail us, we can turn to photographs to fill in the visual gaps. Third, such volumes demonstrate that recording mannerisms, facial expressions and turns of phrase on the page make characters three-dimensional. And finally, this trio proves that the diligence, perspective, and objectivity of historical writing can be successfully fused with the storytelling, imagination, and suspense of fiction. Not an easy task, but it can be done.

Now if only bookstores would dedicate a section to this genre, instead of sticking Conant’s work in history, Berendt’s in journalism, and Larson’s in true crime . . .

Where Should the Thesis Go in a College Essay?

Jonathan Rees

My 11th grade English teacher was named Joan Goodman, and she was very particular about how she wanted us to write our essays. The first sentence was where the thesis went. I’m sure she didn’t put it this way, but the second sentence
was where you would repeat the thesis in different words in case the person grading it was too stupid to get it the first time you wrote it. The rest of the first paragraph was for elaborating on your thesis as you began to foreshadow what would appear in the body of the essay.
Ms. Goodman told us that her method was the same method they used to teach writing to the cadets at West Point. I’ve never checked into that, but I believe it because she was equally regimented in the way she drilled her model into our heads. Ours was not to ask why. Ours was just to do or . . . Well, maybe not die, but at least get a grade too low for us to get into the Ivy League schools to which we all aspired. I internalized her methods well and it served me well for a very long time, especially in history classes by substituting facts for quotes from the novel at hand.

As I don’t write college essays anymore, this structure no longer has a great impact on my own writing. It takes pages not sentences for me to get most of my arguments out, and thankfully the blog posts that I write, which are the length of some college essays, usually have no theses in them. (Otherwise, I doubt that I’d enjoy writing them so much.)

I do, however, subject my own students to the Joan Goodman/West Point writing model even if I pride myself in being a little less martial about it than she was. If you’re writing a paper that’s longer than eight pages, there’s no reason you can’t have one of those flowery introductions that most English teachers seem to love. You’ve got a lot of space to fill. The same thing goes for people who like to put their thesis at the end of the first paragraph. If it’s going to be a long paper, there’s no reason that you can’t elaborate on what the thesis means as well as the rest of the paper in paragraph number two.

However, when it comes to the four to six page papers that are the bread and butter of the upper-level undergraduate history course, I might as well be a drill sergeant. Even though I don’t remember Ms. Goodman ever explaining it this way, I have come to see the first sentence as the prime real estate in any college essay. It is not just the only sentence where a student can be assured of their professor’s undivided attention, it is the perfect place to set up for an explanation of what the student is thinking (which has always been my main criterion for grading).

A few weeks ago in my labor history class, I got the worst pushback I’ve ever experienced on this from one of my students. “I’ll give you your first sentence thesis, but next semester I’m going back to writing it the way I like it,” she told me. While I wish I had the quick thinking skills to compliment her on her newfound flexibility, my response was slightly different. “I don’t want you to write this way because I tell you so,” I explained. “I want you to write this way because you think it’s the best way to write.”

It’s at that point when I started singing. I don’t sing well, so I don’t do it often, but I do think it illustrates my reasoning (not to mention Joan Goodman’s) here well. Imagine an opera singer doing scales. They begin low, gradually get higher and end with a note that catches your attention. The problem with that in a writing context is that every note in a first paragraph should catch your attention. That’s the only way that anyone can make a complex argument well. A good first paragraph, in other words, should be all high notes.

In my experience, students who put their thesis at the end of the first paragraph think their heavy lifting is then over. Without explanation and elaboration, the thesis falls to the wayside for the rest of the paper and I’m left reading mostly book summary. Using the end of the paragraph thesis model is too often an excuse to stop thinking. Putting the thesis at the beginning forces them to explain what they mean in some detail before they ever get to the details of the history at hand.

I teach writing not just because I have to, but because I get better papers that way. This, in turn, makes my job more fun. So thank you Joan Goodman (as well as a few other excellent English teachers from the Princeton, New Jersey public schools). You’re why I take my students’ complaints that I secretly wanted to be an English teacher as the highest form of compliment.