Showing posts with label 20th Century. Show all posts
Showing posts with label 20th Century. Show all posts

The Meaning of “When”

Aaron Astor

I was recently asked to join a local committee to plan the centennial celebration of our local school district. The City of Maryville, Tennessee's public system was poised to commemorate its 100th anniversary in 2013 with lots of festivities and a nice, photo-filled book fleshing out the district’s long and storied past. But then the project hit a strange snag.  It came to the attention of the centennial committee chair that 2013 might not actually be the centennial of the Maryville City School system after all.

In somewhat of a panic, the chairwoman sent me an email detailing her extensive search for the “true” date of the school system’s founding. The first schoolhouse appeared in Maryville as early as 1797.  Still, 1913 was important. It was the year Tennessee passed a compulsory education law (southern states were quite late in the game).  It was also the year that Maryville High School first planned its four-year curriculum, though the first class would not graduate until 1919.  However, the state approved a “special school district”, with taxing authority, for the city as early as 1905.  And the first major schools were not built until 1911.  Adding to the confusion is the fact that there had been some semblance of schooling on the site of Maryville High School as early as 1867. Interestingly enough, it was known as the Maryville Freedmen’s Institute, and it served the relatively small ex-slave population of the county. As a final irony, the high school’s nickname is, you guessed it, the Rebels—despite the staunchly pro-Union leanings of Maryville and East Tennessee during the Civil War. The commemorative volume will surely delve into that oft-controversial piece of history.

But the question of dates persisted.  Before we could get into the thorny questions surrounding the school’s nickname, or the warm and fuzzy memories of graduating classes in years gone by, we had to determine if this was even the right time to do it.  And if we “discovered” that 1913 was the wrong founding date, should we then change our school district seal, which has the 1913 date on it?

And so the question boiled down to the meaning of “when”—as in, when was the school system founded? And more importantly, why does that matter?

Amusingly, this very same question—the meaning of “when” —came up when an old friend and colleague from grad school—Greg Downs of City College of New York—came down to the University of Tennessee and delivered a fascinating lecture on the “Ends of the Civil War.”  As he pointed out in colorful detail, the question of “when did the Civil War end” is a very difficult one to resolve. Lee’s surrender? What about Johnston’s surrender? Or a General in Texas who surrendered? Or when President Johnson declared an end to the state of war (in 1866)? Or myriad other times in between? (See Heather Cox Richardson’s recent post on a related matter.)  As Downs pointed out, this was not a mere antiquarian question. It had real legal consequences. After all, if a state of war still existed, the US government could still apply martial law.  Local court systems could still be suspended. And so on. The question of “when” was inextricably bound to the question of “why”, “how” and, of course, “so what?”

Maryville's first school, founded in 1797
My answer to the Maryville City Schools Centennial Committee chair, then, was to declare that it is up to us, as historians, to declare “when” the school system was founded. As long as we could make a compelling argument for why that date made sense, then there was no reason we couldn’t stick with that date. The chairwoman was clearly relieved to hear from a professional historian that it was OK for us to “pick” a date. Any date, as she also concluded, would be somewhat arbitrary.

Much of the historical profession focuses on how events unfold, why they take the shape they do, what their significance is for later history, and how people of the time—and perhaps later on in the collective memory—make sense of those events. But it is quite rare that we really interrogate the “when” part of history, except insofar as we “complicate” earlier chronologies.   The reality is that every time we select a set of dates to bookend a historical phenomenon—a war, a revolution, a religious awakening, the establishment of the Maryville City School district—we are making a profoundly important argument about the very significance of the event itself.    
______________

Aaron Astor is Associate Professor of History at Maryville College in Tennessee, just a few miles from Great Smoky Mountains National Park.  He is author of Rebels on the Border: Civil War, Emancipation, and the Reconstruction of Kentucky and Missouri, 1860-1872 (LSU Press, 2012), which examines the transformation of grassroots black and white politics in the western border states during the Civil War era.  He earned his PhD in History at Northwestern University in 2006 and lives in Maryville with his wife, Samantha, and two children, Henry and Teddy.

The Day the Archives Walked in the Door

Eric B. Schultz

Alan Lomax (left) with Richard Queen of Soco
Junior Square Dance Team at the Mountain
Music Festival, Asheville, North Carolina,
mid century. Courtesy of the Library of Congress.
I loved Randall’s latest post, which mixed music and archives, and reminded me how tricky it can be to capture and preserve historic “sound.”  It also brought to mind the story of Alan Lomax (1915-2002), one of America’s great music folklorists and archivists.  From 1937 to 1942, Lomax was a director in the newly-formed Archive of Folk Song in the Library of Congress, eventually collecting and preserving thousands of important and unique field recordings.

In 1938, Lomax sat Jelly Roll Morton (Ferdinand Joseph LaMothe 1890-1941) down in a small auditorium at the Library and asked him if he knew how to play “Alabama Bound.”  Morton was in the twilight of his career, many years removed from his formative days in New Orleans, and prone to invention—including a birthday that made him old enough to have, as he proclaimed, “invented jazz.”  Lomax was skeptical of Morton in particular and of jazz in general, which he saw at the time as a destructive force threatening to overwhelm his beloved American folk music. 

Morton began playing “Alabama Bound” and Lomax was stunned, saying later that Morton “played me the most beautiful ‘Alabama Bound’ that I had ever heard.” 

Recognizing suddenly the talent that had walked into his archives, Lomax charged up-stairs, secured fifty blank aluminum discs from his boss, the Chief of the Music Division, grabbed a bottle of whiskey in his office to place on Morton’s piano for motivation, and returned to the auditorium.  Lomax’s next question, “Jelly Roll, where did you come from and where did it all begin?” would result in over twelve hours of recordings that included tales of New Orleans, names of the many musicians Morton remembered (or wanted us to think he remembered), and wonderful, marvelous music.  The recordings today can be purchased in an award-winning CD boxed set, or on-line in digital form.

Those who listen will know that Morton was not only a fount of knowledge and a gifted musician, but most appreciative of the whiskey, commenting throughout his interviews on its high quality.  (Ah, for the days when one could grab a loose bottle of whiskey from his office for the unexpected guest.)

The following year, Frederic Ramsey and Charles Edward Smith published Jazzmen, the essential building block for much of the written jazz history to follow.  Interviewing “every living jazz musician who could contribute factual material,” the authors collected stories and first-hand accounts, all of which turned out to be colorful and instructive, and some of which even turned out to be true.  They were diligent in their quest, moving from “the dives of Harlem, Chicago and New Orleans, to the rice fields of Louisiana, to Storyville, the now legendary red-light district of New Orleans, to reform schools, even to the last stopping place of at least two jazz pioneers, a hospital for the insane.”  In particular, they located and relied heavily upon Willie Geary “Bunk” Johnson, a brilliant early New Orleans cornetist who was rediscovered driving a truck for $1.75 a day during rice season in Louisiana, and nearly starving the rest of the year. 

It was the creation of this oral history by Ramsey and Smith that also led to the rediscovery of “King” Buddy Bolden, acknowledged by some of the very early New Orleans musicians as the first to play music that would come to be called jazz.  Morton’s famous assessment of Bolden: he was “the blowingest man since Gabriel.”

It seems clear from Lomax’s writings that in 1938 jazz had not yet attained status as a great American art form.  By 1950, however, he had begun to appreciate its power, writing, “Perhaps nothing in human history has spread across the earth so far, so fast as this New Orleans music.  Thirty years after its genesis it was as popular and understandable in New York, Paris, Prague, and Shanghai as in its own hometown.”

Thanks to Lomax’s fine ear and musical open-mindedness, and Morton’s superb rendition of “Alabama Bound,” we can download today on iTunes (speaking of things that spread rapidly across the earth!) Jelly Roll’s history and recordings of the great American art form.  Then, unlike Randall’s “cold basements with little sunlight,” we can sit in our sunny family rooms or apartments and enjoy selections from one of America’s finest musical archives.

San Francisco, the 1906 Earthquake, and the Progressive Era

Heather Cox Richardson

Recently, workers in San Francisco unearthed the ruins of the old city hall, destroyed in the 1906 earthquake.

That event has always fascinated me—not just the destruction caused by the quake itself, but the sense it has always given me that San Francisco in 1906 was a perfect example of Progressive Era America. Some of the worst destruction was caused not by the shaking earth, but by the fires that broke out in the cracked gas lines. As much as 90% of the damage in San Francisco has been attributed to the fires. The thought of the flames roaring up the streets of the city is one of those events that makes history human . . . and in this case, horrific.


Here, it seems to me, is a perfect image of both the potential of the Progressive Era to improve people’s lives—in this case, with gas lighting—and the deadly danger of that potential.

San Francisco has become for me the quintessential Progressive Era city for another reason, too. In 1905, a photographer attached a camera to a trolley car traveling along Market Street. The result was a nine-minute recording of urban life before the reforms of the Progressive Era. There are no stop signs, no traffic lights. Children are playing in the streets and running in front of the cars. People are walking, horses are pulling carts, and automobiles are in a free-for-all on undivided roads. It makes you realize how many of the world we take for granted today was, in fact, a product of the efforts of reformers to draw up some rules to make the modern world safer.

This film teaches really well (there are versions with music available on youtube, too).

It’s chilling to realize that most of the people in the film, going about their errands on busy Market Street in 1905, awoke to horrifying shaking at 5:12 AM on April 18, 1906, watched the City Hall crumble, and ran from the billowing flames.

The Cold War Mentality of "A Nation at Risk"

Steven Cromack

“Our nation is at risk,” declared a 1983 report released by the National Commission on Excellence in Education.   The fallout from this simple, short report was astounding.  Its lucid words indicted the American education system and sparked national panic.  Schools across the country scrambled to assess their own standards, revised them, and implemented standardized tests.  Twenty-nine years later, as far as the state of American education is concerned, not all hope is lost.  This document was a product of Cold War mentality.  The Commission examined America’s schools under a microscope of fear. Was the United States losing the Cold War?  Only through education—advancement in math, science, and literacy—could the Land of the Free defeat the Communist threat.  “A Nation at Risk,” its language, and its implications reflect Cold War dogma—in examining this document in the era of globalization, it is evident that the Commission’s Cold War mindset failed to recognize that in the midst of the conflict, America’s schools were not failing; instead, they were shaping the future competition in which the United States finds itself in 2012.

Many Americans believed, as did the Commission, that the United States was not the great giant of innovation it once was.  The Commission asserted, “Our once unchallenged preeminence in commerce, industry, science, and technological innovation is being overtaken by competitors throughout the world.”  Like most Americans, the members of the Commission believed that America was falling far behind the lurking Communists.  America’s greatness, in their eyes, was drowning in its own falling standards.  The Commission echoed national fears: “The educational foundations of our society are presently being eroded by a rising tide of mediocrity that threatens our very future as a Nation and a people.”  Furthermore, they captured the cynicism of the American public with the declaration, “What was unimaginable a generation ago has begun to occur—others are matching and surpassing our educational attainments.”  It seemed to the Commission, and many others, that America was losing the Cold War. 

The language used in the subsequent paragraphs continued to examine the American educational system through the Cold War lens.  “We have squandered the gains in achievement made in the wake of the Sputnik challenge,” the Commission avowed, reaffirming the idea that America had fallen behind the Soviets.  The United States failed to maintain a competitive edge in science and industry.  Ultimately, the Commission argued that the underlying cause of this loss was the faltering education system.  Its members claimed, “We have dismantled essential support systems which helped make those gains possible.  We have, in effect, committed an act of unthinking, unilateral educational disarmament.”  Here, the Commission dropped the phrase that no politician during President Reagan’s first term dared use: “disarmament.”  Many fretted that disarmament would lead to defeat.  Dismantling armaments meant weakening the state.  In using disarmament as a metaphor for not stimulating education, the Commission highlighted its Cold War ideology.  It was a metaphor that reflected the period.
   
With ending of the Cold War came a new way to view the increasingly globalizing world.  The economic boom of the 1990s, albeit an illusion of boom, led to a rise in per capita income.  How was it possible that this boom came on the heels as a “functionally illiterate” generation entered the workforce?  In his book Catching Up or Leading the Way, Yong Zhao asked the question: If America is indeed a nation at risk, and if education is always on the decline, how does the United States maintain its competitiveness?   The Global Competitive Index rates nations on the level of prosperity brought to their citizens.  In 2007, the United States ranked number one of 131 countries (41).  Furthermore, the years between 1993 and 2003 saw a 40 percent increase in college graduation.  That decade also saw a 1 percent increase in the number of graduates who hold science and engineering jobs (42).

In 2011, David von Drehle published an article in Time Magazine titled, “Don’t Bet against the United States.”  Like Zhao, Drehle examined the concept of a “Nation at Risk” in the era of globalization and saw what the Commission could not see with their Cold War mentality.  He argued that throughout the Cold War self doubt drove the United States: from Nixon declaring that America was worse off since Eisenhower left office, to the “crisis of confidence” exuded by Carter.  It was easy to blame schools.   But, Drehle asserts, “fallen trees don’t prove the forest is dying” (35).  Yes, reform is necessary, but America is not on the decline, it just needs to refocus itself in the world it has created.  Drehle concluded, “When more people in more countries are free to rise, to invent, to communicate, to dissent, it’s not the doom of U.S. leadership.  It is the triumph of the American way.”

This Cold War mindset meant that the Commission could not view America’s education and uncertainty as one of its greatest strengths.  The American education system is nowhere near perfect.  The United States must now refocus upon its education system in order to maintain a competitive edge, and drive the competition that the future holds. 

Wartime Swimsuits Storm the Beaches

This piece is cross-posted from Iron as Needed.

Nicole White
,


"During World War II, the pinup girl became popular. And wearing a skimpy swimsuit was patriotic -- it was considered doing your part for the war effort."
                                                    -Anna Cole, swimwear designer

Ava Gardner, Actress/Pin-up Girl
As enthusiastic crowds flock to sandy beaches this summer, swimsuits will be disappearing from store racks at a rapid rate. Once very modest and made of impractical fabrics such as wool, women's beachwear has drastically evolved since the early 1900s.

In the 1920s, Coco Chanel popularized the "sun tan" when she spent a bit too much time in the French Riviera and returned with a sun-kissed glow. Chanel's accidental tan was reason enough for women everywhere to adopt lying in the sun for leisure as a new form of relaxation. This hot new trend did wonders for the fashion world of swimwear.
.
Esther Williams Poolside in 1944
It was in the early 1940s, when war rationing extended to fabric, that the two-piece swimsuit baring some midriff really took off. Designers shortened tops and removed the extra skirt panel covering the thighs to save on fabric consumption, but still kept the navel strategically covered with a high-waisted bottom. Wartime pin-up girls like Ava Gardner, Esther Williams, and Rita Hayworth gained attention and heightened the popularity of these swimsuits among young females. 
.
Cole of California
Wartime Swimsuit Ad 
Fred Cole, a silent film actor and founder of Cole of California, transformed his family's knit underwear business into a swimwear success by bringing Hollywood glamour to the beach. During the war, Cole of California also made parachutes for the Air Force and marketed this tidbit in their swimsuit ads to boost sales among patriotic Americans.

When asked about upcoming swim fashions for an issue of The Evening Independent published on November 15, 1945, Cole said, "We want to keep 'em bare, but flattering. We want 'em functional, but beautiful. And the average figure is bad." Not sure if he'd get away with the latter part of that statement in today's society, but honest, nonetheless. The article went on to say, "With the average figure in mind, Mr. Cole does swim shorts in elasticized shirred treatments which have the effect of a girdle."
After the war was over, French designer Louis Reard debuted the bikini, which exposed much more skin than its predecessor. He named it the "bikini" after the Bikini Atoll in the Pacific Ocean, the site of U.S. nuclear tests. Simultaneously in 1946, Jacques Heim, another French designer, came out with his version of the bikini and called it the "atome" (French for atom) and donned it "the world's smallest bathing suit." Reard then advertised his suit as "smaller than the world's smallest bathing suit." It was still considered improper to reveal one's navel in the 40s, so although it was available, the bikini was not worn by the masses until much later. 
My Version of
Norma Kamali's Design
Norma Kamali's Fringed 40s Pin-up
Swimsuit on net-a-porter.com
Retro 40s pin-up style swimsuits are making a comeback this season. Designers such as Yves Saint LaurentChloé, and Dolce & Gabbana have all perfected the high-waisted two-piece delights. One of my favorite websites to virtually "window" shop is net-a-porter.com. I was recently looking for a retro swimsuit and stumbled across the most exquisite one I had ever laid eyes on, by Norma Kamali. After seeing its shocking price of  $1,500 (and no, your eyes are not deceiving you), I knew I'd have to attempt sewing it myself. I purchased black swimsuit fabric and 17 yards of fringe. I had no idea how tedious sewing all the layers and layers of fringe would be or the challenge of perfecting the fit until I started cutting and sewing. After many hours spent constructing this suit, I now understand why it is listed for $1,500. Actually, quite a bargain after all! The most ironic thing about this little treasure is that it states on net-a-porter.com, "To get the best from this Norma Kamali piece we advise you do not wear it to swim." Happy lounging (I wouldn't dare set foot in the water wearing mine)!

When Is It Time to Stop Teaching Something?

Jonathan Rees

Those of us who teach the second half of the American survey course face a problem that only recent historians ever seem to face: our period keeps expanding.  Until there’s some kind of mass meeting where all we historians decide to move the dividing line in a two-course US survey sequence from 1865 or 1877 to 1900 or something, what counts as 1877 to the Present will only get larger.  This poses some problems for those of us who’d like to keep our courses current.

When I started teaching during the late-1990s, 1989 (with the fall of the Berlin Wall and all that) was a natural time to stop.  A few years ago, I rearranged my entire survey course in order to make it up to September 11, 2001, without actually covering it as everyone I was teaching still remembered it perfectly.  Well, those days have changed.  Listening to my students talk, I realized it was time to recall the events of that day and at least a few of the ones following it because they were barely cognizant of what was happening at that time yet have been living in its shadow ever since.

Besides needing to make room for the near present, I’ve been trying to update some of my other lectures from further back in light of recent scholarship.  When I first started talking about the 1970s, it was all Watergate all the time.  After all, It Seemed Like Nothing Happened was the first book on that decade to come out after it ended.  Jefferson Cowie has absolutely torpedoed that stereotype forever.  I’ve also tried to include some of the absolutely amazing material that’s been written about the rise of conservatism in recent years by people like Kim Phillips-Fein and Bethany Moreton.

My problem, therefore, has not been what to include in the new lectures I’ve been writing.  My problem has been what to cut out.  Cover new ground in any depth and something has to go.  Since I’ve also tried to redesign my course to include less lecturing, some of these cuts have been quite painful.

For example, I used to work for Stanley Kutler.  If you know Stan, you know that he was the first academic historian to write a book about Watergate.  When you get Stan to talk about Nixon, he won’t stop.  Therefore, I picked up an enormous amount of information about Watergate almost by osmosis.  I’ve cut my Watergate coverage down from a lecture all its own to about ten minutes.  It just doesn’t seem as important as it once did, anyway. 

Another subject from the survey class I used to cover in much greater detail is the New Deal.  That was two lectures:  First New Deal in the first one, Second New Deal in the second.  The Depression got a lecture all its own.  It still does, but I’ve got the New Deal down to one lecture by simply admitting to myself that the long string of Alphabet Soup programs that history teachers have been teaching since about the time that Roosevelt died is actually rather boring.  I now cover the programs that I think were crucial (NIRA, Social Security, NLRA, and a couple of others) and let my students read about most of the rest.

Similarly, I used to have one lecture for the Populists and another lecture for the Progressives.  Maybe that’s because I was taught by so many political historians as an undergraduate and graduate student, but I’d rather be talking about scholarship that dates from after I was born, thank you very much.  If I enjoy it, I think they’ll enjoy it more.  Just because you learned it is no reason that you have to cover the exact same material that your professors did. 

Ultimately, I think the question of coverage is the key here.  As Lendol Calder has been saying for years, our survey courses do not have to turn us all into walking encyclopedias.  (In fact, if we do our jobs right many of your students will come back for more in upper-level courses.)  Since covering everything will get even harder as time marches on, perhaps its best to change your approach before defeat becomes inevitable.

Jonathan Rees is Professor of History at Colorado State University – Pueblo.  He blogs mostly about technological and academic labor matters at More or Less Bunk, but still writes about history there every once in a while.

Was There a Schlieffen Plan?

Steven Cromack

In a 1999 journal article published in War in History, historian Terence Zuber dropped a bombshell on the academic community.  He argued that the Schlieffen Plan, or the German attack plan during World War I, was a post-war construction written by the generals to justify why the Germans lost the war.  He based his argument strictly on not only the primary sources that have been around since the War, but also new sources that became available with the fall of the Berlin Wall.  Zuber’s individual pieces of evidence are circumstantial. Take all of it into consideration, however, and he makes a compelling case.  A few years later, he published a book on the topic with Oxford University Press (Inventing the Schlieffen Plan: German War Planning, 1871-1914).

“The Schlieffen Plan” was the so-called German attack plan supposedly articulated by Count Alfred von Schlieffen, Chief of the German General Staff.  It was Germany’s roadmap to war—if all went according to “the Plan,” Germany would deliberately start World War I on their terms in 1916.  It called for rapid building of railroads across the country from West to East.  The attack would consist of the right wing invading Belgium and swing wide around Paris, striking the city from the West.  The left flank would remain stationary at Lorraine and hold off the likely French counterattack.  In the eyes of Schlieffen, France would surrender before they let anything happen to Paris.  Then, with France out of the war, the German army would utilize their new railroads, move its troops across the country to Eastern front, and knock out Russia.  As history “happened,” when entangling alliances ignited the so-called “powder keg,” and launched the War earlier than the Germans had hoped, the Schlieffen plan fell apart.  Schlieffen died, and his successor, Ludwig von Moltke not only inherited the Plan, but also altered it, or failed to understand it.  Von Moltke moved troops away from the West to bolster the Russian front.  “And the rest,” they say, “is history.”

Zuber challenged that history.  He wrote that there never was mention of a “Schlieffen Plan” before 1920.  Instead, he argued that when one historian wrote that Germany employed the wrong strategy, the generals and other members of the General staff, Kuhl, Ludendorff, Foerster and Groener, countered with the myth that Schlieffen had conveyed his master plan to Moltke, but that Moltke failed to understand it.  One should note that historians base their histories of the war on Ludendorff and others’ accounts.

According to Zuber, Schlieffen did have some contingency plans, although they remained in his possession until he died, and were not locked in the vault with the rest of the German war plans.  Zuber insisted that on its own, the Plan, or Denkschrift was a nightmare, poorly organized, and called for troop numbers that never existed.  Schlieffen’s war games, as evident in his writings and handwritten diagrams, did not resemble the master plan, or anything close to it.  Zuber based this argument on the newly discovered German staff memorandum, prepared by Major Wilhelm Dieckmann.  Dieckmann was a German officer whose task was to write a history of the war, and he therefore had access to many of Schlieffen’s notes, and war plans before Allied bombings during World War II destroyed them.  According to Zuber, Dieckmann’s manuscript revealed that Schlieffen’s “Plan” intended to keep the East strong and hold off the French by defeating their fortification line.  Schlieffen never envisioned swinging wide around the Paris and defeating the French army.  If this is true, then the Schlieffen Plan, as we know it, is wrong.

Zuber’s article and subsequent publications provoked a fifteen-year debate in War in History, especially between himself and historian Terence Holmes of Swansea University.  The debate over whether there was or whether there was not a Schlieffen Plan continues to this day.  The debate, however, has not reached high school history textbooks, or even undergraduate classes on European history.  It seems that historians are having trouble grappling with Zuber’s uncomfortable argument.  Why would they not?  He only insists that the academy has gotten World War I wrong for the last hundred years.  Such an assertion changes the interpretation and sequence of events.  Zuber writes that he seeks “establish German military history according to the standard of Leopold von Ranke: ‘as it actually was.’”  He, therefore, concluded his article, “There never was a ‘Schlieffen Plan.’”

For those interested in the heated debate in War in History thus far, here is the “Roundup” from Zuber’s website (http://www.terencezuber.com/):

T. Zuber, 'The Schlieffen Plan Reconsidered' in: War in History, 1999; 3: pp. 262-
305.

T. Holmes, 'A Reluctant March on Paris', in: War in History, 2001; 2: pp. 208-32.

T. Zuber, 'Terence Holmes Reinvents the Schlieffen Plan' in: War in History 2001; 4,
pp. 468-76.

T. Holmes, 'The Real Thing' in: War in History, 2002, 1, pp. 111-20.

T. Zuber, 'Terence Holmes Reinvents the Schlieffen Plan - Again' in: War in History
2003; 1, pp. 92-101.

R. Foley, 'The Origins of the Schlieffen Plan' in: War in History, 2003; 2 pp. 222-32.

T. Holmes, 'Asking Schlieffen: A Further Reply to Terence Zuber' in: War in History
2003; 4, pp. 464-479.

T. Zuber, 'The Schlieffen Plan was an Orphan' in: War in History, 2004; 2 pp. 220-25.

R. Foley, ‘The Real Schlieffen Plan’ in: War in History, 2006; 1, pp. 91-115.

T. Zuber, ‘The ‘Schlieffen  Plan’ and German War Guilt’ in: War in History, 2007; 1,
pp. 96-108.

A. Mombauer, ‘Of War Plans and War Guilt: The Debate Surrounding the Schlieffen
Plan’ in: Journal of Strategic Studies XXVIII, 2005.

T. Zuber, ‘Everybody Knows There Was a ‘Schlieffen Plan”: A Reply to Annika
Mombauer’ in War in History, 2008; 1. pp. 92-101.

G. Gross, ‘There Was a Schlieffen Plan: New Sources on the History of German War
Planning’ in: War in History, 2008; 4, pp. 389-431.

T. Zuber, ‘There Never was a “Schlieffen Plan” (in preparation)

T. Holmes, ‘All Present and Correct: The Verifiable Army of the Schlieffen Plan’, in:
War in History, 2009, 16 (1) 98-115.

T. Zuber, ‘The Schlieffen Plan’s “Ghost Divisions” March Again:  A Reply to Terence
Holmes’ (in preparation)

Beatles History Roundup

.
Steve Marinucci, "Beatles' Apple has nothing to fear from 'Strange Fruit' film," Examiner, April 21, 2012

It finally hits the street Tuesday, but the release of "Strange Fruit: The Beatles' Apple Records,” the new unauthorized documentary on the history of the record label founded by the Beatles, has had a few rough spots.

Amazon.com and at least one other dealer stopped selling it, but others continued to take orders. And though a few writers suggested it might not get released, it's been available all along direct from the distributor.
>>>

Jon Friedman, "Myth-busting, from The Beatles to Bowie’s Ziggy Stardust," Wall Street Journal, April 19, 2012

Ken Scott has as many great stories to tell as anyone in the rock and roll world. And he isn’t shy about sharing them.

Talk about being a fly on the wall. Scott was the engineer on The Beatles’ White Album, among other sessions by the fabled band, and the producer on David Bowie’s classic 1972 album, “The Rise and Fall of Ziggy Stardust and the Spiders from Mars.”
>>>

"Beatles unseen photos to be sold," BBC, April 22, 2012

Unseen photos of the Beatles are to go up for sale after lying in a family album for almost half a century.

The 20 black-and-white images show the band as they made their first film, A Hard Day's Night, in March 1964 at the Scala Theatre in London.
>>>

"Sir Peter Blake recreates The Beatles' 'Sgt. Pepper's Lonely Hearts Club Band' cover," Uncut, April 2012

The Beatles' iconic 'Sgt. Pepper's Lonely Hearts Club Band' album cover has been redesigned by original sleeve designer Peter Blake on his 80th birthday.

Noel Gallagher, Amy Winehouse, late Joy Division frontman Ian Curtis, The Rolling Stones frontman Mick Jagger and Paul Weller all feature in the new collage entitled 'Vintage Blake.'
>>>

Looking East to the Past

Randall Stephens

"Ostalgie." That's East German nostalgia for the quaint days of communism--drab, bunker-like, late-Stalinist architecture, watches that don't work, tiny little cars, and the romance of scarcity. I encountered a slice of that for the first time in the dark comedy film Good Bye Lenin! Since then I've been curious about this strange sort of public memory. A little like being nostalgic about
the Dust Bowl? (Watch the East German National Anthem scene from Top Secret, embedded here.)

A related travel piece that recently appeared in a UK paper got my attention: Stephen McClarence, "Trains and Trabants," Yorkshire Post, 6 March 2011 19:00

FEW cities are as haunted by their past and their politics as Berlin. Around almost every corner there’s a reminder of its turbulent 20th century history – and one of the most potent of those reminders is being celebrated, if that’s the word, this year.

August sees the 50th anniversary of the building of the Berlin Wall and I’ve come to explore what remains of it and the divisions it created across the city. It’s not, however, a wholly sober trip. I’ve signed up for a tour in a Trabant, the notorious national car of East Germany, and there’s the prospect of mainland Europe’s biggest department store. . .

The nostalgia flourishes in a movement called Ostalgie – “nostalgia for the East” – which is celebrated at the absorbing DDR Museum. It explores East Berlin life before the Wall came down in 1989. Visitors poke around a recreated 1970s flat, with its floral wallpaper, net curtains, cassette player, copies of Sputnik magazine... and radiators, we’re alarmed to see, exactly like our own at home.

In the late 1990s Daphne Berdahl wrote an influential essay on the phenomenon: "'(N)Ostalgie' for the Present: Memory, Longing, and East German Things," Ethnos: Journal of Anthropology 64:2 (1999). After the fall of the Berlin Wall, objects and goods from the East became instant camp for West Germans. But the shoddy products of the East also took gave some weird, sentimental comfort. "In this business of Ostalgie," writes Berdahl, "East German products have taken on new meaning when used the second time around. Now stripped of their original context of an economy of scarcity or an oppressive regime, these products largely recall an East Germany that never existed. They thus illustrate not only the way in which memory is an interactive, malleable, and highly contested phenomenon, but also the processes through which things become informed with a remembering--and forgetting--capacity" (198).

Sounds like a fun way to get at "memory" vs "history"!

Jeremy Bangs's Experience with Sundown Towns

Randall Stephens

Americans born after the 1950s may not know that there ever were such things as sundown towns. James W. Loewen writes about them in Sundown Towns: A Hidden Dimension of American Racism (New Press, 2005). "'Don’t let the sun go down on you in this town.' We equate these words with the Jim Crow South," summarizes the book's website, "but, in a sweeping analysis of American residential patterns, award-winning and bestselling author James W. Loewen demonstrates that strict racial exclusion was the norm in American towns and villages from sea to shining sea for much of the twentieth century."

Jeremy Bangs emailed me the other day about his experience of growing up in Illinois and Kansas and being all-too aware of such villages and hamlets that dotted the midwest. Bangs is the founder of the Leiden American Pilgrim Museum in Leiden and the author of Strangers and Pilgrims, Travellers and Sojourners (General Society of Mayflower Descendants, 2009). Jeremy's father, Carl Bangs, was a leading authority on the life and theology of Jacobus Arminius. (I read his biography of Arminius when I first launched into grad school. I was hooked on history.) With Jeremy's permission, I include below his reminiscence about sundown towns.

When my father moved ca. 1952-53 from Chicago to Bourbonnais, IL, to teach religion and philosophy at Olivet Nazarene College, he first heard about the existence of such towns, including some in the immediate area. Like others on the faculty who were ordained, he sometimes was invited to preach, and, as director of the college's brass choir (which he started), he sometimes was asked to take students to perform in Sunday morning services in Nazarene churches in Illinois as fundraisers for the college.

He refused to accept invitations in towns that were known as sundown towns. He always explained that he would be happy to accept the church's invitation once the village or town had reversed its policy. He was not alone in this vocal opposition to the racist custom, and I think that he and his colleagues made this an issue that was discussed (and officially opposed) by area councils of churches. Further, we did not drive through such towns, taking the long way around, whatever that might be, when going somewhere farther away. Since most roads there were laid out in square patterns, there was always a way to drive around. I think that at least one town officially changed their policy, after being shamed from the pulpits. As I recall, that must have been Manteno, IL, because I know that it was once a sundown town but in later years we did drive through Manteno. (We moved to Kansas City in 1961.)

In Kansas City, looking for a house to buy, my parents were shocked to find out that certain suburbs excluded blacks as well as Jews. This was possible with so-called covenants in the deeds, by which a buyer obliged himself to refuse to sell to blacks or Jews. The suburbs were among those developed and operated by the J. C. Nichols Company, who also built Kansas City's Plaza. My parents refused to buy in such restricted areas and refused to buy from that company. My father also instigated legal action on this point against the J. C. Nichols Company, which led to some retaliation, although I've forgotten what form that took. I think that eventually this action resulted in the company's being forced to drop the policy and restrictive covenants, even before legal changes made them explicitly illegal. As in Illinois, my father was one among numerous clergymen who tried to bring an end to these restrictions.

An Interview with Hilary Spurling on Pearl Buck

Randall Stephens

The January issue of Historically Speaking will soon be up on the Project Muse site. In the meantime, here's a selection from my interview with Hilary Spurling, which appears in the issue.

In 1932 Pearl S. Buck, daughter of American missionaries in China, won the Pulitzer Prize for her novel The Good Earth (John Day, 1931). Set in rural China, the book chronicles the struggles of a peasant and his slave-wife. In 1937 the novel was adapted into a Hollywood film. One year later Buck was awarded the Nobel Prize for Literature.

The British biographer Hilary Spurling—Ivy: The Life of I. Compton-Burnett (Knopf, 1984); The Girl from the Fiction Department: A Portrait of Sonia Orwell (Counterpoint, 2004); Matisse the Master: A Life of Henri Matisse: The Conquest of Colour, 1909-1954 (Knopf, 2007), winner of the Whitbread Book of the Year Award and the Los Angeles Times Biography prize—takes up Buck’s story in Pearl Buck in China: Journey to The Good Earth (Simon & Schuster, 2010). Focusing on Buck’s first four decades, Spurling stresses the American author’s intimate familiarity with China. Historically Speaking editor Randall Stephens recently spoke to Spurling about her study of Buck.


Randall Stephens: Why has Pearl Buck’s reputation suffered in recent years?

Hilary Spurling: She had many critics in her own day, too. Her success was so sudden and so enormous, and she was a complete outsider. She was a nobody. She had no contacts, no backup and no track record. When The Good Earth was published at the beginning of the 1930s by a publisher about to go bankrupt, it immediately became not just an American best seller but a global best seller. She won a Pulitzer Prize right away, and a few years later she won the Nobel Prize. Literary critics said, what is The Good Earth? An agricultural history of China? They couldn’t make heads or tails of it.

She had lived in China all her life. Buck never felt at home in America, in the 1930s or afterward. Brought up in China, she spoke Chinese fluently and her childhood friends were Chinese. When she was young, she didn’t know any other Westerners. She imagined she would spend her whole life in China.

When she lived in the United States, Buck was alienated by American politics. She knew and loved China so well that she became in a certain sense an apologist for China, trying to explain China to the West. As America moved sharply to the right in the 1950s, Buck was branded as a communist sympathizer. In communist China, however, she was a public enemy, and her books were forbidden. People were punished and humiliated if they’d even heard of her, let alone met her. She got flack from all sides throughout the second half of her life.

I wrote the biography partly because I wanted to write about China. I wanted to explore China and how it reached the state it is in now. In addition, I think that Pearl Buck is one of the great Americans of the 20th century, and I hope that Americans might look at her again and recognize her achievements as quite extraordinary.

Stephens: How would you sum up her relevance today?

Spurling: Think of how Western attitudes toward China have changed in the last four or five years. When I wrote the proposal for this book four years ago, publishers both in American and in England weren’t very interested. They saw no trace of the shock and awe I think we all feel now on both sides of the Atlantic about China. We understand now that China is a very large part of the future for all of us. And I think it no exaggeration to say that Buck was initially responsible for the West’s change in outlook. Nobody since Marco Polo in the 13th century has opened up the East to the West as much as Buck did. We can see the seeds of our attitudes toward China now in what she wrote so long ago. She was the first to foresee China’s future as a superpower. She was a young woman then. It’s extraordinary to see that as early as 1925 she understood that China would become the leader of Asia and that America needed to cultivate its relationship with China.

Stephens: Has she been reassessed in China today?

Spurling: Yes, she is being reassessed again on quite a large scale. In my lifetime she was officially regarded as a public enemy of the whole Chinese nation. Schoolchildren were told to denounce her. One of them was the Chinese novelist Anchee Min, who told me that as a student she was chosen to denounce Pearl Buck. She asked her teachers if she could read The Good Earth because she said it would give her more grounds. They said no Chinese citizen could read it because it was too toxic.

I finished my book last November, and that very month Buck was named on Chinese state radio—which of course is an official voice of the Chinese state—as one of the top ten international friends of China. I see that as a measure of the rate at which China and China’s opinions are changing. People now want to read Buck. The Good Earth was her farewell to China, the last thing she wrote before she left that country for good. No one else in the West was in a position to write such a book, and no Chinese person either. The Chinese writers who were Pearl’s contemporaries—young intellectuals, many of whom she knew and whose battles she fought—shaped and trained her far more than any American writers would. But the last thing such Chinese writers wanted to do was write about the village life they had struggled so hard to escape from. Roughly 85% of Chinese people were illiterate peasants. Chinese intellectuals at the time yearned to go to Beijing or Shanghai and to write about Eugene O’Neill.

Elvis and the American Dream

Heather Cox Richardson

Forty years ago today, Elvis Presley showed up at the gates of the White House with two bodyguards and handed the guards a letter addressed to President Nixon. He said he knew the president was busy, but was hoping he could say a quick hello and present the president with a gift.

One can only imagine the flurry of astonished commentary in the White House when news arrived that Elvis wanted to drop in for a chat. An aide skimmed Elvis’s letter and sent a quick memo to Chief of Staff H. R. Haldeman. “The thrust of Presley’s letter is that he wants to become a ‘Federal agent at large’ to work against the drug problem. . . . Drug culture types, the hippie elements, the SDS, and the Black Panthers are people with whom he can communicate since he is not part of the establishment.” The aide warned that it would be a bad idea to push Elvis off on the Vice President, since “it will take very little of the President’s time and it can be extremely beneficial for the President to build some rapport with Presley.”

While the aide was right that Presley wanted a federal badge, the thrust of his letter was not that he could talk to members of the counterculture. The gist of his note was that, more than anything, he wanted legitimacy. Elvis wanted to achieve the American Dream—not to be rich and famous (although he certainly was), but to be respectable.

Elvis had been an enormously talented young man with pretty moderate ambitions, in part because his horizons were so limited that he couldn’t see beyond stability and respectability. He wanted to take care of his parents; he wanted a job and a nice house. When his career took off, he bought Graceland, and decorated it in the fanciest way he could imagine—not with fine antiques and expensive art, but with a wall of mirrors and a carpeted ceiling.

Elvis seemed to be the epitome of the American dream. And perhaps he was, but not in the way that concept is usually used. As Elvis’s career went upward, his control over his success sloped inversely downward. Elvis’s life made it clear that even a man with such superlative talent could never rise to security without an education and connections. He took his financial advice from his father, a man who went to jail for altering a $4 check. He took career advice from a manager who was taking 50% of his earnings by the time the singer died (the going rate was 10%) and who pushed him constantly to make more and more money. By 1970, Elvis’s talent had become a commodity over which he had little control. Rather than enabling him to achieve the American dream, his ability was destroying him. His grueling schedule had him increasingly dependent on prescription drugs, and his marriage was falling apart.

What Elvis wrote to Nixon was that he craved solid middle-class respectability. “I . . . admire you and have great respect for your office,” he wrote. Countercultural figures might call the president and his advisors “the establishment,” but “I call it American and I love it.” “I can and I will be of any service that I can to help the country out,” Elvis wrote. He and President Nixon had something in common, and the singer made sure to point it out: “I was nominated this coming year one of America’s Ten Most Outstanding Young Men,” “I believe that you, Sir, were one of the Top Ten Outstanding Men of America also.”

Well over a hundred of Elvis’s records had gone gold, platinum, or multi-platinum, but when Elvis met with President Nixon at 12:30, he felt obliged to explain to the president who he was. And he didn’t focus on his music; he focused on the law, respectability, family, government. The first thing the singer did was to show the president his collection of police badges. He gave President Nixon some Presley family photos and a commemorative WWII Colt 45, and warned him that the Beatles had been fomenting anti-Americanism.

Then, as the White House notes from the meeting relate:

“Presley indicated to the President in a very emotional manner that he was ‘on your side.’ Presley kept repeating that he wanted to be helpful, that he wanted to restore some respect for the flag which was being lost. He mentioned that he was just a poor boy from Tennessee who had gotten a lot from his country, which in some way he wanted repay. . . . At the conclusion of the meeting, Presley again told the President how much he supported him, and then, in a surprising, spontaneous gesture, put his left arm around the President and hugged him.”

Nixon’s people managed to get Presley a special badge from the Bureau of Narcotics and Dangerous Drugs. The badge was a symbol of what Elvis wanted, but it couldn’t give him the middle-class respectability that was at the center of his American dream. It couldn’t buy him the economic understanding that would enable him to rearrange his business affairs, or admission to a professional culture of lawyers and agents whose knowledge would protect him from his parasitic manager.

Ironically, it also couldn’t stop Elvis from dying of drugs only seven years later, sad proof that all the talent in the world could not produce success if it were not protected by education and connections.

The Original Live-Blogging: On the Ground with Antinuclear Activists in 1979

Morgan Hubbard

Professionals of all stripes know that some workdays are better than others. Much of the historian's task is tedious and thankless—slogging through reams of records that may or may not be important for the puzzle she's looking to solve, or the argument she's looking to build. Some days end up being totally useless.

Some days, though, are windfalls.

The Alternative Energy Coalition was an alliance of anti-nuclear and environmental activists in New England in the late 1970s. The AEC, together with the better-known Clamshell Alliance, staged a series of “actions” against the Seabrook Station Nuclear Power Plant in eastern New Hampshire. Some were more intense than others, and a few made national headlines. When the AEC dissolved in the early 1980s—members moved to affiliated groups or dispersed—the organization's papers lay fallow. In the 2000s, they were given to the University of Massachusetts-Amherst Special Collections.

Many of the AEC's records are run-of-the mill: mostly catalogs, routine correspondence, heaps of newspaper clippings. But tucked into one folder, fastened unceremoniously with a single staple, was something remarkable, a historian's goldmine: a sheaf of papers on which AEC protesters had logged, hour by hour, the events of a massive blockade action at the Seabrook plant in October 1979. (Click on the image of the log to enlarge.) The document is special not only for the intensity of its scribbled notes (“8:30 a.m., police dogs and water hoses are visible. Action before noon.” “10:00. Verified macing and clubbing.”) It's special because it's a step closer to historical reality than historians can usually get.

Most of the time, we have to reconstruct the past obliquely. But documents like this allow us to witness the past almost as its participants did, play-by-play, on the ground. Obviously, there's much missing from a written account: the smell of teargas, the overhead chop-chop-chop of police helicopters, the fervency of a protest at the height of the anti-nuclear movement's momentum. But documents like this—a live-blogging before there was such a thing—demonstrate that with the right sources, we can recapture some of the potency and urgency of the past.

Whiteness and WASPishness

Randall Stephens

Race in America has had a complicated history, to say the least. And in the 20th and 21st centuries, historians have grappled with the subject in a variety of ways. Scholars have analyzed the economic and social underpinnings of race and racism. They've scrutinized the power of race as a social construction. They've looked into how race and racism have changed over decades and centuries. They've examined the long-term effects of race-based slavery . . And they've looked into how "whiteness" developed as a counter to blackness and as a marker distinguishing whites from non-whites.

In 2001 Eric Arnesen wrote a piece on "Whiteness and the Historians’ Imagination" for International Labor and Working-Class History. Historically Speaking published the essay in modified form in 2002. (I remember that this essay was quite useful for me as I slogged my way through historiography in grad school. And it's still a great read today.)

"The weaknesses of whiteness scholarship," observed Arnesen, "are particularly evident in its treatment of a subject that historians of the United States have chronicled for decades—the hostile encounter between Irish immigrants and African-Americans in the antebellum North."

Arnesen continued:

Attempting nothing short of a paradigmatic revolution, whiteness scholars suggest that the necessarily prior question, is 'how did the Irish become white?' To pose this question is to assert that 19th-century Irish immigrants to the United States were not white upon their arrival—that is, they were not seen as white by the larger American society and did not see themselves as white. Over time, whiteness scholars argue, they became white. Yet early and mid-19th-century commentators on the Irish did not speak of whiteness per se but invoked a more diverse discursive apparatus, weaving considerations of religion (which virtually vanish in the considerations of the whiteness scholars), notions of innate and observed character and behavior, and yes, race too into their anti-Irish commentaries. Therefore whiteness historians must assume the role of interpreter, translating the 19th-century vernacular of race and group inferiority into the late 20th-century idiom of whiteness.>>>

In the November 2010 issue of Historically Speaking, Kevin Schultz casts a similarly critical eye on the neologism WASP. In "The Waspish Hetero-Patriarchy: Locating Power in Recent American History," Schultz traces the term's origins and later ubiquity. Some excerpts from his illuminating essay:

According to the Oxford English Dictionary, the term WASP was coined only in 1962, when the civil rights movement required an oppositional force against which it could protest. Rather than simply choose “white” (which would come later, and subsequently destroy much of our historical memory), the social landscape of the early 1960s demanded a more inclusive term. WASP is what social scientists came up with.

What came before WASP as a descriptive term for the dominant American social and cultural presence? During the first three decades of the 20th century, the term “Anglo- Saxon” took on importance as a national social identity. The term has deep roots; the Anglo-Saxons are famed for a 5th-century invasion of the British Isles. But at the beginning of the 20th century there was a need in America for a term that would differentiate the social elite from all others. Anglo-Saxon
served that need. . . .

Today, at the beginning of the 21st century, historians have shown that it has been politically, socially, and economically useful in our society to be seen as white, to possess an identity affiliated with whiteness.9 But for the first three or four decades of the 20th century, whiteness was not enough. One had to be considered Anglo-Saxon to achieve this status. And, like today’s “whiteness,” the phrase “Anglo-Saxon” lost any specific meaning once it be- came a social necessity. Everyone tried to claim it, and it lost all coherence. Webster’s Revised Unabridged Dictionary of 1913, for instance, defined Anglo- Saxon as “a person of English descent in its broad- est sense.” In 1925 Sinclair Lewis described his hero Martin Arrowsmith as “a Typical Pure-bred Anglo- Saxon American, which means that he was a union of German, French, Scotch, Irish, perhaps a little Spanish, conceivably a little of the strains lumped together as ‘Jewish,’ and a great deal of English, which is itself a combination of primitive Briton, Celt, Phoenician, Roman, German, Dane and Swede.” Clearly the definition of Anglo-Saxon had become diffuse, even if the term’s importance had not.

Nevertheless, the abbreviated construction WASP picked up steam in 1963 as the civil rights movement progressed, appearing in that year in the New York Times and the New Statesman, although in each case the editors required a definition of the term to ensure that readers understood what it meant. Daniel Bell also paused to define WASP in his 1964 review of Nathan Glazer and Daniel Moynihan’s Beyond the Melting Pot. By the middle of the 1960s, though, WASP no longer needed an accompanying definition. After just a few years in circulation, everyone knew what it meant. Baltzell himself did a good deal to popularize the term in his 1964 book The Protestant Establishment: Aristocracy and Caste in America.

And we’ve been stuck with WASP ever since. Interestingly, the term became calcified as a permanent part of the American lexicon even as its sociological and historical accuracy demanded further refinement. Instead of continuing to be sociologically useful, it became a type, often to be castigated and derided. One reason for this might be that the sociological utility of all the markers inherent in the word WASP declined precipitously in the middle 1960s. . . .

And yet despite the rise and fall of the recognized power holders, the term WASP has remained calcified in its mid-century conception. Today, it typifies a stuffy type, someone in the cultural and social elite who disdains things popular and dresses preppy. It’s a much-maligned class identity more than it is a sociological descriptor. Indeed, the distinctions marked out by being a WASP are not necessarily bound by whiteness, Anglo-Saxonism, or Protestantism anymore. They signify an attitude and a manner of living. They have become commodified as “preppy” or what used to be called “Ivy League.” Today, it signifies an elitist snoot, a highbrow more than an actual white Anglo-Saxon Protestant. For example, it’s not hard to imagine someone describing the manner of, say, W.E.B. Du Bois—who wasn’t white, Anglo-Saxon, or technically Protestant—as being WASPish.

Read more of Schultz's interesting critique here.

Roundup: Historians

Neal Ascherson, "Liquidator," London Review of Books, August 19, 2010.

Seven years after his death, Hugh Trevor-Roper’s reputation is still a cauldron of discord. He would have enjoyed that. Steaming in the mix are the resentments of those he expertly wounded, the awe of colleagues at the breadth and depth of his learning, dismay at his serial failures to complete a full-length work of history, delight in the Gibbonian wit and elegance of his writing and – still a major ingredient – Schadenfreude over his awful humiliation in the matter of the Hitler diaries. >>>

"Cambridge University connects communities with Domesday," news.bbc.co.uk, August 10, 2010.

When William the Conqueror wanted to consolidate his power over his new English subjects he created the Domesday Book.

It was a comprehensive list of who owned all property and livestock.

Now Cambridge University historians have digitised the information in an interactive website.

"It's possible for anyone to do in a few seconds what it has taken scholars weeks to achieve," said Dr Stephen Baxter, a co-director of the project.

PASE Domesday was launched on 10 August 2010 and is the result of collaboration between scholars from Cambridge University and King's College, London.

Tax collection?

The Domesday Book was collated between 1085 and 1086.

Most historians believe it is some sort of tax book for raising revenue.

Dr Baxter, a medieval historian from King's College, London, has a different theory.

He argues its real purpose was to confer revolutionary new powers on King William.

"The inquest generated some pretty useful tax schedules," he explained. "But the book gave him something altogether more powerful." >>>

Daniel Hernandez, "Mayas protest monument to Spanish conquistadors," La Plaza, LA Times blogs, August 11, 2010.

The city of Merida on Mexico's Yucatan peninsula is reviewing a petition to remove a recently built public monument to the city's colonial founder, a figure whom some indigenous Mayas regard as a violent conquistador. The municipal government accepted the petition from a coalition of Mayan organizations to reconsider the monument and statues depicting Francisco de Montejo, known as "El Adelantado," and his son, also named Francisco and known as "El Mozo." The younger Montejo established Merida in 1542, on the site of the former Maya city of T'ho. >>>

"Tony Judt, Historian And Author, Dies At 62," NPR, August 8, 2010.

Much to his presumed irritation, historian Tony Judt, who died on Friday, might be remembered for one word: anachronism.

That's what he called the idea of a Jewish state in Israel in a widely read essay in the New York Review of Books. But Tony Judt was, first and foremost, an intellectual historian.

His book Postwar, about the history of Europe after 1945, became an instant classic. And he made it his mission to try to unpack the nuances of 20th-century history. >>>

Faye Fiore, "Guardians of the nation's attic," Los Angeles Times, August 8, 2010.

When Paul Brachfeld took over as inspector general of the National Archives, guardian of the country's most beloved treasures, he discovered the American people were being stolen blind.

The Wright Brothers 1903 Flying Machine patent application? Gone.

A copy of the Dec. 8, 1941 "Day of Infamy" speech autographed by Franklin D. Roosevelt and tied with a purple ribbon? Gone. >>>

Dispatches from the Historical Society Conference, Day 3: Plenary Session on America's Enduring Two-Party System

Randall Stephens

The Historical Society's 2010 conference came to a close at George Washington University with a final plenary session on Saturday night dealing with the nature of America's two-party system. (Listen to the audio file embedded below. It will take a moment to load. The quality is not the greatest, but the words can be made out OK.) Heather Cox Richardson (University of Massachusetts Amherst) introduced Michael Barone (American Enterprise Institute), who spoke on “The Enduring Character of America’s Political Parties in Times of Continual Change.” These two parties, ancient in the world of modern politics, have long diverged sharply, said Barone. Some deeply consistent themes have defined the Democratic and Republican parties since the mid-19th century. The two distinct parties represent very different constituencies and have, since the 19th-century, upheld rather distinct political ideas. For instance, Barone described the outsider aspect of the Democratic Party, which tended to represent immigrants, saloon keepers, and many on the margins. The party of Roosevelt, populated by interest groups and factions, Barone remarked, lacked the cohesion of the Republican Party.



Commenters Sean Wilentz (Princeton University) and Leo Ribuffo (George Washington University) both praised Barone's extensive knowledge of political history, but each had serious critiques of Barone's key arguments. Ribuffo thought Barone overemphasized the differences between the parties. The two parties were, argued Ribuffo, less like a donkey and an elephant and more like kissing cousins, even incestuous cousins at times. Wilentz argued that Barone had not paid appropriate attention to class. Wilentz and Ribuffo also questioned Barone's insider-outsider thesis. The white democracy of the South hardly fit that pattern. At other points the commenters took issue with the continuities Barone saw.

The lively discussion was a fitting end to an intellectually engaging, vibrant conference that gave attendees much to ponder about the state of the profession and the future of historical inquiry.