Showing posts with label Judgments of Historians. Show all posts
Showing posts with label Judgments of Historians. Show all posts

A Leadership Legacy: Happy 138th, Winston

Philip White

November 30 was Winston Churchill’s birthday. 138 years after his birth, historians, politicians and the public are still as fascinated as ever about this most iconic of British Prime Ministers. Of course, as with every major historical figure, the
Ivor Roberts-Jones statue of Churchill, Oslo, Norway
amount of one-sided deconstructionism has increased over the past few years, no more useful to the reader than one-sided hagiography. The truth, of course, lies somewhere in the middle–a deeply flawed (aren’t we all!) larger-than-life figure who botched a lot of decisions–notably his resistance to home rule for India and well-meaning but ill-conceived support of Edward VIII during the 1936 abdication crisis–who got the big things right.

Among the latter was Churchill’s foresight over the divisions between the democratic West and the Communist East. Since the inception of Communism and its violent manifestation in the Russian Revolution, Churchill had despised the movement, calling it a “pestilence.” Certainly, his monarchial devotion was part of this, but more so, Churchill believed Communism destroyed the very principles of liberty and freedom that he would devote his career to advancing and defending. Certainly, with his love of Empire, there were some inconsistencies in his thinking, but above all, Churchill believed that the individual should be able to make choices and that systemic freedom–of the press, of religion, of the ballot, must be upheld for individuals to enact such choices. That’s why he vowed to “strangle Bolshevism in its cradle,” though his plan to bolster anti-Communist forces was quickly shot down by Woodrow Wilson and David Lloyd George as another of “Winston’s follies.”


In this case, his plan to oppose Communism was indeed unrealistic. There were a small amount of British, Canadian, and American troops and a trickle of supporting materiel going to aid the White Russians toward the end of World War I, but once the Armistice was signed on November 11, 1918, the Allied leaders wanted to get their boys home, not commit more to a seemingly hopeless cause.

But over the next three decades, Churchill’s ideas on how to deal with Communism became more informed, more realistic and, arguably, more visionary. Though he reluctantly accepted Stalin as an ally when Hitler turned on Russia in the fateful summer of 1941, Churchill’s pragmatism and public admiration of the Marshal did not blind him to the ills of the Communist system. The Percentages Agreement he signed with Stalin in a late 1944 meeting has since been blamed for hastening the fall of democratic Eastern Europe, but what Churchill was actually doing there was essentially recognizing that the Communist takeover was a fait accompli, and guaranteeing Stalin’s agreement to largely leave the Greek Communists to their own devices in Greece after World War II. Though Moscow did supply arms and it took the Marshall Plan to prop up the anti-Communist side in Greece, Stalin largely honored this pledge.

He was not so good on his word with many other things, however. Among the promises he made to Churchill and FDR were to include the London Poles (exiled during the war) in a so-called representative government in Poland. In fact, the Communist puppet Lublin Poles ran the new regime after the war, and the old guard was either shunned or killed. In fact, horrifyingly, many of the leaders of the Polish Underground were taken out by Stalin’s henchmen, and others were held in former Nazi camps that the Red Army had supposedly “liberated.” At the Potsdam Conference in July 1946, Stalin showed that his vows at Yalta were mere lip service to the British and American leaders.  He made demands for bases in Turkey, threatened the vital British trade route through the Suez canal and refused to withdraw troops from oil-rich Iran.

Churchill, still putting his faith in personal diplomacy, believed he could reason with Stalin, particularly if Harry Truman backed him up. But halfway through the Potsdam meeting the British public sent the Conservative Party to its second worst defeat in one of the most surprising General Election decisions. Churchill was out as Prime Minister and Clement Attlee was in. Off Attlee went to Germany to finish the dialogue with Truman and Stalin. Churchill feared he was headed for political oblivion.

Yet, after a few weeks of moping, he realized that he still had his pen and, as arguably the most famous democratic leader of the age (only FDR came close in global renown), his voice. And so it was that he accepted an invitation to speak at a most unlikely venue in March 1946 – Westminster College in Fulton, Missouri – not least due to the postscript that Truman added to Westminster president Franc “Bullet” McCluer’s invite, offering to introduce Churchill in the President’s home state. There he described the need for a “special relationship” between the British Commonwealth and the United States, which was needed to check the spread of expansionist Communism and the encroachment of the “iron curtain” into Europe. 


As I explained
Philip White speaking at the National
Churchill Museum, Fulton, Missouri, Nov 11, 2012
when I spoke at the National Churchill Museum on, fittingly, Armistice Day, last month, this metaphor entered our lexicon and was embodied in the Berlin Wall–the enduring image of the standoff. Yet the “special relationship” outlived this symbol, as did the principles of leadership Churchill displayed in his brave “Sinews of Peace” speech (the real title of what’s now known as the “Iron Curtain” address). Churchill was willing to speak a hard truth even when he knew it would be unpopular and then, a few days later, after a police escort was needed to get him into New York’s Waldorf Astoria Hotel as demonstrators yelled “GI Joe is home to stay, Winnie, Winnie, go away,” to boldly declare, “I do not wish to withdraw or modify a single word.” His critics again called him an imperialist, an old Tory and, in as Stalin said, a warmonger. The same insults he had endured when sounding the alarm bell about Hitler in the mid- to late-1930s. And in 1946, just as in the 1930s, Churchill was right.

Not only did Churchill define the Communist-Democratic divide, he also had a plan for what to do about it. Though his more ambitious ideas, including shared US-UK citizenship, did not come to fruition, the broader concepts were embodied in the creation of NATO, European reconciliation, and the Marshall Plan. He also understood not just the Communist system he criticized but the democratic one it threatened, and, the day after the anniversary of Jefferson’s inaugural address, gave a memorable defense of the principles that were, he said, defined by common law and the Bill of Rights. This is something leaders of any political persuasion must be able to do–to articulate what they and we stand for, and why.

As I think of Churchill just after his birthday, that’s what I’m focusing on: vision, understanding and bravery. Such leadership principles will be just as valid 138 years from now as they were on that sunny springtime afternoon in Fulton.

Roundup on Presidential Politics and History


"Historian reflects on George McGovern's enduring impact on presidential politics," Public Radio International, October 22, 2012

McGovern, an icon of liberalism, was a senator and representative from South Dakota, serving from 1957 to 1981. Princeton University professor Julian Zelizer said McGovern played a key role in changing the rules of politics conventions.
>>>

"Everything you need to know about presidential debate history," The Week, October 14, 2012

When were the first debates held? The seven encounters between Abraham Lincoln and Stephen A. Douglas in 1858 are widely considered to be the first "presidential" debates — even though they took place two years before the men were actually running for president.>>>

Sarah Rainsford, "Cubans remember missile crisis 'victory,'" BBC, October 16, 2012

The countryside around San Cristobal is littered with traces of the Cuban missile crisis, when the world came the closest yet to nuclear war.

It was here that the Soviet Union installed dozens of nuclear missiles, pointing at America. Fifty years on, a local guide called Stalin took me to explore what remains of that history.
>>>

Joseph Crespino, "Moderate White Democrats Silenced," NYT, October 2, 2012

Part of the story of working-class whites in the Deep South lies in the demise of the moderate white Democrats who used to win their votes. And that story is wrapped up very much in the history of voting rights and redistricting.
>>>

Christopher Benfey, "The Empty Chair that Keeps Me Awake at Night," NYRBlog, October 17, 2012

I have no idea what Clint Eastwood had in mind when he dragged an empty chair up to the stage at the Republican Convention in Tampa last August. Maybe he was thinking, as some have suggested, of some bygone exercise in a Lee Strasberg acting class. “Please, Clint. Talk to the chair. You are Hamlet and the chair is Ophelia. Please. Just talk to her.” Or maybe a marriage counselor had used an empty chair to teach the tight-lipped gunslinger from Carmel how to empathize with his wife. “Go ahead, Clint, make her day. Tell her what you’re feeling.”
>>>

The Malleability of “the Law”

Steven Cromack

Are Supreme Court decisions important? Most intellectuals, lawyers, teachers, professors, and anyone with an eighth grade education would exasperatedly answer, “Of course! What a stupid question.” The Supreme Court hands down “The Law of the Land.” In Dred Scott v. Sanford (1857), the Supreme Court fueled the tension between North and South over the fugitive slave law. The decision Minor v. Happersett (1875) posited that women are citizens, but citizenship does not mean suffrage. Brown vs. the Board of Education (1954) declared segregation in American public schools unconstitutional. Finally, Citizens United vs. Federal Election Commission (2010) opened the floodgates for political contributions and significantly changed the nature of American politics. These cases have an important impact on our society—increasing tension (and even war) between the branches, parties, regions, and social classes—or, have they?

The fundamental question lurking below the surface is: Are these decisions truly that significant in themselves, or is their significance historically constructed? In other words, are Supreme Court decisions an end in themselves, or are they a means to an end crafted not by the Court, but by historians, lawyers, academics, or historical figures seeking to advance their own agendas? In studying the “history” of such decisions, and how intellectuals and public figures use them for their own purposes, one arrives at the uncomfortable notion that the law is malleable for those who choose to pursue a political agenda.

2003 was the bicentennial of Marbury v. Madison (1803), the fundamental Supreme Court case that set in motion “judicial review” and established the legitimacy of the Court. Or so Constitutional Scholars tell us. In an article titled “The Rhetorical Uses of Marbury v. Madison: The Emergence of a ‘Great Case,’” the Dean of William and Mary and Professor of Law, Davison Davis tracked the evolution of the decision and pointed out that “between 1803 and 1887, the Court never once cited Marbury for the proposition of judicial review (376).” How is it that such a “landmark case” was not mentioned or invoked by the nineteenth-century Court?

Davis contended that when the Court struck down the income tax as unconstitutional in Pollock v. Farmer’s Loan & Trust Co. (1895), only then did defenders of the Court’s decision make Marbury relevant. Davis wrote, “In the struggle to defend the Court’s actions, judicial review enthusiasts elevated the Marbury decision—and Chief Justice Marshall—to icon status to fend off attacks that the Court had acted in an unwarranted fashion (377).” Davis then goes on to outline how politicians, populists, and other figures trashed the Court’s decision, and also how the Warren Court used Marbury as a shield against segregationists and as a sword to assert its power (409). Thus, the Marbury decision was not an end in itself, but a means to end.

It is common knowledge that the Plessy v. Ferguson (1896) decision espoused the “separate, but equal” doctrine. American history textbooks, law books, and historians overwhelmingly cite it as a landmark decision. On May 19, 1896, the day following the decision, The New York Times, the nation’s leading paper, reported of this monumental case, “No. 210 Homer Adolph Plessy vs. J.H. Ferguson, Judge & c- In error to the Supreme Court of Louisiana. Judgment affirmed, with costs.” For those living at the time, the Plessy decision was not that important. The South did not use the Court’s decision to implement Jim Crow. Nor did the decision legalize the Jim Crow laws. Jim Crow came long before the 1896 decision. The New York Times did not mention the case again until the 1950s, when Brown v. the Board of Education (1954) was making its way through the courts.

If one examines the history and evolution of a Supreme Court decision, one finds that decisions in themselves are not ends, but instead, a means to an end for others not on the Court. This is evident in examining the context of our major court cases and tracking their usage across time. Historians, from generation to generation, construct, tear down, and then reconstruct a narrative. While “deconstruction” has had its moment in the sun, it nevertheless has some validity for history: the narrative changes from generation to generation with those who interpret the past. If history is malleable, then the law is malleable, and that is a scary concept.

Abolitionism's Two Formulations

Donald A. Yerxa

Last night Andrew Delbanco gave the Alexis de Tocqueville Lecture on American Politics at Harvard’s Center for American Politics. His “Abolition and American Culture” was a provocative interdisciplinary assessment of antebellum abolitionists (‘the originals”) that also explored abolition as an enduring American cultural dynamic. Without detracting from the originals’ accomplishments, Delbanco believes that more measured approbation acknowledging the “limits of the abolitionist imagination” is needed. Their sacred rage, uncompromising fervor, and furious certitude, he noted, indeed broadened the horizons of the possible in American society—no small thing! But this needs to be considered in the light of the fact that it took the pragmatic Lincoln and a very bloody Civil War to end slavery.

One of the four scholars Harvard invited to respond to Delbanco was THS board member Wilfred McClay. In the interest of full disclosure, I should say that not only is Bill McClay a dear friend, in my opinion he is one of the finest historical essayists of his generation. And this was an ideal venue for his formidable skills. McClay observed that the Puritan-abolitionist style seems prone to a “strange combination of moral grandeur and nannying coerciveness.” In the main agreeing with Delbanco, McClay stressed the importance of the abolitionists’ millenarian religious fervor: “No religion. No abolitionism; it’s that simple.” And like Delbanco, McClay appreciates that abolitionism is amenable to two formulations. The prophetic moral clarity that single-mindedly has named evil as just that also exhibits overbearing and coercive tendencies that seemingly blind it to “the limits of human intentionality and the abyss of unknowable consequences.”

It was not hard to imagine Reinhold Niebuhr looking down on the proceedings last night at Harvard with a smile. And I was also reminded of David Brion Davis’s claim that history is a kind of moral philosophy, teaching by example. Single-minded devotion to noble ends stirs the moral imagination, but it also breeds a moral certitude that flirts with godlike mastery, which in some religious traditions is humanity’s besetting—even "original"?—sin. Much to ponder not only as we reflect on the 19th-century American experience, but also as we consider our present state of affairs.

Judge Ye Not?

Randall Stephens

I assigned John Lewis Gaddis’s The Landscape of History: How Historians Map the Past for my history methods course. It did not go swimmingly. Students were perplexed and overwhelmed by the technical terms, bewildered by Gaddis’s one-man crusade against the social sciences, and distracted by the deluge of citations and the flurry of references to everything from cubism to chaos theory to postmodern nominalism. Rough going.

Still we managed to glean something from the book, which I find to be a fascinating work, as insightful as it is provocative. (That gap between what students like and what profs like.) Students got into his map analogies even if their eyes glazed over as they read of "a preference for parsimony in consequences" (105).

We spent some time delving into a theme from Gaddis’s last chapter. Historians, he writes, without being too grandiose, have some role in liberating the past. "To the extent that we place our subjects in context, we also rescue the world that surrounded them," he notes (140). Earlier he observes that "History happens to historians as well as everyone else. The idea that the historian can and should stand aloof from moral judgment unrealistically denies that fact" (128).

It got me thinking. How and why do historians make judgments about the past? Is it even possible to withhold judgment or bracket it? Selection of material in itself is a kind of judgment.

Anyhow, my mind went back to historians' wisdom on the matter. I excerpt below bits on judgment/liberation in history:

Lord Acton, "The Study of History," 1895. (To be distinguished from Lord Action, an aristocratic, late-Victorian superhero.)

But the weight of opinion is against me when I exhort you never to debase the moral currency or to lower the standard of rectitude, but to try others by the final maxim that governs your own lives, and to suffer no man and no cause to escape the undying penalty which history has the power to inflict on wrong. The plea in extenuation of guilt and mitigation of punishment is perpetual. At every step we are met by arguments which go to excuse, to palliate, to confound right and wrong, and reduce the just man to the level of the reprobate. The men who plot to baffle and resist us are, first of all, those who made history what it has become. They set up the principle that only a foolish Conservative judges the present time with the ideas of the past; that only a foolish Liberal judges the past with the ideas of the present.

Herbert Butterfield, The Whig Interpretation of History (1931; 1965), 107-108.

It is the natural result of the whig historian’s habits of mind and his attitude to history--though it is not a necessary consequence of his actual method--that he should be interested in the promulgation of moral judgements and should count this as an important part of his office. His preoccupation is not difficult to understand when it is remembered that he regards himself as something more than the inquirer. By the very finality and absoluteness with which he has endowed the present he has heightened his own position. For him the voice of posterity is the voice of God and the historian is the voice of posterity. And it is typical of him that he tends to regard himself as the judge when by his methods and his equipment he is fitted only to be the detective. His concern with the sphere of morality forms in fact the extreme point in his desire to make judgements of value, and to count them as the verdict of history. By a curious example of the transference of ideas he, like many other people, has come to confuse the importance which courts of legal justice must hold, and the finality they must have for practical reasons in society, with the most useless and unproductive of all forms of reflection--the dispensing of moral judgements upon people or upon actions in retrospect.

Marc Bloch, The Historian's Craft (1954; 1992), 115-116.

Are we so sure of ourselves and of our age as to divide the company of our forefathers into the lust and the damned? How absurd it is, by elevating the entirely relative criteria of one individual, one party, or one generation to the absolute, to inflict standards upon the way in which Sulla governed Rome, or Richelieu the States of the Most Christian King! Moreover, since nothing is more variable than such judgments, subject to all the fluctuations of collective opinion or personal caprice, history, by all too frequently preferring the compilation of honor rolls to that of notebooks, has gratuitously given itself the appearance of the most uncertain of disciplines. Hollow indictments are followed by vain rehabilitations. Robespierrists! Anti-Robespierrists! For pity's sake, simply tell us what Robespierre was.

E. P. Thompson, The Making of the English Working Class (1963), 12.

I am seeking to rescue the poor stockinger, the Luddite cropper, the "obsolete" hand-loom weaver, the "utopian" artisan, and even the deluded follower of Joanna Southcott, from the enormous condescension of posterity.

Joyce Appleby, Lynn Hunt, and Margaret Jacob, Telling the Truth about History (1994), 66-67.

Even though nothing could have been further from his intention, Hegel opened the way to relativism, that is, the idea that truth depends on historical circumstances. If truth is revealed over time, then any truth, moral, scientific, or political, also changes over time and is never permanent. What seems to be true today may not be true in the conditions of tomorrow; what is true for some people is not true for others. Thus, even as Hegel's views lent great prestige to history, now conceived as an essential framework for philosophy, they also created potential problems for the idea of historical truth itself. Were there no absolute moral standards that transcended the particularities of time and place? Was the role of historians simply limited to explaining how previous people had thought and acted without passing judgment on those thoughts and actions?

Stephen Prothero "Belief Unbracketed: A Case for the Religion Scholar to Reveal More of Where He or She Is Coming From," Harvard Divinity Bulletin (Winter/Spring 2004).

Since Jonestown, religion has shown its dark side repeatedly—with Heaven's Gate, at Waco, and on 9/11. In each case, we Religious Studies scholars have been largely irrelevant to the public debates. True, we drew out the parallels between the Heaven's Gate website and medieval Daoist immortality texts. But we could not explain what produced the worst mass suicide on American soil. No surprise, then, that radio and television producers turned instead to self-styled "cult experts" to explain what happened when Heaven's Gate swung shut. And to experts on the Middle East rather than Islamicists when it came to parsing Islam as "a religion of peace."

Robert Orsi, "A 'Bit of Judgment,'" Harvard Divinity Bulletin (Winter/Spring 2004).

Predictable judgments occlude their implication in power, but this becomes clearer if we think about what a "little bit of judgment" looks like in relation to religious practices that subvert normative modernity or that are simply uncomfortable to the good hearted. It's one thing to come out boldly "for" ecological responsibility. What about "for" speaking in tongues and creating a religious environment in which one's children are expected to speak in tongues as a sign of their religious status? But apart from the boldness and deliciousness of judgment, how exactly does a scholar's being "for" or "against" the practices, say, of rural Pentecostals help us understand the nature of relationships in this world, the press of authority, the meanings of gender and class, the experience of kinship? Wouldn't battering, sneering, and castigating keep us from approaching ways of loving and being that are unfamiliar to us, ways of being and loving which we cannot imagine ourselves being and loving?