Showing posts with label Modern U.S.. Show all posts
Showing posts with label Modern U.S.. Show all posts

Economics in the Age of Fracture

Dan Allosso

I’ve started re-reading Daniel T. Rodgers’ Age of Fracture.  I glanced at it in the final run-up to my PhD comps, but it didn’t make much of an impression.  Then Jane Kamensky mentioned it during her closing talk at the Historical Society’s recent conference, and I thought I ought to pick it up again.

This closer reading led me to a couple of thoughts.  First, that there’s probably a whole lot more in many of those books we powered through in grad school; it’s probably worth revisiting some of them and digesting them slowly.  Second, what doesn’t seem relevant when you’re under the gun and trying to absorb the historiography of a field may be really useful when you’re thinking about teaching or writing – especially for the public.

I’ve only scratched the surface of Age of Fracture so far, but it strikes me as a very interesting attempt to argue a complicated point for a more-or-less general audience.  This fascinates me, since I think historians really need to help us all understand how we got to where we are today.  I hope to pick up some ideas about how to do this, especially about where the boundary is between assertion and explication: how much of an argument you can carry with an authoritative voice and how much you need to demonstrate with evidence and analysis.  At one point, for example, Rodgers says, “What precipitates breaks and interruptions in social argument are not raw changes in social experience, which never translate automatically into mind. What matters are the processes by which the flux and tensions of experience are shaped into mental frames and pictures that, in the end, come to seem themselves natural and inevitable: ingrained in the very logic of things” (Kindle Locations 125-127).  This is an interesting claim; very close to the idea I just found in Giambattista Vico’s New Science (another book I picked up as a result of the conference), “Every epoch,” Vico wrote, “is dominated by a ‘spirit’, a genius, of its own. Novelty, like beauty, recommends certain faults which, after fashion changes, become glaringly apparent. Writers, wishing to reap a profit from their studies, follow the trend of their time” (quoted in Anthony Grafton’s Introduction).  It’s a provocative idea, and it obviously has a lineage – but is it true?  And can it be used to explain social change over time?
GDP Growth, 1923-2008, (Source: wikimedia).
Another thing Rodgers does, in the early pages of Age of Fracture, is to provide a schematic for a “historiography” of the field of Economics.  Beginning with Alfred Marshall (Principles of Economics, 1890), Rodgers traces the development of economic thinking (and college economic teaching) through Paul Samuelson (Economics, 1948), and then into the variety of competing interpretations resulting from the failure of macro-economic prediction in the 1970s and 80s.  Along the way, Rodgers mentions many of the relevant texts in this development: popular texts such as Milton Friedman’s Capitalism and Freedom and F. A. Hayek’s Road to Serfdom as well as academic titles like Ronald Coase’s “The Problem of Social Cost” and Richard Posner’s Economic Analysis of Law.  It would be interesting to organize a syllabus around these titles, and read them one after another.  In addition to tracing the development of economic theory, the themes of such a class might be to examine whether theory or contingency really moved society, and more importantly to test the point made above by Rodgers and Vico: to see if the explanations offered by economists in their historical moments contain “faults which, after fashion changes, become glaringly apparent.”

Life and Debt in the US

Randall Stephens

Has the United States ever defaulted on its debt? Yes. It did so in 1790 and in 1933 as well. Both cases are quite different from the current situation in D.C. (More on that below.)

The second of those had to do with the repayment of gold obligations. When "President Roosevelt and the Congress decided that it was a good idea to depreciate the currency in the economic crisis of the time," writes Alex J. Pollock, "they also decided not to honor their unambiguous obligation to pay in gold."

Arthur Schlesinger dealt with the matter in his Coming of the New Deal, 1933-35. The administration, wrote Schlesinger, aimed to break loose from foreign economic entanglements. Here's Schlesinger:

From the viewpoint of classical theory, Roosevelt's decision to abandon the international gold standard was, indeed, a wanton step. When Britain had left gold in 19S1, it had at least done so because the pressure on its gold reserves left it no alternative. But, despite Roosevelt's professed fears about a raid on American gold by Dutch banking interests, United States gold stocks were, in fact, capable of meeting normal foreign demands. The presidential decision seemed therefore to have a more sinister implication. It meant that American monetary policy was no longer to be the quasi-automatic function of an international gold standard; that it was to become instead the instrument of conscious national purpose. More than that, the step involved the repudiation of obligations to pay in gold long written into the "gold clause" of public and private contracts--an act which damaged all creditors who had hoped to make a killing out of the increase in the value of the dollar (203).

Long before, in 1790, the United States defaulted on its international and domestic obligations. The first government of the new nation enacted the Funding Act of 1790, which allowed Alexander Hamilton, secretary of the treasury, to take on the war debts of individual states. It was intended, in part, to create confidence in the new government. Altogether it amounted to $21.5 million dollars of assumed debt. According to John Carney over at CNBC: "Prior to the passage of the Funding Act, much of the debt was expected to default. It traded at deep discounts to face value. Once the act was passed, the value of the debt skyrocketed—because bondholders were sure they would be repaid by the new federal government. In fact, quite a lot of money was made by people who bought the state debt in anticipation of the Funding Act or with early notice that it had passed. Even at the time of the Founding, traders were profiting from informational asymmetries." That positive outcome had to do with the fact that the federal government was not itself in debt, but was only assuming state debt. That's why, says Carney, "the bonds rallied after the passage of the act."

Some weeks ago historian Julian Zelizer reflected on the political troubles that make the current economic crisis different. "There was a time when Congress worked differently," he observes. "During the committee era, which lasted from the 1910s through 1970s, bipartisan dealmakers were the kings of Capitol Hill. Legislating was seen as an art, and producing policy was the objective." Zelizer, writing on July 5th, hoped for a return to the deal making of recent history. That didn't happen, but a deal has been struck, nonetheless. Zelizer fittingly concludes: "But the fact that we have another example of what should be a routine decision turning into high-stakes gamesmanship should be a stark reminder that we need Congress to work better than this."

American Political Partisanship in Historical Perspective

Heather Cox Richardson

Peter Orszag had an article last week at Bloomberg arguing that political partisanship in America has increased dramatically in recent years because Americans have self-segregated their housing according to political leanings. Once in like-minded groups, he suggests, they tend to reinforce each other and drift toward extremes as individuals try to outdo each other in enthusiasm for their political affiliations.

This is an interesting theory. It suggests our political inclinations are beyond our control and that society has spiraled into extremism for reasons we cannot stop.

It would certainly be quite interesting to Americans of the 1840s and 1850s, whose partisanship was so extreme congressmen took guns to the House of Representatives to protect themselves, settlers in Kansas and Missouri murdered each other in their beds, and millions of men killed off several hundred thousand of each other before deciding to call it quits. Who knew that when they moved West, setting up shelters wherever they found good land, that antebellum Americans were unconsciously segregating into political neighborhoods?

There is a much more obvious and more plausible explanation than political segregation for the increases in political partisanship that have occurred with pretty cyclical regularity in American history. It is an explanation that suggests that partisanship and compromise are both deeply imbedded in the American political tradition.

Rising politicians need to be able to attract attention. To that end, they need to distinguish themselves from the successful politicians who hold power. When those senior politicians have emphasized compromise, aspiring politicians have attacked them and advocated more extreme positions. Extremism begets extremism until the system becomes utterly dysfunctional. At that point, aspiring younger politicians can attract attention by advocating not extremism, but compromise.

This cycle of compromise to partisanship to extremism to compromise has turned over again and again in American history.

To see how this works, let’s look at the first generation of professional politicians in America: the Jacksonians of the 1830s. The men who wanted to put Andrew Jackson in the White House needed a way to garner support for a man who was widely regarded as volatile and a rather dim bulb. How could they elevate him when men like Henry Clay, the Great Compromiser, controlled the political scene? By viciously attacking Clay and men of his ilk with unfounded accusations, decrying compromise as weakness, and building a constituency that despised the very art of compromise Clay performed so well.

How then could the next generation of politicians opposed to Jackson’s Democratic Party build its own constituency? By attacking the Democrats, of course.

As political leaders squared off, the newspapers that supported them echoed their rhetoric. There, and not in unconsciously politically segregated communities, individual editors turned up the heat of extremism. Each tried to outdo the competition to draw readership and the advertising dollars readers attracted. Partisanship rose as voters learned to value conflict rather than compromise.

As members of each party more and more often characterized their opponents as corrupt, dangerous, and evil, compromise became increasingly unthinkable.

We know how that turned out.

But the need for politicians to distinguish themselves from their predecessors can serve compromise as well as conflict. When partisanship has become more important than actual governing the government ceases to function in any sort of a competent way. Astute younger politicians then can build careers by promising to compromise with opponents to create solutions that make the government work again. Theodore Roosevelt, for example, recognized voters were frustrated by the extremism of the late nineteenth century that had paralyzed government just when the nation was desperate for solutions to the crises of industrialism. Roosevelt created an image of himself as bipartisan, willing to side with Democrats even against members of his own party to do what was good for the country (a position that infuriated old-fashioned Republicans and Democrats both). Roosevelt was not alone, though. His construction of a politics of compromise was part of the reaction of his generation to the partisanship of the previous generation. That premium on compromise produced the bipartisan Progressive Era.

Which, in turn, was followed by the growing extremism of the 1920s . . . and so on.

These political swings have been part of American society since at least the 1830s. They are not about living quarters. While housing patterns may reflect the current political values of the national culture, Americans are not first self-segregating politically and then self-integrating politically every few generations. What they are doing, though, is listening to their political leaders, reading the news, watching TV, and now, using the internet. What they hear drives their attitudes toward politics.

Far from being a reflection of living patterns created without our conscious control, partisanship and compromise are both deliberate decisions made by political leaders.

A Blast from Our Tech Past

Randall Stephens

Is it true that one-third of the world's population will have watched the royal wedding? Wow. . . (And you thought you could get a break from hearing about THE event of 2011. Nope.)

In the scope of modern history, live broadcasts and
recording technology are such recent developments.

This video from YouTube is an example of primitive video tech. President Eisenhower's 1958 address deals with communication and the challenges of the future. Ironic that he seems to be having such trouble getting just the right words out! (It might count as one of the worst presidential speeches of modern history.) The script takes a decidedly World's Fair tone--the progress rhetoric that will inform amusement parks like Epcot. The World of Tomorrow!!

According to a little history of TV site that the FCC has put together:

In 1956 the Ampex quadruplex videotape replaced the kinescope; making it possible for television programs to be produced anywhere, as well as greatly improving the visual quality on home sets. This physical technology led to a change in organizational technology by allowing high-quality television production to happen away from the New York studios. Ultimately, this led much of the television industry to move to the artistic and technical center of Hollywood with news and business operations remaining on the East Coast.

In 1957 the 1st practical remote control, invented by Robert Adler and called the "Space Commander," was introduced by Zenith. This wireless, ultrasound remote followed and improved upon wired remotes and systems that didn't work well if sunlight shone between the remote and the television.

This "Golden Age" of television also saw the establishment of several significant technological standards. These included the National Television Standards Committee (NTSC) standards for black and white (1941) and color television (1953). In 1952 the FCC made a key decision, via what is known as the Sixth Report and Order, to permit UHF broadcasting for the 1st time on 70 new channels (14 to 83). This was an essential decision because the Nation was already running out of channels on VHF (channels 2-13). That decision gave 95% of the U.S. television markets three VHF channels each, establishing a pattern that generally continues today.

Thus the "Golden Age" was a period of intense growth and expansion, introducing many of the television accessories and methods of distribution that we take for granted today. But the revolution – technological and cultural – that television was to introduce to America and the world was just beginning.

To see how this techsplosion would later impact modern families, watch the charming little BBC series Electric Dreams (2009), now airing on PBS. Gotta get back in time, minus the DeLorean: "The Sullivan-Barnes family from Reading are a thoroughly modern family who own the latest in 21st century gadgetry. In a unique experiment they were stripped of all their modern tech and their own home was taken back in time so that they could live with the technology of earlier decades. The family lived a year per day starting in 1970 right up to the year 2000."

Forum on Daniel Rodgers's Age of Fracture in April Issue of Historically Speaking

Randall Stephens

Over at the U.S. Intellectual History blog there's been a lively discussion and debate generated by Daniel Rodgers's new book Age of Fracture. We add a little to that here with a selection from the forum on the book in the new issue of Historically Speaking. (View the full April 2011 issue on Project Muse. Use your college, university, or library connection for full access.)

AMERICA IN AN AGE OF FRACTURE: A FORUM

Historians and other observers of postwar America note the dramatic social and political changes underway since the 1960s. It was, as Daniel Rodgers puts it, an age marked by discontinuity, shifting party allegiances, and social fracture. An intellectual and cultural historian, Daniel Rodgers is Henry Charles Lea Professor of History and director of the Shelby Cullom Davis Center for Historical Studies at Princeton University. He is the author of four books, including: The Work Ethic in Industrial America, 1850-1920 (University of Chicago Press, 1978), winner of the Organization of American Historians’ Frederick Jackson Turner Prize; Contested Truths: Keywords in American Politics (Basic Books, 1987); and Atlantic Crossings: Social Politics in a Progressive Age (Harvard University Press, 1998), which won the American Historical Association’s Beer Prize and the Organization of American Historians’s Hawley Prize. His latest book, The Age of Fracture (The Belknap Press of Harvard University Press, 2011), argues that in the 1970s Americans began to think of the country in terms of choice-making individuals rather than as a society shaped by classes, interests, and norms. Historically Speaking asked Bruce Schulman, Melani McAlister, Michael Kimmage, and Donald Critchlow to comment on Rodgers’s short essay about this shift in American thought. Their comments are followed by Rodgers’s response. (Citations of Rodgers’s book are in parentheses.)

Daniel Rodgers, "Age of Fracture," Historically Speaking (April 2011)

The history of the United States in the last quarter of the 20th century seems to constitute, at first glance, a maze of paradoxes. It was the age of Reagan, conservative partisans said at the time: a moment of deep political reversal. Peggy Noonan, one of the most gifted of Reagan’s speechwriters, joined the White House staff to be present at “the Reagan revolution.”1 Political managers dreamed of a realignment election that would change the very frame of partisan politics, and political scientists pored over election results to see if they could discern that one had occurred. A sense of the world as shifting rapidly beneath one’s feet was widespread, but, for all of Reagan’s symbolic prominence, the notion of a clear political watershed turned out to be an illusion.

Party allegiances shifted dramatically in the last quarter of the century, particularly among white southern voters; a new and highly energized conservative political movement came into being; but the realignment election that would give the Republican Party a permanent majority failed to occur. Closely fought elections, divided governance, and an increasingly divided electorate have marked the last three decades’ politics, not a new consensus. Even Reagan himself, Noonan wistfully admitted, “wasn’t a revolutionary; he wasn’t a missile drawn to the heat of a new idea.”2 The battles over taxes and regulation that Reagan’s election precipitated represented no revolutionary break with history. Even the “culture wars” and their partisan mobilization of religious loyalties replayed long-standing strains in 20th-century American politics.

A stronger argument for discontinuity can be made for the structures of the late 20th-century economy. The global economic crisis of the 1970s with which the era began, buffeted by oil price shocks and inflation, was a prelude to wide-ranging transformations in the domestic and global economies. Squeezed by new cost pressures, compromises between labor and management unraveled. Union membership spiraled precipitously downward. Manufacture went global in search of cheaper labor. Finance capitalism emerged out of the crisis stronger than ever before, fueled by new and more exotic investment instruments and new investor ambitions for corporate restructuring. The derogatory “age of greed” label reflects a simplistic reading of the moral tone of the era, but the phrase had its structural basis, as David Harvey and others have argued, in the collapse of the Fordist economy of the middle years of the century.3

And yet the age of materialism, global markets, ascendant financial capitalism, new political ambitions, and an intensely politicized punditry was also, and in many ways more fundamentally, a period of deep transformations in social thought. It was here, on the terrain in which Noonan thought her hero Reagan to have been least adept, that the discontinuities of the era were most pronounced. A whole vocabulary of concepts that had once seemed the common sense of social thought weakened and new languages took their place. The age of Reagan, the “age of greed,” was simultaneously, and more importantly, an age of transformation in ideas.

“A war of ideas,” conservatives often called it, but it was much less structured by partisan polarities than has often been understood. Ideas flowed quickly into politics through more aggressively partisan think tanks and more aggressively partisan funders of books and university research. But ideas simultaneously slipped across the political blocs, often incongruously and unpredictably. Deregulation was a radical project before it became a conservative one. The first practical school voucher proposals were the work of liberal social scientists. Libertarian ideas infiltrated social thought, leaving a trail across both Left and Right. >>> read on

See also the four comments and Rodgers's response:

Bruce Schulman, "Daniel Rodgers's (New) Consensus History"

Melani McAlister, "Popular Media in an Age of Fracture"

Michael Kimmage, "A Response to Age of Fracture"

Donald T. Critchlow, "On a Darkling Plain"

Daniel Rodgers, "A New Consensus?"

Economic History: State of the Field in Historically Speaking

~
"But with the slow menace of a glacier, depression came on," Frances Perkins lamented in 1934. "No one had any measure of its progress; no one had any plan for stopping it. Everyone tried to get out of its way.
" How did this thing happen and when will it end? Common questions back in the Dirty Thirties.

Today, journalists, historians, policymakers, and so many others are grasping for some handle on the current economic slump. What's the historical context of economic trouble? What went wrong? Could it have been avoided? "Some brilliant scholar has to write a comprehensive history of modern economics," says David Brooks in the NYT, "because the evolution of this field is clearly one of the most consequential things happening in the world today." Brooks' speculates: "One gets the sense, at least from the outside, that the intellectual energy is no longer with the economists who construct abstract and elaborate models. Instead, the field seems to be moving in a humanist direction."*

Others disagree. Over a month ago Diane Coyle penned an essay for the Chronicle on how "Economics Is on the Verge of a Golden Age." "An astonishing explosion of creativity and intellectual progress has been under way for years in a number of areas," observes Coyle. "Consider competition economics (should the Department of Justice challenge the Google Books settlement on antitrust grounds?), the application of game theory or the use of market design (what's the best system for matching newly qualified doctors or Ph.D.'s to jobs?), development economics, the economics of technological change and network markets (what prices should mobile-phone companies charge for access to one another's networks?), and the study of long-term growth."*

The latest issue of Historically Speaking (April 2010) features a forum on "The Neglected Field of Economic History?" Senior editor Donald Yerxa organized the forum with a generous grant from the Earhart Foundation. I paste below Yerxa's intro to the forum and short excerpts from each essay. (Read the full forum and other material from the new issue of HS at Project Muse.)

No graduate student in history in the 1970s could escape economic history. One of the major professional debates of that era—about the cliometrics of Robert Fogel and Stanley Engerman’s Time on the Cross—went well beyond historiographical interpretation to encompass seemingly fundamental differences over the nature of historical methodology. But where does economic history stand now? In this our third in a series of four forums we asked several leading economic historians to assess the state of their field. Robert Whaples gets our conversation started with the forum’s lead essay. Philip Hoffman, Deirdre McCloskey, Joel Mokyr, and Werner Troesken respond, followed by a rejoinder from Whaples.

"Is Economic History a Neglected Field of Study?"
Robert Whaples

In the fall of 2008 and early 2009 it looked to many weary and wary workers, investors, policy makers, and analysts as though the U.S. economy was about to fall off a cliff into an abyss as bottomless as the Great Depression. What on Earth was going on? Everyone wanted to know, and many turned to history—economic history—for answers. The press burgeoned with interviews and insights from economic historians who were called to Washington and New York to offer advice. Indeed, Christina Romer, an economic historian from University of California, Berkeley, whose pioneering early research examined historical trends in economic volatility and who has done influential research on the causes of the Great Depression and the recovery from it, was tapped by President Barack Obama to be chair of the Council of Economic Ad- visors. And Ben Bernanke, a former Princeton University economist and author of Essays on the Great Depression (2000), held—and still holds—the most powerful economic policy making position in the world as chair of the Federal Reserve.

In these turbulent times, it became obvious to almost everyone that understanding economic his- tory is useful, indeed essential, and economic historians are indispensible. And yet many economic historians have the sense that their discipline is a neglected field, a field on the margins, caught in a no man’s land between two disciplines: ignored and underappreciated by economists and misunderstood, feared, and perhaps even despised by historians. Most economic historians sense that the discipline has almost always been on the margins and that this marginalization has increased appreciably since the end of a brief golden age that glimmered during the 1960s and into the 1970s.

To understand this situation, I’ll begin—as economic historians almost always begin—by doing some counting. . . . read on>>>

"Response to Robert Whaples"
Philip T. Hoffman

To make the picture even more depressing, Whaples (being the good economic historian that he is) backs up his assertions with solid evidence. One could easily add to it. To judge by the titles of articles in mainstream history journals (the American Historical Review, the Journal of American History, the Journal of Modern History, Past and Present), interest in economic history is vanishing.1 Dissertations in economic history in history departments are rare.2 And citations suggest that major works of economic history can pass unnoticed by the history profession even when they address issues that once fascinated many non-economic historians.3

My personal experience, if it is worth anything, suggests much the same. Older historians I know who were trained in the 1970s may not write economic history, but they do seem willing to pay attention to it. They also seem open to borrowing from the social sciences and to the possibility of generalization—in other words, to the notion that what they have unearthed in the archives is not necessarily a special case. . . . read on>>>

"One More Step: An Agreeable Reply to Whaples"
Deirdre N. McCloskey

I agree with every word of Robert Whaples’s elegant and well-grounded essay.1 Whaples doesn’t say things until he has the goods—and as he says, we people from the economic side tend to think of the goods as numbers. It’s very true, as he also says, that our numerical habits have repelled the history-historians, especially since they have in turn drifted further into non-quantitative studies of race, class, and gender (it is amusing that the young economic historian Whaples quotes gets the holy trinity slightly wrong, substituting “ethnicity,” a very old historical interest, for “class,” a reasonably new one; it is less amusing that historians believe they can adequately study race, class, and gender without ever using numbers, beyond pages 1, 2, 3).

But it’s also true, as is shown by the fierce and ignorant quotations he reports from other economists and economic historians, that quantitative social scientists don’t get the point of the humanities. “Whenever I read historians,” said a young economic historian to Whaples, “my response is: How can you say that without a number? Do you have a number?” Many social scientists, and especially those trained as economists, believe adamantly that, as Lord Kelvin put it in 1883, “when you cannot express it in numbers, your knowledge is of a meager and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely in your thoughts advanced to the state of Science.” The young economists nowadays believe this so fervently that rather than deviating ever from their faith they insist on collecting sometimes quite meaningless numbers (such as what is known as “statistical significance,” or what they are pleased to call “calibrations” of a hypothetical model unbelievable on its face). . . . read on>>>

"On the Supposed Decline and Fall of Economic History"
Joel Mokyr

Much like the West, the field of economic history has experienced endless lamentations of its imminent decline and fall. Whaples’s basic argument that economic historians as a group are disrespected by economists and feared and despised by historians is typical of this kind of premature eulogy. The Cliometric Revolution had all been so promising back in the 1970s, and now all we are good for is telling a few stories about past economic crises to entertain our fellow economists or supply them with a telling historical anecdote to decorate the first paragraph of some technical paper. How bad are things, really?

It has never been easy to be an economic historian. Much like Jews in their diaspora, they belong simultaneously in many places and nowhere at all. They are perennial minorities, often persecuted, exiled, accustomed to niche existences, surviving by their wits and by (usually) showing solidarity to one another. They must work harder, and know more. . . . read on>>>

"Toward a Richer, More Diverse Intellectual Marketplace? A Response to Whaples"
Werner Troesken

Mostly I agree with Robert Whaples. Economic history is a neglected field in both economics and history. I have only two concerns. First, Whaples quotes a historian who characterizes cliometrics as generating “trivial” and “unreliable” results. I spent nearly fifteen years in a history department producing work in cliometrics. While I often felt isolated, which is the reason I left, my experience was nothing at all like that implied by the quotation. With a few unimportant exceptions, I always felt that my colleagues in history respected my work. I realize that my experience might not be representative, but I want to offer that qualification up front. Second, I think Whaples overstates the degree to which economists reject historical evidence and the broader enterprise that cliometricians call economic history. Although economic history could be held in higher esteem by economists than it currently is, there is evidence to suggest that economic history still has its place in economics departments.

But, whatever my quibbles, Whaples raises an important question: What is it about the field of economic history that undermines its position among both economists and historians? What follows is a crude and preliminary attempt to answer this question. . . . read on>>>

"Is Economic History a Neglected Field of Study? Final Thoughts"
Robert Whaples

There is considerable good sense in the comments of my four colleagues. I certainly didn’t mean to suggest that economic history is “ready for hospice care” and “doomed to extinction,” or to deliver a “eulogy.” Rather, my fundamental point, which all seem to agree on, is that, despite manifest evidence that economic historians continue to produce a high-quality product that more historians and economists should go out and read, the current amount of output in the economic history industry is below the social optimum. The demand is too low.

I don’t blame economic historians for this. Collectively, we are not as haughty as some of my quotes may suggest. And although we may not have all the breadth, polish, and ability to marshal evidence suggested by my commentators, economic historians are immensely practical. . . . read on>>>

Some Historical Perspectives on National Healthcare

Randall Stephens

Teddy Roosevelt campaigned on health care in his 1912 Progressive Party bid for the presidency. Makes sense. He needed a little medical assistance after surviving a gunshot wound/assassination attempt in Milwaukee: "Friends, I shall ask you to be as quiet as possible. I don't know whether you fully understand that I have just been shot; but it takes more than that to kill a Bull Moose. . . . The bullet is in me now, so that I cannot make a very long speech, but I will try my best."

The Progressive Party platform of 1912 declared: "We favor the union of all the existing agencies of the Federal Government dealing with the public health into a single National health service without discrimination against or for any one set of therapeutic methods, school of medicine, or school of healing with such additional powers as may be necessary to enable it to perform efficiently such duties in the protection of the public from preventable diseases as may be properly undertaken by the Federal authorities . . ."

Two decades later FDR seemed to have thought a national health bill was one bill too many for his already ambitious alphabet soup initiatives.

It then fell to FDR's accidental successor to give it a go. "In my message to the Congress of September 6, 1945, there were enumerated in a proposed Economic Bill of Rights certain rights which ought to be assured to every American citizen." So began President Harry Truman's address to Congress on November 19, 1945.

One of them was: "The right to adequate medical care and the opportunity to achieve and enjoy good health." Another was the "right to adequate protection from the economic fears of . .. sickness ...."

Millions of our citizens do not now have a full measure of opportunity to achieve and enjoy good health. Millions do not now have protection or security against the economic effects of sickness. The time has arrived for action to help them attain that opportunity and that protection. . . .

Our programs for public health and related services should be enlarged and strengthened. The present Federal-State cooperative health programs deal with general public health work, tuberculosis and venereal disease control, maternal and child health services, and services for crippled children. >>>

Didn't work out. Some Republicans equated it with communism and the American Medical Association came out against it.

Indeed, longstanding opposition to federal involvement in health care squelched most efforts. The American Medical Association again stood firm against Lyndon Johnson's landmark 1965 health insurance program. The AMA's chief players backed Barry Goldwater's 1964 campaign, making their criticism loud and clear. The AMA fiercely opposed Medicare. And in the early 1960s, Ronald Reagan warned that after the passage of such a national program, "We will awake to find that we have socialism."

Calvin Woodward offers some useful historical context on the current battle in the Sunday LA Times.

To history, it is likely to be judged alongside the boldest acts of presidents and Congress in the pantheon of domestic affairs. Think of the guaranteed federal pensions of Social Security, socialized medicine for the old and poor, the civil rights remedies to inequality.

Change is coming, it now appears, but in steps, not overnight. . . .

In contrast, on June 30, 1966, after a titanic struggle capped by the bill signing a year earlier, President Lyndon Johnson launched government health insurance for the elderly with three simple words, as if flicking a switch: "Medicare begins tomorrow." >>>

See also: the Boston Globe's (Associated Press) rather interesting timeline of health care legislation; A PBS timeline from Healthcare Crisis; and Jonathan Chait, "Health Care Reform And History," TNR, March 19, 2010.

“Machte Alle Kaput”: The Malmedy Massacre, December 17, 1944


The following is an excerpt from
World’s Bloodiest History: Massacre, Genocide, and the Scars They Left on Civilization by Joseph Cummins (by permission, Fair Winds Press ©2009). Cummins is the author of a variety of books, including War Chronicles: From Chariots to Flintlocks; War Chronicles: From Flintlocks to Machine Guns; History’s Great Untold Stories; and Anything for a Vote: Dirty Tricks, Cheap Shots, and October Surprises in U.S. Presidential Campaigns. He has also edited two anthologies for Lyon’s Press: Cannibals: Shocking True Stories of the Last Taboo on Land and at Sea and The Greatest Search and Rescue Stories Ever Told, and written a novel, The Snow Train. He lives in Maplewood, New Jersey.

The Belgian farmer, whose name was Henri Lejoly, was surprised at the nonchalance of the U.S. troops. Standing in the barren field outside of the town of Malmedy on that cold early afternoon in the winter of 1944, they smoked and joked with each other. Some of them had placed their hands on their helmets in a casual token of surrender to the Waffen-SS troops of Kampfgruppe Peiper—the mechanized task force commanded by the brilliant young German Colonel Jochen Peiper—as it passed by, but beyond that they seemed remarkably unconcerned.

The offhand behavior of the roughly 115 U.S. prisoners may have been because the men came from Battery B of the 285th Field Observation Battery. This was an outfit whose job was to spot enemy artillery emplacements and transmit their location to other U.S. units. It had seen relatively little frontline duty and was filled with numerous green replacements.

Most of the SS troops, including Jochen Peiper, had seen extensive duty in the grim killing fields of the Eastern Front. As Kampfgruppe Peiper passed by these Americans, an SS soldier suddenly stood up in the back of his halftrack, aimed his pistol, and fired it twice into a group of U.S. prisoners. One of them crumpled to the ground. Terrified U.S. soldiers in the field suddenly began to run. Then a German machine gun at the back of another halftrack opened up and U.S. prisoners fell screaming to the ground. Within a matter of a few minutes, the field was covered with quickly coagulating pools of blood and writhing bodies. Then the SS men began to walk among the injured and the dead, pistols out.

“A Greater Risk”
The Battle of the Bulge was the largest battle ever fought in the history of the U.S. infantry and one of the bloodiest battles of World War II, which was the most costly war in human history. The U.S. troops suffered 81,000 casualties, which included 18,000 dead, while their German opponents were hit with 70,000 casualties, including 20,000 dead. The battle lasted forty days in December and January of 1944–45, in atrocious winter weather that was the worst seen in the Ardennes region of Belgium in twenty years, and could easily have resulted in a devastating loss for Allied forces, one that might have stalemated a war that they seemed well on their way to winning.

With all of these matters of great importance, why has so much attention been paid to the killing of eighty-four U.S. soldiers in a small field on December 17, 1944? The Germans of Kampfgruppe Peiper, seventy of whom were convicted in a war crimes tribunal after the war, were surprised—executing prisoners was standard fare on the Eastern Front. So, too, were many U.S. soldiers who had done battle in the Pacific, where the Japanese treated U.S. POWs with casual brutality. Perhaps one reason for the attention paid to the Malmedy Massacre is that many Americans at the time, including, possibly, those of Battery B standing in the field that day, thought that, against the Germans at least, they were fighting a “civilized” war with adversaries who shared the same racial heritage as thousands of GIs.

Another reason for the focus on Malmedy is that, as word spread like wildfire through the U.S. frontline ranks in the immediate aftermath of the killings, U.S. soldiers vowed to take no prisoners. Within a few weeks of Malmedy, one U.S. unit had machine-gunned sixty German prisoners to death in a small Belgian village called Chenogne. As even the official U.S. military history of the Battle of the Bulge states: “It is probable the Germans attempting to surrender in the days immediately following [the killings at Malmedy] ran a greater risk.”

“The Ghost Front”
In a sense, the Allied war against the Germans since the D-Day landings of June 6, 1944, had gone almost too well. After a fierce fight in Normandy, the Americans and British had broken out of their beachheads at the end of July and sent the Wehrmacht reeling backwards, ceding vast areas of France and Belgium to the U.S. armored divisions of the First and Third Armies and the British Twenty-fifth Army Group. But such was the speed of the Allied advance that outfits began to outrun their supply lines. By late fall, the sixty-five Allied divisions operating in northeastern Europe were facing vital supplies shortages, especially of fuel, and their offensive had sputtered to a halt.

Digging in for the winter, the Americans and British sought to consolidate their gains and build up fuel supplies for a massive push into Germany in the early spring. The Allied lines were weakest along a 100-mile stretch from southern Belgium into Luxembourg, a place where U.S. commander Omar Bradley took what he called a “calculated risk” by placing only six U.S. divisions—about 60,000 men—three of which were untried in battle and three of which were exhausted from months of heavy combat.

This area covered the rugged and desolate Ardennes Forest and was mountainous and remote. As December 1944 began, the Ardennes fell prey to the worst winter weather it had experienced in a generation, with temperatures hovering below 0°F/−17°C for days at a time. Snow blanketed the little towns, vacation chateaus, and deep forests of the area. The area was so thinly held by GIs billeted (if they were lucky) in Belgian inns and private homes that it was called “the Ghost Front.” The GIs knew that their German enemies were out there in the snow and fog, but believed that they would never attempt a serious attack in such conditions.

But that is exactly what the Germans did, in a massive counteroffensive personally planned by Adolf Hitler. His goal was to punch through this weakly held part of the Allied line and send his armored divisions streaking toward Antwerp. Once he had captured this vital port, he could force the Allies to sue for peace. With the greatest of secrecy, aided by winter weather that kept Allied planes on the ground, he assembled a huge force of 250,000 men, 1,400 tanks, and 2,000 artillery guns on the eastern edge of the Ardennes. And, at 5:30 a.m. on December 16, this blitzkrieg struck the unsuspecting Americans.

Jochen Peiper
Spearheading the German attack was a remarkable twenty-nine-year-old SS colonel named Jochen Peiper. Peiper was the commander of Kampfgruppe Peiper, the leading battle formation of the First Panzer Division—he had been personally picked by Adolf Hitler to be the point person on the Sixth Panzer Army’s drive to seize the bridges of the Meuse River and capture Antwerp. Holder of the Knight’s Cross with Oak Leaves, Germany’s highest military decoration; an ardent Nazi; and a hardened veteran of fighting in France, Italy, and on the Eastern Front; Peiper was admired by his soldiers, but known as a brutal fighter. He had probably ordered an attack by his unit, which caused the deaths of forty-three Italian civilians in the village of Boves, Italy, in 1943, and in numerous actions against partisans in Russia, his unit deliberately burned villages and killed Russian civilians.

And on the morning of December 17, the second day of the German attack, he was a frustrated man. Because of a heroic and determined resistance by elements of the U.S. 99th Infantry Division, his task force, which consisted of 117 tanks, 149 halftracks, and 24 artillery pieces, was already 12 hours behind schedule. Time is always important in military operations, but in the Ardennes in December 1944, it was the most crucial factor that Peiper, and by extension the entire Wehrmacht, faced. They must reach the bridges on the Meuse River before the sky cleared and the Allied planes, which enjoyed almost total air superiority, could turn their tanks into smoldering wrecks blocking the narrow roads and halting Germany’s last chance at saving itself from total defeat.

“You Know what to Do with the Prisoners”
At around 8 a.m. on December 17, a convoy carrying Battery B, 285th Field Observation Battery, set out from Schevenutte, on the border of Germany and Belgium, on its way to St. Vith, Belgium, which was about to become a focal point of one of the great clashes in the Battle of the Bulge. The convoy consisted of about 130 men, thirty jeeps, weapons carriers, and trucks and was led by Captain Roger Mills, and Lieutenants Virgil Lary and Perry Reardon.

The day was clear and cold, with temperatures well below freezing, and a light dusting of snow on the ground. Battery B reached the Belgian town of Malmedy around noon. After passing through the town, the convoy was stopped on its eastern edge by Lt. Colonel David Pergrin, in charge of a company of combat engineers who were all that were left to defend Malmedy. Pergrin warned Mills and Lary that a German armored column had been seen approaching from the southeast. He advised them to go to St. Vith by another route, but Mills and Lary refused, perhaps because ahead of them were several members of Battery B who had been laying down road markers, and they did not wish to abandon them, or perhaps simply because the route they were to take was stated in their orders.

For whatever reason, Battery B proceeded along its designated route until it came to a crossroads about 2.5 miles (4 km) east of Malmedy, which the Belgians called Baugnetz but the Americans referred to as Five Points, because five roads intersected here. Shortly after it passed this crossroads, the column began to receive fire from two German tanks that were 1,000 yards (0.9 km) down the road. These tanks were the spearhead of Kampfgruffe Peiper, led by Lieutenant Werner Sternebeck, and their 88-mm guns and machine guns easily tore up the U.S. column. Sternebeck and his tanks proceeded down the road, pushing burning and wrecked U.S. jeeps and trucks out of the way and firing their machine guns at U.S. soldiers who cowered in ditches—something Sternebeck later told historian Michael Reynolds that he did to get the Americans to surrender.

Sternebeck then sent the Americans, numbering about 115 in all, marching with their hands held high back to the crossroads at Five Points. He assembled the prisoners in a field there and waited with his tanks and halftracks for further orders. The delay upset Peiper. Racing to the front of the German column, he upbraided Sternebeck for engaging Battery B—because the noise might alert more powerful U.S. combat units nearby—and told him to keep moving. Sternebeck moved out, followed closely by Peiper, and the long line of Kampfgruffe Peiper began to pass the Americans standing in the field, some of whom had begun to relax, put their hands down, and light cigarettes.

After an hour or so, Peiper left an SS major named Werner Poetschke in charge of the prisoners. However, at around 4 o’clock that afternoon, soldiers from the SS 3rd Pioneer company were detailed to permanently guard the prisoners. According to testimony at the war crimes trial, Major Poetschke was heard by a U.S. soldier who understood German telling a Sergeant Beutner: “You know what to do with the prisoners.”

“The Germans Killed Everybody!”
Sergeant Beutner then stopped a halftrack that held a 75-mm cannon and attempted to depress its barrel low enough to aim at the prisoners in the field. When the gun crew was unable to do this, Beutner gave up in disgust and waved the halftrack on, much to the relief of the now edgy and nervous Americans in the field. But then another German unit came by and those Americans who could speak German heard a lieutenant in this unit give the order: “Machte alle Kaput!” Kill the Americans. At first, the Germans present merely stared at the officer, but then Pfc. George Fleps, an ethnic German from Romania, stood up in his halftrack and fired twice at the crowd of Americans.

The Americans in the rear of the group began to run away, even as an officer yelled “Stand fast!” thinking that the Germans would shoot them if they saw them escaping. In fact, this is what happened. Seeing Americans fleeing, a machine gun on the back of a halftrack opened up, cutting down those who stood in the field and those trying to escape.

To this day it is uncertain if the Germans would have shot the Americans had they not tried to run—many German soldiers present later claimed they were merely killing escaping prisoners. However, surviving Americans distinctly remember the German order to kill coming before any of the POWs tried to escape. However, what the Germans did next reinforces the belief that they intended to kill the Americans from the beginning. As the GIs lay moaning on the ground, SS men walked among them, kicking men in the testicles or in the head. If they moved, the SS men would casually lean over and shoot them in the head. Some survivors later testified that the Germans were laughing as they did this.

Lejoly, who was a German sympathizer, nevertheless could not believe his eyes as he watched one SS man allow a U.S. medic to bandage a wounded soldier, after which the German shot both men dead. Eleven Americans fled to the café nearby, but the Germans set it on fire and then gunned down the men as they ran out. As this killing was going on, the German column continued to pass through Five Points, and soldiers on halftracks chatted and pointed. Some fired into already dead Americans, as if to practice their aim.

Amazingly enough, some sixty Americans were still alive in the field after the machine-gunning. As the SS massacred the survivors, they realized they had no choice to but to try to escape, and they rose and ran as fast as they could to the back of the field, heading for a nearby woods. The Germans swept them with rifle and machine gun fire, but made little attempt to chase after them. Perhaps forty made good their escape into the deepening dusk. Most of them attempted to make their way back to Malmedy, some wandering for days before they returned. However, early that evening, three escapees did encounter a patrol led by Colonel Pergrin, who had heard the shooting and was coming to investigate. The men, covered with blood, were hysterical.

“The Germans killed everybody!” they shouted at Pergrin.

Aftermath of the Massacre
That evening, Pergrin sent back word to 1st Army Headquarters that there had been a massacre of some type at Malmedy. The area around Five Points was so hotly contested that it was not until nearly a month after the massacre, on January 14, that the U.S. Army was able to recover the bodies of the 84 men who had been killed in that field. Autopsies conducted on the frozen corpses showed that forty-one men had been shot in the head at close range and another ten had had their heads bashed in with rifle butts. Nine still had their arms raised above their heads.

By the time the war ended, the U.S. public knew all about the Malmedy massacre and clamored for revenge. On May 16, 1946, a year after the end of hostilities in Europe, Peiper and seventy of his men were placed on trial for war crimes connected with the massacre. The trials were deliberately held on the site of the Dachau concentration camp, to garner maximum symbolism from the event.

Not all of the presumed guilty could be punished—both Major Poetschke and Sergeant Beutner died in action during the war. But at the end of the proceedings, all seventy of the SS men, as well as Peiper, had been convicted of war crimes by a six-man panel of U.S. officers. Forty-three of them, including Peiper, were sentenced to die by hanging, twenty-two to life imprisonment, and the rest to ten- to twenty-year sentences.

However, the trials were tainted by later testimony that the SS men had been tortured by U.S. interrogators before their trials. All of the death sentences were commuted to imprisonment and, in 1956, Jochen Peiper became the last member of the group to walk out of jail. Peiper, who was murdered in France in 1976 by a shadow group of anti-Nazi terrorists who called themselves “the Avengers,” always claimed that he did not give express orders to kill the prisoners at Malmedy, and he probably did not.

Though we may never completely know the truth surrounding the Malmedy massacre, there is no doubt that, in the end, the deaths there stiffened U.S. resolve to destroy the Nazis, and the hated SS, wherever they found them.

Feelings, Nothing More than Feelings

Randall Stephens

In "C (for Crisis)" (London Review of Books, 6 Aug 2009) Eric Hobsbawm looks at a significant shift in historical studies. "There is a major difference," he observes "between the traditional scholar’s questions about the past–‘What happened in history, when and why?’–and the question that has, in the last 40 years or so, come to inspire a growing body of historical research: namely, ‘How do or did people feel about it?’"

Hobsbawm uses Richard Overy's The Morbid Age: Britain between the Wars to pose some questions about the change in focus: "Though this type of research is fascinating, especially when done with Overy’s inquisitiveness and surprised erudition, it presents the historian with considerable problems. What does it mean to describe an emotion as characteristic of a country or era; what is the significance of a socially widespread emotion, even one plainly related to dramatic historical events? How and how far do we measure its prevalence?"

I hadn't thought of things in these terms. But, on reflection, this change in emphasis to a national emotional experience does seem to be a major trend. Take, for example, the relatively new research on the 1960s and 1970s. Much of it, not surprisingly, focuses on the political and emotional turmoil of those two decades. Three books in particular, all sophisticated works of history, fit that pattern: Philip Jenkins, Decade of Nightmares: The End of the Sixties and the Making of Eighties America (2006); Rick Perlstein, Nixonland: The Rise of a President and the Fracturing of America (2008); Andreas Killen, 1973 Nervous Breakdown: Watergate, Warhol, and the Birth of Post-Sixties America (2006). (I used the latter in my modern America course and it worked very well.)

Hobsbawm gives good background on the development of the "how-did-people-feel?" genre. "The pioneer, he says is, "Jean Delumeau’s history of fear in Western Europe from the 14th to the early 18th century, La Peur en Occident (1978), describes and analyses a civilisation ‘ill at ease’ within ‘a landscape of fear’ peopled by ‘morbid fantasies’, dangers and eschatological fears." In the end, Hobsbawm criticizes Overy's The Morbid Age for not doing what it set out to do. "Overy’s book, however acute in observation, innovative and monumental in its exploration of archives, demonstrates the necessary oversimplifications of a history built around feelings. Looking for a central ‘mood’ as the keynote of an era does not get us closer to reconstructing the past than ‘national character’ or ‘Christian/Islamic/Confucian values.'"

Hobsbawm's review made me wonder about other books that might fit into the genre. And it made me question the strengths and weaknesses of tracking national "feelings" or "moods."