Reassessing American ‘Heroes’

American “heroes” often were hailed in their time but are viewed differently through the lens of history, as is happening to racist presidents Andrew Jackson and Woodrow Wilson, notes Lawrence Davidson.

By Lawrence Davidson

It seems as though some of the heroes of the United States are losing their bright reputations. It’s just as well, for they are really bad examples for us all. Of course, you might ask, if that is the case, why were they heroes in the first place?

Part of the reason might be that the negative nature of their attitudes and actions was simply not widely known, owing to both the primitive state of communication and the prevailing racist ideologies of their times.7aj_header

Because conditions and outlooks change, the status of many heroes is provisional – admired in a specific place and a relatively limited time. The American heroes I am thinking of may well have seemed exemplary for their day. However, by today’s standards those times were marked by open bigotry and imperial/colonial ambitions. Let’s hope that we are outgrowing such attitudes.

Consider past luminaries associated with political office and the exercise of power. Despite their celebrated actions, their social attitudes are anathema by modern standards. Thus, while some may still see them as heroes, others certainly have come to see them as scoundrels. That is not the sort of balance that promotes a permanently heroic reputation. Standards change and so does the balance of perceptions.

Against this background let’s take up the recent challenges to the hero status of two past U.S. presidents:  Andrew Jackson and Woodrow Wilson.

Andrew Jackson (aka Old Hickory)

Andrew Jackson (1767-1845) was the seventh U.S. president, serving from 1829 to 1837. His fame is based on misleading legends and the fact that his face has been on the $20 bill since 1928 (which is ironic, because Jackson always opposed the issuance of paper money).

There are two deceptive beliefs about Jackson that have fostered his “great man” image. One is that he was the “common man’s” president, a notion that grew up largely because he was the first president to come from west of the Appalachian Mountains – an area that was then thought of as the “frontier.”

Actually, while born poor and orphaned during the Revolutionary War, Jackson became a wealthy man by the age of 30, lived in a rural mansion on his Tennessee cotton plantation and owned hundreds of slaves, from whose labor his wealth derived. This made him the sort of “self-made man” Americans love to admire.

The second deceptive belief is that he was a great soldier. This is based on his bloody victory at the Battle of New Orleans at the end of the War of 1812, and his brutal campaign against the Indian tribes along the Florida border.

Actually, Jackson’s victory at New Orleans had much more to do with his opponent’s misjudgments and fatally out-of-date tactics than his own military skills. The British marched their men straight toward the American defenses in an open fashion developed for the Napoleonic Wars. They allowed themselves to become overly exposed and this led to the disproportionate number of British casualties when compared to those of Jackson’s forces.

Jackson’s subsequent behavior as the officer leading the campaign against Seminole and other Indian tribes was characterized by genocidal brutality and insubordination. He consistently disobeyed the orders of his superiors.

Nonetheless, all of this helped earn him the presidency in 1829 and, a hundred years later, a place on the $20 bill. However, the real Andrew Jackson was a racist and the Nineteenth Century equivalent of a “thug in a suit.”

He saw himself above the law, which is always particularly dangerous for a democratic leader. This was most clearly seen in his very public disregard of the Supreme Court’s decision favoring the right of the Cherokee Indians to remain on their land in the state of Georgia. Jackson ignored the decision despite its having the force of law, and used the U.S. Army to forcefully remove the Cherokees – not the last time a president would make himself a criminal to much popular acclaim.

There are still some today who protest against any public  airing of these accusations, calling them “libels against Old Hickory.” However, that has not prevented a reexamination of Jackson’s hero status, and as a result, the man’s true nature and actions are being met with the condemnation they deserve. By 2020 Jackson’s face will no longer appear on the front of the $20 bill. He will be demoted to the bill’s reverse side.

Woodrow Wilson (aka The Schoolmaster)

Woodrow Wilson (1856-1924) was the 28th president of the United States, serving from 1913 to 1921. He was also president of Princeton University from 1902 to 1910 and governor of New Jersey from 1911 to 1913. His fame is based on the flawed notion that he was a great champion of democratic government. After all, he led the United States into World War I to “make the world safe for democracy.”Unknown-4

There was only one very big problem with Wilson’s conception of democracy – it was a deeply racist one. As it turns out Wilson was a Southerner transplanted to the U.S. North. He was born in Virginia and spent a good part of his formative years in Georgia and South Carolina. Not all white Southerners of his time and class were racists, but Wilson certainly was.

There is plenty of evidence for Wilson’s racist state of mind. Here are some examples: as president of Princeton, he refused to allow the admittance of African-American applicants; as president he refused demands to desegregate the U.S. military (desegregation was finally achieved under Harry Truman in 1948); also, while attending the Paris Peace Conference he restricted his famous World War I pledge to support national independence and democratic government for all the peoples of the defeated Central Powers (the German and Ottoman empires) to the white populations of eastern Europe. He thus abandoned the peoples of the Middle East to the imperial rule of Britain and France.

But times have changed. In November 2015, Wilson’s racist legacy finally broke into the open when Princeton University’s African-American students, seeking an improved racial atmosphere on campus, occupied the university president’s office. Among their demands was that Wilson’s name be removed from campus buildings, a mural depicting him in one of the university dining halls be removed, and that the name of the Woodrow Wilson School of Public Policy and International Affairs be changed. So far, Princeton has agreed only to remove the mural. But that at least is a beginning.

New Heroes

There are plenty of other U.S. heroes of political renown and aggressive poor judgment, such as Teddy Roosevelt, the 26th president of the U.S., who, while founding U.S. national parks and wildlife preserves, managed to find time to help engineer the Spanish-American War and the imperial seizure of Cuba and the Philippines.

More recently there was John Kennedy, the 35th president of the U.S. He was handsome and young and, in the early 1960s, inspiring of the nation’s youth. However, he initiated the catastrophic U.S. intrusion into Vietnam and, taking up Teddy Roosevelt’s mantle, launched his own invasion of Cuba.

You might argue that all of the above were men of their times, and you would have a point. However, conditions have changed and with them laws and mores. Today’s professed standards of behavior really demand that we start questioning the appropriateness of these figures as national heroes. Their demotion will, hopefully, help us maintain a more humane and principled standard for our times.

All this means that we are in need of newer, more culturally and historically relevant heroes. Men and women such as Martin Luther King, Nelson Mandela, Desmond Tutu, Albert Einstein, Susan B. Anthony, Rachel Carson, Angela Davis, Cesar Chavez, Daniel Berrigan, Noam Chomsky and Daniel Ellsberg, to name but a few.

Certainly one may be able to find skeletons in the closets of these people, but they will not override the humanitarian achievements that make them relevant heroes for our time.

Each of us should give serious consideration to the promotion of new heroes. And, the resulting lists can be easily customized to one’s own ideals and goals. With such an effort we help define ourselves and help make our time better than a very flawed past.

Lawrence Davidson is a history professor at West Chester University in Pennsylvania. He is the author of Foreign Policy Inc.: Privatizing America’s National Interest; America’s Palestine: Popular and Official Perceptions from Balfour to Israeli Statehood; and Islamic Fundamentalism.

How a Classic Movie Fueled US Racism

A century ago, there was a surge in lynching and other white racist violence against blacks across the American South, combined with a burst in Confederate pride, actions and attitudes fueled by the widely proclaimed movie, “The Birth of a Nation,” as William Loren Katz recalls.

By William Loren Katz

By an odd coincidence the first week of Black History Month this February, Time magazine ran an article on the 100th anniversary of the first public showing of the movie classic The Birth of a Nation. This 22-reel, 3-hour and 10 minute silent film was Hollywood’s first blockbuster, first great historical epic, first full-length film (when most ran for minutes not hours), and first to introduce modern cinematic techniques that still keep audiences enthralled.

Time noted the movie’s problem. From its casting and content to its dramatic conclusion it was unabashedly racist. (Spoiler alert: It ends with its armed KKK heroes riding to save “white civilization” from “black barbarians.”)

This first major box office hit charged a staggering $2 admission, had a special musical score played by an orchestra of 30 at each showing, and reached 50 million people before sound films appeared in 1927. Its millions in profits built Hollywood and made movies a major U.S. industry. Beyond profits, it aimed to educate the public in the values of white supremacy. Thomas Dixon, author of the book and the movie, stated that his goal “was to revolutionize Northern sentiments by a presentation of history that would transform every [white] man in the audience into a good Democrat!” [Back then, Democrats in the South represented the interests of white supremacy and segregation, while Republicans were still viewed as the Party of Lincoln.]

In many cities the showing stirred racist violence against African-Americans, and no wonder. White actors put on blackface and played evil African-Americans who were grasping for political power over white people, except when they were intent on raping white women. It projected an air of authenticity by using pictures of Abraham Lincoln and others from the Civil War, and quotes by noted historians such as President Woodrow Wilson.

The Birth of a Nation focused on the period of Reconstruction after the Civil War when formerly enslaved men were allowed to vote and hold office in 11 Southern states. Dixon was a young former Baptist minister in love with gallant Ku Klux Klan stories he heard as a child and decided to write a book, a play, and a movie.

Dixon described Reconstruction as a clash between white good and black evil, when African-American men under the protection of three constitutional amendments and 25,000 federal troops were elected to office in Southern states.

Then his film omits a lot: With white allies, black elected officials helped rewrite the constitutions of Mississippi and South Carolina, elected 22 black congressmen, including two senators from Mississippi, a Supreme Court justice in South Carolina, and a host of state representatives, sheriffs, mayors, and other local officials in 10 states.

This coalition managed to introduce the South’s first public school system, and bring economic, political, and prison reforms to their states, including laws to help the poor of both races and to end racial injustice. Nonetheless, black legislators did not challenge segregation in Southern education, business or personal life.

After about half a dozen years, as the federal government largely sat silent, these governments were overthrown by KKK violence and systematic election fraud. In 1877, the federal government caved in, made a deal with former slaveholders and withdrew all troops. A democratic experiment was overthrown and white supremacy reigned again.

The Birth of a Nation sought to erase any memories of the role of African-Americans and the unity they forged with whites to bring democracy to Southern states. The film’s lesson: Race relations must remain in the hands of those who once owned, “understood,” and controlled black people. And white violence is justified to ensure this noble end.

When the movie was shown at the White House, President Wilson called it “history written in lightning.” When it was shown to members of the Supreme Court, Chief Justice Edward White proudly confided to author Thomas Dixon, “I rode with the Klan, sir.”

The movie also stirred the first large nationwide NAACP-led protests and boycotts. So many black (and white) people marched on theaters that some mayors ordered the removal of lynching and other scenes, or cancelled showings.

African-American and other historians exposed the movie’s lies, distortions, and omissions. But the most fulsome challenge came in 1935 when W. E. B. Du Bois, the great African-American scholar, wrote Black Reconstruction, a thorough history of that era and a documented refutation of the film’s bigoted premise and distortions.

But old racial lies have a high survival rate. In 1950, I was a senior at Syracuse University taking a course on the Civil War and Reconstruction when I read Du Bois’s book as an assignment and wrote a highly favorable report. My professor returned it to me with one word on top: “Nuts.” One of the final exam questions for the course asked students: “Justify the actions of the Ku Klux Klan.” Thomas Dixon, Woodrow Wilson and Chief Justice White would have done well.

A hundred years after the premiere of The Birth of a Nation, the promise of Reconstruction remains unfulfilled. But this has been a century of antiracist struggle, and it has yielded important results. Sometimes we glimpse these symbolically.

In 1915, President Wilson was screening and praising a film that celebrated the Ku Klux Klan and was filled with images of grotesque racist stereotypes. In 2015, President Obama invited the black director Ava DuVernay to the White House to screen Selma, a film that shows how African-Americans fought for the right to vote through courageous activism, facing down murderous white violence.

No doubt, voting rights are under attack once again. This time not by robed Klansmen, but by well-dressed, well-educated members of Congress, the Supreme Court, Northern and Southern state legislatures, and their fabulously wealthy backers.

So it’s worth remembering that racism comes in different guises. But it’s unthinkable that a film with the racial politics of Selma could have been shown in the White House 100 years ago. And that progress is something to celebrate.

William Loren Katz is the author of 40 books on AfricanAmerican history, and has been associated with New York University as an instructor and Scholar in Residence since 1973. His website is Read an interview with Katz about his life teaching and writing history. He wrote this column for the Zinn Education Project,

Learning No Lessons About War

Americans like to think of themselves as a peace-loving people but their record has been one of war-making with the pace of interventions picking up in recent decades as the U.S. military and intelligence services are dispatched around the world, notes ex-State Department official William R. Polk.

By William R. Polk

America appears to be on the brink of another war. This time the conflict is likely to involve Syria and/or Iraq (which U.S. troops just left in 2011). If we jump into one or both of these wars, that will bring the number of significant military operations since American independence from Great Britain to about 200, according to my count.

Not all, of course, were officially “wars.” There also have been many “proactive” interventions, regime-change undertakings, covert-action schemes and search-and-destroy missions. In addition, the United States has provided weapons, training and funding for a variety of non-American military and quasi-military forces throughout the world, including five new African countries in recent months.

History and contemporary events show that we Americans are a warring people. So we should ask: what have we learned about ourselves, our adversaries and the process in which we have engaged? The short answer appears to be “very little.”

As both a historian and a former policy planner for the U.S. government, I will very briefly here illustrate what I mean by “very little.” (I will expand on this thesis in an upcoming book to be called A Warring People.)

I begin with us, the American people. There is overwhelming historical evidence that war is popular with us. Politicians from our earliest days as a republic, indeed even before when we were British colonies, could nearly always count on gaining popularity by demonstrating valor. Few successful politicians were pacifists.

Even supposed pacifists found reasons to engage in the use of force. Take the man most often cited as a peacemaker or at least a peace-seeker, President Woodrow Wilson. He promised to “keep us out of war,” by which he meant avoiding a big, expensive European war, the so-called Great War, better known now as the First World War.

Before becoming president, however, Wilson approved the American conquest of Cuba and the Philippines and described himself as an imperialist; then, as president, he occupied Haiti, sent the Marines into the Dominican Republic and ordered the cavalry into Mexico.

By 1917, Wilson also plunged the United States into the European conflict on the side of Great Britain, France and their allies. In 1918, Wilson put American troops into Russia, following the victory of the Bolsheviks over the Tsar.

Many Reasons

The purpose and explanation of our wars have varied. Many U.S. conflicts, particularly those against the Native Americans, would today be classified as war crimes. But strong justifications can be mounted for the Revolutionary War, the First World War and the Second World War. One could argue the United States had no real choice on the Civil War and, perhaps, the War of 1812. An argument can be made in defense of the Korean War, too.

However, it is the middle grouping of America’s wars that seem to me to be the most important to understand. I see them like this: Some military ventures were really misadventures in the sense that they were based on misunderstandings or deliberate misinformation.

I think most students of history would put the Spanish-American, Vietnam, Iraq and a few other wars in this category. Basically, the government lied to us: the Spaniards did not blow up the USS Maine; the Gulf of Tonkin was not a dastardly attack on innocent U.S. ships, and Iraq was not about to attack with nuclear or chemical weapons, which it did not have.

But we citizens listened uncritically. We did not demand the facts. It is hard to avoid the charge that we were complicit, lazy or ignorant. And afterwards, we did not hold our government to account.

Several wars and other forms of intervention were justified by supposed local or regional requirements of the Cold War. We told one another that the “domino theory” in Indochina was real. So, any hint of Communist subversion or even criticism of U.S. policy sent us racing off to protect almost any form of political association that pretended to be on “our” side.

And we believed or feared that even countries that had little or no connections with one another would topple at the first touch of Communist contamination — even before their neighbors appeared to be in trouble. Therefore, regardless of their domestic political style —  monarchy, dictatorship or democracy — these governments had to be protected.

Our “protection” often included threats of invasion, paramilitary operations, subversion, bribery and direct intervention all in support of our proclaimed intent to keep them free, at least from Soviet control or influence. A partial list of such conflicts includes Guatemala, Nicaragua, Brazil, Chile, Italy, Greece, Syria, Lebanon, Iran, Indonesia, Vietnam and various African countries.

Some interventions involved acquisition of their resources or protection of U.S. economic assets; Guatemala, Chile, Iraq, Iran and Indonesia come to mind. Few if any of these conflicts were to establish peace or even to bring about ceasefires. Those tasks we usually left to the United Nations or regional associations.

High Costs

The costs for all these conflicts have been high. Just counting fairly recent interventions say, since World War II they have cost America over 100,000 fatalities and some multiple of that in wounded; they have cost “the others” — both “enemies” and “friends” even larger multiples of those numbers. The monetary cost is perhaps beyond counting both to them and to us. Figures range upward from $10 trillion.

Beyond the staggering costs, the rate of success of these foreign conflicts has been low. Failure to accomplish the desired or professed outcome is shown by the fact that within a few years of the initial American intervention, the condition that precipitated U.S. involvement had recurred. This rate of failure has dramatically increased in recent years.

That is because we are operating in a world with heightened political sensitivities and global public awareness of these events. Today even poor, weak, uneducated and corrupt nations become focused by the actions of foreigners. Whereas before, a few members of the native elite made the decisions, today we face various national “fronts,” including political parties, tribes and independent opinion leaders. So the “window of opportunity” for foreign interventions that can be carried out in relative anonymity is now often shut.

I will briefly focus on five aspects of this transformation:

–Nationalism has been and remains the predominant way of political thought of most of the world’s people. Its power has long been strong (even when we called it by other names). It was given impetus by the emergence of Communism and popular demands for more equitable economic structures. Religion has also played a part. Today, nationalism in Africa, much of Asia and parts of Europe is increasingly magnified by the rebirth of Islam in the salafiyah movement.

Attempts to crush these nationalist-ideological-religious-cultural movements militarily have generally failed. When foreigners arrive on the scene, local inhabitants tend to put aside their mutual hostilities to unite against the outsiders. The U.S. saw this vividly and painfully in Somalia. The Russians saw it in Chechnya and the Chinese among the Uighur peoples of Xinjiang (former Chinese Turkistan).


–Outside intervention has usually weakened local moderate or conservative forces or at least those more stable tendencies within national movements. People espousing the most extreme positions are more likely to prevail against the invaders. Thus, particularly in protracted hostilities, extremists are more likely to take charge than their moderate domestic rivals.

We have seen this tendency in each of the guerrilla wars in which we have gotten involved. Look, for instance, at the insurgent movements in Syria and Iraq. (For my analysis of the philosophy and strategy of the Muslim extremists, see my essay “Sayyid Qutub’s Fundamentalism and Abu Bakr Naji’s Jihadism” on my website,

What is true of the resistance movements is even more evident in the effects on civic institutions and practices within an embattled society. In times of acute national danger, the “center” does not hold. Centrists get caught between the insurgents battling both the outsiders and the regimes which may be seen as puppets of the foreigners.

Insurgents have to destroy many of the traditional social and governmental bonds to “win.” Thus, in Vietnam, for example, doctors and teachers who interfaced between the French- or U.S.-backed governments and the general population were prime targets for the Vietminh in the 1950s.

And, as the leaders of governments against whom the insurgents are fighting become more desperate, they more aggressively suppress their perceived rivals and critics, often driving these political activists, journalists and judges into the arms of the radicals. And, as the regime’s grip on power weakens, the foreign-backed leaders scramble to create safe havens for themselves by stealing money and sending it abroad. Thus, the institutions of government are weakened and the range of enemies widens.

Over the last half century or so, prominent examples of this pattern have been Vietnam and Afghanistan.

In Vietnam at least by 1962, senior members of the U.S.-backed regime had essentially given up the fight and were preparing to bolt the country. The army commanders were so focused on earning money that they sold U.S.-supplied bullets and guns to the Vietminh.

In Afghanistan, the U.S.-backed regime’s involvement in the drug trade, its draining of the national treasury into foreign private bank accounts (as even President Hamid Karzai has described) and in “pickpocketing” hundreds of millions of dollars from aid projects is well documented. [See, for example,, the monthly reports of the American Special Inspector General for Afghan Reconstruction.)

Short Memories 

–America’s institutional memory of programs, events and trends is shallow, usually no longer than a decade. Thus, we repeat policies even when the record clearly shows that they did not work when previously tried. And we address each challenge as though it is unprecedented. We forget the American folk saying: When you find yourself in a hole, the best course of action is to stop digging.

It isn’t only that the U.S. government (and the thousands of “experts,” tacticians and strategists it hires) do not “remember” the past mistakes, but they often reject the obvious lessons and decide that what they need to do is get a bigger shovel to dig even deeper.

–Despite America’s immigrant origins, we are a profoundly insular people. Few of us have much appreciation of non-American cultures and even less empathy for them. Within a generation or so, few immigrants can even speak the language of their grandparents. Many even shun their ethnic origins.

For example, at the end of the Second World War, despite many Americans being of German or Italian or Japanese descent, the U.S. government was markedly deficient in people who could help implement policies in those defeated countries.

Americans are even more alienated from other important world cultures. When I began to study Arabic, there were said to be only five Americans not of Arab origin who knew the language. Beyond language, grasp of the broader cultural understanding petered off to near zero.

Today, after the expenditure of significant government subsidies to universities (in the National Defense Education Act) to teach “strategic” languages, the situation should be better. But, while we now know much more, I doubt that we understand people from Islamic societies much better.

Take Somalia as an example. Somalia was not, as the media put it, a “failed state;” it was and is a “non-state.” That is, the Somalis do not base their effective identity on being members of a nation-state. Like almost everyone in the world did before recent centuries, they thought of themselves as members of clans, tribes, ethnic or religious assemblies or territories. It is we, not they, who have redefined their political identity.

We forget that the nation-state is a concept that was born in Europe only a few centuries ago and became accepted only late in the Nineteenth Century in Germany and Italy. The idea of nationhood has remained fragile even in many parts of Europe, such as the former Yugoslavia and today’s Ukraine.

For the Somalis, it is still an alien construct. So, not surprisingly, the U.S. attempt to force them or entice them to shape up and act within our definition of statehood has not worked. And Somalia is not alone. If we peek under the flags of Indonesia, Burma, Pakistan, Afghanistan, Iraq, Congo, Mali, Sudan and other nation-states we find powerful forces of separate ethnic nationalisms.

These tensions are often made worse by arbitrarily drawn borders often dating back to colonial times when Western powers divided up the spoils of their overseas conquests and by the sophisticated tools of repression that Europe and the United States provide to many national governments.

When these governments fail to acquire legitimacy in the eyes of significant political or tribal groupings, violence often results, sometimes leading to long, debilitating conflicts with the local regimes serving essentially as proxies for Western interests, a process with ancient and troubling roots.

Since Roman times, foreign rulers have sought to save money by governing through local agents who would do the dirty work of keeping order and extracting wealth. Centuries later, the British imperialists used the Copts to collect taxes from Egyptians and assigned the Assyrians the task of controlling the Iraqi Sunnis. In the modern era, the United States has groomed local elites to manage populations within the U.S. broad spheres of influence.

The echoes of those years continue to reverberate in the Third World today. Ethnic, religious and economic jealousies rooted in those arrangements still abound. Americans may not be sensitive to them, but to many local inhabitants these memories remain painful.

Vast Reach

–Finally, as today’s preeminent nation-state America has a vast reach. There is practically no area of the world where the U.S. does not have one sort of interest or another, with over a thousand military bases in more than a hundred countries. The United States also trains, equips and subsidizes dozens of armies and even more paramilitary or “special” forces.

While these economic and geopolitical interests are a source of strength and richness, they also generate conflicts between what Americans may wish to accomplish in one country and what we think we need to accomplish in another. At the very least, handling or balancing these diverse aims within acceptable means and at a reasonable cost is a challenge, one that we seem less and less able to meet.

Take, for example, Iraq. As a corollary of U.S. hostility toward Saddam Hussein, President George W. Bush and his administration essentially turned Iraq over to Hussein’s enemies, the Iraqi Shia Muslim. (For details, see my Understanding Iraq, New York: HarperCollins, 2005, 171 ff.)

There was some justification for this policy. The Shia community has long been Iraq’s majority and because they were Saddam’s enemies, some “experts” naively thought they would become “our friends.” But immediately two negative aspects of Bush’s policy became evident.

First, the Shiites took revenge on the Sunni Muslim community and thus threw the country into a vicious civil war. What the U.S. called “pacification” often amounted to “ethnic cleansing,” as Shiites and Sunnis violently separated into their own enclaves.

Second, the Shiite Iraqi leaders (the marjiaah) made common cause with coreligionist Iranians with whom the United States had strained relations. Before the U.S. invasion of Iraq in 2003, Bush had clumsily lumped Sunni-ruled Iraq and Shiite-ruled Iran into his artificial “axis of evil.”

At several points in the U.S. military occupation of Iraq, there were opportunities to shift toward a more coherent, more moral and safer policy. But it seemed that few of the U.S. authorities even grasped the problem; certainly they did not find ways to work toward a solution of the dysfunctional policy.

When I was on the State Department’s Policy Planning Council in the early 1960s, we saw our objective as making the world at least somewhat safer, even if not exactly safe for democracy. We surely made many significant mistakes (and our advice often was not heeded by our superiors), but I would argue that we worked within a more coherent framework than the U.S. government has in recent years.

Increasingly, it seems that Washington is in a mode of leaping from one crisis to the next without having understood the first or anticipating the second. I see no strategic vision; only tactical jumps and jabs.

Constitutional Restraint

So what to do? At the time of the writing of the American Constitution, Gouverneur Morris, the principal author of the famous Preamble, remarked that one of the Framers’ goals was “to save the people from their most dangerous enemy, themselves.”

He and other delegates to the Constitutional Convention were especially frightened by the dangers of militarism and tried to restrain the temptation toward unnecessary warfare by imposing checks and balances, such as dividing the war-making powers between the Executive and the Legislature.

The nation’s early leaders, including Presidents George Washington and John Adams, certainly did not look to the military to solve problems of policy. They would have agreed, I feel sure, that very few of the problems that America faced could be solved by military means. They did what they could to keep the young country out of the ongoing conflict between France and England.

I believe many of the Framers would be horrified by the national security state that the United States has become and the gunslinger mentality of careless military actions that has taken hold.

Over the past several decades, the U.S. has been frequently misled by the successes of the postwar policies toward both Germany and Japan, successfully helping those two countries embark upon a new era.

Perhaps consequent to those successes, when the U.S. decided to destroy the regimes of Saddam Hussein and Muammar Gaddafi, little thought was given to what would follow. U.S. policymakers just assumed that things would get better, but they did not. Instead, the societies imploded.

Had U.S. forces invaded Iran as another “regime change” experiment, the results also would have been a moral, legal and economic disaster. By now Americans should know that we should not make proactive war on foreign nations.

Beyond the practical effects, the United States has sworn not to engage in aggressive warfare as part of the treaty creating the United Nations. In short, we need to be law-abiding, and we should look before we leap. We should weigh several factors.

The first is to be realistic: there is no switch we can flip to change our capacities. To look for quick and easy solutions is part of the problem, not part of the solution.

The second  is a matter of will and the costs and penalties that attach to it. We would be more careful in foreign adventures if we had to pay for them in both blood and treasure as they occurred. That is, “in real time.” We now avoid this by borrowing money abroad and by inducing or bribing vulnerable members of our society and foreigners to fight for us.

All our young men and women should know that they will be obliged to serve if we get into war, and we should not be able to defer to future generations the costs of our ventures. We should agree to pay for them through immediate taxes rather than foreign loans.

The third is to demand accountability. Our government should be legally obligated to tell us the truth. If it does not, the responsible officials should be prosecuted in our courts and,  if they violate our treaties or international law, they should have to go before the World Court of Justice. We now let them off scot-free.

Punishment is reserved for some “culprits” like the guards at Abu Ghraib prison who get caught carrying out distasteful policies and for “leakers” like Pvt. Bradley (now Chelsea) Manning who reveal secret activities to the public.

Fourth, in the longer term, the only answer to the desire for better policy is better public education. For a democracy to function, its citizens must be engaged. They cannot be usefully engaged if they are not informed. Yet few Americans know even our own laws on our role in world affairs. Probably even fewer know the history of our actions abroad — that is, what we have done in the past with what results and at what cost.

Ignorance of the World

And as a people we are woefully ignorant about other peoples and countries. Polls indicate that few Americans even know the locations of other nations. And beyond geography, there is nearly a blank page when it comes to other people’s politics, cultures and traditions.

Isn’t it time we picked up the attempt made by such men as Sumner Wells (with his An Intelligent American’s Guide to the Peace and his American Foreign Policy Library), Robert Hutchins, James Conant and others (with the General Education programs in colleges and universities) and various other failed efforts to make us a part of humanity?

On the surface, at least, resurrecting these programs is just a matter of (a small amount of) money. But results won’t come overnight. Our education system is stodgy, our teachers are poorly trained and poorly paid, and we, the consumers,  are distracted by quicker, easier gratifications than learning about world affairs.

I had hoped that we would learn from Vietnam and other failures, but we did not. The snippets of information which pass over our heads each day do not and cannot make a coherent pattern. Absent a matrix into which to place “news,” it is meaningless.

We are like a computer without a program. When we do get data, we lack the means to “read” it. To us, it is just gibberish.

Our biggest challenge therefore comes down to us: unless or until we find a better system of teaching, of becoming aware that we need to learn and a desire to acquire the tools of citizenship, we cannot hope to move toward a safer, more enriching future.

William R. Polk is a veteran foreign policy consultant, author and professor who taught Middle Eastern studies at Harvard. President John F. Kennedy appointed Polk to the State Department’s Policy Planning Council where he served during the Cuban Missile Crisis. His books include: Violent Politics: Insurgency and Terrorism; Understanding Iraq; Understanding Iran; Personal History: Living in Interesting Times; Distant Thunder: Reflections on the Dangers of Our Times; and Humpty Dumpty: The Fate of Regime Change.

Reasons for Intellectual Conformity

In theory, many people hail the idea of independent thinking and praise the courage of speaking truth to power. In practice, however, the pressure of “group think” and the penalties inflicted on dissidents usually force people into line even when they know better, as Lawrence Davidson notes.

By Lawrence Davidson

World Wars I and II created watershed moments in the lives of Western intellectuals, defined here as those who are guided by their intellect and critical thinking and who understand various aspects of the world mainly through ideas and theories which they express through writing, teaching and other forms of public address.

Just how were they to respond to the call of patriotic duty that seduced the vast majority of citizens to support acts of mass slaughter? What constituted a proper response is often debated. How most of them did respond is a matter of historical record.

During the world wars most intellectuals on all sides of the conflicts uncritically lent their talents to their government’s war efforts. Some did so as propagandists and others as scientists. Some actually led their nations into the fray, as was the case with President Woodrow Wilson.

Wilson held a doctorate from Johns Hopkins University, had taught at Cornell, Bryn Mawr and Wesleyan, and became president of Princeton University. Eventually he was elected President of the United States and, having taken the nation to war, sanctioned the creation of a massive propaganda machine under the auspices of the “Committee on Public Information.” He also supported the passage of the Sedition Act of 1918 to suppress all anti-war sentiments.

Wilson never experienced combat, but another intellectual, the British poet Siegried Sassoon, did so in the trenches of the Western front. After this experience he wrote, “war is hell and those who initiate it are criminals.” No doubt that was his opinion of the intellectual President Woodrow Wilson.

In 1928, the French philosopher and literary critic Julien Benda published an important book, The Betrayal of the Intellectuals. In this work Benda asserted that it is the job of the intellectual to remain independent of his or her community’s ideologies and biases, be they political, religious or ethnic. Only by so doing could he or she defend the universal practices of tolerance and critical thinking that underpin civilization.

Not only were intellectuals to maintain their independence, but they were also obligated to analyze their community’s actions and, where necessary, call them into question.

However, as the memory of the intellectuals’ complicity in World War I faded, so did the memory of Benda’s standard of behavior. By World War II it held little power against the renewed demands of national governments for citizens to rally around the flag.

Thus, in that war, with even greater atrocities being committed, most intellectuals either supported the slaughter or remained silent. Some became fascists, others communists, and all too many once more lent their talents to propaganda machines and war industries in all the fighting states.

As a result, the debate over the proper role of the intellectual in relation to power and ideology continues to this day. It is not a question that needs a world war to be relevant. There are any number of ongoing situations where nationalism, ethnicity or religious views spark intolerance and violence. And with each of them the intellectuals, particularly those whose home states are involved, have to make the same age-old choice: Do they follow Woodrow Wilson’s path or that of Julian Benda?

Fate of the Jewish Intellectual

This problem has recently been raised in reference to the seemingly endless Palestinian-Israeli conflict. On April 14, Eva Illouz, a professor of sociology at Hebrew University, published an article in the Israeli newspaper Haaretz entitled, “Is It Possible to Be a Jewish Intellectual?”

In this piece, she sets forth two opposing positions: one is the Zionist/Israeli demand for the primacy of “ahavat Israel,” or the “love of the Jewish nation and people” – the claim that all Jews have a “duty of the heart” to be loyal to the “Jewish nation.” The other position is that of the lone intellectual (here her model is the philosopher Hannah Arendt), whose obligation is to maintain the “disinterested intelligence” necessary to, if you will, speak truth to power.

Illouz explains that Zionists have a “suspicion of critique” and use “the memorialization of the Shoah” (the Holocaust) and “ahavat Israel” to mute it, adding: “The imperative of solidarity brings with it the injunction to not oppose or express publicly disagreement with official Jewish bodies.”

It is within this context that she can ask if it is still possible to be a Jewish intellectual, at least as portrayed by Julien Benda. Illouz’s conclusion is that it has become exceedingly difficult to be so, particularly in the diaspora communities, where the demands for Jewish solidarity are particularly “brutal.”

Illouz is unhappy with this situation. While she feels the allure of “ahavat Israel,” she ultimately supports the position of the independent-mindedness of Benda’s thinker. She insists that the “contemporary Jewish intellectual has an urgent task to unveil the conditions under which Jewish solidarity should or should not be accepted, debunked or embraced. In the face of the ongoing, unrelenting injustices toward Palestinians and Arabs living in Israel, his/her moral duty is to let go, achingly, of that solidarity.”

Primacy of Group Solidarity 

While the portrayal of the intellectual as a thinker insisting on and practicing the right of critical thinking about society and its behavior is an ancient one (consider Socrates here), such behavior is not common in practice. This, in turn, calls Benda’s notion of a proper intellectual into question.

Thus, the description of an intellectual offered at the beginning of this essay (which is in line with common dictionary definitions) does not reference any particular direction of thought. For instance, in practice there is nothing that requires an intellectual to think about societal or government behaviors, much less take a critical public position on such matters.

And, no doubt, there are many very talented minds who, deeply involved in aesthetic matters or certain branches of scientific, linguistic, literary or other pursuits, do not involve themselves with issues of the use or abuse of power.

In addition, one might well be judged an intellectual and be a supporter or even a perpetrator of criminal policies and actions. Woodrow Wilson might fall within this category, as might Henry Kissinger, Condoleezza Rice and many others.

Indeed, from a historical perspective most people of high intellect have sought to serve power and not critique or question it. This is quite in line with the fact that most non-intellectuals accept the word of those in power as authoritative and true.

According to Eva Illouz, this reflects the primacy of group solidarity over truth. She is correct in this judgment. That, no doubt, is why the independent-minded, outspoken intellectuals demanding moral integrity and responsibility from those in power are so rare, be they Jewish or gentile.

Lawrence Davidson is a history professor at West Chester University in Pennsylvania. He is the author of Foreign Policy Inc.: Privatizing America’s National Interest; America’s Palestine: Popular and Official Perceptions from Balfour to Israeli Statehood; and Islamic Fundamentalism.

A History of False Fear

It’s always hard to get someone to speak honestly when his or her livelihood depends on not telling the truth. With the military-industrial-surveillance complex, that reality is multiplied by the billions of dollars and the many careers at stake, Joe Lauria writes.

By Joe Lauria

Despite the deep embarrassment and outrage caused by continuing revelations of the National Security Agency’s abuse of power, meaningful reform is unlikely because at heart the Edward Snowden story is about money and political power. And Snowden has threatened both.

President Obama is considering adopting some NSA reforms recommended by a White House panel. But don’t bet on him going too far.

Federal District Court Judge Richard Leon’s ruling that the controversial NSA programs are “almost Orwellian” and may be unconstitutional is encouraging. Most telling was Leon’s statement that the abusive NSA practices have not stopped one terrorist attack. But don’t count on the government to suddenly start telling the truth about the real level of the terrorist threat.

False fear is what the entire operation is built on. If the disturbing NSA programs are ultimately judged unjustified and unconstitutional and have to be shut down or curtailed, billions of dollars in contracts and careers would be at stake. And that’s why the government will continue to exaggerate the terrorism threat while pursuing Snowden.

It is the government’s last line of defense: that the NSA must do these things to protect the American people from what is really a minimal threat. “National security” is the justification to collect every American’s phone records, emails and Internet traffic and millions of other people’s around the globe.

But is it the nation’s security Snowden has risked, or the interests of a relatively few wealthy and powerful contractors and government officials? Terrorism exists. But are false fears of a rare attack whipped up to link those powerful interests with the entire population’s to win their support?

First there was the color-coded terror alerts. Obama did away with that. But we still take our shoes off at the airport and get x-rayed. Tom Ridge, the first Homeland Security chief, said he was pressured to raise the terrorism alert for political reasons. He ran an entirely new $40 billion-a-year department, with its own security force and private contracts, created because of a single major attack.

When Boston was hit only the second significant attack in decades paramilitary police shut down the whole city and marched innocent people out of their homes at gunpoint. Many of what the government trumpets as disrupted plots over the past few years have been actually engineered by FBI informants, stoking more unnecessary fear. And politicians, law enforcement and the media constantly chatter about terrorism, as if the next attack could happen any minute.

A device goes off every day in Iraq, Pakistan and Syria. Britain endured an IRA bombing campaign. But there’s nothing like that in the U.S.  In fact you are nine times more likely to choke to death, eight times more likely to be killed by a cop, 1,048 times more likely to die in a car crash and 87 times more likely to drown than die in a terrorist attack.

Put another way, your risk of dying from a fireworks accident is 1 in 652,046. The risk from dying from terrorism is 14 times smaller. The State Department says only 17 Americans were killed by terrorists in 2011, and that includes in Iraq and Afghanistan.

A History of Hype

Hyping fear that results in profit and political power unfortunately has a long history in the United States. Mass hysteria against imagined threats for the gain of a few is ingrained in American culture.

Playwright Arthur Miller criticized the anti-communist hype of McCarthyism in The Crucible, showing that orchestrated fear about phantom threats in order to benefit a select group of people reaches back to America’s Puritan past.

To get the people behind a war that was of no concern to them but instead to a powerful and wealthy few, President Woodrow Wilson created the Creel Committee. It was a propaganda ministry that became the precursor of modern public relations. It whipped up American fear and hatred of Germans and anyone who opposed the war.

Wilson’s repressive 1918 Seditions Act then made it a crime to use “disloyal, profane, scurrilous, or abusive language” about the government, the flag or armed services during World War I.

As Brigadier General Smedley Butler said about the First World War: “Beautiful ideals were painted for our boys who were sent out to die. This was the ‘war to end wars.’ This was the ‘war to make the world safe for democracy.’ No one told them that dollars and cents were the real reasons. No one mentioned to them, as they marched away, that their going and their dying would mean huge war profits.” About American motives for entering the war, Butler said:

“The normal profits of a business concern in the United States are six, eight, ten, and sometimes twelve per cent. But wartime profits, ah! that is another matter, twenty, sixty, on hundred three hundred, and even eighteen hundred percent, the sky is the limit. All that traffic will bear. Uncle Sam has the money. Let’s get it. Of course, it isn’t put that crudely in wartime. It is dressed into speeches about patriotism, love of country, and ‘we must all put our shoulder to the wheel,’ but the profits jump and leap and skyrocket, and are safely pocketed.”

Butler said the du Pont’s average 1910-1914 profit of $6 million a year soared to $58 million a year from 1914 to 1918. “Take one of our little steel companies that so patriotically shunted aside the making of rails and girders and bridges to manufacture war materials,” he wrote of Bethlehem Steel, whose average annual profits soared from $6 million to $49 million. Profits soared for a host of other industries, feasting on the taxpayers.

Fearing the Russians

After the Second World War, careers were built on the same kind of hysteria about communism that we are now seeing about terrorism. The Soviet Union was devastated by the war. Yet U.S. administrations inflated Moscow’s military capabilities to get more military spending from Congress. That enriched a military industry that had pulled the U.S. out of the Depression.

Once the war was over the economy tanked again and there was widespread fear of a new Depression. Overblowing the Soviet threat saved the aircraft industry and military spending jumpstarted the post-war economy.

To build up this new, lucrative national security state, Truman instituted the first peacetime draft and transformed the Executive Branch, giving it much more power than the Constitution intended. In July 1947, Truman changed the country probably for good by signing the National Security Act. It set up the Defense Department, the National Security Council and the CIA. In 1952 he wrote a classified letter establishing the NSA.

A phony “missile gap,” with the Soviets, bogus claims to Congress admitted by Gen. Lucius Clay that Moscow was planning war, and McCarthy’s communist witch hunt were among the tactics used. They cemented the surveillance state at home and Cold War abroad, both yielding power for politicians and profits for military contractors.

With the end of the Cold War, the exaggerated terrorist threat became a convenient replacement for the Soviet Union. False fears of Saddam Hussein’s links to the 9/11 attack whipped up support for the illegal 2003 invasion of Iraq, which also did not threaten the U.S., creating a boondoggle for a plethora of new military contractors.

We saw attacks on French culture, including pouring wine down sewers hyped by the news media because France opposed the war.

James Bamford, our most experienced writer on the National Security Agency, points out that when you drive down the Baltimore-Washington Parkway past Fort Meade, behind the trees on your right is the vast campus of the NSA. But across the street on your left are the offices of the handful of private-sector contractors that have a made a bundle off the so-called War on Terror.

An estimated 80 percent of the NSA’s approximate $10 billion annual budget goes to these contractors. Personnel changes hands too. James Clapper, the current director of national intelligence, was an executive at Snowden’s former employer, Booz Allen Hamilton. Mike McConnell left Booz Allen to be the first DNI and then returned to it after he left government. Ex-CIA director James Woolsey works at the firm. The company is owned by the Carlyle Group, one of the biggest military contractors. Their incomes depend on the programs Snowden is exposing.

That stretch of the Parkway and a collection of military contractors near the Pentagon in northern Virginia form the nexus of the military- industrial cooperation fueled by exaggerated fear that President Dwight Eisenhower warned could threaten American democracy.

Truman’s Admission

Less well known is President Truman’s astounding admission. The man who was as responsible as anyone for hyping the Cold War wrote after reflecting on his life:

“The demagogues, crackpots and professional patriots had a field day pumping fear into the American people. Many good people actually believed that we were in imminent danger of being taken over by the Communists and that our government in Washington was Communist riddled. So widespread was this campaign that it seemed no one would be safe from attack. This was the tragedy and shame of our time.”

The Soviet Union at least had a massive standing army and a nuclear arsenal. It fought proxy wars with the U.S., mostly in Africa and Asia. Terrorists do not have such capabilities.

Yet the government and established media (there are media careers at stake too) hammer into us that terrorists pose an existential threat to the United States and that unconstitutional surveillance and perpetual war are therefore justified.

The rare public figure will admit the hype. Zbigniew Brzezinski, President Jimmy Carter’s national security adviser, testified to Congress in 2007 that it was a “simplistic and demagogic narrative” to compare the threat of Islamic terrorism to either Nazism or Stalinism. “Most Muslims are not embracing Islamic fundamentalism;” he said, “al Qaeda is an isolated fundamentalist Islamist aberration.”

A more realistic danger than terrorism to Americans is other Americans with guns. There are nearly 3,000 deaths by gunfire every month in the United States. That is one 9/11 every 30 days. Yet terrorism is hyped and gun violence is explained away.

That’s because of money too. As the bodies from Columbine, Aurora and Newtown pile up, the gun manufacturer’s lobby, the National Rifle Association, plays down the role of guns because it is bad for business.

NSA director General Keith Alexander says the reason there are so few terrorists attacks is due to the very NSA programs Snowden has exposed. He testified before Judge Leon’s ruling that at least 50 terrorist plots have been disrupted since 9/11 because of NSA surveillance. Alexander gave details only about a handful. What isn’t known is how many of these plots were actually FBI sting operations, initiated and carried by the feds using informants.

As Federal Judge Colleen McMahon said about one of these stings: “The essence of what occurred here is that a government, understandably zealous to protect its citizens from terrorism, came upon a man [the supposed terrorism ringleader] both bigoted and suggestible, one who was incapable of committing an act of terrorism on his own.

“It [the F.B.I.] created acts of terrorism out of his fantasies of bravado and bigotry, and then made those fantasies come true. The government did not have to infiltrate and foil some nefarious plot there was no nefarious plot to foil.”

Having covered Susan Rice as the U.S. ambassador at the U.N. since 2009, I asked her through her spokesman the following question as she prepared to leave to become National Security Advisor last summer:

“A country like Pakistan suffers a terrorist attack nearly every day but terrorism inside the U.S. has fortunately been very rare before and after 9/11.  Do you believe the U.S. exaggerates the threat of terrorism, which has justified controversial NSA programs, and if so, in your new job will you work for a more realistic assessment of the terrorism threat?”

It is not surprising she wouldn’t answer. It is hard to know how many of the elite who benefit financially and politically from the surveillance state and perpetual war believe the terrorism hype themselves.

But one thing is certain. They have to keep the fear going and get Snowden to make an example of him and stop future leaks. Their careers may depend on it.

Joe Lauria is a veteran foreign-affairs journalist based at the U.N. since 1990. He has written for the Boston Globe, the London Daily Telegraph, the Johannesburg Star, the Montreal Gazette, the Wall Street Journal and other newspapers. He can be reached at .

How America Became an Empire

Exclusive: Director Oliver Stone and historian Peter Kuznick offer a major reexamination of modern American history in “The Untold History of the United States,” which has many strengths amid a few shortcomings, writes Jim DiEugenio in this first of a two-part review.

By Jim DiEugenio

The title of Oliver Stone’s “The Untold History of the United States” is a bit of a misnomer, both as a book and a Showtime series. It’s more precisely a reinterpretation of official U.S. history over the past century or so. You might call it “The Little Understood Back Story of America’s Imperial Era.”

The 750-page book, which seems to be more the work of Stone’s collaborator, American University history professor Peter Kuznick, picks up the tale around the time of the Spanish-American War at the end of the 19thCentury, with the U.S. conquest and occupation of the Philippines.

The Showtime series some of which is now on YouTube is narrated by Stone and begins, more or less, with the gathering clouds of World War II and the events that led to the dropping of atomic bombs on Hiroshima and Nagasaki.

What’s relatively “untold” about this history is the impact of some little remembered decisions, such as the Democratic Party’s replacing Vice President Henry Wallace with Missouri Sen. Harry Truman in 1944, and some ideologically suppressed memories, like how the Soviet Union broke the back of Adolf Hitler’s Third Reich in World War II.

While much of this context is interesting, even revelatory for a contemporary audience, if you were expecting Stone to push the envelope on new historical disclosures on important events such as John F. Kennedy’s presidency and his assassination you might find the material a tad thin and disappointing.

The chief point of the book and the series at least the first halves that I’m dealing with here is that U.S. history could have gone in a very different and a much more positive direction if the United States had not locked itself on a course toward worldwide empire.

For instance, Stone and Kuznick imply that if Franklin Roosevelt had lived longer or if his favored subordinate, Henry Wallace, had succeeded him as President the worst aspects of the Cold War might have been averted.

If the United States under Harry Truman hadn’t picked up the mantle of Western imperialism from the diminished European powers, millions of lives might have been saved; the United States might have more effectively addressed its own economic and social problems; and many people in the Third World might not have been so profoundly alienated from the U.S.

Stone and Kuznick suggest that an alternative future was available to the United States, but that political, economic and ideological pressures sent the nation down a path that transformed the Republic into an Empire.

The Back Story

The back story of the Stone-Kuznick collaboration dates back to 1996, when Kuznick started an American University history class entitled “Oliver Stone’s America.” That first year, Stone made an appearance as a guest lecturer.

Kuznick and Stone then decided to cooperate on a TV documentary about the dropping of the atomic bomb on Hiroshima. This idea somehow grew into this ten-hour mini-series and its companion book. [New York Times, Nov. 22, 2012]

In an appearance with Stone on Tavis Smiley’s program, Kuznick said this history is told from the point of view of the victims, implying that it was written from the bottom up. Not so.

The book is not a sociological history written from a socio-economic perspective covering things like the plight of minorities. It does touch on those issues, but that is not its prime focus by any means.

The book’s real focus is on America’s foreign relations of the 20th Century and on the key figures who shaped or failed to shape those policies. One of the volume’s major tasks is to re-evaluate two people: Harry Truman and Henry Wallace.

This is an important historical issue because Truman replaced Wallace as Vice President in 1944 and then became President in 1945 when Roosevelt died. If Truman had not replaced Wallace, Wallace would have become President and might have shaped the post-war period very differently, with less antagonism toward the Soviet Union.

Wallace had been Secretary of Agriculture during the New Deal. And according to Arthur Schlesinger, he was very good in that position. (Stone and Kuznick, p. 91) He was then Roosevelt’s personal choice for VP in 1940.

According to the authors, FDR said he would refuse to run for President for an unprecedented third term unless Wallace joined him on the ticket. (pgs. 92-93) By all indications, Wallace was a populist.

For instance, the book contrasts the famous Henry Luce quote about the 1900s being the American Century with Wallace’s reply that it must be “the century of the Common Man.” (p. 101) The authors then contrast Wallace’s view of the Soviet Union, which was much closer to Roosevelt’s during the war, with that of Truman’s belligerence.

The Rise of Truman

How did Truman replace Wallace on the ticket in the first place? FDR’s health was already failing in 1944. This meant two things to the party bosses: 1.) He would not make it through a fourth term, and 2.) They had to stop the too-liberal Wallace from becoming President.

Realizing that Roosevelt was in a weakened state, the bosses enacted what came to be known as “Pauley’s Coup”, since it was led by California millionaire and party treasurer Edwin Pauley. (pgs. 139-40) Pauley was also running the convention and was good friends with Sen. Truman.

Pauley’s group put together a list of alternative candidates to Wallace. Truman was the name that was least objectionable to everyone. In spite of the backroom dealings, Wallace still almost survived.

Sen. Claude Pepper of Florida approached the podium to place his name in nomination. If that had been done, Wallace surely would have won by popular acclamation. But before Pepper could do so, the session was adjourned. (p. 143)

For two reasons, the authors see this as a turning point. First, they feel that the atomic bombs would never have been dropped on Japan if Wallace had become President at FDR’s death. And second, they feel that the Cold War would never have gone into high gear with Wallace in the White House.

There is certainly a lot of evidence in support of those two arguments. Truman was not really well versed in foreign policy at the time he became President. FDR had largely acted as his own Secretary of State.

And, during the war, Roosevelt had tried to serve as a kind of bumper between Stalin and the hard-line anti-communist Winston Churchill. Roosevelt and Cordell Hull, his cooperative Secretary of State, managed to hold off the hardliners, including Churchill. This arrangement fell apart once Hull retired in late 1944 and Roosevelt died in April 1945.

Suddenly, the thinly qualified Truman was in the White House and was much more malleable in the guiding hands of the hardliners. Little about Truman qualified him for the extraordinary geopolitical and moral issues he would face.

Truman had failed at three businesses before he became the creation of Missouri political boss Tom Pendergast, who started Truman off as a judge, though Truman had never graduated from law school. Pendergast then got Truman elected to the U.S. Senate.

When Roosevelt died, Truman felt overwhelmed, since he had only been VP for three months. Because Roosevelt had been ill during those months, the two men did not see each other very much.

The Hardliners Emerge

Once Roosevelt was dead, the hardliners on the Russia issue took over, including Secretary of State James F. Byrnes, Navy Secretary James Forrestal, Gen. Leslie Groves, and Churchill.

Truman began to favor Churchill and England in the allied relationship, something Roosevelt tried to avoid. (Stone and Kuznick, p. 182) Byrnes, a South Carolina politician with little foreign experience, told Russian Foreign Minister V. H. Molotov that Truman planned on using the atomic bomb to get the USSR to comply with American demands on post-war behavior. (ibid. p. 184)

Wallace, who stayed on as Secretary of Commerce, was being marginalized. Truman nominated financier Bernard Baruch to head the Atomic Energy Commission, which oversaw development of nuclear strategy. Baruch laid down terms that all but eliminated the Soviets from joining in the effort.

Finally, Truman invited Churchill to America to make his famous “Iron Curtain” speech in March 1946. As the authors note, it was that militant, bellicose speech which “delivered a sharp, perhaps fatal blow to any prospects for post-war comity.” (p. 191)

A few months later, Henry Wallace tried to counter the sharpness of Churchill’s speech at Madison Square Garden. There, appearing with Paul Robeson and Claude Pepper, Wallace pleaded for a foreign policy that tried to understand the fears of Russia, that tried to meet her halfway. After all, he argued, Russia had been invaded twice by Germany in less than 30 years and had suffered over 20 million dead by the blitzkrieg alone.

Wallace also asked that America not follow the British imperial model in the developing world. And he added that the proper body to have far-flung foreign bases around the world was the United Nations, not the United States. (p. 201)

The speech was sharply criticized in the mainstream press as being a straight right cross to the chin of Byrnes. Even though Truman had read the speech in advance, he fired Wallace, thus eliminating one of the few remaining voices for a more conciliatory approach toward the Soviet Union.  (Pgs. 202-04)

The ouster of Wallace also was the death knell for any hope that FDR’s more balanced strategy toward the World War II alliance would survive into the post-war era. The same month of Wallace’s speech, Elliot Roosevelt published an article in Look detailing how Truman and Churchill had derailed his father’s plans for peace after the war. (ibid, p. 200)  Churchill feared Wallace so much that he placed spies around him.  (p. 138)

This aspect of the Stone-Kuznick book directly ties into the decision to use the atomic bomb. The first point to recall is one that is mentioned by the authors in passing, that the Germans had abandoned their atomic bomb research. Yet, that research was the reason that FDR approved the Manhattan Project in the first place. (p. 134)

Therefore, by the time frame of 1944-45, when the testing of this devastating new weapon was approaching, the reason d’être for the bomb to serve as a deterrent to a German bomb had disappeared. But Truman still used it on the remaining Axis Power belligerent, Japan.

Why Hiroshima and Nagasaki?

The question has always been: Was it necessary to use the bomb to induce Japan into surrendering? Or were diplomacy and a second-front invasion by Russia in 1945 enough to get a surrender without either the bomb or an American invasion? (A particularly good polemic against using the bomb is the late Stewart Udall’s The Myths of August.)

Soviet leader Josef Stalin had promised Roosevelt that he would open up a second front against Japan three months after Germany was defeated and Stalin kept his promise. On Aug. 8 two days after the first U.S. atomic bomb was dropped on Hiroshima and one day before the second bomb destroyed Nagasaki the Soviets launched a three-pronged invasion of Japanese-held Manchuria.

The Soviet invasion was so successful that the Manchurian emperor was captured, and the offensive spread to Korea, Sakhalin Island and the Kuril Islands. Stone and Kuznick note that Japan, which had already suffered devastating fire-bombings of major cities, seemed less concerned about the destruction of Hiroshima and Nagasaki than the dramatic loss of territory to an old enemy, the Russians. Emperor Hirohito announced Japan’s surrender on Aug. 15, after the Russian offensive had secured Manchuria.

The book also notes that in the war’s final months, the hardliners in Truman’s administration, like Byrnes, insisted on an “unconditional surrender” by Japan. To the Japanese, this meant the emperor had to go and that Japanese society would have to be completely restructured.

Yet, there were voices outside the White House, like Gen. Douglas MacArthur, who advised Truman to let the Japanese keep the emperor in order to make it easier for them to surrender. MacArthur was confident that maintaining the emperor would be a help and not a hindrance to rebuilding the country.

The irony of this protracted argument is that, after Hirohito’s announcement of surrender, the allies did let the emperor stay. And he reigned until his death in 1989. Indeed, Hirohito had been looking for a way to surrender since June 1945.

Today it seems fairly clear that the combination of the Soviet invasion and an altering of the unconditional surrender terms could have avoided the hundreds of thousands of deaths and maimings brought on by the two atomic bombs, and perhaps stopped the dawn of the atomic age.

However, both Byrnes and the military commander of the Manhattan Project, Leslie Groves, admitted that they wished to use the weapons not so much to induce Japan to surrender, but to warn the Russians what they were now up against in the post-World War II world.  (Stone and Kuznick, p. 160)

As wiser men like Wallace foresaw, this threat backfired. Stalin ordered a ratcheting up of his scientific team to hurry along the Soviet version of the bomb. (ibid, p. 165)

Misreading the Soviets

Truman also miscalculated regarding the Soviet capability to duplicate the U.S. development of a nuclear bomb. When Truman asked the scientific supervisor of the Manhattan Project, Robert Oppenheimer, how long it would take for the Russians to come up with their version of the bomb, Oppenheimer said he was not sure. Truman said, “I’ll tell you. Never.”  (p. 179)

The Russians exploded their first atomic bomb just four years later. The nuclear arms race was off and running.

The other major argument in support of Truman’s decision to drop the A-Bombs on two Japanese cities has been that lives were saved by avoiding a U.S. invasion of the Japanese mainland, a project codenamed Downfall and scheduled to begin in December 1945. In other words, there were still several months to negotiate Japan’s surrender.

The hurried-up decision to use the bomb seems to stem from the fact that Truman had told Stalin at the Potsdam Conference that the U.S. now had the weapon. (Stone and Kuznick, pgs. 162-65) So, just four days after the conclusion of Potsdam, the first bomb was dropped on Hiroshima. Then, one day after the Russians invaded Manchuria, the second bomb was dropped on Nagasaki.

Still, Stone and Kuznick recognize that their historically well-supported view is considered contrarian to mainstream U.S. history. That’s because the political and historical establishment has tried to prop up Truman as something like a good-to-near-great President.

The reason that people like George Will and Condoleezza Rice do so is fairly obvious. To them, the Cold War and the nuclear arms race were things to be thankful for. But the national mythology about Harry Truman goes further. One needs only consider the enormous success of David McCullough’s 1992 biography of the man, eponymously called Truman. For me, and others, this was a 990-page appeal for Truman’s canonization.

To figure that out, one only has to compare how many pages McCullough spent on Truman’s dramatic come-from-behind victory in the presidential race of 1948 (a lot) versus how many he spent on the decision to drop the atomic bomb (a lot fewer). But McCullough’s book was met with great acclamation. It became a huge bestseller and was made into a TV movie, establishing McCullough as the successor to Stephen Ambrose as the agreed upon historian for the MSM.

A Misleading Claim

The problem with the acclaim is that, as it turned out, McCullough cheated on a key point in defending Truman’s decision to use the A-Bomb. As Stone and Kuznick show, in both their book and film, Truman always (unconvincingly) maintained that the reason he dropped the bombs was to avoid an American invasion of the island. Truman thought that hundreds of thousands of American lives, at times he said a million, would have been lost in an amphibious assault.

In his book, McCullough tried to back up Truman’s claim, by citing a memorandum by Thomas Handy of Gen. George Marshall’s staff saying that an invasion of Japan could cost anywhere from 500,000 to a million lives. McCullough added that this memo showed “that figures of such magnitude were then in use at the highest levels.” (McCullough, Truman, p. 401)

This memo would certainly fortify Truman’s ex post facto defense. The problem is that when writer Philip Nobile went looking for the document, he couldn’t find it. McCullough had left it out of his footnotes, an omission that grew more suspicious when we learn from Stanford historian Barton Bernstein that no such memo by Handy exists.

Bernstein, an acknowledged authority on Truman, told Nobile that the memo in question was actually written by former President Herbert Hoover, who was no military expert and failed to sign it. Clipped to the Hoover memo was a critique of Hoover by Handy. The critique repudiated Hoover’s estimates as being too high.

In other words, McCullough presented in his book the opposite of what Handy’s meaning was. Making it even worse for McCullough is the fact that Bernstein had exposed all this Handy/Hoover mishmash twice before, once in a periodical and once in a book. And that was five years before McCullough’s book was published.  (Click here for Nobile’s article

Yet this shoddy scholarship, if that is what it was, gets ignored in this battle over, as journalist Robert Parry has termed it, the stolen historical narrative of America.

Reconsidering the Eastern Front

Another major theme of the Stone/Kuznick book is that, contrary to what textbooks and Hollywood films like Saving Private Ryan imply, World War II in Europe was not actually won by the Americans. Or the British. It was really won by the Russians.

The story of Operation Barbarossa, Hitler’s massive invasion of the Soviet Union, has been relatively ignored in high school texts, although college texts have been improving on this as of late. There is little doubt today by any serious military historian that the German defeats on the Eastern Front were the primary reason for the fall of the Third Reich.

In the last 20 years, with the fall of the Soviet Union, there has been much good work done out of the Russian archives which allow historians to etch into the saga of World War II the huge military campaigns on the Russian front from 1941-43. This has allowed for the proper crediting of the importance of Marshal Georgy Zhukov, the commander who was most responsible for thwarting Germany’s invasion of the Soviet Union.

For his battlefield successes, Zhukov deserves to be mentioned with the likes of Eisenhower, MacArthur and Montgomery as one of the icons of World War II. Yet, because he was Russian, he is generally ignored.

But it was Zhukov who wisely advised Stalin to abandon Kiev in 1941 and convinced Stalin that Leningrad was the key to their defense. It also was Zhukov whom Stalin sent to save Moscow after the original commanding officer, S. M. Budyonny, could not be located. And, most importantly, it was Zhukov who commanded the counteroffensive at Stalingrad, now widely considered the turning point of World War II. It was also Zhukov who advised the strategy that stopped the last German offensive in 1943 at the great tank battle at Kursk.

As the book notes, Hitler had arranged an invasion force of nearly four million men to attack Russia in 1941, still the largest invasion in the history of warfare. At one time, the Russians were facing about 200 divisions of the Wehrmacht. The British and Americans never faced even close to that many.

But further, Barbarossa accounted for 95 percent of all Wehrmacht casualties from 1941-44 as five major battles were fought on the Eastern Front: Kiev, Leningrad, Moscow, Stalingrad, and Kursk. After Stalingrad and Kursk, the Germans were so decimated they could launch no more offensives in the East.

The rest of the war in Europe was essentially anti-climactic. The Soviet victories on the Eastern Front had doomed the Nazis, not the fabled battles at Normandy and elsewhere on the Western Front.

Stone and Kuznick note that Stalin pressed for a second front almost immediately after the German invasion of the Soviet Union, and Roosevelt agreed. But Churchill argued for a delay in opening up a second front in France in 1942. Instead he wanted to open up a front in North Africa, which would lead to Egypt and the Middle East, therefore preserving British interest in oil and their colonial mandates there.

As a side effect, the Russians would endure the main brunt of the Nazi war machine longer. (Stone and Kuznick, pgs. 104-05) In the Showtime version, Truman is quoted as saying that in his opinion if Germany was winning the battle, America ought to help Russia. He then added that if Russia started to win, the U.S. should help Germany. Truman said the idea was to kill off as many from each country as possible. This is the man David McCullough has beatified.

Assessing Wilson

Earlier in their book, Stone and Kuznick also trained their guns on another overrated president, Woodrow Wilson. Like Truman, who actually tried to join the Ku Klux Klan at one time, Wilson also was a racist who screened D. W. Griffith’s heroic picture about the Klan, Birth of a Nation, in the White House.

Wilson, although ostensibly a Democrat and a progressive reformer, was really a wolf in sheep’s clothing. He once wrote, “There is nothing in which I am more interested than the fullest development of the trade of this country and its righteous conquest of foreign markets.” (Stone and Kuznick, p. 2)

Wilson also clearly favored America getting into World War I on the side of the British.  As the book notes, and as Secretary of State Robert Lansing tried to conceal, the Lusitania was carrying arms to England when she was struck by a German U-boat. (Stone and Kuznick, p. 6) The House of Morgan also had guaranteed so many loans to England during the war that it would have been disastrous for the American banking system if England had been defeated.

Then, once in the war, Wilson did all he could to stifle dissent against it. He set up a propaganda arm called the Committee on Public Information headed by newspaperman George Creel. But Creel also propagandized against the Russians by spreading the lie that both Trotsky and Lenin were German agents. (ibid, p. 9)

The coercion of public opinion became an enduring part of American war culture. Professors who dissented from the war were fired from Columbia University. Socialist politician Eugene Debs was imprisoned. Anti-German attitudes were encouraged and fostered by Creel’s outfit, leading to lynchings. (ibid, pgs. 11-16)

And when it was all over, Wilson failed in large part to gain his sacred Fourteen Points, the basis for which Versailles was supposed be an honorable peace, a peace, as Wilson termed it, for all time.

As the authors note, one reason Wilson failed at Versailles was that he did not make the Fourteen Points part and parcel of the United States entering the war in the first place. If he had he would have had much more leverage.

Although Jon Weiner of The Nation has said the Stone-Kuznick book ignores or discounts the influence of Wall Street on historical events, that is not really accurate. In their discussion of the Eisenhower years, for instance, the authors sketch in the background of the Dulles brothers, John Foster who was Ike’s Secretary of State and Allen who became Director of the CIA.

Both men came from the giant corporate law firm Sullivan and Cromwell. There John was managing partner and Allen was senior partner. Their interest in corporate affairs influenced the decisions the brothers made while in government. (Stone and Kuznick, pgs. 253-54)

I actually think this subject merited more space since one can make a good case that when Allen Dulles came to power at the Agency, he more or less revolutionized the CIA and the uses to which it would be put. And this could not have been done without the help of his brother at State, for Foster was personally friendly with Ike and he would at times remove ambassadors in countries which resisted the siren song of covert action, one which the brothers found so enthralling.

The Guatemalan Coup

Although I wish the authors had done more with this issue of covert action, the book does a good job in its description of the first two famous overthrows that the Dulles brothers managed, i.e. in Iran in 1953 and in Guatemala in 1954. The second account is one of the best summaries I have read.

Before he left office Guatemalan president Jacobo Arbenz accurately stated, “The United Fruit Company, in collaboration with the governing circles of the United States, is responsible for what is happening to us.” He then warned, also accurately, that Guatemala would now descend into “twenty years of fascist bloody tyranny.”

After the Guatemalan coup, John Foster Dulles applauded the victory of democracy over Soviet communism and stated the Guatemalans themselves had cured the situation. (Stone and Kuznick, p. 265)

In this chapter on the Fifties, the book also accurately states that McCarthyism in reality was supplied by FBI Director J. Edgar Hoover. (Ibid, pgs. 231-34) And that its real objective was to eliminate the Left in the United States so there would never be any viable socialist or communist party here.

I wish Stone and Kuznick had explicitly noted that it was not illegal to be a communist in the United States at the time of McCarthy. Therefore, what happened in the Fifties was a collapse of the whole civil liberties system which should have protected his victims from government-directed repression.

For me, the most disappointing chapter in the first half of the book is on John F. Kennedy. The first third of this chapter wraps up the Eisenhower years, devoting attention to Ike’s Farewell Address and its warning about “the military-industrial complex.” But the authors do not mention the U-2 incident which marred the Paris Peace Conference and may have led to what Eisenhower said in that address. (Stone and Kuznick, p. 289)

The book offers a fairly simplistic account of Kennedy’s political career prior to 1960, calling him a Cold War liberal who ran in 1960 as a hawk. This was the first time I felt the book really fell down in its scholarship because to make this rubric stick, there is no mention of Kennedy’s battles with Eisenhower and the Dulles brothers in the Fifties over things like Vietnam and Algeria.

The authors then say that, under Kennedy, foreign policy was still in the hands of the Establishment figures from the Council on Foreign Relations, without saying that Kennedy was never in the CFR. Although the book does mention Kennedy’s try for a cease-fire in Laos, it completely ignores his efforts to beat back the colonialists in Congo and Indonesia in 1961.

Misreading Mongoose

The authors say Operation Mongoose against Cuba began in November 1961 and that one of the objectives was to assassinate Fidel Castro. (Stone and Kuznick, p. 304) I was really surprised to see that in a book co-authored by Oliver Stone, since the operation did not actually go into effect until February 1962, when CIA officer Ted Shackley arrived in Miami to take over the JM/Wave station. (William Turner and Warren Hinckle, Deadly Secrets, p. 126)  And as the CIA Inspector General’s report on the Castro assassination plots reveals, the killing of Castro was never part of the Mongoose operation.

The book then blames the Missile Crisis on Mongoose. (Stone and Kuznick, p. 304) Yet anyone can see by reading The Kennedy Tapes that Soviet leader Nikita Khrushchev’s agenda was really to attain a first-strike capability in order to deal with the question of Berlin. (May and Zelikow, p. 678)

The discussion of Kennedy and Vietnam is also disappointing. The book states that Kennedy was intent on standing up to the communists in Vietnam (Stone and Kuznick, p. 304), to which I would reply, “With what? Fifteen thousand advisers against the combined forces of both the Viet Cong and North Vietnam?”

I was surprised to see some of the sourcing in this chapter. In addition to citing JFK’s purported mistress Mimi Alford, a lot of it was to books like David Halberstam’s obsolete and discredited The Best and the Brightest and to New York Times’ correspondent Tim Weiner’s Legacy of Ashes. There was not one footnote to John Newman’s milestone book JFK and Vietnam, or to works based on the declassified record like James Blight’s Virtual JFK. This baffles me.

And the authors fail to mention a wonderful meeting which could have provided an ironic cap to the chapter on Kennedy (which, at least does end with Kennedy seeking détente with the Russians and Cubans.)

This meeting was occasioned by Harry Truman’s op-ed in the Washington Post on Dec. 22, 1963, a month after JFK’s assassination.In that essay, Truman wrote that the CIA had strayed far afield from the mission he had originally envisioned for it, i.e. an emphasis on objective intelligence gathering and analysis.

It turns out that ex-CIA Director Allen Dulles, who at the time was on the Warren Commission investigating JFK’s murder, was so upset by the op-ed’s implication that he personally visited Truman at his home in April 1964. Dulles tried to get Truman to retract the criticism.

Dulles tried to persuade Truman that newspaper articles at the time of JFK’s assassination saying the CIA had taken over Vietnam policy from Kennedy were wrong.  (James DiEugenio, Destiny Betrayed, Second Edition, pgs. 379-81)  That would have made an ironic and symmetrical tie between Truman, Kennedy and the Dulles brothers.

But despite my various concerns about shortcomings there is much to like in this book. The second part deals with the period from the Johnson administration to Barack Obama’s first term. Stay tuned.

Jim DiEugenio is a researcher and writer on the assassination of President John F. Kennedy and other mysteries of that era. His new book is Destiny Betrayed (Second Edition) from Skyhorse Publishing.

Stumbling into Disastrous Wars

The pundits say America’s economic angst will trump worries about war in the Nov. 6 election. However, as Americans learned a decade ago, careless foreign policies can have disastrous consequences, a lesson that ex-CIA analyst Paul R. Pillar also traces back one and two centuries.

By Paul R. Pillar

As this year’s presidential campaign turns to debates about certain foreign conflicts and controversies with the potential for sucking the United States into war, here is an anniversary-based fact that does not seem to have received notice, certainly nothing like the Cuban missile crisis did this month upon its semicentennial.

The presidents of the United States who were elected 200 and 100 years ago both led the nation into war. Both did so despite earlier indications of personal hesitation and reservation in doing so.

The United States entered war with Britain during the final year of James Madison’s first term. The impetus for war came principally from congressional war hawks from the West and South such as Henry Clay and John C. Calhoun.

When Madison sent to Congress in June 1812 what became known as his war message, it did not explicitly ask for a declaration of war. Instead it only listed the maritime and other grievances that the nation had against the British. Congress did declare a war, “Mr. Madison’s war”, in which Madison would become the only U.S. president to be chased out of the White House by foreign troops.

A century later, as the European powers sank into the carnage of World War I, President Woodrow Wilson said to his aide Colonel House, “Madison and I are the only Princeton men that have become presidents. The circumstances of 1812 and now run parallel. I sincerely hope they will not go further.”

Wilson was elected to a second term in 1916 aided by the slogan, “He kept us out of war.” But only a few months later he asked for and received a congressional declaration of war. The United States was on the winning side of that war, but mismanagement of the aftermath set the stage for another ghastly European war two decades later.

Notwithstanding the many differences, there are some parallels between the circumstances of a century and two centuries ago and those of today. Let us sincerely hope the parallels will not go further.

Paul R. Pillar, in his 28 years at the Central Intelligence Agency, rose to be one of the agency’s top analysts. He is now a visiting professor at Georgetown University for security studies. (This article first appeared as a blog post  at The National Interest’s Web site. Reprinted with author’s permission.)