Bob Parry: Holding Government and Media to Account

A memorial was held on Saturday for Robert Parry, the late founder and editor of this web site.  Among the speakers paying tribute to Bob was Joe Lauria, the new editor of Consortium News.

By Joe Lauria

If you watch Bob’s various talks available on YouTube you’ll see that he was often asked why he started Consortium News. Bob says, essentially, that he got fed up with the resistance he faced from editors who put obstacles in the way of his stories, often of great national significance. One editor at Newsweek told him they were suppressing a story for “the good for the country.” The facts he’d unearthed went too far in exposing the dark side of American power. His editor was speaking, of course, about what was for the good of the rulers of the country, not the rest of us. As we just heard from John Pilger, Bob created a consortium for journalists who ran up against similar obstruction from their editors: a place for them to publish what they could not get published in the mainstream.

Sixteen years after Bob launched Consortium News with Sam and Nat I became one of those journalists. I’d had similar experiences. When I covered the diplomacy at the U.N. leading up to the 2003 invasion of Iraq for a Canadian chain that published the Montreal Gazette, Ottawa Citizen, and other papers, I gave equal weight in my stories to the German, French and Russian opposition on the Security Council to the invasion.  So the chain’s foreign editor called me up one day from Ottawa to berate me for not supporting the war effort in my reporting.
He told me his son was a marine. I told him I was certain he was proud of him, but my job was not to support the war but to report objectively on what was happening at the Security Council.  The Bush administration never got their resolution. But they invaded anyway. It was illegal under international law as Kofi Annan finally said after being pressured by a BBC interviewer. Annan was then hounded to the point of a near nervous breakdown by the likes of then UN Ambassador John Bolton, who, unfortunately. has since gotten a promotion. I, on the other hand, on the day of the invasion was fired.

Later, while covering the U.N. for The Wall Street Journal, I found that several of my stories were suppressed or inconvenient facts were getting edited out. One was a story I twice had rejected on a declassified Defense Intelligence Agency document that predicted the rise of ISIS back in 2012 but was ignored in Washington. It said the U.S. and its allies in Europe, Turkey and the Gulf were supporting a Salafist principality in eastern Syria that could turn into an Islamic State. Such a story would undermine the government’s war on terrorism.

In another instance, my editors repeatedly removed from my stories, on the UN vote on Palestine’s observer status, a line indicating that 130 nations had already recognized Palestine. At that point I realized the Journal had an agenda—not to neutrally report complex international events from multiple sides, but to promote US interests abroad. So I turned to Bob and he accepted a piece from me on that Palestine issue in late 2011, the first of many of my articles that he eventually published.

Bob was without doubt the best editor I’ve ever had. He was the only one who really understood—or accepted–what I was writing about.

Bob was a supreme skeptic, but he never descended to cynicism. His legacy, which I am committed to carry on, was of a principled, non-partisan approach to journalism. He took a neutral stance reporting on international issues, which some wrongly saw as anti-American. Bob knew never to take a government official’s word for it, especially an intelligence official. He knew people in all governments lie. But there are two other parties involved: the press and the public. He understood that the press had to act as a filter, to verify and challenge government assertions, before they are passed on to the public. Bob became distraught, and in his last piece poignantly said so, about the state of American journalism, where careerism and vanity had aligned the profession with those in power, a power through which too many reporters seem to live vicariously.

The press’ power is distinct from the government’s, it is the power to hold government accountable on behalf of the public. Bob understood that the mainstream media’s greatest sin was the sin of omission: leaving out of a story, or marginalizing, points of view at odds with a U.S. agenda, but vital for the reader to comprehend a frighteningly complex world.

The viewpoints of Iranians, Palestinians, Russians, North Koreans, Syrians and others are never fully reported in the Western media, though the supposed mission of journalism is to tell all sides of a story. It’s impossible to understand an international crisis without those voices being heard. Routinely or systematically shutting them out also dehumanizes people in those countries, making it easier to gain popular support in the U.S. to go to war against them.

The omission of such news day after day in newspapers and on television adds up over the decades to what Bob called the Lost History of post-war America. It is a dark side of American history—coups overthrowing democratically-elected leaders, electoral interference, assassinations and invasions. Omitting that history, as it continues to unfold nearly everyday, gives the American people a distorted view of their country, an almost cartoonish sense of America’s supposed morality in international affairs, rather than it just pursuing its interests, too often violently, as all great powers do.

These things aren’t normally mentioned in polite society. But Bob Parry built his extraordinary career telling those truths. And I’m going to do my damnedest to continue, and honor, his legacy.

Thank you.




America’s Complicated Relationship with International Human Rights Norms

The U.S. has long had a love-hate relationship with international norms, having taken the lead in forging landmark human rights agreements while brushing off complaints over its own abuses, Nat Parry explains.

By Nat Parry

American exceptionalism – the notion that the United States is unique among nations due to its traditions of democracy and liberty – has always been the foundation of the nation’s claim to moral leadership. As a country founded on ideals that are today are recognized the world over as fundamental principles of international norms, the U.S. utilizes its image as a human rights champion to rally nations to its cause and assert its hegemony around the world.

Regardless of political persuasion, Americans proudly cite the influence that the founding principles laid out in the Declaration of Independence and the Bill of Rights have had on the rest of the world, with 80 percent agreeing that “the United States’ history and its Constitution … makes it the greatest country in the world” in a 2010 Gallup poll. Respecting these principles on the international level has long been considered a requisite for U.S. credibility and leadership on the global stage.

Much of this sentiment is an enduring testament to U.S. leadership following World War II, a period in which international legal principles of human rights and non-aggression were established, as well as the four decades of the Cold War, in which the “free world,” led by the United States, faced off against “totalitarian communism,” led by the Soviet Union.

During those years of open hostility between East and West, the U.S. could point not only to its founding documents as proof of its commitment to universal principles of freedom and individual dignity, but also to the central role it played in shaping the Charter of the United Nations and the Universal Declaration of Human Rights.

Fourteen Points and Four Freedoms 

While the U.S. didn’t fully assume its position as moral arbiter until after the Allied victory in World War II, its role in these matters had already been well-established with Woodrow Wilson’s professed internationalism. As expressed in his famous “Fourteen Points,” which sought to establish a rationale for U.S. intervention in the First World War, the United States would press to establish an international system based on “open covenants of peace, openly arrived at, after which there shall be no private international understandings of any kind but diplomacy shall proceed always frankly and in the public view.”

Wilson had seen the First World War as evidence that the old international system established by the Europeans had failed to provide necessary security and stability, and sought to replace the old diplomacy with one based on cooperation, communication, liberalism and democracy.

Speaking on this issue throughout his presidency, he consistently advocated human rights and principles of self-determination.

“Do you never stop to reflect just what it is that America stands for?” Wilson asked in 1916. “If she stands for one thing more than another, it is for the sovereignty of self-governing peoples, and her example, her assistance, her encouragement, has thrilled two continents in this Western World with all the fine impulses which have built up human liberty on both sides of the water.”

These principles were expanded upon by subsequent American administrations, and especially by President Franklin Delano Roosevelt. In his January 1941 State of the Union address, Roosevelt spelled out what he called “the Four Freedoms,” which later became the foundation for the Universal Declaration of Human Rights.

“In the future days,” he said, “which we seek to make secure, we look forward to a world founded upon four essential human freedoms.”

He continued: “The first is freedom of speech and expression – everywhere in the world. The second is freedom of every person to worship God in his own way – everywhere in the world. The third is freedom from want – which, translated into world terms, means economic understandings which will secure to every nation a healthy peacetime life for its inhabitants – everywhere in the world. The fourth is freedom from fear – which, translated into world terms, means a world-wide reduction of armaments to such a point and in such a thorough fashion that no nation will be in a position to commit an act of physical aggression against any neighbor – anywhere in the world.”

Following the Allied victory over the Axis powers, FDR’s widow Eleanor Roosevelt took her late husband’s vision and attempted to make it a reality for the world through the Universal Declaration of Human Rights. Chairing the Commission on Human Rights, a standing body of the United Nations constituted to undertake the work of preparing what was originally conceived as an International Bill of Rights, Eleanor Roosevelt pushed to ensure that FDR’s “four freedoms” were reflected in the document.

Under Roosevelt’s leadership, the Commission decided that the declaration should be a brief and inspirational document accessible by common people, and envisioned it to serve as the foundation for the remainder of an international bill of human rights. It thus avoided the more difficult problems that had to be addressed when the binding treaty came up for consideration, namely what role the state should have in enforcing rights within its territory, and whether the mode of enforcing civil and political rights should be different from that for economic and social rights.

As stated in its preamble, the Universal Declaration of Human Rights is “a common standard of achievement for all peoples and all nations, to the end that every individual and every organ of society, keeping this Declaration constantly in mind, shall strive by teaching and education to promote respect for these rights and freedoms and by progressive measures, national and international, to secure their universal and effective recognition and observance, both among the peoples of Member States themselves and among the peoples of territories under their jurisdiction.”

Much of the language in the Declaration echoed language contained in the founding documents of the United States, including the Declaration of Independence and the Bill of Rights. Whereas the U.S. Declaration of Independence articulates the “unalienable right” to “life, liberty and the pursuit of happiness,” the Universal Declaration of Human Rights states that “everyone has the right to life, liberty and security of person.”

While the First Amendment to the U.S. Constitution prohibits Congress from “abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble,” the UDHR provides that “everyone has the right to freedom of opinion and expression” and that “everyone has the right to freedom of peaceful assembly and association.” Whereas the Eighth Amendment forbids “cruel and unusual punishments,” the UDHR bars “cruel, inhuman or degrading treatment or punishment.”

Although the United States made it clear that it could not support a legally binding UDHR, it readily endorsed the final document as a political declaration, one of 48 nations to vote in favor of the Declaration at the UN General Assembly in December 1948. With no votes in opposition and just eight abstentions – mostly from Eastern Bloc countries including the Soviet Union, Yugoslavia and Poland – the Declaration served as a defining characteristic of the contrast between East and West in those early days of the Cold War.

A Small Problem 

There was of course one small problem. Despite the United States formally embracing “universal human rights” on the international stage, its respect for those rights domestically was considerably lacking. Throughout the country and especially in the South, African Americans endured racist segregation policies and were routinely denied the right to vote and other civil rights.

Lynching, while not as pervasive as its heyday earlier in the century, was still a major problem, with dozens of blacks murdered with impunity by white lynch mobs throughout the 1940s.

In 1947, the National Association for the Advancement of Colored People (NAACP) filed an “Appeal to the World” petition in the United Nations that denounced racial discrimination in the United States as “not only indefensible but barbaric.” The American failure to respect human rights at home had international implications, argued the NAACP. “The disenfranchisement of the American Negro makes the functioning of all democracy in the nation difficult; and as democracy fails to function in the leading democracy in the world, it fails the world,” read the NAACP petition.

The NAACP’s appeal provoked an international sensation, with the organization flooded with requests for copies of the document from the governments of the Soviet Union, Great Britain and the Union of South Africa, among others. According to NAACP chief Walter White, “It was manifest that they were pleased to have documentary proof that the United States did not practice what it preached about freedom and democracy.”

The U.S. delegation to the UN refused to introduce the NAACP petition to the United Nations, fearing that it would cause further international embarrassment. The Soviet Union, however, recommended that the NAACP’s claims be investigated. The Commission on Human Rights rejected that proposal on December 4, 1947, and no further official action was taken.

According to W.E.B. DuBois, the principle author of the petition, the United States “refused willingly to allow any other nation to bring this matter up.” If it had been introduced to the General Assembly, Eleanor Roosevelt would have “probably resign[ed] from the United Nations delegation,” said DuBois. This was despite the fact that she was a member of the NAACP board of directors. While Roosevelt’s commitment to racial justice may have been strong, it was clear that her embarrassment over the U.S.’s failures to respect the “four freedoms” at home was even stronger.

It was in this context that the United States endorsed the Universal Declaration of Human Rights in 1948. That year also marked the beginning of tentative steps the U.S. began making towards respecting basic rights within its borders.

On July 26, 1948, President Harry Truman signed Executive Order 9981, which ended segregation in the U.S. Armed Forces. The next month, the Democratic Party included a civil rights plank in its platform. “The Democratic Party,” read the platform adopted at the 1948 Democratic National Convention, “commits itself to continuing its efforts to eradicate all racial, religious and economic discrimination.”

While there was clearly a domestic motivation for embracing the cause of civil rights (presidential adviser Clark Clifford had presented a lengthy memorandum to President Truman in 1947 which argued that the African-American vote was paramount for winning the 1948 election), there was also a strong international component to the Democratic Party’s support for civil rights.

UN Bragging Rights 

In addition to its civil rights plank, the 1948 Democratic platform included a wholehearted endorsement of the recently established United Nations, and expressed “the conviction that the destiny of the United States is to provide leadership in the world toward a realization of the Four Freedoms.” But the Democrats recognized that the U.S. had a long way to go to realizing those four freedoms at home.

“We call upon the Congress to support our President in guaranteeing these basic and fundamental American Principles: (1) the right of full and equal political participation; (2) the right to equal opportunity of employment; (3) the right of security of person; (4) and the right of equal treatment in the service and defense of our nation,” the platform stated.

The Democratic platform also proudly pointed to the accomplishment of organizing the United Nations: “Under the leadership of a Democratic President and his Secretary of State, the United Nations was organized at San Francisco. The charter was ratified by an overwhelming vote of the Senate. We support the United Nations fully and we pledge our whole-hearted aid toward its growth and development.”

For its part, the Republican Party also embraced the fledgling UN, stating in its 1948 platform that “Our foreign policy is dedicated to preserving a free America in a free world of free men. This calls for strengthening the United Nations and primary recognition of America’s self-interest in the liberty of other peoples.” While the Democrats pointed to the president’s leadership for helping establish the UN, the Republicans also wanted to make sure that they received due credit. Their party platform listed “a fostered United Nations” as one of the main accomplishments of the Republican Congress, despite “frequent obstruction from the Executive Branch.”

As “the world’s best hope” for “collective security against aggression and in behalf of justice and freedom,” the Republicans pledged to “support the United Nations in this direction, striving to strengthen it and promote its effective evolution and use.” The UN “should progressively establish international law,” said the Republicans, “be freed of any veto in the peaceful settlement of international disputes, and be provided with the armed forces contemplated by the Charter.”

As a major component of the progressive establishment of international law, the Universal Declaration of Human Rights was to be codified into legally binding treaties.

Although the Declaration was endorsed by the U.S. and 47 other countries in December 1948, the two corresponding legally binding covenants to define the obligations of each state required another two decades of work. The International Covenant on Civil and Political Rights and the International Covenant on Economic, Social and Cultural Rights were ready for ratification in 1966, some 18 years later.

The United States became a signatory to both covenants on Oct. 5, 1977. It ratified the ICCPR on June 8, 1992, but to this date has not fully subscribed to the ICESCR, one of just seven countries in the world not to ratify the agreement.

Cold War Context 

Throughout those years, the U.S. was engaged in an intense ideological battle with the Soviet Union, in which human rights were used as a rhetorical weapon by each side against the other. While American leaders chastised the Soviets for their failures to respect fundamental liberties, including freedom of religion, freedom of speech and freedom of association, the USSR could readily point to the blatant institutionalized racism that plagued American society.

Racial discrimination belied America’s rhetoric about democracy and equality, making the U.S. cause of freedom look like a sham especially to people of color in Africa, Asia, and Latin America. The Soviets enthusiastically exploited the issue, imbuing their anti-capitalist propaganda with tales of horrors suffered by African Americans.

So, in 1954, when the U.S. Supreme Court ruled in the case of Brown v. Topeka Board of Education that segregated schools were unconstitutional and ordered that school integration proceed “with all deliberate speed,” the case was trumpeted by the American establishment as evidence of the great strides being made toward full equality for all citizens.

At times, racial discrimination in the United States caused such international embarrassment that the State Department would pressure the White House to intervene. In 1957, for example, when a Federal District Court ordered the all-white Central High School in Little Rock, Arkansas, to allow African-American students to attend, Governor Orval Faubus declared that he would refuse to comply with the decree. Several hundred angry and belligerent whites confronted nine African-American students who attempted to enter the school on September 4, 1957.

The National Guard, called up by Faubus, blocked the students from entering the school. Pictures of the angry mob, the frightened African-American students, and armed National Guardsmen were seen all over the world, and the Soviets eagerly seized on the propaganda.

Secretary of State John Foster Dulles informed President Dwight Eisenhower that the Little Rock incident was damaging the United States’ credibility abroad, and could cost the U.S. the support of other nations in the UN. Eisenhower attempted to negotiate a settlement with Faubus, but when that failed, he sent in federal troops. The nine African-American students were finally allowed to attend Central High under the armed protection of the United States military.

The developing international human rights project led to deep ideological divisions in the United States, with some conservatives, especially in the South, concerned that the national government would use international human rights law to promote national civil rights reforms. Arguing that the civil rights question was beyond the scope of Congress’s authority and concerned about the constitutional power of treaties, conservatives launched several attempts in the 1950s to amend the U.S. Constitution to limit the government’s ability to subscribe to treaties.

Those failed efforts to amend the Constitution were based on the premise that the federal government had no say in the matters of states and localities in regulating race relations, and that since Article VI of the Constitution provides treaties the status of “supreme law of the land,” the U.S. would find itself subjected to the whims of the international community on these matters.

Those fears would prove unfounded, since the U.S. didn’t formally subscribe to the International Covenant on Civil and Political Rights until 1977, long after most of the relevant domestic civil rights legislation had been adopted, but the right-wing opposition to U.S. submission to international norms had become thoroughly established as American conservative orthodoxy.

Nat Parry is co-author of Neck Deep: The Disastrous Presidency of George W. Bush




The King Assassination Case and the Mueller Probe

Fifty years after the King assassination, Americans still have a hazy view of the House Select Committee on Assassinations’ findings, an ambiguous understanding that may end up characterizing American views on Robert Mueller’s probe as well, Bob Katz explains.

By Bob Katz

What is our official conclusion about the Martin Luther King assassination? Or rather, after all this time, is there an “official” conclusion? The answer to that goes beyond mere historical curiosity. For the murky ambiguities that define this case, coupled with an evident fondness among Americans for simplified, easy-reader versions of wrenching events, could well foreshadow the ultimate outcome of another critical probe 50 years later – Special Counsel Robert Mueller’s investigation into alleged collusion between Donald Trump and the Russian government to sway the outcome of Election 2016.

When it comes to the April 4, 1968 assassination of Dr. King, James Earl Ray is the name that pops up first in the minds of most Americans, as well as in Google searches and history textbooks. An oft-convicted thief, Ray managed to elude a massive international manhunt for two months before being captured in London while trying to board a plane to Brussels. Questions concerning his finances, travels, and possible collusion with others have always surrounded the case, although Ray’s culpability is widely assumed.

The House Select Committee on Assassinations, the most comprehensive formal investigation into King’s murder, and the only one with subpoena power, concluded in 1979 that, “there is a likelihood that James Earl Ray assassinated Dr. Martin Luther King Jr. as a result of a conspiracy.”

Ray never stood trial. Soon after his arrest he pled guilty. Three days later, he attempted to withdraw the plea, a quest that consumed much of the rest of his life. The HSCA report, therefore, stands as the single most authoritative interpretation of the case, and the closest thing we have to a definitive last word. Yet relatively few Americans have heard of the HSCA or, if they have, know much at all about its findings.

On the occasion of the 50th anniversary of King’s assassination, it’s worth asking what’s behind this erasure, this gradual airbrushing of the HSCA findings from the historical record? It could happen again, after all, the virtual deletion from public memory of an official investigation into a crucial national mystery. (Just saying.)

House Select Committee on Assassinations

The HSCA spent two years in the late 1970s investigating the King assassination as well as that of President Kennedy. Funded by Congress and headed by Robert Blakey, a Notre Dame law professor and former Justice Department official with an expertise in organized crime prosecutions, the HSCA had its own professional staff and unprecedented access to police and intelligence agency files.

On August 16, 1978, James Earl Ray was brought to the Rayburn Office Building on Capitol Hill to testify. His appearance, some ten years after the murder that traumatized the country and snuffed out one of America’s leading voices for peace and justice, was intensely anticipated.

Every major news outlet, print and electronic, was present. Rev. Jesse Jackson, who had been at King’s side on the balcony of the Lorraine Motel that fateful spring evening, took what probably counted as a box seat, behind Ray, as close as he could get. I too was there, in the gallery, working with a public interest group that monitored the hearings.

Flanked by seven U.S. Marshalls, Ray entered the hearing room to stone silence as spectators and media were commanded to remain seated and stationary. He calmly raised his right hand to take the oath, this unassuming figure already a peer of John Wilkes Booth and Lee Harvey Oswald in the pantheon of American villains. Seeing Ray in person was like seeing a ghost.

But this ghost was stripped of all standard trappings of creepiness. There was no eerie musical soundtrack accompanying his entrance. He wore a striped tie with a blue-green checkered sport coat that might have made a positive impression on a Missouri parole board in the 1950s. His dark hair was combed in a wave and tapered above the ears to reveal graying sideburns. With darting eyes and a tight-lipped grimness, he appeared just handsome enough to have landed an audition for the role of a petty burglar in a “Law & Order” episode.

Peppered with questions from the committee chair, Louis Stokes of Ohio, Ray nervously gave answers with varying degrees of forthrightness concerning his racial animus (he professed none and investigators also found little evidence of this); his finances while on the run (smuggling and odd jobs were his explanation – the HSCA believed Ray and one of his brothers robbed an Alton, IL bank of $27,000 in July, 1967); and accomplices (Ray insisted that a blond Latino named “Raoul” directed much of his activity, including the purchase of a rifle and a road trip that brought him to Memphis on April 3 – investigators believed Ray’s brothers John and Jerry, both petty criminals, assisted him).

It was, alas, no ghost story. There was no “aha!” moment of reckoning, no Hollywood ending.

A Disappointingly Obscure Scoundrel

Regarding its investigation of a conspiracy, the HSCA explicitly implicated a St. Louis lawyer named John Sutherland who’d been active in such segregationist groups as the St. Louis Citizens Council, the Southern States Industrial Council, and the American Independent Party of George Wallace. Within these networks, Sutherland was reported to have circulated a “serious” offer to have King killed, coupled with the promise of a $50,000 reward.

Sutherland, who died in 1970 and was never interrogated, proved a disappointingly obscure scoundrel for story-telling purposes. And the HSCA, commendably circumspect, employed language that was hardly meant to excite headlines:

“James Earl Ray may simply have been aware of the offer and acted with a general expectation of payment after the assassination; or he may have acted, not only with an awareness of the offer, but also after reaching a specific agreement, either directly or through one or both brothers, with … Sutherland. The legal consequences of the alternative possibilities are, of course, different. Without a specific agreement with the Sutherland group, the conspiracy that eventuated in Dr. King’s death would extend only to Ray and his brother(s); with a specific agreement, the conspiracy would also encompass Sutherland and his group.”

The upshot: no riveting narrative arc, no snappy logline. The HSCA findings have thus been consigned to history’s dustbin, invisible to all but scholars and buffs, doomed by poor ratings. It was a classic show biz failure, a failure to recognize that its attention-deficit audience – we the people – prefers explanations that are neatly wrapped and sound-bite succinct.

Obviously the HSCA was handicapped by strict adherence to the known facts, which turned out to be convoluted and puzzling. No scriptwriter with blockbuster dreams would ever want to be so confined. “Inspired by a true story,” whatever that means, is where the real action is.

Which brings us to the Mueller probe. It may yet yield high-profile trials for dastardly offenses, and wouldn’t that be nice. Absent an A-list conviction, the Mueller investigation seems susceptible to the same factors that effectively sidelined the King findings. Too many confounding footnotes, too many loose threads, and an assortment of two-bit bad guys standing in, but for who?

All available box office evidence suggests that Americans crave political dramas that are sharply plotted, easy to follow and seamlessly resolved. The ambiguous kind? Not so much. The truth, in the long run, may not be an ideal vehicle for maximizing audience share.

If in the end Mueller demonstrates only that vile crimes were perpetrated with craven or treasonous intent by despicable actors plausibly though not provably affiliated with the White House, what will be the popular understanding of the Trump-Russia-election saga ten years, twenty years from now? Especially when a far less complicated account – NO COLLUSION! – gets blasted from the loudest megaphone known to humankind.

Bob Katz was involved in monitoring the HSCA investigation and was present for James Earl Ray’s testimony. He is the author of several books and his writing has appeared in the New York Times, Boston Globe, Chicago Tribune, as well as Consortium News. His most recent book is The Whistleblower: Rooting for the Ref in the High-Stakes World of College Basketball (see BobKatz.info )




In Case You Missed…

Some of our special stories in February focused on the release of the so-called “Nunes Memo”, the US system of perpetual warfare, and the growing risk of confrontations in Syria, North Korea and Iran.

Outpouring of Support Honors Robert Parry” Feb. 1, 2018

U.S. Media’s Objectivity Questioned Abroad” by Andrew Spannaus, Feb. 2, 2018

Nunes Memo Reports Crimes at Top of FBI and DOJ” by Ray McGovern, Feb. 2, 2018

‘Duck and Cover’ Drills Exacerbate Fears of N. Korea War” by Ann Wright, Feb. 3, 2018

Do We Really Want Nuclear War with Russia?” by Robert Parry, Feb. 4, 2018

Recipe Concocted for Perpetual War is a Bitter One” by Robert Wing and Coleen Rowley, Feb. 4, 2018

WMD Claims in Syria Raise Concerns over U.S. Escalation” by Rick Sterling, Feb. 4, 2018

Connecticut Court Decision Highlights U.S. Educational Failures” by Dennis J. Bernstein, Feb. 5, 2018

Understanding Russia, Un-Demonizing Putin” by Sharon Tennison, Feb. 6, 2018

Did Al Qaeda Dupe Trump on Syrian Attack?” by Robert Parry, Feb. 6, 2018

No Time for Complacency over Korea War Threat” by Jonathan Marshall, Feb. 7, 2018

‘This is Nuts’: Liberals Launch ‘Largest Mobilization in History’ in Defense of Russiagate Probe” by Coleen Rowley and Nat Parry, Feb. 9, 2018

A Note to Our Readers” by Nat Parry, Feb. 10, 2018

Donald Trump v. the Spooks” by Annie Machon, Feb.23, 2018

How Establishment Propaganda Gaslights Us Into Submission” by Caitlin Johnstone, Feb. 12, 2018

Budget Woes Sign of a Dysfunctional Empire” by Jonathan Marshall, Feb. 13, 2018

The Right’s Second Amendment Lies” by Robert Parry, Feb. 16, 2018

NYT’s ‘Really Weird’ Russiagate Story” by Daniel Lazare, Feb. 16, 2018

Russians Spooked by Nukes-Against-Cyber-Attack Policy” by Ray McGovern and William Binney, Feb. 16, 2018

Nunes: FBI and DOJ Perps Could Be Put on Trial” by Ray McGovern, Feb. 19, 2018

U.S. Empire Still Incoherent After All These Years” by Nicolas J.S. Davies, Feb. 20, 2018

Time to Admit the Afghan War is ‘Nonsense’” by Jonathan Marshall, Feb. 22, 2018

Selective Outrage Undermines Human Rights in Syria” by Jonathan Marshall, Feb. 23, 2018

The Mueller Indictments: The Day the Music Died” by Daniel Lazare, Feb. 24, 2018

Growing Risk of U.S.-Iran Hostilities Based on False Pretexts, Intel Vets Warn” by Veteran Intelligence Professionals for Sanity, Feb. 26, 2018

Who Benefits from Russia’s ‘Peculiar’ Doping Violations?” by Rick Sterling, Feb. 26, 2018

 

To produce and publish these stories – and many more – costs money. And except for some book sales, we depend on the generous support of our readers.

So, please consider a tax-deductible donation either by credit card online or by mailing a check. (For readers wanting to use PayPal, you can address contributions to our PayPal Giving Fund account, which is named “The Consortium for Independent Journalism”).




Capitalism’s Process of Universal Commodification

The Marvel/Disney movie “Black Panther” is the latest example of an idea with anti-capitalist origins being co-opted for corporate commodification and profit, explains Lawrence Davidson in this analysis.

By Lawrence Davidson

Paradoxical Profit

Unless regulated, capitalism operates as a wide-open market system. If a demand exists or can be created and a profit made, that demand will be met. As a consequence, capitalism has the capacity to commercialize almost anything, including its detractors and even its enemies.

Here are some examples:

Che Guevara, the iconic Marxist revolutionary. He was young and handsome when he served at the side of Fidel Castro during the Cuban Revolution in the 1950s. Today, most people outside of Cuba know of him only as an image on T-shirts, backpacks and posters. He has been immortalized at a profit by the economic system he despised.

Wall-E, a 2008 animated movie about an “adorable robot” left behind on earth after mankind abandons the planet. It seems that humans have reduced their home to a garbage heap and Wall-E (short for “Waste Allocation Load Lifter Earth-Class”) has the job of cleaning the place up. Ironically, the movie suggests to us the dangers of commercialism while still managing to gross $533.3 million worldwide. Half of that came from audiences in the U.S., the homeland of “shop till you drop.”

Apple’s “Think Different” sales campaign. This promotion of Apple products opens with the line, “Here is to the crazy ones.” This is followed by images of Einstein, Bob Dylan and Martin Luther King, among others – folks who, the commercial tells us, are “rebels and misfits and have no respect for the status quo.” Apple was promoting its groundbreaking computer products using the images of some people who really didn’t believe in a capitalist system. Nonetheless, this promotion campaign became iconic and probably can be said to have helped the company “change the world” – just not in the direction some of those “crazy ones” would have liked.

Graffiti Art.  There is a 2016 documentary film, Wall Writers: Graffiti In Its Innocence, that depicts the early days of graffiti art (1960s and 1970s) as a sometimes illegal wall writing phenomenon. It explains that the original “wall writers” were anonymous people seeking recognition basically among their own kind. There was no thought that this activity was giving rise to an art form and certainly not to the possibility that it could be a vehicle to riches. But the graffiti phenomenon exploded across the United States and soon spread to England. By1973 it was sufficiently in the public mind to be used as a successful movie title, “American Graffiti.” Soon after that (by the 1980s), some of the best graffiti had recognized artistic value and was integrated into the art market. Currently some of it is sold for millions of dollars.

— The Weather Underground is an on-line site that “provides local & long range weather forecasts, weather reports, maps & tropical weather conditions for locations worldwide.” But where does that name come from? It is taken from a radical anti-capitalist youth group known as the “Weathermen.” This group broke off from the Students for a Democratic Society (SDS), also a radical group, in 1970. After doing so the radical Weathermen declared war on the U.S. government.

It seems that both the meteorological Weathermen and the radical political Weathermen began at the University of Michigan. That is where the meteorologists first got their start in 1995, working with the university’s internet weather database. That is also where in 1970, the SDS split apart and the radical Weathermen Underground was founded. By the way, the split came about democratically through a vote of assembled members. I know. I was there (on the non-Weathermen side). Maybe some of the future meteorologists were there as well, and that is how they chose the name.

The Weather Underground forecasting organization is now owned by The Weather Company, which in turn is owned by IBM.

Black Panther – The Movie and the Party

Market capitalism will also seek to profit from aspects of culture – even those parts that are marginalized, for instance, African American culture and the concept of Black pride.

In 1971 we got the Shaft movies of the cool Black private eye who plays tough on the streets of New York City. This gave rise to a myriad other “blaxploitation” films. The Shaft productions mimicked White equivalents using Black actors and Black backdrops. They demonstrated that African Americans are part of a dominant culture which cannot be uniquely Black. In truth, it is a White culture that has been modified over time by its minority components: Black, Asian, Latin, Native American, etc. into a hybrid that is uniquely American.

I don’t want to be misunderstood here. African Americans can take great pride in their movies and other arts. Black actors, screenwriters, directors and producers are as competent as their White counterparts. However, they and their Black audience are still captive to that preexisting hybrid cultural canvas on which they, and other minorities, are led to play out their creativity.

This brings us to the latest, and perhaps most spectacular example of this dilemma, the Disney company’s movie Black Panther. I have two comments on this worldwide commercial success (the movie has grossed over a billion dollars).

— As with some of the examples given in the first part of this analysis, the movie exploits for profit an organization that was anti-capitalist. The giveaway is the title itself. The “Black Panther” is closely linked in cultural memory to an organization known as the Black Panther Party. This was a radical organization created in the 1960s to provide for the needs of poor Black neighborhoods (the group originated the idea of the school breakfast program) and to protect residents both from criminals and the police, who were viewed as racist occupiers. The Black Panther Party became the target of violent attacks by agents of the U.S. government and eventually destroyed.

— Having been rendered safe through its destruction, the Black Panther Party’s image could be reworked and then reintroduced back into the prevailing culture. The Disney movie does just this. This is not to say that the film does not have merit. Its depiction of strong Black women, Black scientists and technicians, and the able and successful Black civilization of Wakanda are inspiring. On the other hand, the savagery that is part of Wakana’s succession process is problematic.

Overall, the movie is formulaic. It is a familiar good vs. bad scenario.There is a not entirely unsympathetic arch-villain, minor bad guys who come around to be good guys, and competitive tension in the good-guy camp. We have gangsters, government agents and almost non-stop violence. Nothing particularly original here. Nor is there anything original, and certainly nothing radical, about the film’s answer to the problems of poor African Americans – an outreach center in a needy urban neighborhood. By the way, this “cinematically portrayed help effort” works primarily because of the fantasy that there is a Black superpower backing it up. When real Black Panthers tried the same sort of outreach in the 1960s, they were arrested and sometimes murdered.

It is worrisome that the enthusiasm for the movie is based on a fantasy that essentially makes the tragedy of the real-life Black Panthers disappear. As the culture war now being fought in the U.S. between often racist ultra-conservatives and besieged progressives shows, we need change in the real world. Fantasy can give you a momentary lift and sense of pride, but in the end the real world’s problems are still there.

In 1983, the Irish novelist Iris Murdoch remarked that “we live in a fantasy world, a world of illusion. The great task in life is to find reality.” Che Guevara, many graffiti artists, some of the “crazy ones” depicted in Apple’s “Think Different” campaign, the original radicals of the Weather Underground, and the members of the Black Panther Party, all knew what reality was. They wanted to change it without recourse to fantasy. And, each time their attempts were stymied by a system that judged human needs solely in terms of monetary profit. This is brilliantly demonstrated by the fact that the images of these enemies of the system have been re-presented to us as within the context of profitable fantasy. The process has been remarkably successful and remarkably lucrative. It is also depressing and, in terms of social progress, represents a road to nowhere.

Lawrence Davidson is a history professor at West Chester University in Pennsylvania. He is the author of Foreign Policy Inc.: Privatizing America’s National Interest; America’s Palestine: Popular and Official Perceptions from Balfour to Israeli Statehood; and Islamic Fundamentalism. He blogs at www.tothepointanalyses.com.




Behind Colin Powell’s Legend – My Lai

From the Archive: With media focus on the 50th anniversary of the Vietnam War’s My Lai massacre, Colin Powell’s role as a military adviser has continued to elude scrutiny, so we’re republishing a 1996 article by Robert Parry and Norman Solomon.

By Robert Parry and Norman Solomon (first published in 1996)

On March 16, 1968, a bloodied unit of the Americal division stormed into a hamlet known as My Lai 4. With military helicopters circling overhead, revenge-seeking American soldiers rousted Vietnamese civilians — mostly old men, women and children — from their thatched huts and herded them into the village’s irrigation ditches.

As the round-up continued, some Americans raped the girls. Then, under orders from junior officers on the ground, soldiers began emptying their M-16s into the terrified peasants. Some parents desperately used their bodies to try to shield their children from the bullets. Soldiers stepped among the corpses to finish off the wounded.

The slaughter raged for four hours. A total of 347 Vietnamese, including babies, died in the carnage that would stain the reputation of the U.S. Army. But there also were American heroes that day in My Lai. Some soldiers refused to obey the direct orders to kill.

A pilot named Hugh Clowers Thompson Jr. from Stone Mountain, Ga., was furious at the killings he saw happening on the ground. He landed his helicopter between one group of fleeing civilians and American soldiers in pursuit. Thompson ordered his helicopter door gunner to shoot the Americans if they tried to harm the Vietnamese. After a tense confrontation, the soldiers backed off. Later, two of Thompson’s men climbed into one ditch filled with corpses and pulled out a three-year-old boy whom they flew to safety.

A Pattern of Brutality

While a horrific example of a Vietnam war crime, the My Lai massacre was not unique. It fit a long pattern of indiscriminate violence against civilians that had marred U.S. participation in the Vietnam War from its earliest days when Americans acted primarily as advisers.

In 1963, Capt. Colin Powell was one of those advisers, serving a first tour with a South Vietnamese army unit. Powell’s detachment sought to discourage support for the Viet Cong by torching villages throughout the A Shau Valley. While other U.S. advisers protested this countrywide strategy as brutal and counter-productive, Powell defended the “drain-the-sea” approach then — and continued that defense in his 1995 memoirs, My American Journey.

After his first one-year tour and a series of successful training assignments in the United States, Maj. Powell returned for his second Vietnam tour on July 27, 1968. This time, he was no longer a junior officer slogging through the jungle, but an up-and-coming staff officer assigned to the Americal division.

By late 1968, Powell had jumped over more senior officers into the important post of G-3, chief of operations for division commander, Maj. Gen. Charles Gettys, at Chu Lai. Powell had been “picked by Gen. Gettys over several lieutenant colonels for the G-3 job itself, making me the only major filling that role in Vietnam,” Powell wrote in his memoirs.

But a test soon confronted Maj. Powell. A letter had been written by a young specialist fourth class named Tom Glen, who had served in an Americal mortar platoon and was nearing the end of his Army tour. In a letter to Gen. Creighton Abrams, the commander of all U.S. forces in Vietnam, Glen accused the Americal division of routine brutality against civilians. Glen’s letter was forwarded to the Americal headquarters at Chu Lai where it landed on Maj. Powell’s desk.

“The average GI’s attitude toward and treatment of the Vietnamese people all too often is a complete denial of all our country is attempting to accomplish in the realm of human relations,” Glen wrote. ”Far beyond merely dismissing the Vietnamese as ‘slopes’ or ‘gooks,’ in both deed and thought, too many American soldiers seem to discount their very humanity; and with this attitude inflict upon the Vietnamese citizenry humiliations, both psychological and physical, that can have only a debilitating effect upon efforts to unify the people in loyalty to the Saigon government, particularly when such acts are carried out at unit levels and thereby acquire the aspect of sanctioned policy.”

Glen’s letter contended that many Vietnamese were fleeing from Americans who “for mere pleasure, fire indiscriminately into Vietnamese homes and without provocation or justification shoot at the people themselves.” Gratuitous cruelty was also being inflicted on Viet Cong suspects, Glen reported.

“Fired with an emotionalism that belies unconscionable hatred, and armed with a vocabulary consisting of ‘You VC,’ soldiers commonly ‘interrogate’ by means of torture that has been presented as the particular habit of the enemy. Severe beatings and torture at knife point are usual means of questioning captives or of convincing a suspect that he is, indeed, a Viet Cong…

“It would indeed be terrible to find it necessary to believe that an American soldier that harbors such racial intolerance and disregard for justice and human feeling is a prototype of all American national character; yet the frequency of such soldiers lends credulity to such beliefs. … What has been outlined here I have seen not only in my own unit, but also in others we have worked with, and I fear it is universal. If this is indeed the case, it is a problem which cannot be overlooked, but can through a more firm implementation of the codes of MACV (Military Assistance Command Vietnam) and the Geneva Conventions, perhaps be eradicated.”

Glen’s letter echoed some of the complaints voiced by early advisers, such as Col. John Paul Vann, who protested the self-defeating strategy of treating Vietnamese civilians as the enemy. In 1995, when we questioned Glen about his letter, he said he had heard second-hand about the My Lai massacre, though he did not mention it specifically. The massacre was just one part of the abusive pattern that had become routine in the division, he said.

Maj. Powell’s Response

The letter’s troubling allegations were not well received at Americal headquarters. Maj. Powell undertook the assignment to review Glen’s letter, but did so without questioning Glen or assigning anyone else to talk with him. Powell simply accepted a claim from Glen’s superior officer that Glen was not close enough to the front lines to know what he was writing about, an assertion Glen denies.

After that cursory investigation, Powell drafted a response on Dec. 13, 1968. He admitted to no pattern of wrongdoing. Powell claimed that U.S. soldiers in Vietnam were taught to treat Vietnamese courteously and respectfully. The Americal troops also had gone through an hour-long course on how to treat prisoners of war under the Geneva Conventions, Powell noted.

“There may be isolated cases of mistreatment of civilians and POWs,” Powell wrote in 1968. But “this by no means reflects the general attitude throughout the Division.” Indeed, Powell’s memo faulted Glen for not complaining earlier and for failing to be more specific in his letter.

Powell reported back exactly what his superiors wanted to hear. “In direct refutation of this [Glen’s] portrayal,” Powell concluded, “is the fact that relations between Americal soldiers and the Vietnamese people are excellent.”

Powell’s findings, of course, were false. But it would take another Americal hero, an infantryman named Ron Ridenhour, to piece together the truth about the atrocity at My Lai. After returning to the United States, Ridenhour interviewed Americal comrades who had participated in the massacre.

On his own, Ridenhour compiled this shocking information into a report and forwarded it to the Army inspector general. The IG’s office conducted an aggressive official investigation and the Army finally faced the horrible truth. Courts martial were held against officers and enlisted men implicated in the murder of the My Lai civilians.

But Powell’s peripheral role in the My Lai cover-up did not slow his climb up the Army’s ladder. Powell pleaded ignorance about the actual My Lai massacre, which pre-dated his arrival at the Americal. Glen’s letter disappeared into the National Archives — to be unearthed only years later by British journalists Michael Bilton and Kevin Sims for their book Four Hours in My Lai. In his best-selling memoirs, Powell did not mention his brush-off of Tom Glen’s complaint.

MAM Hunts

Powell did include, however, a troubling recollection that belied his 1968 official denial of Glen’s allegation that American soldiers “without provocation or justification shoot at the people themselves.” After mentioning the My Lai massacre in My American Journey, Powell penned a partial justification of the Americal’s brutality. In a chilling passage, Powell explained the routine practice of murdering unarmed male Vietnamese.

“I recall a phrase we used in the field, MAM, for military-age male,” Powell wrote. “If a helo spotted a peasant in black pajamas who looked remotely suspicious, a possible MAM, the pilot would circle and fire in front of him. If he moved, his movement was judged evidence of hostile intent, and the next burst was not in front, but at him. Brutal? Maybe so. But an able battalion commander with whom I had served at Gelnhausen (West Germany), Lt. Col. Walter Pritchard, was killed by enemy sniper fire while observing MAMs from a helicopter. And Pritchard was only one of many. The kill-or-be-killed nature of combat tends to dull fine perceptions of right and wrong.”

While it’s certainly true that combat is brutal, mowing down unarmed civilians is not combat. It is, in fact, a war crime. Neither can the combat death of a fellow soldier be cited as an excuse to murder civilians. Disturbingly, that was precisely the rationalization that the My Lai killers cited in their own defense.

But returning home from Vietnam a second time in 1969, Powell had proved himself the consummate team player.

For more on Colin Powell’s real record, please check out the “Behind Colin Powell’s Legend” series.




‘Hostiles’ and Hollywood’s Untold Story

Hollywood’s recent attempt to depict Frontier life captures the reality of “hostiles” shooting various weapons at one another, but the real history is more interesting, Jada Thacker explains in this essay.

By Jada Thacker

A theatrical poster for the recent American Western movie “Hostiles” depicts its principal characters – a Frontier widow, a hardboiled Indian fighter, and an Indian chief – with a helpful blurb stating the story’s theme with the subtlety of a striking rattlesnake: “We are all hostiles.”

Some critics think the movie somehow ought to have been a different one – that it should have included a bit more of this, or a bit less of that…whatever. Maybe they have a point. Though it hardly seems fair to ding “Hostiles” for being an imperfect example of the ideal Frontier fantasy.

But it is fair to criticize a movie for being a perfect example of a movie genre that consistently ignores the most essential themes of the American Frontier. “Hostiles” succeeds brilliantly as the latest addition to a very long list of movies that focus laser-like attention on hostile Frontier characters, rather than on the consequences of Frontier hostility.

The American Frontier was not, as Hollywood formerly portrayed it, merely a canvas background prop for a violent soap box drama starring Cowboys & Indians – or, as more recently re-imagined, an ethnic melodrama featuring white Bad Guys versus Noble Indian resistance.

Nor can the American Frontier be considered a particularly hostile place without expunging from history the slaughter-grounds of Cannae, Verdun, Stalingrad, or even America’s own Gettysburg – each of which produced more bloated corpses than any number of Wild Wests. In an encyclopedia of human violence, the massacres at the Little Bighorn and Wounded Knee would be relegated to a footnote.

Yet, the significance of the American Frontier endures. William Faulkner was not referring to the Frontier experience when he said, “The past is never dead. It’s not even past,” but he was right.

Unacknowledged by the silver screen, contemporary America remains as hostile as it ever was to the Frontier dwellers of tee-pees, log cabins, wigwams, or army outposts. Every American today who rages at corrupt and incompetent government, who counts out their pennies for rent or mortgage, or who despairs of the growth-driven, mechanized rape of the American landscape can thank the American Frontier experience for their trouble.

Frontier Anarchy

No government existed in North America at the time of European contact. The societies that pre-existed there lived in a condition of anarchy.

Although the term “anarchy” is used casually to denote a condition of chaos, it literally refers only to a society without government (from the Greek: a [without] + archy [rulers]). Anarchy is the voluntary self-organization of people without the use of authoritative force. Thus, anarchy does not denote an absence of social order, but only the absence of a forcible social order.

Anarchy is not an exception to human organization, but the rule – if we can forgive the pun. All non-governmental organizations are anarchic, voluntary associations: sports teams, business entities, civic groups, church congregations, trade unions, symphony orchestras, and marriages included. American Indian societies had thrived just so without authoritative force for some 20,000 years before Europeans appeared to set things straight.

Immediately upon European arrival, the Frontier materialized as a lethal No Man’s Land where the alien hierarchical order of government clashed catastrophically with indigenous anarchy. At issue was not just the survival of hostile individuals, but the survival of fundamentally hostile political cultures.

Unlike anarchy, government has nothing to do with the voluntary self-organization of society. Nobody ever volunteers to be arrested, pay fines, go to jail, or be executed – or pay the taxes necessary for doing so to others. And no such elements of coercion existed in North America prior to the importation of European authoritarianism. (When so-called “democratic government” later purported to banish British tyranny, it made certain to keep prisons and capital punishment intact.)

Moviegoers, no less than movie-makers and history textbooks, blithely assume that Indian leaders wielded the same authority as did government officials in white society. Not so. Indians had no officials because they had no offices. Indian chiefs led by example and inspiration only; they possessed no more coercive ability than a scoutmaster or a captain of a football team.

In any event, Indians had no written laws that begged enforcement. Anarchic political culture does not depend on the enforcement of rules and regulations, but upon free consent to them. A Wikipedia article summarizes the Abenaki people’s consensual customs:

“Group decision-making was done by a consensus method. The idea is that every group (family, band, tribe, etc.) must have equal say, so each group would elect a spokesperson. Each smaller group would send the decision of the group to an impartial facilitator.

“If there was a disagreement, the facilitator would tell the groups to discuss again. In addition to the debates, there was a goal of total understanding for all members. If there was not total understanding, the debate would stop until there was understanding.

“When the tribal members debate issues, they consider the Three Truths: Peace: Is this preserved? Righteousness: Is it moral? Power: Does it preserve the integrity of the group?

“These truths guide all group deliberations, and the goal is to reach a consensus. If there is no consensus for change, they agree to keep the status quo.”

Not all Indian self-organization was this formal, but it all was intensely democratic. The hierarchical European political culture which ruled by indelible law, dictated by police and military forces and financed by forcible taxation, decidedly was not.

The collision of anarchy and government in America was not a melodramatic struggle between “good” and “evil.” But it did involve a spiritual choice – between a circle and a pyramid.

The Indian way was represented by a circle or hoop, symbolized physically by the Puebloan people’s kiva, a circular, ceremonial meeting place. The Lakota and other tribes conceived of universal order as a hoop. The symbolic meaning is one of balance and equality, with each member of society located equidistant from a common core. Indian leaders did not occupy the position of “top dog” or “king of the hill” but as central mediators among equals.

In contrast, all civilizations – including the white civilization that hovered in the wings of the Frontier stage – are pyramidal structures. In pyramidal culture, authority resides at the apex and flows only downward, forcibly if necessary. While pyramidal culture was not unique to the colonizing European culture of the day – Ancient Egyptians and Aztecs expressed their pyramidal culture in stone, just as current organization charts express our pyramids on paper – it was utterly foreign to the Indian consciousness.

So-called “Indian Nations” were conceptual fallacies that did not in fact exist. Even the famous Iroquois League, or Haudenosaunee, was not an example of “Indian government” and certainly not of pyramidal structure. It was a decentralized, voluntary confederacy – a hooplike “League of Peace” (ca.1140 – 1784) of its six constituent tribes – not a hierarchical command-and-control structure that dominated Indian society.

Frontier Economics

Lest the Right-Libertarians among us applaud too loudly the absence of Big Government (or any government) in Indian society, the central conflict between white and red men (a term Indians used to describe themselves) was a contest between individualistic vs. collective property rights.

To be clear, Indians had a keen sense of territorial sovereignty. But this did not include personal property ownership, which was both unknown and an anathema to the Indian way. T.R Fehrenbach, a notable commentator on Frontier culture and author of the encyclopedic Comanches: The History of a People, put it simply:

“Hypocrisy was perhaps inevitable in a people [whites] who convinced themselves that they were creating something new in the New World, while actually carrying out the most primordial form of conquest.”

But then he adds:

“Amerindians resisted all sincere imitation of their conquerors. Broken warriors refused to become economic men, to accept the concept of private property or the discipline of incessant labor.”

Quite frankly, the Comanche people (the Nermernuh) of whom Fehrenbach spoke were without doubt the most rapacious Indians that whites ever encountered. (Other Indians were intimidated by them, too, and for good reason, a point “Hostiles” duly observes.) Alongside hunting buffalo, raiding and stealing constituted the raison d’etre of their predatory society.

In fact, hostility and theft generally characterized Indian between-group behavior both before and after European arrival; they did not need the presence of whites to justify their elevation of lethal larceny to an art form. By the same token, European pioneers needed no particular excuse to exterminate Indians, or each another, while committing Grand Theft Continent.

Ironically, armed robbery was the primary economic activity whites and Indians shared in common. “Making a killing” by “hostile takeovers” of others’ property is not a new pony trick invented by corporate raiders.

But the ruthless exploitation of one’s own kinsmen and their resources is something else. This was as unthinkable to tribal peoples as it was premeditated by the bringers of civilization. The privatization of shared resources proved to be the profound and irreconcilable issue that separated the two peoples’ concepts of economic justice.

Even in abject defeat, Indians never shared the whites’ notion that the land’s resources could, or should, be monopolized as private property. Since Indians perceived themselves essentially as children of the Earth, private ownership of land made no more sense to them than a child claiming to own its parents.

Unlike whites, the Indian concept of territory was communal. What they possessed in common they defended in common. Their view of communal property rights flowed naturally from their egalitarian culture, which did not tolerate landlords or economic class distinctions.

Within any Indian band, no privileged economic class could exist simply because there was no hierarchical power structure to sustain one. Since no Indian had the power to control the food supply of another, they were liberated at birth from the private monopolization of the “means of production.” Possession of property was not justified by individual privilege but was their common birthright.

Thus, Indian society was devoid of both private property and the State. This is inconvenient news for today’s Marxists and Right-Libertarians, alike.

Indian society repudiated the Right-Libertarian (anarcho-capitalist) notion that individual liberty requires the sanctity of private property ownership. No humans have exercised more individual liberty, nor owned less private property, than American Indians. Ownership of private property – which cannot and does not exist in the absence of government-sanctioned privilege – would not have conferred any liberty to Indians they did not already possess.

At the other end of the economic spectrum, Indian society also belied the Marxian notion that economics is determined to evolve from capitalism, through socialism, to the ideal of communism. In reality, American Indians had beat Marx to the punchline 20,000 years before he set pen to paper.

In modern parlance, Indians were communists long before communism was cool. Contemporary Indians may disavow Marx as an industrial materialist with no respect for their spiritual way; that doesn’t mean their people were not original communists, but only that they are not Marxists.

Marx was the latecomer – and then he got it all backwards. The American Frontier experience graphically demonstrated that humanity was not advancing toward a stateless, economic Utopia but was rooting out and laying waste to prehistoric communism wherever it still persisted.

All “isms” aside, reality reveals that whoever exercises effective ownership of a place rules it for their benefit. First and foremost, the Frontier was a place of a hostile and involuntary transfer of economic property from communal Indian ownership into the itchy palms of the private white owners who usually stood at the apex of an authoritarian pyramid.

Frontier Ecology

Pre-contact Indians lived in Stone Age societies. They possessed no metal implements, and the highest level of tool technology available to them employed only stone, bone, and clay.

In Stone Age Economics, Marshall Sahlins famously referred to Stone Age people as the “original affluent society” – not because they possessed much material wealth, but rather because they required so little and because their modest needs were so readily fulfilled when compared to the far greater requirements of us Moderns.

On the other hand, we would be mistaken to believe Indians were conscious “environmentalists.” Like any society, theirs took from nature what was needed for survival. Stone Age people had no reason to conserve that which was beyond their power to despoil.

As Sahlins “original affluence” implies, the trick to achieving environmental sustainability does not lie in not taking what is needed, but in not needing to take more than the environment can afford. “What the environment can afford” is known in ecology-speak as carrying capacity.

More formally stated, carrying capacity is the ability of the environment to sustain a given population of organisms indefinitely. “Sustain” usually means “to feed” and “indefinitely” simply means “with no end in sight.” Thus, a given number of organisms that continues to live (and reproduce) within the means of its food-energy supply is “ecologically sustainable.”

In any event, “living sustainably” should not be conceptualized as “living in harmony with nature.” Nature is not a Barbershop Quartet. Nature is nothing if not a relentless, biological gang fight encompassing every organism on the planet. Each organism will lose the fight eventually, only to decompose into the itinerant molecules from which it was temporarily pasted together.

In fact, the natural danse macabre preserves ecological balance at the expense of harmony. Any cosmic harmony on the American Frontier, existed only under the influence of mezcal and peyote.

Moreover, just because an organism manages to survive individually does not imply that it lives in a sustainable society. Sustainability requires that a given number of organisms must be able to survive indefinitely. No environmental carrying capacity can sustain too many needy organisms, or even a few organisms that consume more food-energy than the environment can replace.

By any measure, however, American Indians had been living sustainably for millennia before Europeans waded ashore with their metallurgy, animal husbandry, intensive agriculture, literacy – and their marked tendency toward epidemic plagues, famine, industrialized warfare, and commercial-grade slavery. Upon arrival, the benighted invaders found practically nothing to remind them of their ecologically stressed homelands, which they had abandoned.

Nowhere in America did the colonizers find the privation, starvation, social depravity, and ecological wastage that characterized their soil-ravaged and forest-denuded homeland. Having accidentally stumbled upon a Stone Age population that lived sustainably, civilized Europeans set about at once to destroy it, as they had done at home. Indeed, had Europeans possessed a sustainable culture, they would not have needed to ditch their depleted continent in search of lootable resources elsewhere.

The supreme irony of the Old-World invasion was that Europeans never realized the “savages” inhabiting the Americas were practically identical to their own ancestors, though a couple of hundred generations removed. Ecologically, the European invasion did not represent the wave of the future, but a retrogression to their own Edenic past.

The environmental devastation that had taken several thousand years to accomplish in Europe was replicated in three centuries in the Americas. Such was the price and the speed of the “progress” achieved on the American Frontier.

Frontier Armageddon

The Frontier did not disappear just because the westward movement had run out of geographical space, its few Indian survivors having been herded into open-air prisons. Rather, the Frontier itself was destroyed by the westward migration of the Industrial Revolution – a truly monstrous creation of unrelenting factory toil, rolling on steel rails, powered by steam, and financed by perpetual human servitude to debt.  

The terminal theme of the Frontier was not to be man’s conquest of nature, or even of man’s conquest of other men, but instead the industrial conquest of humanity. Metastasizing far beyond the “primordial form of conquest” of Indians by hypocritical whites, this final act of destruction was so complete that not even whites survived it.

A Stone Age world bound by blood kinship, loyalty, courage, intuition and revenge was within a single lifetime displaced by the depersonalized tyranny of contract law, freight schedules, time zones, taxes, universal debt and ‘no trespassing’ signs. Proud Indian warriors, brave Texas Rangers, indomitable pioneer sod-busters – all alike swept away only to be reincarnated by industrialized karma as sweatshop wage-slaves, coal mining troglodytes, and corporate lackeys.

After this cataclysm, we can rely on Hollywood to remind us now and again that the Frontier was where some hostile hombres ran amok shooting various weapons at one another – as if that is not the daily fare of modern-day America. The theatrical poster blurb “We are all hostiles” could be a permanent contemporary subtitle to American civilization.

But the American Frontier was not a blurb or a subtitle. It was a war that raged westward for 300 years before its place was lost to history. Yet, the ultimate loss of the Frontier was not by those fortunate few who once lived within the warzone; the greater loss was to those unfortunate multitudes who were fated to live thereafter without it. And that would be us.

Possibly lost to us forever has been our egalitarian self-determination, our common possession of the means of survival, our ecological sustainability, and our sense of the primacy of personal human worth. These hallmarks of human society have been eradicated so thoroughly that even celluloid fables of our own history betray hardly a trace of their multi-millennial existence. Unwilling to recall such a way of life, we retell only tales of hostility that surrounded its death.

But lest old acquaintance be forgot and never brought to mind, Americans everywhere now commemorate the first day of each calendar month with a nagging sense of loss – as befits the date on which the rent is due in this erstwhile Land of the Free.

Jada Thacker, Ed.D is the author of Essential Themes of America History. He teaches collegiate Political Science and History courses in Texas. jadathacker@sbcglobal.net




In Case You Missed…

Some of our special stories in January highlighted misrepresented historic events, analyzed shortcomings of the Democratic Party, and remembered Robert Parry’s legacy.

Giving War Too Many Chances” by Nicolas J.S. Davies, Jan. 3, 2018

Missing the Trump Team’s Misconduct” by J.P. Sottile., Jan. 9, 2018

Pesticide Use Threatens Health in California” by Dennis J. Bernstein, Jan. 10, 2018

Trump Lashes Pakistan over Afghan War” by Dennis J. Bernstein, Jan. 11, 2018

The FBI Hand Behind Russia-gate” by Ray McGovern, Jan. 11, 2018

Haiti and America’s Historic Debt” by Robert Parry, Jan. 12, 2018

Why Senator Cardin Is a Fitting Opponent for Chelsea Manning” by Norman Solomon, Jan. 16, 2018

Trump Ends Protections for El Salvador” by Dennis J. Bernstein, Jan. 18, 2018

An Update to Our Readers on Editor Robert Parry” by Nat Parry, Jan. 19, 2018

Regime Change and Globalization Fuel Europe’s Refugee and Migrant Crisis” by Andrew Spannaus, Jan. 20, 2018

‘The Post’ and the Pentagon Papers” by James DiEugenio, Jan. 22, 2018

Foxes in Charge of Intelligence Hen House” by Ray McGovern, Jan. 22, 2018

A National Defense Strategy of Sowing Global Chaos” by Nicolas J.S. Davies, Jan. 23, 2018

George W. Bush: Dupe or Deceiver?” by Robert Parry, Jan. 23, 2018

Tom Perez, the Democratic Party’s Grim Metaphor” by  Norman Solomon, Jan. 25, 2018

The Struggle Against Honduras’ Stolen Election” by Dennis J. Bernstein, Jan. 26, 2018

Unpacking the Shadowy Outfit Behind 2017’s Biggest Fake News Story” by George Eliason, Jan. 28, 2018

Robert Parry’s Legacy and the Future of Consortiumnews” by Nat Parry, Jan. 28, 2018

Assault on the Embassy: The Tet Offensive Fifty Years Later” by Don North, Jan. 30, 2018

Will Congress Face Down the Deep State?” by Ray McGovern, Jan. 30, 2018

Treasury’s ‘Kremlin Report’ Seen as Targeting Russian Economy” by Gilbert Doctorow, Jan. 31, 2018

Mass Surveillance and the Memory Hole” by Ted Snider, Jan. 31, 2018

How Trump and the GOP Exploit Israel” by Jonathan Marshall, Jan. 31, 2018

To produce and publish these stories – and many more – costs money. And except for some book sales, we depend on the generous support of our readers.

So, please consider a tax-deductible donation either by credit card online or by mailing a check. (For readers wanting to use PayPal, you can address contributions to our PayPal Giving Fund account, which is named “The Consortium for Independent Journalism”).




Billy Graham: An Old Soldier Fades Away

Evangelist Billy Graham, who counseled presidents and stirred controversy with inflammatory statements on gay rights, opposition to Martin Luther King’s tactics of civil disobedience, and support for U.S. wars, died Wednesday. Cecil Bothwell reflects here on his life and legacy.

By Cecil Bothwell

“We are selling the greatest product on earth. Why shouldn’t we promote it as effectively as we promote a bar of soap?” – Billy Graham, Saturday Evening Post, 1963 

Billy Graham was a preacher man equally intent on saving souls and soliciting financial support for his ministry. His success at the former is not subject to proof and his success at the latter is unrivaled. He preached to millions on every ice-free continent and led many to his chosen messiah.

When Graham succumbed to various ailments this week at the age of 99 he left behind an organization that is said to have touched more people than any other Christian ministry in history, with property, assets and a name-brand worth hundreds of millions. The address lists of contributors alone comprise a mother lode for the Billy Graham Evangelical Association, now headed by his son and namesake, William Franklin Graham, III.

Graham also left behind a United States government in which religion plays a far greater role than before he intruded into politics in the 1950s. The shift from secular governance to “In God We Trust” can be laid squarely at this minister’s feet.

Graham’s message was principally one of fear: fear of a wrathful god; fear of temptation; fear of communists and socialists; fear of unions; fear of Catholics; fear of homosexuals; fear of racial integration and above all, fear of death. But as a balm for such fears, he promised listeners eternal life, which he said was readily claimed through acceptance of Jesus Christ as one’s savior.

Furthermore, he assured listeners that God loved us so much that He created governments, the most blessed form being Western capitalist democracy. To make this point, he frequently quoted Romans 13, particularly the first two verses. In the New American Standard Version of the Bible, they read, “Let every person be in subjection to the governing authorities. For there is no authority except from God, and those which exist are established by God. Therefore he who resists authority has opposed the ordinance of God; and they who have opposed will receive condemnation upon themselves.”

The question of whether this was actually the recorded word of God or a rider inserted into the bill by Roman senators with rather more worldly aims never dimmed Graham’s insistence that all governments are the work of the Almighty. Almost perversely, he even endorsed the arrest of a woman who lofted a Christian banner during his Reagan-era visit to Moscow, opting for the crack-down of “divine” authority over the civil disobedience of a believer.

Governments, he reminded his Moscow listeners, do God’s work.

Based on that Biblical mandate for all governments, Graham stood in solid opposition to the work of Dr. Martin Luther King, Jr. In his Letter from Birmingham Jail, all but addressed to Graham, King noted, “We should never forget that everything Adolf Hitler did in Germany was ‘legal’ and everything the Hungarian freedom fighters did in Hungary was ‘illegal.’ … If today I lived in a Communist country where certain principles dear to the Christian faith are suppressed, I would openly advocate disobeying that country’s anti-religious laws.”

Finger on the Pulse of American Fear

Fear is the stock in trade of most evangelists, of course, comprising the necessary setup before the pitch. As historian William Martin explained in his 1991 account of Graham’s early sermons, “even those whose personal lives seemed rich and fulfilling must live in a world filled with terror and threat. As a direct result of sinful humanity’s rebellion against God, our streets have become jungles of terror, mugging, rape, and death. Confusion reigns on campuses as never before. Political leaders live in constant fear of the assassin’s bullet. Racial tension seems certain to unleash titanic forces of hatred and violence. Communism threatens to eradicate freedom from the face of the earth. Small nations are getting the bomb, so that global war seems inevitable. High-speed objects, apparently guided by an unknown intelligence, are coming into our atmosphere for reasons no one understands. Clearly, all signs point to the end of the present world order. …

“Graham’s basic mode of preaching in these early years was assault. … Then, when he had his listeners mentally crouching in terror, aware that all the attractively labeled escape routes—alcohol, sexual indulgence, riches, psychiatry, education, social-welfare programs, increased military might, the United Nations—led ultimately to dead ends, he held out the only compass that pointed reliably to the straight and narrow path that leads to personal happiness and lasting peace.”

Columnist and former priest James Carroll had much the same take, noting that “Graham had his finger on the pulse of American fear, and in subsequent years, anti communism occupied the nation’s soul as an avowedly religious obsession. The Red Scare at home, unabashed moves toward empire abroad, the phrase ‘under God’ inserted into the Pledge of Allegiance, the scapegoating of homosexuals as ‘security risks,’ an insane accumulation of nuclear weapons, suicidal wars against postcolonial insurgencies in Asia—a set of desperate choices indeed. Through it all, Billy Graham was the high priest of the American crusade, which is why U.S. presidents uniformly sought his blessing.”

While Carroll had most of that right, the record suggests that, over and over again, it was Graham who sought presidential blessing, rather than the other way around. Letters enshrined in the presidential and Graham libraries reveal a preacher endlessly seeking official audience. As Truman said, years after his presidency, “Well, I hadn’t ought to say this, but he’s one of those counterfeits I was telling you about. He claims he’s a friend of all the presidents, but he was never a friend of mine when I was president.”

Of course, politicians have often brandished fear as well, and the twin streams of fear-based politics and fear-based religion couldn’t have been more confluent. Communist infiltrators, missile gaps and the domino effect each took their turn, as did the Evil Empire and, more recently, Saddam, Osama bin Laden and an amorphous threat of global terrorism.

In light of the Biblical endorsement of rulers, Graham supported police repression of Vietnam war protesters and civil rights marchers, opposed Martin Luther King’s tactic of civil disobedience, supported South American despots, and publicly supported every war or intervention waged by the United States from Korea forward.

A Pro-War Christian

Born on a prosperous dairy farm and educated at Wheaton College, Graham first gained national attention in 1949 when the publishing magnate William Randolph Hearst, searching for a spiritual icon to spread his anti-communist sentiments, discovered the young preacher holding forth at a Los Angeles tent meeting. Hearst wired his editors across the nation, “puff Graham,” and he was an instant sensation.

Hearst next contacted his friend and fellow publisher Henry Luce. Their Wall Street ally, Bernard Baruch, arranged a meeting between Luce and Graham while the preacher was staying with the segregationist Governor Strom Thurmond in the official mansion in Columbia, South Carolina, Luce concurred with Hearst about Graham’s marketability and Time and Life were enlisted in the job of selling the soap of salvation to the world. Time, alone, has run more than 600 stories about Graham.

The man who would become known as “the minister to presidents” offered his first military advice in 1950. On June 25, North Korean troops invaded South Korea and Graham sent Truman a telegram. “MILLIONS OF CHRISTIANS PRAYING GOD GIVE YOU WISDOM IN THIS CRISIS. STRONGLY URGE SHOWDOWN WITH COMMUNISM NOW. MORE CHRISTIANS IN SOUTHERN KOREA PER CAPITA THAN ANY PART OF WORLD. WE CANNOT LET THEM DOWN.”

It was the first time Graham encouraged a president to go to war, and with characteristic hyperbole: Korea has never topped the list of Christian-leaning nations. Subsequently, Graham gave his blessing to every conflict under every president from Truman to the second Bush, and most of the presidents, pleased to enjoy public assurance of God’s approval, made him welcome in the White House.

Graham excoriated Truman for firing General Douglas MacArthur and supported the general’s plan to invade China. He went so far as to urge Nixon to bomb dikes in Vietnam – knowing that it would kill upward of a million civilians – and he claimed to have sat on the sofa next to G.H.W. Bush as the bombs began falling in the first Gulf War (though Bush’s diary version of the evening somehow excludes Graham, as does a White House video of Bush during the attack).

According to Bush’s account, in a phone call the preceding week, Graham quoted poetry that compared the President to a messiah destined to save the world, and in the next breath called Saddam the Antichrist. Bush wrote that Graham suggested it was his historical mission to destroy Saddam.

Through the years, Graham’s politics earned him some strange bedfellows. He praised Senator Joseph McCarthy and supported his assault on Constitutional rights, then scolded the Senate for censuring McCarthy for his excesses. He befriended oil men and arms manufacturers. He defended Nixon after Watergate, right up to the disgraced president’s resignation, and faced public scorn when tapes were aired that exposed the foul-mouthed President as a schemer and plotter.

Nixon’s chief of staff, Bob Haldeman, reported on Graham’s denigration of Jews in his posthumously published diary—a claim Graham vehemently denied until released tapes undid him in 2002. Caught with his prejudicial pants down, Graham claimed ignorance of the hour-and-a-half long conversation in which he led the anti-Semitic attack.

As reported by the Associated Press on March 2, 2002:

“Although I have no memory of the occasion, I deeply regret comments I apparently made in an Oval Office conversation with President Nixon . . . some 30 years ago,” Graham said in a statement released by his Texas public relations firm. “They do not reflect my views, and I sincerely apologize for any offense caused by the remarks.”

Whether or not the comments reflect Graham’s views at the time or thirty years later, it is his defense that bears much closer scrutiny. What were we to make of a preacher who insisted that his words didn’t reflect his beliefs? Were we to believe him then or later, on other matters?

Graham was a political operative, reporting to Kennedy on purported communist insurgencies in Latin America, turning over lists of activist Christians to the Republican party, conferring regularly with J. Edgar Hoover and networking with the CIA in South America and Vietnam. He was even assigned by Nixon’s operatives to talk George Wallace out of a second run for the White House.

To accomplish the latter, he phoned Wallace as he was coming out of an anesthetic stupor after one of his numerous post-assassination-attempt surgeries. While the long suffering gunshot victim asked the minister to pray for him, the minister asked him not to make a third-party bid for the presidency. “I won’t do anything to help McGovern,” Wallace replied.

There are many who would argue that the good that Graham did outweighs whatever political intrigue he embraced, and even the several wars he enthusiastically endorsed. To the extent that bringing people to Christ is of benefit to them, an untestable hypothesis, he was successful with his calls to come forward. He accrued hundreds of millions of dollars which were used to extend his ministry and thereby bring more people to “be saved,” which is self-justifying but fails as evidence of goodness.

Billy Graham Freeway

If Christian beliefs about the hereafter prove correct, we will all presumably discover what good he accomplished, or what chance for salvation we missed, in the sweet by and by.

In talking to one of his biographers, Graham recalled his mood during his fire and brimstone declamations, “I would feel as though I had a sword, a rapier, in my hand, and I would be slashing deeper and deeper into the consciences of the people before me, cutting away straight to their very souls.”

In that regard, Graham’s largest and most lasting monument is a highway cut through Beaucatcher Mountain, blasted through a majestic land form that once bisected Asheville, North Carolina. He helped convince recalcitrant landowners to permit the excavation and construction through the cut of the short stretch of Interstate highway subsequently named the Billy Graham Freeway.

Downwind residents report that the weather has permanently shifted due to the gaping mountain maw and the future of the highway that transects the city continues to be one of the most divisive issues in that southern metropolis.

“Straight to their very souls,” indeed.

In every way, Graham was the spiritual father of today’s right-wing religious leaders who so inhabit the national conversation. If he cloaked his suasion in public neutrality it was the hallmark of an era in which such intrusion was deemed unseemly. If today’s practitioners are less abashed, it is in many ways reflective of the secure foundation Graham built within Republican and conservative circles.

Graham endorsed and courted Eisenhower and compared a militaristic State of the Union speech to the Sermon on the Mount, fanned anti-Catholic flames in the Nixon-Kennedy contest, backed Johnson and then Nixon in Vietnam, lobbied for arms sales to Saudi Arabia during the Reagan years, conveyed foreign threats and entreaties for Clinton and lent his imprimatur to G.W. Bush as he declared war on terrorism from the pulpit of the National Cathedral.

Billy Graham approved of warriors and war, weapons of mass destruction (in white, Christian hands) and covert operations. He publicly declaimed the righteousness of battle with enemies of American capitalism, abetted genocide in oil-rich Ecuador and surrounds and endorsed castration as punishment for rapists. A terrible swift sword for certain, and effective no doubt, but not much there in the way of turning the other cheek.

Graham will be cordially remembered by those who found solace in his golden promises and happy homilies, but the worldly blowback from his ministry is playing out in Iraq and Afghanistan, Chechnya and Korea, the Philippines and Colombia – everywhere governments threaten human rights and pie in the sky is offered in lieu of daily bread.

In the words of Graham’s ministerial and secular adversary, Dr. King, “I had hoped that the white moderate would understand that law and order exist for the purpose of establishing justice and that when they fail in this purpose they become the dangerously structured dams that block the flow of social progress.”

Farewell Reverend Graham. Let justice roll.

Prize-winning investigative reporter Cecil Bothwell is author of The Prince of War: Billy Graham’s Crusade for a Wholly Christian Empire, (Brave Ulysses Books, 2007) and Whale Falls: An exploration of belief and its consequences (Brave Ulysses Books, 2010).




How the Washington Post Missed the Biggest Watergate Story of All

The Watergate scandal may have been rooted in Richard Nixon’s alleged efforts to sabotage the 1968 Paris peace talks, but this story has never fully been told – partly because the Washington Post remained silent on it, explains Garrick Alder.

By Garrick Alder

Stephen Spielberg’s film The Post is still running in theaters, lauding the Washington Post, Katharine Graham and Ben Bradlee as fearless exposers of official secrets about government wrongdoing. But previously overlooked evidence now reveals for the first time how the Washington Post missed the most serious leak in newspaper history, and as a result history itself took a serious wrong turn. Consequently, this is a story that was also missed by Spielberg, and missed by Alan Pakula in his 1976 film about The Washington Post’s role in Watergate, All The President’s Men.

Spielberg’s 2018 film tells the story of the “Pentagon Papers” affair of 1971, in which a huge number of Defense Department documents were leaked by RAND Corporation employee Daniel Ellsberg, whose conscience would not allow him to stay silent about the carnage in Vietnam. The Washington Post took on Richard Nixon and won – a victory for press freedom that has been enshrined in the mythos of the mass media. But in fact, the Washington Post had inadvertently let Nixon off the hook.

The newspaper had been told by an unbeatable source – one might almost say, an “unimpeachable” source – that the president had committed treason against America in time of war and had then conspired to destroy the damning evidence of his own crime. It is no exaggeration to say that if the Washington Post had printed what it had been told, simmering domestic discontent over the Vietnam War would have become an incendiary mix with national disgust over Nixon’s conduct in office.

At the height of the Watergate scandal, in summer 1974, Secretary of State Henry Kissinger tried to tell the world about Nixon’s sabotage of the 1968 Paris peace talks, talks which – had they succeeded – could have spared the nation six more years of futile slaughter. Nixon would have gone down with the blame for Vietnam squarely on his shoulders – ultimately, perhaps, providing America with much-needed catharsis. Kissinger leaked his knowledge of Nixon’s treason to Washington Post reporter Bob Woodward. Woodward fumbled the pass and no story ever appeared.

The first trace of desperation is recorded on the White House tape of June 17, 1971, just four days after the first newspaper story about the Pentagon Papers (in the New York Times). Nixon is heard telling White House chief of staff HR Haldeman: “God damn it, get in and get those files. Blow the safe and get them.” Nixon’s aides were used to occasionally turning a deaf ear to their boss’s more outrageous orders.

Indeed a fortnight later (June 30, 1971) Nixon had to hammer home his demands once more: “I want Brookings … just break in, break in, and take it out. Do you understand? You’re to break into the place, rifle the files, and bring them in.” Twenty-four hours later, Nixon issued the same demand even more emphatically: “Did they get the Brookings Institute raided last night? No? Get it done. I want it done. I want the Brookings Institute safe cleaned out.” What was in the safe at the Brookings Institute?

In a July 24, 1974 memorandum, Woodward set out what he could recall of an interview with Nixon aide John Ehrlichman, in which the Brookings break-in was discussed:

At president’s direction E[hrlichman] said he talked to Brookings and about secrecy there; did it several times; right after Pentagon Papers. Also about Brookings a meeting in San Clemente about 12 July 71 ‘undoubtedly discussed it’ (w/ Dean) the discussions were an effort to get the so-called “bombing halt” papers back.

The “bombing halt papers” were what Nixon told his cronies he wanted to retrieve – evidence that his predecessor Lyndon Johnson had stopped bombing in Vietnam in a last-minute attempt to swing the 1968 election to the Democratic Party. But this was just another Nixon lie to conceal his true motivations, and Ehrlichman essentially admitted as much to Bob Woodward during the same interview, when describing his attempts to access the Brookings Institute’s Vietnam records via official bureaucratic channels: “Buzhardt decided what we not get to see [sic] So it was admittedly a hit and miss process. … in terms of ? what he got to see; not the whole story; but the Brookings matter was not necessarily what he was looking for. Wouldn’t elaborate on that.” (emphasis added)

Filed in the Woodward-Bernstein collection at the University of Texas, among the July 24, 1974 Ehrlichman interview notes, is a second typed memorandum from Woodward, addressed to his colleague Carl Bernstein. Its significance has been overlooked for nearly 45 years. The memo is undated,  but, from part of its contents, its creation can be pinned down to a period of approximately 35 days at the height of the Watergate scandal, immediately prior to July 24, 1974 when the Supreme Court ordered Nixon to hand over the White House tapes.

Woodward’s memo begins: “First and most important, my source said that the President personally ordered the break-in at Brookings.” This was correct, although the tapes of Nixon’s orders were at this stage still in the sole possession of the White House.

Woodward’s source knew what he was talking about. After some discussion about how Charles Colson had reacted to the President’s order to burgle the Brookings Institute, when other aides had just ignored what they regarded as another of Nixon’s impetuous outbursts, Woodward got to the point of his source’s information:

“I quizzed him for a while, and while I don’t remember exactly what he answered in each instance, the impression left was that these papers related to secret U.S. negotiations with Hanoi, Russia and China. The ‘Other stuff,’ my source said, really provided the impetus for the administration’s panic reaction to the Pentagon Papers, not the Pentagon Papers themselves.” (emphases added)

As can be seen, the exact information passed on by Woodward’s source was already a fading memory by the time this memo was typed up. Even so, the import is clear. Woodward’s source knew exactly why Nixon wanted a break-in at the Brookings Institute, and which documents Nixon wanted to seize.

Woodward’s notes state that his source told him “several times that the picture the public had of [Pentagon Papers leaker Daniel] Ellsberg was still distorted … all he would hint at was that Ellsberg’s activities were very questionable.”

He also mentioned to Woodward the supposed existence of “material that the [Nixon] administration had gathered about Ellsberg’s behavior while in Vietnam.” This corresponds closely with claims that had been made in the White House soon after Daniel Ellsberg’s leak of the Pentagon Papers had been published.

In his 2000 tell-all biography, The Arrogance of Power: The Secret World of Richard Nixon, Anthony Summers wrote: “Kissinger, who knew Ellsberg, fed the president’s spleen with a torrent of allegations. Ellsberg may have been ‘the brightest student I ever had,’ he told Nixon, but he was ‘a little unbalanced.’ He supposedly ‘had weird sexual habits, used drugs,’ and, in Vietnam, had ‘enjoyed helicopter flights in which he would take potshots at the Vietnamese below.’ Ellsberg had married a millionaire’s daughter and – Kissinger threw in for good measure – had sex with her in front of their children.”

Other information known to Woodward’s source included the existence of “a document – he gave the number as NSSCM 113 on declassification. We did not get further than that.” It is somewhat surprising that Woodward was able to recall the number of this document so exactly, when his recollection of the nature of the papers Nixon wanted from Brookings was so hazy. The document Woodward’s source was directing him toward was NSSM 113 (just one letter different; NSSM standing for “National Security Study Memorandum”). Dated January 15, 1971, NSSM 113 was titled “Procedures for Declassification and Release of Official Documents” and was written by Henry Kissinger.

Finally, Woodward mentions that “My source also confirmed that Kissinger was for a unit to plug security leaks” (i.e., that Kissinger had supported the formation of Nixon’s “plumbers” team).

Assessing the reliability of Woodward’s information concerning the Brookings break-in plan, the following factors are known. Woodward’s source repeated rumours about Ellsberg that Kissinger was circulating in the White House; like Kissinger, Woodward’s source claimed to have knowledge about Ellsberg’s private life; Woodward’s source knew the document number and nature of a (then undisclosed) memorandum concerning national security that had been written by Kissinger; and the source was able to give solid information about Kissinger’s private attitude toward Nixon’s creation of the plumbers.

There could only be a very small number of White House figures privy to this precise set of information in mid-1974, and perhaps only one. Woodward’s source was Nixon’s National Security Advisor and Secretary of State, Henry Kissinger. Still alive in 2018, Kissinger has maintained public silence about his knowledge of Nixon’s Vietnam treason for half a century.

It is incomprehensible that neither Woodward nor Bernstein appeared to understand the information they were being told by Kissinger: the allegations against Nixon had swirled ever since he won the Presidency. On January 12, 1969, the Washington Post itself had carried a profile of Nixon’s go-between, Anna Chennault, which stated: “She reportedly encouraged Saigon to ‘delay’ in joining the Paris peace talks in hopes of getting a better deal if the Republicans won the White House.” Chennault was reported as making no comment on the allegations, which were entirely accurate.

Woodward and Bernstein had been handed the skeleton key that would have unlocked the entire Watergate affair. The reporters had been told – by no less a figure than Nixon’s National Security Advisor, Henry Kissinger – about the real motive behind Nixon’s plan to burgle the Brookings Institute. It was to destroy the evidence that Nixon had conspired to prolong a war with an official enemy of the United States in order to win the presidency in 1968; after which he deliberately prolonged – even escalated – the Vietnam War. And – for reasons that might never be known – Woodward and Bernstein stayed silent.

Bob Woodward and Henry Kissinger were contacted for comment on the specific disclosures made in this article. Neither of them replied.

This is an abridgement of an article first published by Lobster Magazine (www.lobster-magazine.co.uk). Republished with permission. All rights reserved by the author.