Stretching Charges of Anti-Semitism

Hard-line Israeli defenders have tried to shut down protests over how the Palestinians have been treated by accusing critics of “anti-Semitism” and by labeling dissenting Jews as “self-hating.” These intimidating tactics are now common on U.S. college campuses, Lawrence Davidson writes.

By Lawrence Davidson

Can criticism of Israel, particularly a) criticism of Israel’s treatment of the Palestinian people and b) criticism of the state ideology of Zionism that justifies that treatment, be labeled anti-Semitic?

This is not a hypothetical query. An affirmative answer to this question is being advocated by influential Zionist lobbies in the United States. The question is of particular importance on the nation’s college and university campuses.

In places like the University of California at Berkeley and Santa Cruz, and also at Rutgers University in New Jersey, Zionist students are now threatening to sue these institutions for failing to prevent an “atmosphere of anti-Semitic bigotry” allegedly created by the presence of pro-Palestinian student groups and faculty.
 
One might ask if it isn’t a stretch to assert that protesting Israeli and Zionist behavior is the same as anti-Semitism? Common sense certainly tells us this is so.

Unfortunately, we are not dealing with situations that are ruled by common sense. What we are facing here is the issue of ideologues bred to a specific perceptual paradigm and their insistence that others conform to it.
 
Here is an example: Take an American kid from a self-conscious Jewish home. This kid does not represent all American Jewish youth, but does typify say 20 percent of them. He or she is taught about the religion and also taught about recent history and the near annihilation of the Jews of Europe. He or she is sent to Hebrew school, and maybe a yeshiva school as well.

Most of our hypothetical student’s friends will be Jewish and of similar background. Between home, friends and school the student might well find himself or herself in something of a closed universe.

Throughout this educational process Judaism and its fate in the modern world is connected with Israel and its survival. The Arabs, and particularly the Palestinians, are transformed into latter-day Nazis. In addition, Israel’s state ideology of Zionism becomes assimilated into the credos of the religion. Soon our hypothetical student cannot tell the difference between the two.

Then, having come of age, our student goes off to college or university. Now he or she is no longer in a closed world. The result can be culture shock and an uncomfortable feeling that the student is on a campus where vocal and assertive debate about Israel and its behavior sounds like an attack on the Jewish religion.

Our student complains to the ZOA, Hillel, AIPAC or some similar organization and we are off down a road toward censorship and/or litigation.

Lawsuits are lodged (particularly if the ZOA is involved), donors swear that they will no longer support the institution, legislators bang on desks at the state capital, and boards of directors want to know what is going on and what the institution’s president is going to do about it?
 
Sweet Reason
 
There have been a number of efforts to try to use sweet reason to work out some of these problems before they get too explosive. For instance, in 2006, there was concern over the efforts of various pro-Palestinian campus groups to promote an academic boycott of Israel. Is this being anti-Semitic? Should campuses allow this to be advocated?

After all those who espouse academic boycott have a good deal of evidence of criminal activities on the part of the Israeli universities. At that time the American Association of University Professors (AAUP) sought to clarify the issues by arranging a roundtable discussion on academic boycott by those who stood pro and con.

This sounded like a good idea. But, no, the Zionist side did not like the list of discussants on the pro side and tried to censor the list. The AAUP resisted that move, so the Zionist side pressured the donors subsidizing the proposed roundtable to pull their support. The whole thing collapsed. It seemed the Zionists were not going to discuss the topic except on their own terms.
 
Just recently there has been similar attempt at sweet reason. A heated debate is now taking place over whether Title VI of the Civil Rights Act of 1964 (which bars federal funds from institutions that discriminate) can be applied to schools that allow criticism of Israel which the Zionists claim is anti-Semitic.

If so, those same Zionists, whose influence is strong in Congress, can use Title VI as a club to threaten colleges and universities with the loss of financial support unless they shut down the criticism. This, of course, equates to censorship and an attack on free speech.
 
Once more the AAUP, which opposes the use of Title VI in such situations, approached the American Zionists in an effort to find a compromise position. Professor Cary Nelson, head of the AAUP, managed to enter into negotiations with Kenneth Stern, the “anti-Semitism expert” of the American Jewish Committee (AJC).

The two of them worked out a common position which, after consultation with others in each organization, was signed and released to the public. What did this document say? For our needs, here are its most important points:
 
1. Title VI is not an appropriate instrument to use when trying to “protect” Jewish students from “anti-Israel events, statements and speakers.” To use Title VI this way amounts to censorship.
 
2. Regarding how to know when activities are anti-Semitic, the document said, “Six years ago the European monitoring Centre on Racism and Xenophobia (EUMC) created a working definition of anti-Semitism … while clearly stating that criticism of Israel in the main is not anti-Semitic, [it] gives some examples of when anti-Semitism may come into play, such as holding Jews collectively responsible for the acts of the Israeli state, comparing Israeli policy to that of the Nazis, or denying to Jews the right of self-determination (such as by claiming that Zionism is racism).

“In recent years the U.S. Department of State and the U.S. Commission on Civil Rights have embraced this definition too. It is entirely proper for university administrators, scholars and students to reference the working definition in identifying definite or possible instances of anti-Semitism on campus.”

3. So, censorship and Title VI should be avoided, but the “working definition” should be used to make judgments as to how best to “wrestle with ideas” while at the same time “combating bigotry.”
 
This letter was signed by both Cary Nelson as President of the AAUP and Kenneth Stern as the Director of the anti-Semitism and extremism sub-division of the American Jewish Committee. Released in early August, it took only a few days before it was repudiated by the AJC.

On Aug. 9, David Harris, AJC president, “apologized” for the joint declaration, said it was “ill advised” and blamed a breakdown in the AJC’s “system of checks and balances” for the slip-up. Kenneth Stern is now on an unscheduled sabbatical and cannot be reached for comment.
 
This is, of course, a replay of the 2006 situation and just goes to show that, it is the hard-right ideologues who are in charge on the Zionist side. These people have a worldview that allows for no compromise. Censorship is exactly what they want and Title VI is as good a weapon to wield as any.

What could Kenneth Stern possibly have been thinking? There is no room for sweet reason here.
 
AAUP’s Mistake
 
This is not the end of the story. There is something wrong with the fact that the AAUP was so quick to endorse the EUMC working definition of anti-Semitism (a definition, by the way, that Kenneth Stern had a hand in writing).

Consider these two statements from the above AAUP-AJC declaration each of which, according to the “working definition,” can be seen as anti-Semitic: 1) “holding Jews collectively responsible for the acts of the Israeli state” and 2) “denying to Jews the right of self-determination (such as by claiming that Zionism is racism).”

As we are about to see the first statement has hidden facets to it and the second defies historical reality.
 
Statement 1:
 
It is absolutely the case that the Jews should not be held collectively responsible for the actions of Israel. But it should be pointed out that it is just such collective responsibility that Zionists insist upon.

Zionist ideology demands that Israel be recognized as representing world Jewry. Zionists expect that, in return, all Jews will identify with and actively support Israel feel one with the “Jewish state.” They classify those Jews who do not recognize their collective responsibility to Israel as somehow deficient or perhaps “self-hating” Jews.

So let us get this straight, if holding Jews collectively responsible for the acts of Israel is anti-Semitic, what does that make the Zionists?
 
Statement 2:
 
a. That Jews have some sort of natural right to political self-determination is highly questionable. How about Protestants, Catholics, Hindus, Buddhists, ad infinitum? Just how far do we want to push this claim of political self-determination for religious faiths?

Oh, but the Zionists insist that Jews are not just adherents to a particular faith they are a “people.” Well, that is an opinion. It just doesn’t happen to be the opinion of millions of other Jews who see Judaism as a religion pure and simple. Of course, if the latter are vocal about this they run the risk of being labeled “self-hating.”
 
b. And who, except of course the Zionists, says that Zionism is a desirable vehicle for the expression of this alleged right of self-determination?

Let us face it. Israel and its Zionist ideology were born of the will of a small minority of Jews, almost exclusively from Central and Eastern Europe, most of whom were secularists, and almost all of whom carried within their heads the poisoned perceptions of European imperialist bigotry an outlook which still characterizes the state they set up.

That is why, in practice, Zionism has resulted in a prima facie racist environment in Israel. And now we are told that, according to the “working definition,” pointing out the link between Zionism and racism is an act of anti-Semitism!
 
Given this close reading of parts of the “working definition,” the AAUP really ought to rethink its apparent support of the document. It is a position that can only give impetus to the very censorship the AAUP dreads.
  
One has come to expect twisted logic from the Zionists. Actually, one can expect this sort of thinking from any band of ideologues. Their blinkered vision, incapable of seeing around the corners of their prejudices, guarantees that most of what comes out of their mouths and their pens is sophistry.
 
However, what is one to do when folks you count on as rational and careful thinkers, like the leadership of the AAUP, get caught short this way? What is one to do when flawed reasoning and spurious assumptions start to be translated into criteria for government administrative decisions?

What can you do when a fifth of the Congress decides to take a break and visit one of the most racist places on the planet and you risk being labeled an anti-Semite for decrying this fact?

Well, you have a good laugh, have a good cry, and then go post your assessment of the situation on your website. Then you get a bit drunk. Finally, you repeat ten times “I will never stay silent.”

Lawrence Davidson is a history professor at West Chester University in Pennsylvania. He is the author ofForeign Policy Inc.: Privatizing America’s National Interest; America’s Palestine: Popular and Offical Perceptions from Balfour to Israeli Statehood; and Islamic Fundamentalism.




Orange Jumpsuits / Double Standards

Exclusive: The U.S. news media regularly rallies the American public to outrage when a U.S. adversary or some unpopular group is linked to a heinous crime. But a different standard applies to U.S. allies even when there is strong evidence of a similar offense, observes Robert Parry. 

By Robert Parry

In Great Britain, convicted looters are being dressed in orange jumpsuits and made to clean up areas damaged by recent riots. The law-and-order crowd on both sides of the Atlantic cheers this “riot payback scheme,” even when applied to offenders who only grabbed some bottled water or received a stolen pair of running shorts from a friend. 

After all, the phrase “zero tolerance” was made for moments when the poor and the powerless break the rules.

By contrast, these same British authorities will take no action against officials from the former government of Prime Minister Tony Blair, who joined with President George W. Bush’s team in making a bloody mess out of Iraq in clear violation of international law.

Indeed, if the architects of the Iraq War were put in orange jumpsuits and forced to fix the devastation of Iraq, one might see more justice in humiliating the British looters.

But it is impermissible to envision an orange-clad chain gang at work in Iraq consisting of Blair, Bush and their subordinates the likes of Dick Cheney, Donald Rumsfeld, George Tenet, Jack Straw, Elliott Abrams and a host of neoconservatives, including many big-time media pundits. For such important people, different rules apply.

There also will be no special tribunal set up to deal with these former U.S./U.K. officials (and their allied propagandists) whose aggressive war in Iraq got hundreds of thousands killed. Such courts, it seems, are reserved for international law violators from weak states in Eastern Europe, Africa and Asia.

At the Nuremberg Tribunal after World War II, jurists from the United States and Great Britain made the specific point that the rules being established, including prohibitions against “aggressive war,” were to be applied to the victors, not just the vanquished.

U.S. Supreme Court Justice Robert Jackson, who represented the United States at Nuremberg, stated that holding Nazi leaders responsible was not just a case of victor’s revenge but a desire to establish a precedent against aggressive war in the future.

“Let me make clear,” Jackson said, “that while this law is first applied against German aggressors, the law includes, and if it is to serve a useful purpose, it must condemn aggression by any other nations, including those which sit here now in judgment.”

But it seems Justice Jackson had it wrong. Based on what has happened in the six-plus decades since Nuremberg, an objective observer would have to conclude that the punishment of the Nazis, including the death penalty for some, was indeed a case of victor’s revenge. When leaders of the former Allied powers engage in crimes like “aggressive war,” nothing happens to them.

Indeed, today’s tribunals, such as the International Criminal Court and special courts to handle acts of terrorism, target offenders from weak nations or from unpopular groups. These judicial bodies turn a blind eye to similar crimes committed by or protected by powerful governments.

Evil Libyans

So, while longtime Libyan dictator Muammar Gaddafi and his inner circle seem destined for prosecution by the ICC if they’re not simply executed by NATO-backed rebels it’s unthinkable to suggest that Bush, Blair and their inner circles get dragged before the ICC for their role in precipitating the far greater slaughter in Iraq.

You see, while it’s a crime against humanity when Gaddafi kills insurrectionists in Libya, it is perfectly okay when U.S. and British authorities slaughter “militants” opposed to Western occupation of their countries, whether Iraq or Afghanistan. Any “collateral damage” from Gaddafi’s attacks is inexcusable, but “collateral damage” from U.S. missile strikes is shrugged off.

You have similar rules for terrorism. Acts of terrorism against the powerful or their friends must be punished, even if the evidence is thin to invisible and even if the wrong people get blamed. However, acts of terrorism by friends of the powerful require the sort of perfect evidence that doesn’t exist in the real world. Those terrorists rarely get nailed.

Take, for example, the case of right-wing Cubans Luis Posada Carriles and Orlando Bosch. They were clearly implicated as the masterminds of the in-flight bombing of a Cubana Airlines plane in 1976, killing 77 people.

However, under the protection of Miami’s politically powerful Cuban community and the Bush Family, the CIA-trained Posada and Bosch have been allowed to live out their golden years in freedom and comfort. For them, no evidence even contemporaneous U.S. intelligence reports and self-incriminating statements was enough to justify holding these defiant terrorists accountable.

Meanwhile, international tribunals have relied on the skimpiest circumstantial evidence to bring charges against Arabs who are viewed with disdain by Western governments and media. To this day, U.S. journalists ignore the implausibility of Libyan intelligence agent Ali al-Megrahi’s 2001 conviction by a Scottish court for the 1988 bombing of Pan Am 103 over Lockerbie, Scotland.

The special Scottish court convicted Megrahi in the deaths of the 270 people  while acquitting a second Libyan in what appeared to be more a political compromise than an act of justice. One judge told Dartmouth government professor Dirk Vandewalle about “enormous pressure put on the court to get a conviction.”

Following Megrahi’s dubious guilty verdict, Libya was coerced into accepting “responsibility” for the bombing to get punitive international sanctions lifted. Despite agreeing to pay reparations to the victims’ families, Libyan officials continued to deny having a role in the bombing.

Then, after the testimony of a key witness against Megrahi was discredited, the Scottish Criminal Cases Review Commission agreed in 2007 to reconsider his conviction out of a strong concern that it was a miscarriage of justice. However, due to more political pressure, that review was proceeding slowly in 2009 when Scottish authorities agreed to release Megrahi on medical grounds.

Megrahi dropped his appeal in order to gain an early release in the face of a terminal cancer diagnosis, but that doesn’t mean he was guilty. He has continued to assert his innocence and an objective press corps would reflect the doubts regarding his conviction.

Instead, American journalists from all hues on the ideological spectrum routinely blame Gaddafi for the Lockerbie bombing and cite it as justification for NATO’s bombing campaign that has killed many young Libyan soldiers (and a number of civilians) while paving the way for anti-Gaddafi rebels to reach Tripoli.

Hariri Bombing

A similar lack of objectivity has applied to the work of a special United Nations tribunal investigating the 2005 assassination of former Lebanese Prime Minister Rafik Hariri. Earlier this month, the tribunal unsealed an indictment accusing four members of Lebanon’s militant group Hezbollah of carrying out the bomb attack that killed Hariri and 21 others.

However, the prosecutors acknowledged that they had no smoking gun or even any direct evidence tying the accused to the crime. Instead, the indictment cited a complex analysis of cell-phone usage attributed to the defendants, though it wasn’t clear how the prosecutors linked the suspects to the various phones.

In many ways, the case had the look of “rounding up the usual suspects,” including Mustafa Amine Badreddine, whose slain brother-in-law Imad Moughnieh was linked to the 1983 bombing of the U.S. Marine barracks in Beirut, an attack that the U.S. media frequently identifies as “terrorist” even though it followed the Reagan administration’s military intervention in the Lebanese civil war.

When the Hariri indictment was unsealed on Aug. 17, the U.S. media again was quick to treat the dubious allegations against the four defendants as credible, since Hezbollah is an unpopular group among U.S. and Israeli officials.

But Hezbollah leaders noted that the indictment lacked any hard evidence and that Israeli intelligence had penetrated Lebanon’s phone service, raising doubts about the reliability of the cell-phone records. (Two senior employees of one cell-phone company were arrested in 2010 for spying.)

Hezbollah denounced the charges as an American-Israeli scheme to discredit the organization and vowed to protect the defendants from arrest.

In the Western press coverage of the indictment, there also was little note that the tribunal’s earlier investigation had reached a very different conclusion, fingering Syrian intelligence for the Hariri killing. That preliminary finding in 2005 had received uncritical front-page treatment in the New York Times and other leading U.S. news outlets, since Syria was another bête noire.

Back then, Consortiumnews.com and Der Spiegel were two of the few news organizations that pointed to what seemed like a rush to judgment by the tribunal’s German investigator, Detlev Mehlis. Some of Mehlis’s witnesses appeared unreliable and promising leads had not been followed up.

When two of those key witnesses were discredited, Mehlis’s initial report was essentially withdrawn by the U.N. tribunal and he quit his post. But the fact that the case collapsed was largely ignored by the U.S. news media, which instead kept referring to Syria’s presumed guilt.

Now, Syria’s presumed guilt has simply been replaced by Hezbollah’s presumed guilt with little or no acknowledgement that a new batch of “usual suspects” has filled in for the old bunch.

Murder Mystery

The complex Hariri murder mystery began on Feb. 14, 2005, when an explosion destroyed a car carrying Hariri through the streets of Beirut. Twenty-one other people also died.

Because Syria was then on President George W. Bush’s hit list for “regime change” and Syria was considered a front-line enemy of Israel speculative evidence of Syrian guilt was an easy sell to the U.S. news media.

So, when Mehlis’s preliminary report was issued in fall 2005, there was little U.S. media skepticism about its assertions of guilt regarding Syrian leaders and their Lebanese allies.

“There is probable cause to believe that the decision to assassinate former Prime Minister Rafik Hariri could not have been taken without the approval of top-ranked Syrian security officials and could not have been further organized without the collusion of their counterparts in the Lebanese security services,” declared Mehlis’s report on Oct. 20, 2005.

Despite the curiously vague wording “probable cause to believe” the killing “could not have been taken without the approval” and “without the collusion” Bush immediately termed the findings “very disturbing” and called for the UN Security Council to take action against Syria.

The U.S. press joined the stampede in assuming Syrian guilt. On Oct. 25, 2005, a New York Times editorial said the UN investigation had been “tough and meticulous” in establishing “some deeply troubling facts” about Hariri’s murderers. The Times demanded punishment of top Syrian officials and their Lebanese allies.

But Mehlis’s investigative report was anything but “meticulous.” Indeed, it read more like a conspiracy theory than a dispassionate pursuit of the truth.

As a wealthy businessman with close ties to the Saudi monarchy, Hariri had many enemies who might have wanted him dead for his business or political dealings. The Syrians were not alone in having a motive to eliminate Hariri.

Indeed, after the assassination, a videotape was delivered to al-Jazeera television on which a Lebanese youth, Ahmad Abu Adass, claimed to have carried out the suicide bombing on behalf of Islamic militants angered by Hariri’s work for “the agent of the infidels” in Saudi Arabia.

However, Mehlis relied on two witnesses Zuhair Ibn Muhammad Said Saddik and Hussam Taher Hussam to dismiss the videotape as part of a disinformation campaign designed to deflect suspicion from Syria. (The new indictment also rejects Adass’s as the suicide bomber.)

Mehlis spun a narrative of a Syrian conspiracy to kill Hariri, implicating four pro-Syrian Lebanese security officials who were jailed on suspicion of involvement in Hariri’s murder. Everything was falling neatly into place.

As a new U.S. press hysteria built over another case of pure evil traced to the doorstep of an American adversary in the Muslim world, holes in the UN report were mostly ignored. At Consortiumnews.com, we produced one of the few critical examinations of what had the looks of a rush to judgment. [See “The Dangerously Incomplete Hariri Report.”]

 Crumbling Case

Much like the Bush administration’s Iraqi WMD claims which the Times also had touted uncritically Mehlis’s Hariri case against the Syrians soon began to crumble.

One witness, Saddik, was identified by the German newsmagazine Der Spiegel as a swindler who boasted about becoming “a millionaire” from his Hariri testimony. The other one, Hussam, recanted his testimony about Syrian involvement, saying he lied to the Mehlis investigation after being kidnapped, tortured and offered $1.3 million by Lebanese officials.

Mehlis soon stepped down, as even the New York Times acknowledged that the conflicting accusations had given the investigation the feel of “a fictional spy thriller.” [NYT, Dec. 7, 2005]

Mehlis’s replacement backed away from the Syrian accusations. Belgian investigator Serge Brammertz began entertaining other investigative leads, examining a variety of possible motives and a number of potential perpetrators.

“Given the many different positions occupied by Mr. Hariri, and his wide range of public and private-sector activities, the [UN] commission was investigating a number of different motives, including political motivations, personal vendettas, financial circumstances and extremist ideologies, or any combination of those motivations,” Brammertz’s own interim report said, according to a UN statement on June 14, 2006.

In other words, Brammertz had dumped Mehlis’s single-minded theory that had pinned the blame on senior Syrian security officials.

Still, the U.S. news media barely mentioned the shift in the UN probe. Virtually nothing appeared in the U.S. press that would alert the American people to the fact that the distinct impression they got in 2005 that the Syrian government had engineered a terrorist bombing in Beirut was now a whole lot fuzzier.

In 2009, the UN tribunal examining Hariri’s murder and other terrorist acts in Lebanon acknowledged that it lacked the evidence to indict the four Lebanese security officials who had been held without formal charges since 2005. Finally, Judge Daniel Fransen of the special international tribunal ordered the four imprisoned officials released.

In a similar situation say, one that involved a U.S. ally the release would have been viewed as proof of innocence. In this case, however, the New York Times refused to acknowledge the fact that Mehlis’s initial case against Syria had been weak. Instead, the Times blamed “the legal pitfalls of a divisive international trial.” [NYT, April 30, 2009]

It remained common practice for the New York Times and the rest of the mainstream U.S. news media to continue citing the Mehlis report and referring to “Syrian officials implicated in Mr. Hariri’s killing” without providing more context.

Keeping Up the Pressure  

That pattern continued in 2010 with a New York Times op-ed article, “A U.N. Betrayal in Beirut” by Michael Young, portraying Mehlis as a hero and his replacement, Brammertz, as an incompetent stooge serving a supposed UN cabal to protect Syria.

The online version of Young’s op-ed linked to a 2005 story that trumpeted Mehlis’s initial report, but cited no articles describing the subsequent collapse of Mehlis’s case. (In 2009, Brammertz was replaced by Canadian prosecutor Daniel Bellemare, who brought the current indictment.)

Even in the newly released indictment, there remain gaps around a central piece of evidence, the white Mitsubishi Canter Van that was identified as the vehicle carrying the bomb. According to Mehlis’s initial report, a Japanese forensic team matched 44 of 69 pieces of the van’s wreckage to Canter parts manufactured by Mitsubishi Fuso Corp. and even identified the specific vehicle.

So, the van’s chain of possession would seem to be a crucial lead in identifying the killers. But Mehlis issued his first report suggesting Syrian guilt before that trail had been followed.

At that point, Mehlis only stated that the Japanese forensic team had learned that the van had been reported stolen in Sagamihara City, Japan, on Oct. 12, 2004. A subsequent update to Mehlis’s report added some more intriguing clues about the van, tracking its arrival in the Middle East to port facilities in the United Arab Emirates.

The newly released indictment says the van then found its way into a car showroom in the northern Lebanese city of Tripoli where it was purchased with cash by two unidentified men. The indictment asserts, again without any clear proof, that the buyers were collaborating with the four defendants.

While the evidence against the four Hezbollah members remains murky, what is clear is that Lebanon is regarded by the United States and its regional allies as an important battleground in their geopolitical struggle with Iran.

According to classified State Department cables released by WikiLeaks, Saudi Arabia even discussed a military intervention in Lebanon in 2008 under cover of UN peacekeepers.

On May 10, 2008, Saudi Foreign Prince Saud Al-Faisal told U.S. Ambassador David Satterfield that a joint U.S.-Saudi “security response” might be needed against Hezbollah to counter its “military challenge to the Government of Lebanon,” according to a U.S. embassy cable.

“Specifically, Saud argued for an ‘Arab force’ to create and maintain order in and around Beirut, which would be assisted in its efforts and come under the ‘cover’ of a deployment of UNIFIL troops from south Lebanon.

“The US and NATO would need to provide movement and logistic support, as well as ‘naval and air cover.’ Saud said that a Hizballah victory in Beirut would mean the end of the Siniora government and the ‘Iranian takeover’ of Lebanon.”

The cable indicates how high the stakes are in the Lebanese political struggles and how powerful the motivation is to use propaganda to discredit U.S. adversaries there.

Between those propaganda imperatives and the inherent double standards regarding how the U.S. news media addresses crimes by the United States and its allies versus those allegedly committed by U.S. adversaries, it shouldn’t be surprising that an objective observer might lose faith in what’s regularly presented to the American public.

The drumbeat is already building for new sanctions against Hezbollah to force it to turn over the four defendants to the special tribunal, much as Libya was pressured to surrender Megrahi to the special Scottish court which then succumbed to apparent political influence to convict him.

On Aug. 17, the Washington Post published an op-ed by David M. Crane and Carla Del Ponte (two prosecutors in cases involving human rights crimes in Sierre Leone, the former Yugoslavia and Rwanda) demanding strong support from the international community for the Hariri tribunal.

The pair cited a statement by the tribunal’s president, Italian jurist Antonio Cassese, declaring how important it is “to entrench the notion that democracy cannot survive without the rule of law, justice and respect for fundamental human rights.”

That standard apparently applies to weak countries and to movements considered unpopular in the West, but not to the United States, other big powers or CIA-connected terrorists who find safe haven in places like Miami.

It’s as if Washington’s enemies should expect to get fitted for orange jumpsuits, while it would be wrong to subject U.S. officials and their friends to such humiliations.

[For more on these topics, see Robert Parry’s Secrecy & Privilege and Neck Deep, now available in a two-book set for the discount price of only $19. For details, click here.]

Robert Parry broke many of the Iran-Contra stories in the 1980s for the Associated Press and Newsweek. His latest book,Neck Deep: The Disastrous Presidency of George W. Bush, was written with two of his sons, Sam and Nat, and can be ordered at neckdeepbook.com. His two previous books, Secrecy & Privilege: The Rise of the Bush Dynasty from Watergate to Iraq and Lost History: Contras, Cocaine, the Press & ‘Project Truth’ are also available there.




Having a Voice in Global Debates

Just as more and more issues require a global response, political pressures in the United States are building against American participation in international bodies designed to address these concerns. R. Spencer Oliver, an American who is secretary general of one such organization, says U.S. representatives must be on hand to engage in the debate.

By R. Spencer Oliver

In a rare moment this summer, the seats of the United States delegation were empty when more than 220 parliamentarians from the world’s largest regional security organization gathered in Belgrade for the opening of their annual meeting.

The 17-seat U.S. delegation has the most votes in the 55-nation Organization for Security and Co-operation in Europe Parliamentary Assembly, a body of which Sen. Benjamin L. Cardin, D-Maryland, is an elected vice president.

Unfortunately, with the House in session and the Senate working several days the week of July 4th, only two voting Americans were present  Cardin and Sen. Jeanne Shaheen, D-New Hampshire for part of this summer’s meeting.

It was the poorest showing ever for the Americans in the Assembly’s 20-year history down from a high of 13 U.S. members present in two of the last three years.

Regardless of the worthy reasons precluding member attendance, the low turnout weakens the U.S. argument for greater participation from Russia and other post-Soviet countries in the OSCE. Russia sent nine delegates to the meeting.

Political debate is like sport you can’t win if you don’t show up and the low turnout from the Americans cost Rep. Robert B. Aderholt, R-Alabama, his vice-chairmanship of the Assembly’s human rights committee, a post from which he was an active contributor for the last two years.

Without Aderholt, Cardin is left as the sole American in the Assembly’s elected leadership, and his vice-presidential term expires next year.

At a time when the U.S. Congress and the European Union have moved to sanction nations for their violations of human rights, forums like the OSCE become all the more important. Bringing parliamentarians and diplomats together for five days of meetings, debates and votes can create new streams for dialogue where others may have dried up.

In fact, nowhere other than the OSCE can members interact with their elected counterparts from Russia, Central Asia, the Balkans and Western Europe.

Take the case of Belarus, where opposition figures have been repeatedly imprisoned for exercising freedoms of assembly and expression. The country became a priority topic accounting for at least three hours of discussion in Belgrade, and for much of it a member of the Belarusian parliament was present never hiding from the criticism, but instead sitting and speaking right next to the German MP demanding prisoners be released.

The U.S. delegation is known to welcome meetings of its own, often with people whom they do not see eye to eye. Their jam-packed schedules keep members of Congress busy from the moment of landing, always squeezing the most out of these diplomatic opportunities.

Here a bilateral meeting with the Russian delegation, there a visit with a head of state in this case Serbia’s Boris Tadic. The trips end up being as important for the lasting international relationships they forge as for the substantive ideas the members discuss when together.

Despite the small bench for the Americans this month, Cardin proved more than able to keep up the multi-tasking tradition at the Belgrade meeting. Juggling no less than four substantive issues in various committees, he made sure the delegation with no House members was still wholly invested in the multilateral process.

Cardin literally seemed to be everywhere successfully pushing an
amendment on extractive industry transparency, speaking about investigating organ trafficking, and promoting a colleague’s cyber security measure. He was equally active behind the scenes, where his lobbying contributed to the narrow defeat of a resolution that would have called for giving the Palestinian Authority partner status in the OSCE.

But as often as we saw Cardin speaking in Belgrade, fellow parliamentarians repeatedly were asking about his colleagues, especially Aderholt and Rep. Chris Smith, R-New Jersey, the Assembly’s special representative on human trafficking.

Smith’s Victims of Trafficking and Violence Protection Law, the world’s first major anti-trafficking measure, took off internationally thanks largely to his work at an Annual Session where he gave his colleagues copies of the bill. They went home, translated it, and made it law  beginning a global network that still works to combat modern-day slavery.

This type of robust activity is the hallmark of U.S. participation, but if
 it continues to fall to only one or two members to do all the heavy
lifting, it becomes harder to sustain and weakens the international forum at a time when legislators most need to benefit from each other’s experiences.

R. Spencer Oliver is secretary general of the Organization for Security and 
Co-operation in Europe Parliamentary Assembly.




US/Israel Can Respect Palestinian Rights

The clock is ticking on what could be the next explosion in the Middle East, if Palestinians press their demand for United Nations recognition as a state and the United States and Israel continue to spurn this acknowledgement of Palestinian rights. But Adil E. Shamoo says this political bomb can be defused.

By Adil E. Shamoo

If conditions do not change quickly by the time of the U.S.-promised veto of Palestinian statehood at the UN General Assembly on Sept. 20, the Palestinian-Israeli conflict could explode into a new uprising with hundreds of deaths.

The recent attack of Palestinian extremists on a bus in the southern Israeli resort town of Eilat and the eager over-reaction of Israeli President Benjamin Netanyahu is a harbinger of what is to come.

The uprising will bring the United States into sharp conflict with not only the Palestinians but also the rest of the Arab world. A new Arab spirit is demanding that the rest of the world, especially the United States, treat Arabs with equal respect and dignity.

The Palestinians will ask the upcoming UN General Assembly to vote for “non-member state” status for the Palestinians on Sept. 20. Since this resolution bypasses the Security Council, the promised U.S. veto will not be operative.

The least desirable choice for the United States is to vote no in the General Assembly. It would isolate the United States from the rest of the world community, which is expected to agree to the Palestinians’ sought-after status.

With the United States at its lowest popularity in the Arab world, this further isolation would only create additional challenges as the Arab Spring turns cloudy and many long-term challenges complicate U.S.-Arab relations.

The Palestinians have struggled for over 60 years to regain their rights, economic justice, and dignity. They have tried peaceful confrontation, military action, terrorism, and negotiation — without any success.

The 1.5 million Palestinians in Gaza live in an open-air prison with the highest unemployment (45 percent) in the world, near-starving conditions, and little or no medical care. Israel even stops humanitarian flotillas from reaching Gaza.

Another 1.5 million Palestinians live in Israel as second-class Israeli citizens. Do the Israelis consider the Palestinians as equal human beings?

The Israelis paint the conflict at every step as an existential threat. Israel has legitimate security concerns, which have been addressed as part of successive deals.

The existential threat may have been true in the first few decades of Israel’s existence. However, most reasonable observers and many Israelis know that a demilitarized Palestinian state is not an existential threat.

Israel has the upper hand militarily, and it has used it with a vengeance to suppress Palestinian aspirations. The Israelis are engaged in a policy of open-ended negotiation while confiscating and resettling Palestinian land.

President Obama has attempted to move the negotiations forward slightly by endorsing the blueprint used by previous administrations, namely the 1967 borders with mutually agreed land swaps. But the Obama administration remains as reluctant as its predecessors to pressure its Israeli ally to negotiate in good faith.

The Israeli lobby remains powerful on Capitol Hill, the State Department is staffed by strong supporters of Israel, and the U.S. media features very few voices representing Arab concerns. It’s no surprise that U.S. policies rarely reflect Arab views.

Israel’s policy has increased its isolation in the Middle East and the rest of the world, everywhere in fact except in the United States.

Turkey used to be the closest ally of Israel in the Middle East. But after the killing of nine Turkish citizens (one also having U.S. citizenship) in the Gaza flotilla raid last year and Israel’s refusal to apologize, the relationship between the two countries could not be any colder.

Playing Catch Up

U.S. foreign policy toward the Arab world has not changed to catch up with the Arab Spring.

The Arab Spring is a result of centuries of occupation and indignity. Arabs are now more educated and more connected to the outside world. But instead of working with this new generation, the United States is trying to leverage its relationships with military contacts in Arab militaries to indirectly maneuver the Arab Spring in a way to sustain U.S. interests.

Arabs can easily see the inconsistency of a U.S. policy that supports the overthrow of Libya’s Muammar Gaddafi while taking no action in Bahrain and remaining silent about Saudi Arabia’s oppression.

The Arab Spring has forced the Arab people to face their reality of occupation, colonization, and U.S. and Western support of their corrupt regimes.

The current crises in several Middle Eastern countries, such as those in Syria, Yemen, Bahrain, Iraq, Jordan, and Iran are destabilizing the area. The U.S. veto of the Palestinian statehood resolution at the UN will further aggravate a difficult situation.

This destabilization can become further inflamed if the Palestinian-Israeli conflict deteriorates into another massacre of the Palestinians by Israeli forces. Arab anger can easily be directed against the United States.

As a primary issue among Arabs, the Palestinian-Israeli conflict remains a barometer that shows the willingness of the United States to grant Arabs equal respect. At this tenuous time in the Middle East, the killing of innocent Palestinian civilians by the Israeli military with U.S. acquiescence is explosive.

But the United States can do something to change the situation. It can acknowledge the new realities in the Arab world by recognizing Palestinian self-determination at the UN. Treating Arabs as equals rather than a people to be manipulated for political and economic gain is a lesson of the Arab Spring that the United States can still learn.

Adil E. Shamoo, a senior analyst for Foreign Policy In Focus, writes on ethics and public policy. He is the author of the forthcoming book Equal Worth: When Humanity Will Have Peace. He can be reached at ashamoo@umaryland.edu.




Making Airport Screening Saner

In the decade since 9/11, airports have invested a fortune in heightened security against terrorism while alienating millions of passengers with procedures that demean and delay. Retired prosecutor William John Cox suggests some improvements to the system.

By William John Cox

Google the phrase “TSA stupidity” and you will find that almost one-and-a-half million websites have something to say about the subject. 

If the United States is to avoid another major terrorist attack on its air transportation system without placing greater restrictions on the civil liberties of air travelers, the Transportation Security Administration (TSA) had better get smart.

Everyone who travels by air in the United States has a depressing story to tell about airport screening.

Media stories of a gravely ill 95-year-old grandmother forced to remove her adult diaper before being allowed on a plane and viral videos showing terrified children being intimately touched by TSA agents are more than depressing. They are a chilling commentary on the police state increasingly accepted by the American public in the name of security.

Air travelers dare not complain. TSA standards focus additional scrutiny on travelers who are “very arrogant” and express “contempt against airport passenger procedures.”

Is such repression the only choice? Or, can TSA officers be trained to exercise the necessary discretion to detect would-be terrorists, while allowing innocent travelers to swiftly and safely pass through screening?

A reasonable and practical balance in airport security screening policy must be obtained before another terrorist attack results in even greater repression.

Today’s TSA

Shocked that poorly trained airport security guards allowed terrorists armed with box cutters to board and use four passenger airplanes as flying missiles of mass destruction, Congress established the TSA two months after 9/11.

Fifty thousand Transportation Security Officers (TSO) were quickly hired and rushed through one-week training courses. Although these officers are now federal employees and receive improved training, they are still security guards. Even so, as “officers” of Homeland Security, they exercise great power over the flying public.

TSA transformed contract screening guards into quasi-law enforcement officers and provided uniform training and policies; however, the TSA was organized as a top-down directed organization which allows very little discretion to individual officers. 

It’s “one size fits all” approach to screening results in well-intended, but outrageous conduct by its agents.

In an attempt to prevent collective bargaining and to avoid adding Democratic-leaning permanent workers to the federal bureaucracy, the Republican-controlled Congress exempted TSA employees from most federal civil service laws. 

Instead, the Secretary of Homeland Security and the TSA administrator were given virtually unlimited authority to create a personnel system. This action was to have a number of unintended consequences.

Although legislation has been introduced to bring TSA officers into the federal civil service, the TSA administrator retains absolute control over the personnel system. Exercising this power, John Pistole, the administrator appointed by President Barack Obama, granted some bargaining rights earlier this year.

While Pistole’s order provides greater job protection to officers, it does nothing to improve the existing TSA personnel selection system. As presently constituted, the employment process perpetuates mediocrity and limits the ability of TSA managers to hire and promote the most qualified officers.

Currently TSA job applicants primarily use the Internet to identify job announcements for TSA airport operations at more than 450 airports, complete applications and take an online test to measure their ability to operate screening equipment.

All English-speaking U.S. citizens over the age of 18 with a high school diploma, a GED, or one year of experience as a security officer or x-ray technician, meet the basic requirements for TSA officers, as long as they are current in their payment of income taxes and child support.

The main problem is that, once applicants meet these minimum requirements and pass a physical examination, drug screening and perfunctory background investigation, they are lumped together with all other applicants in a hiring pool for each job site.

Unlike general civil service rules, there are no ranked lists of the most qualified applicants within these pools.

Under the personnel standards established by the TSA administrator, local managers are required to select officers from the hiring pool based on the earliest applicant first, irrespective of their additional qualifications. 

Thus, a local TSA manager must hire a high-school dropout with a GED and no experience who applied one day before a college graduate with a degree in criminal justice and who earned his or her way through college working for the campus police department. 

While some managers conduct oral interviews of candidates, only in rare cases are they allowed to reject candidates who meet the minimum qualifications.

Laboring under a flawed selection process and making the best of available candidates, TSA has identified three basic ways to achieve mission effectiveness: baggage inspection, passenger screening and, most recently, behavior observation.

Although every checked bag is not hand-inspected, passengers are not allowed to lock baggage unless special TSA locks are used. As a result most bags are inspected by inspectors who are either working alone or under limited supervision.

There have been some recent improvements in baggage security; however, the New York Press reports that “according to Transportation Security Administration records, press reports and court documents, . . . approximately 500 TSA officers” have been “fired or suspended for stealing from passenger luggage since the agency’s creation.”

Every passenger is personally screened before boarding commercial aircraft and the majority of TSA officers are deployed to handle this task. Having a mission in which officers “literally touch passengers” and their most private possessions “requires a workforce of the best and brightest,” according to Nico Melendez, TSA Public Affairs Manager of the Pacific Region.

Unfortunately, because of low hiring standards and minimum training, many, if not most screening officers possess poor people skills and manage to offend a large portion of the flying public on a daily basis.

Seeking to emulate the Israeli model of “identifying the bomber, rather than the bomb,” TSA deployed Behavior Detection Officers (BDO) in 2007 under its Screening of Passengers by Observation Techniques (SPOT) program. 

Officers randomly ask passengers questions, such as “Where are you traveling,” while looking for facial cues that might indicate deception or terrorist intent, leading to additional questioning and closer inspection of baggage.

Thousands of BDOs are now working in hundreds of airports and the program is being expanded; however, they are generally selected from screening personnel and only given two weeks of training before being deployed.

There has been no scientific validation of the program and, although there have been hundreds of criminal arrests, most have been for documentation issues, such as immigration violations and outstanding warrants.

Would improved personnel selection procedures of TSA officers better insure the safety of the flying public and reduce the incidence of civil rights violations?

Building a Better TSA

The essential question is whether TSA officers are security guards or police officers when it comes to the manner in which they lay hands on the bodies and belongings of passengers. The difference in the two roles being the manner and extent to which they make decisions.

Security guards with minimal training cannot be expected to exercise discretion in critical matters. They are told exactly what or what not to do. The result is that screaming children are being felt up by strangers and the sick and elderly are publicly humiliated.

On the other hand, even with the “mandatory” criminal laws passed in the past 30 years, America’s free society still requires the exercise of arrest, prosecution and sentencing discretion in the criminal justice system, if there is to be individual justice in an individual case.

TSA must rethink the manner in which its officers are hired and trained to allow greater discretion, without an unacceptable rise in the risk of a terrorist attack.

The TSA has been moving in this direction with its “risk-based intelligence-driven screening process”; however, its steps have been hesitant and unsure, as it has staggered from incident to increasingly negative incident.

TSA official Melendez believes the key to successful screening is a workforce capable of implementing a risk-based screening process based upon updated software and equipment and ready access to an improved data base.

So, how can a marginally trained group of 50,000 security guards be converted into a professional workforce, which has the intellectual ability and training to use sophisticated detection equipment and computer data bases and which allows TSA officers to decide which sick person or young child should be allowed to proceed without a mandatory body search?

Selection. A former high-level TSA manager, who declined to be publicly identified, firmly believes that TSA could build an elite organization, if local managers were simply allowed to rank the hiring pools by qualifications, rather than having to hire the candidate who filed the earliest application.

Certainly there is a need to avoid discrimination in hiring and to create a “diverse and inclusive” workforce that is reflective of the public it serves; however, police departments have used a civil service process for decades that involves testing and interviews to establish priority lists to ensure the employment and promotion of the most qualified candidates.

Among the federal law enforcement agencies, the FBI moves applicants though a multi-phase selection process in which advancement depends upon “their competitiveness among other candidates”; Secret Service applicants must pass several examinations and a series of in-depth interviews; and ATF applicants who pass entrance exams and assessment tests have to successfully complete a “field panel interview.”

The current recession and high unemployment rate has resulted in a gigantic pool of highly-qualified and well-educated people who are looking for work. At the same time, TSA has been experiencing a fairly high turnover of employees, even though it offers a generous salary and benefit package. 

Given all of this, there is a golden opportunity to improve the quality of the TSA workforce, particularly as it relates to the ability of its officers to exercise discretion.

A recent informal survey of airport car rental employees revealed that all of them were college graduates; however, they generally earned less and had fewer benefits than the TSA officers who worked in the same building.

In fact, most national car rental companies require all applicants to have college degrees.

Avis says, “College graduates, start your engines” in its attempt to attract “energetic pro-active college graduates who are eager to accelerate their careers in a fast-paced environment.” Enterprise “prefers” college degrees since applicants will “be involved in a comprehensive business skills training program that will help you make crucial business decisions.”

Clearly it is neither necessary nor appropriate for all TSA applicants to be college graduates; however, local TSA managers should be allowed to consider levels of education, as well as length and quality of relevant experience, in establishing priority lists for hiring replacement officers and for promoting officers to supervisory or BDO positions.

Revised personnel policies that rank applicants by qualifications for these advanced positions would also allow TSA managers to directly hire more qualified candidates, such as retired police officers, for positions requiring a higher level of decision making.

Training. Currently, most training of TSA officers is conducted through online applications of standardized instruction. 

While such training may be adequate to communicate rule-based procedures to security guards, it is inadequate to teach the more finely nuanced insights required for officers to safely exercise discretion in individual cases.

Behavior Detection Officers and supervisors are currently selected from the ranks of TSOs and receive as little as two weeks of additional training upon promotion. However, a successful risk-based screening process involving critical thinking requires more intensive development and training.

Obviously, TSA can’t fire 50,000 officers and start all over again from scratch, but surely there is a way to safely maintain the basic security guard approach to screening yet allow for higher levels of discretion during the process?

Assuming that TSA managers are allowed to more effectively promote officers and to select supervisors and Behavior Detection Officers from outside the organization, and further that TSA could improve the training of supervisors and BDOs, they could begin to exercise the quality of discretion which would allow small children and elderly grandmothers to safely pass through security without impermissible assaults.

TSA should consider establishing regional training academies at the larger facilities around the country to provide classroom training for newly-appointed supervisors and BDOs into the nature of policy, the concept of rational profiling and the exercise of security discretion in a free society.

Policy. The concept of policy, as differentiated from procedures and rules, is that policies are intended as broad guidelines for the exercise of discretion allowing decision makers some flexibility in their application.

The exercise of critical discretion will fail in the absence of effective policies. This was recognized by the National Advisory Commission on Criminal Justice Standards and Goals in its Report on the Police in 1973:

“If police agencies fail to establish policy guidelines, officers are forced to establish their own policy based on their understanding of the law and perception of the police role. Errors in judgment may be an inherent risk in the exercise of discretion, but such errors can be minimized by definitive policies that clearly establish limits of discretion.”

We are all aware of the insidious and repressive nature of racial profiling that has been practiced by some law enforcement agencies. Indeed, one criticism of the TSA Behavior Detection program involved Newark BDOs known as “Mexican hunters” was that they concentrated on Hispanic-appearing individuals, resulting in a large number of arrests for immigration violations.

Well-considered policies can allow BDOs to productively direct their attention to the most suspicious candidates for extended questioning, rather than to mindlessly and repetitively ask every single traveler where they are going.

With improved policy guidance and greater discretion, BDOs might actually identify and stop a real threat, but they will only offend even more travelers if they continue to follow rote procedures.

Perhaps most importantly, such polices can provide commonsense guidelines for qualified decision makers at each screening station to allow obviously harmless grandmothers and children to avoid intrusive body contact, while focusing attention on those individuals more likely to be a terrorist.

The Right Direction

According to TSA 101, a 2009 overview of the TSA, the agency seeks to evolve itself “from a top-down, follow-the-SOP culture to a networked, critically-thinking, initiative-taking, proactive team environment.”

TSA Administrator Pistole wants “to focus our limited resources on higher-risk passengers while speeding and enhancing the passenger experience at the airport.”

On June 2, Pistole testified before Congress that “we must ensure that each new step we take strengthens security. Since the vast majority of the 628 million annual air travelers present little to no risk of committing an act of terrorism, we should focus on those who present the greatest risk, thereby improving security and the travel experience for everyone else.”

It appears TSA is moving in the right direction and Pistole may the person to keep it on course. Prior to his appointment by Obama in May 2010, he served as the Deputy Director of the FBI and was directly involved in the formation of terrorism policies.

Most significantly, his regard for civil rights was suggested by his approval of FBI policy placing limits on the interrogation of captives taken during the “war on terror.” The policy prohibited agents from sitting in on coercive interrogations conducted by third parties, including the CIA, and required agents to immediately report any violations.

One can hope that TSA Administrator Pistole will exercise his authority to bring about improved selection and training of TSA personnel and will promulgate thoughtful screening policies achieving a safer and less stressful flying experience for everyone.

William John Cox is a retired prosecutor and public interest lawyer, author and political activist. He authored the portions of the Police Task Force Report on the role of the police and policy formulation for the National Advisory Commission on Criminal Justice Standards and Goals in 1973. His efforts to promote a peaceful political evolution can be found at VotersEvolt.com, his writings are collected at WilliamJohnCox.com and he can be contacted at u2cox@msn.com.




The World at a Tipping Point

America and the world seem precariously balanced between those who wish to deny the many problems facing mankind and those who insist that the human race address the multiple crises confronting the planet. Winslow Myers sees reason to hope that the world will tip in a positive direction.

By Winslow Myers

The brilliance of the “Mad Men” television series lies in the crackerjack acting and script, but even more in the way the series dramatizes the paradigm shift of American women from gross subjugation to rough equality.

In an early episode, protagonist Don Draper reluctantly allows his wife to consult a (male) psychiatrist, and then calls the doctor, who casually violates confidentiality.

The series explains much about how the males of my generation often haplessly misunderstood — or deliberately ignored — the autonomous subjectivity of females.

This begs two questions: what blindnesses operating in the present cultural moment might be illuminated by talented scriptwriters as they look back from the perspective of 2040?

And second, what is the vision that orients us as we work to ensure that there will be a future to look back from in 2040?

American politics in 2011, in the run-up to the next presidential election, seems to operate in a weird bubble of denial, the engine of which is politicians pandering for votes. No one gets to be a President or Senator by emphasizing such unvarnished truths as:

–Oil and coal companies exercise too much power to slow or prevent altogether an incentivized transition to clean and sustainable forms of energy generation.

–People of wealth and large corporations do not pay their fair share of taxes, and as long as Congress is in thrall to lobbyists, reform of the tax code toward simplicity, transparency and fairness will be difficult in coming.

–Some American financial institutions characterized as “too big to fail” are insufficiently regulated, making money off the misfortunes of ordinary citizens, intensifying the grotesque differences between the incomes of the super-rich and all the rest of us.

–The wars in Iraq and Afghanistan are obscenely expensive stalemates that have not increased our security, and may have created more terrorists than they have killed.

–Nuclear weapons have become completely useless as instruments of deterrence.

–The U.S. defense budget is bloated and lacks accountability.

–Global climate instability is clearly being intensified, if not caused, by human activity.

–The U.S. military is the biggest user of fossil fuels and polluter in the world, even as it plans to fight wars caused by the same extreme climate events that are presently intensifying chaos and dislocation for millions.

–The debt ceiling of nations may be negotiated or engineered, but the debt that comes from the unsustainable assault of too many humans on the living systems of the Earth is non-negotiable.

Coral reefs are dying; the oceans are polluted with plastic; many fish species have been harvested almost to extinction; tropical rain forests are still being put to the torch or the saw; polar icecaps and mountain glaciers continue to melt at faster than expected rates.

But there is good news also, about which we also do not hear enough from our candidates:

–There are millions of non-governmental organizations springing up around the world that agree upon the values of human rights for all, eco-sustainability, nonviolence, and democratic structures,the largest mass movement in history, says entrepreneur and ethicist Paul Hawken.

One important new organization is Awakening the Dreamer, which offers citizens a free half-day seminar that awakens us to the real challenges we face,and the real possibility of meeting them.

–War just might be a dying institution. Wars of decolonization or proxy wars between superpowers have scaled back to zero since the end of the Cold War. While still horrible, contemporary wars kill fewer civilians and soldiers than some of the conflagrations of the not-too-distant past.

Still, this optimism about war fails to include the continued presence of massive numbers of nuclear weapons, nor the ever-increasing effects of climate change upon the poorer nations, nor global population growth, nor the unpredictable element in current events.

New World

We find ourselves waking up in a whole new world, where rich and poor occupy the same leaky boat in a polluted sea.

Ensuring the future requires a fundamental shift in thinking from “I am separate” to “We are one”,a paradigm shift from measuring our economic success quantitatively to finding new qualitative criteria.

From turning reflexively toward war to moving aggressively to prevent war. From grotesquely large military budgets to humanitarian aid that directly meets human needs. From candidates who deny global warming to candidates who advocate for a reorientation of priorities on the level of a planetary Marshall Plan.

None of this will happen unless we all get involved, and push and question and become an active force that leaders cannot ignore.

This is the time when candidates are spending the most time listening to ordinary citizens. The questions we ask can be powerful agents of a new awakening.

If that came to pass, we might someday enjoy a TV series that looked back through the decades to dramatize the gradual end of our delusions.

It might make us wince at the “windy militant trash” (Auden) of present political discourse just as we wince at the dated chauvinism of the “Mad Men” era, but we might also be celebrating how far we had come.

Meanwhile we have a long way to go, baby.

Winslow Myers, the author of Living Beyond War: A Citizen’s Guide, serves on the Board of Beyond War (www.beyondwar.org), a non-profit educational foundation whose mission is to explore, model and promote the means for humanity to live without war.




The Dangerous Reagan Cult

Exclusive: Ronald Reagan’s anti-government philosophy inspires Tea Party extremists to oppose any revenue increase, even from closing loopholes on corporate jets. Democrats try the spin that “even Reagan” showed flexibility on debt and taxes. But Robert Parry says it is the “Reagan cult” that is at the heart of America’s crisis.

By Robert Parry

In the debt-ceiling debate, both Republicans and Democrats wanted Ronald Reagan on their side. Republicans embraced the 40th president’s disdain for government and fondness for tax cuts, while Democrats noted that “even Reagan” raised the debt limit many times and accepted some tax increases.

But Reagan possibly more than any political leader deserves the blame for the economic/political mess that the United States now finds itself in. He was the patriarch for virtually every major miscalculation that the country has made over the past three decades.

It was Reagan who slashed taxes on the rich to roughly their current level; he opened the flood gates on deficit spending; he accelerated the decline of the middle class by busting unions and slashing support for local communities; he disparaged the value of government regulations; he squandered money on the Pentagon; he pushed more militaristic strategies abroad; and he rejected any thoughtful criticism of past U.S. foreign policies.

Reagan also created what amounted to a “populist” right-wing cult that targeted the federal government as the source of nearly all evil. In his First Inaugural Address, he famously declared that “government is not the solution to our problem; government is the problem.”

It is that contempt for government that today is driving the Tea Party extremists in the Republican Party. Yet, as with many cults, the founder of this one was somewhat more practical in dealing with the world around him, thus explaining some of Reagan’s compromises on the debt ceiling and taxes.

But once the founder is gone, his teachings can become definitive truth to the disciples. Flexibility disappears. No deviation is permitted. No compromise is tolerated.

So, at a time when government intervention is desperately needed to address a host of national problems, members of this Reagan cult apply the teachings of the leader in the most extreme ways. Since “government is the problem,” the only answer is to remove government from the equation and let the corporations, the rich and the magical “market” dictate national solutions.

It is an ironic testament to Ronald Reagan’s enduring influence that America’s most notable “populist” movement, the Tea Party, insists that tax cuts for the wealthy must be protected, even minor ones like tax loopholes for corporate jets. Inside the Tea Party, any suggestion that billionaire hedge-fund managers should pay a tax rate equal to that of their secretaries is anathema.

Possibly never in history has a “populist” movement been as protective of the interests of the rich as the Tea Party is. But that is because it is really a political cult dedicated to the most extreme rendering of Ronald Reagan’s anti-government philosophy.

Astro-Turf ‘Populists’

Granted, the Tea Party also can be viewed as an astro-turf outfit financed by billionaires like the Koch brothers and promoted by billionaire media mogul Rupert Murdoch. But Election 2010 proved that the movement is capable of putting like-minded politicians into office, especially when discouraged elements of the American Left choose to sit on the sidelines.

During the debt-ceiling battle, the GOP’s Tea Party caucus showed it was strong enough to block any compromise that included a revenue increase. The thinking is that the “evil” government must be starved even if that means defending indefensible tax loopholes and shoving the world’s economy to the brink of catastrophe.

The Tea Party’s rabid enforcement of the Reagan orthodoxy instills such fear among top Republicans that every one of the eight presidential hopefuls at a recent Iowa debate vowed to reject a deal that would include just $1 of higher taxes for each $10 in spending cuts. Even supposed moderates like Mitt Romney and Jon Huntsman threw up their hands.

But the Reagan cult reaches far beyond the Republican Party. Last February, a Gallup poll of Americans cited Reagan as the greatest president ever, with a five percentage point lead over Abraham Lincoln.

These days, virtually no one in Washington’s political or media circles dares to engage in a serious critique of Reagan’s very checkered record as president. It’s much easier to align yourself with some position that Reagan took during his long career, much like a pastor selectively picking a Bible passage to support his theological argument.

When negative national trends are cited such as the decline of the middle class or the widening gap between rich and poor the self-censorship demands that Reagan’s name not be spoken. Instead, there are references to these problems deepening “over the past three decades,” without mentioning whose presidency got things going big time.

Creating an Icon

And there is a self-interested reason for this hesitancy. The Republicans and the Right have made it a high priority to transform Reagan into an icon and to punish any independent-minded political figure or journalist who resists the group think.

The first step in this process occurred in the late 1980s, with aggressive cover-ups of Reagan’s crimes of state, such as scandals over the Iran-Contra arms-for-hostages affair, Contra-cocaine trafficking, and the Iraq-gate support of dictator Saddam Hussein.

Faced with furious Republican defenses of Reagan and his inner circle, most Democrats and mainstream journalists chose career discretion over valor. By the time Bill Clinton was elected in 1992, the refrain from Democrats and Washington pundits was to “leave that for the historians.”

Those who didn’t go along with the cover-ups like Iran-Contra special prosecutor Lawrence Walsh were subjected to ridicule from both the right-wing and mainstream media, from both the Washington Times and the Washington Post. Journalists who challenged the implausible Reagan cover-ups also found themselves marginalized as “conspiracy theorists.”

Leading Democrats decided it made more sense to look to the future, not dwell on the past. Plus, acquiescing to the cover-ups was a way to show their bipartisanship.

However, Republicans had other ideas. Having pocketed the concessions regarding any serious investigations of Reagan and his cohorts, the Republicans soon went on the offensive by investigating the heck out of President Clinton and his administration.

Then, having stirred up serious public doubts about Clinton’s integrity, the Republicans trounced the Democrats in the 1994 congressional elections. With their new majorities, the Republicans immediately began the process of enshrining Reagan as a national icon.

By and large, the Democrats saw these gestures, like attaching Reagan’s name to National Airport, as another way to demonstrate their bipartisanship.

But Republicans knew better. They understood the strategic value of elevating Reagan’s legacy to the status of an icon. If everyone agreed that Reagan was so great, then it followed that the hated “guv-mint” must be that bad.

More Accommodations

Increasingly, Democrats found themselves arguing on Republican ground, having to apologize for any suggestion that the government could do anything good for the country. Meanwhile, the Clinton-era stock market boom convinced more Americans that the “market” must know best.

Going with that flow, President Clinton signed a Republican-sponsored bill that removed Depression-era regulations in the Glass-Steagall Act, which had separated commercial and investment banks. With the repeal, the doors were thrown open for Wall Street gambling.

In the short run, lots of money was made, encouraging more Americans to believe that the government and its “safety net” were indeed anachronisms for losers. People with any gumption could simply day-trade their way to riches.

Reagan, it seemed, was right all along: government was the problem; the “free market” was not only the solution but it could “self-regulate.”

That was the political/media environment around Election 2000 when the wonkish Vice President Al Gore ran against the brash Texas Gov. George W. Bush, who came across to many as another version of Ronald Reagan, someone who spoke simply and disdained big government.

Though Gore could point to the economic successes of the Clinton years, including a balanced federal budget and the prospect of the total elimination of the federal debt, the major media mocked him as a know-it-all nerd who wore “earth-toned sweaters.” Meanwhile, mainstream journalists swooned over Bush, the regular guy.

Still, Gore eked out a narrow victory in the national popular vote and would have carried the key state of Florida if all legally cast votes were counted. But Bush relied on his brother’s administration in Florida and his father’s friends on the U.S. Supreme Court to make sure that didn’t happen. Bush was declared the winner in Florida and thus the new president. [For details, see Neck Deep.]

In retrospect, Election 2000 was a disastrous turning point for the United States, putting into the highest office in the land an unqualified ne’er do well who had lost the election.

But this outrage against democracy was largely accepted because of the muscular right-wing machine, the on-bended-knee mainstream media and the weak-kneed Democrats a political/media dynamic that Reagan had helped create and had left behind.

The progress that the Clinton administration had made toward putting the U.S. financial house in order was quickly undone as Bush pushed through two massive tax cuts benefiting mostly the rich and waged two open-ended wars financed with borrowed money.

Years of Reaganism also had taken its toll on the government’s regulatory structures. Reagan had consistently appointed regulators who were hostile to the very concept of regulating, such as Anne Gorsuch at the Environmental Protection Agency and James Watt at Interior. He also elevated Alan Greenspan, a “free market” admirer of Ayn Rand, to be chairman of the Federal Reserve Board.

In the 1980s, the looting of America was underway in earnest, but the elites of Washington and New York saw little to protest since they were getting a cut of the plunder. The real losers were the average Americans, especially factory workers who saw their unions broken or their jobs shipped overseas under the banner of “free trade.”

Feeling Good

But many Americans were kept entranced by Reagan’s feel-good magic.

Taking office after a difficult decade of the 1970s, when America’s defeat in Vietnam and the Arab oil price hikes had shaken the nation’s confidence, Reagan simply assured everyone that things would work out just fine and that no excessive sacrifice was in order. Nor should there be any feelings of guilt, Reagan made clear.

By the late 1970s, it was widely accepted even among many Republicans that the Vietnam War had been an abomination. But Reagan simply rebranded it a “noble cause,” no reason for any serious self-reflection on America’s imperial role in the world.

Reagan then allied the United States with “death-squad” regimes all over Latin America and across the Third World. His administration treated the resulting carnage as a public-relations problem that could be managed by challenging the patriotism of critics.

At the 1984 Republican National Convention, Reagan’s United Nations Ambassador Jeane Kirkpatrick labeled Americans who dared criticize U.S. foreign policy as those who would “blame America first.”

To continue this sort of verbal pummeling on those who continued to get in the way, Reagan credentialed a bunch of thuggish intellectuals known as the neoconservatives.

For the rest of the country, there were happy thoughts about “the shining city on a hill” and “morning in America.”

In reality, however, Reagan had set the stage for the tragedies that would follow. When George W. Bush grabbed power in 2001, he simply extended the foreign and economic policies of the Republican cult leader: more tax cuts, more militarism, less regulation, more media manipulation.

Soon, the gap between rich and poor was widening again. Soon, the United States was at open war in two countries and involved in secret wars in many others. Soon, the nation was confronted with new scandals about torture and deception. Soon, the federal budget was flowing with red ink.

And near the end of Bush’s presidency, the de-regulated excesses of Wall Street pushed the country to the brink of a financial cataclysm. Bush supported a bail-out to save the bankers but didn’t do much for the millions of Americans who lost their jobs or their homes.

Second Thoughts?

One might have thought that the financial crack-up in 2008 (plus the massive federal deficits and the botched wars in Iraq and Afghanistan) would have confronted the Reagan cult with an existential crisis of faith. It would seem obvious that Reagan’s nostrums just didn’t work.

However, after only a brief interregnum of Barack Obama, the Republicans seem poised to restore the Reagan cult to full power in the United States. The new apparent GOP frontrunner, Texas Gov. Rick Perry, is already being hailed in the Washington Post as “The Texas Gipper.”

The Washington Times (yes, Rev. Sun Myung Moon’s right-wing propaganda sheet is still around) fairly cooed over Perry’s tough attacks on Obama, depicting America’s first black president as someone who apologizes for America and isn’t deserving of its soldiers in uniform.

“One of the powerful reasons for running for president of the United States is to make sure every man and woman who puts on the uniform respects highly the president of the United States,” Perry said. “We are indignant about a president who apologizes for America.”

As far as Perry is concerned, America has nothing to apologize for.

These are themes right out of Ronald Reagan’s playbook. And it appears likely that Election 2012 will be fought over terrain defined by Reagan, even though he left office in 1989 and died in 2004.

It is already clear that President Obama will be on the defensive, trying to justify a role for the federal government in America and explaining why the Reaganesque policy of low taxes on the rich must finally be reversed. Obama also is certain to shy away from any serious examination of how U.S. foreign policy went so wrong, so as not to be labeled “apologist-in-chief.”

Rick Perry or whatever other Republican gets the party’s nomination will hold the high ground of Reagan’s lofty standing among the American people. The GOP nominee can continue blaming “guv-mint” for the nation’s problems and promising another “morning in America” if only the nation further reduces the size of “guv-mint.”

With Democrats also trying to associate themselves with the “greatest president ever,” it appears doubtful that any serious effort will be made to explain to the American people that the charming Reagan was the pied piper who led them to their current demise.

[For more on these topics, see Robert Parry’s Secrecy & Privilege and Neck Deep, now available in a two-book set for the discount price of only $19. For details, click here.]

Robert Parry broke many of the Iran-Contra stories in the 1980s for the Associated Press and Newsweek. His latest book,Neck Deep: The Disastrous Presidency of George W. Bush, was written with two of his sons, Sam and Nat, and can be ordered at neckdeepbook.com. His two previous books, Secrecy & Privilege: The Rise of the Bush Dynasty from Watergate to Iraq and Lost History: Contras, Cocaine, the Press & ‘Project Truth’ are also available there.




Strange Death of American Revolution

At the heart of the American experiment was always a tension between oligarchy and democracy, with the oligarchs usually holding the upper hand. However, in recent decades, the struggle has taken a curious turn with the oligarchs largely obliterating the people’s memory of the true democratic cause, writes Jada Thacker.

By Jada Thacker 

Most Americans know Jack London as the author of The Call of the Wild. Few have ever read his 1908 novel, The Iron Heel, which pits what London calls “the Oligarchy” (aka The Iron Heel) against the American working class, resulting in armed revolution.

The Oligarchy, London explains, is the ruling elite whose immense concentration of capital has empowered it to transcend capitalism itself. The Iron Heel is thus an allegorical tale of a fascist state whose hydra-headed business monopolies have seized control of all facets of production, consumption and national security.

London was not the lone American revolutionary author of his generation. Looking Backwards by Edward Bellamy, Caesar’s Column by Ignatius Donnelly, and the less militant Progress and Poverty by Henry George all assumed that some version of democratic-socialist Revolution was just around the corner of history or if not, then ought to be.

As late as the 1930s (and briefly during the anti-Vietnam War period), many Americans still thought “The Revolution” was in the offing. But those days have passed, and no one today speaks seriously of any such thing.

Why not?

The Traditional Oligarchy   

“Oligarchy” means “rule by the few.” It is an ugly word in its pronunciation as well as in its implied meaning.

Moreover, it is a tainted word because it is used often by “dangerous radicals” to describe the people they wish to see blindfolded and stood against a wall. Nonetheless, it is the proper word to describe the current practice of governance in the United States.

This, of course, is not a new development.

The origin of American civil government was not, as certain champions of Locke’s social contract would have it, to secure to each citizen his equal share of security and liberty, but rather to secure for the oligarchs their superior position of power and wealth.

It was for precisely this reason the United States Constitution was written not by a democratically-elected body, but by an unelected handful of men who represented only the privileged class.

Accordingly, the Constitution is a document which prescribes, not proscribes, a legal framework within which the economically privileged minority makes the rules for the many.

There is nothing in the Constitution that limits the influence of wealth on government. No better example of this intentional oversight exists than the creation of the first American central bank. It is worth a digression to examine this scheme, as it was the precedent for much yet to follow.

 The very first Congress incorporated a constitutionally-unauthorized central banking cartel (the Bank of the U.S.) before it bothered to ratify the Bill of Rights a sequence of events which eloquently reveals the priorities of the new government.

The bank was necessary in order to carry out a broader plan: the debts of the new nation would be paid with money loaned by the wealthy, and the people were to be taxed to pay the money back to the wealthy, with interest.

The 1791 Whiskey Tax which penalized small-scale distillers in favor of commercial-scale distilleries was passed to underwrite this scheme of bottom-up wealth-redistribution. When frontiersmen predictably rebelled against the tax, they were literally shackled and dragged on foot through the snowbound Allegheny Mountains to appear in show-trials at the national capital, where they were condemned to death.

Socialist bureaucrats were not the culprits here: the 16,000 armed militiamen that crushed the rebels were led in person by two principal Founding Fathers, President George Washington and Treasury Secretary Alexander Hamilton, the author of both the central bank and the whiskey tax legislation.

(After the disproportionate tax drove small producers out of competition, Washington went into the whiskey-distilling business, becoming by the time of his death the largest whiskey-entrepreneur in Virginia, if not the nation.)

This should be a “text-book” example of how oligarchy works, but such examples are rarely admitted in textbooks. Instead, the textbooks assure us that the Founders established the nation upon the principles of “liberty and justice for all,” words that do not appear in any founding document.

Fortunately, for the sake of candor, Hamilton made his support of oligarchy quite clear at the Constitutional Convention when he said, “All communities divide themselves into the few and the many. The first are the rich and well born, the other the mass of the people. … The people are turbulent and changing; they seldom judge or determine right. Give therefore to the first class a distinct, permanent share in the government.”

Who Were “We the People?”

Despite the “We the People” banner pasted on the Preamble, the Constitution, including the Bill of Rights, does not guarantee anyone the right to vote, nor did it prevent the wealthy from making laws denying that right to “the mass of the people.”

Any belief that the Founders countenanced “democracy,” would, at a logical minimum, require that term to appear at least one time within the Constitution or any of its 27 Amendments which it conspicuously does not.

Without some constitutional guarantee of democracy, government maintains the practice of oligarchy by default. Despite pretensions of Republicanism, even among the followers of Jefferson, the new nation was ruled by “the rich and well born” few for a generation before the specter of democracy even began to rear it head.

And so it was that the oligarchic social contract described in Rousseau’s Discourse on Inequality remained the actual basis upon which American socioeconomic order was founded not the Lockean version first fantasized by Jefferson in the Declaration of Independence and then summarily excluded from the Constitution by the Federalists.

Since money, then as now, buys both property and power, it was only logical that democracy would make its first appearance on the 19th century American frontier, where there was very little money, but much property, to be had.

The fact that the property mostly had been stolen was beside the point: possession of it now conferred the right to vote for the first time upon a majority of people who had no money. Thus, but for a limited time only, common Americans began to feel they were in charge of their future. 

For a few short decades, America actually became what it now believes it always was: a democratic Republic, largely free from Big Business, Big Government and Big Religion.

True, the majority of the people still could not vote, slavery still existed, and American Indians were being ravaged, but things were looking up for free, white males as the frontier expanded beyond the grasp of the old-money power of the traditional Eastern oligarchy.

Until the middle of the century when the war came, that is.

The Industrial Oligarchy

The coming struggle did not develop, as many had feared, between the Old East and the New West, nor even between haves and the have-nots. Following the tradition of our remarkably un-revolutionary “American Revolution,” the contest was again a proxy war fought by the common man, but led by factions of the wealthy.

In essence, it was a colonial war that would determine whether the Southern oligarchy of Property or the Northern oligarchy of Money would dominate the resources of the vast American Empire west of the Mississippi.

In practice, however, it was a war not so much between men as machines. When the Northern oligarchy whose money commanded both more men and more machines won the contest, it emerged as a political monopoly in possession of both the fastest growing industry and the mightiest military on Earth.

Requiring only a four-year period of gestation from Fort Sumter to Appomattox, America’s first “military-industrial complex” was born as a result of war, rather than in anticipation of it.

Facing no immediate foreign threat, the military component of the complex soon devolved into an occupation force for the subjugated South and an invasion force for the soon to be subjugated West. Meanwhile, the industrial arm expanded beyond all precedent, exploiting its political monopoly to lavish public subsidies on favored industries, which reciprocated by buying government offices wholesale.

Cloaked in its guise as the Emancipator of Man and the Savior of the Nation, the nationalist-corporate State had arrived. It was to become a super-oligarchy, controlled increasingly by the monopolists of capital, both foreign and domestic; its mission was nothing less than to monopolize what remained of the means of production: the land and labor of the world’s richest continent.

It was this London termed “the Iron Heel.” It was not free-market capitalism. It was a corporatist monopoly far beyond anything envisioned by the traditional, landed oligarchy. It was not controlled by statesmen in frocked coats, or by generals, or government apparatchiks, but by the denizens of the nation’s boardrooms, untouched and untouchable by democratic vote.

It was, in fact, a domestic version of the British Empire.

It did not take long for those under its heel to realize there was only one power on Earth ultimately capable of opposing it: democratic collectivization.

But when reformers made peaceful attempts to rally American farmers, miners and industrial labor, they were defeated by political chicanery, divisive media propaganda and state-sanctioned violence. When they dared employ violence, they were simply outgunned.

Fantasies of a democratic Revolution became the last refuge for those who held out hope for social and economic justice.

Revolution How?

Yet the violent military destruction of the U.S. government was not seriously entertained by any who had witnessed the burning of the Southern cities and the utter destruction of Dixieland.

Indeed, in the dystopic novels, The Iron Heel and Caesar’s Column, violent revolution proves initially suicidal for the working class. And, though Looking Backwards celebrates the emergence of a national-socialist state, the off-stage Revolution that produced utopia is reported as having been miraculously bloodless.

No doubt, American democratic reformers believed in sacrifice for the common good, but even the fringe anarchists among them were not Kamikazes.

The problem lay not in government, per se, but in the oligarchy that controlled the levers of power to benefit its own interests (a lesson contemporary government-hating reformers would do well to learn.)

Although American utopians before and at the turn of the 20th century seemed to assume the Revolution would soon arrive,  its intended purpose would not be to destroy American government wholesale and rebuild it anew.

The Revolution would restore the principal virtues of Jefferson’s Declaration and the Lockean social contract the Natural Right of revolution over that of the extant Constitution foretold by Rousseau, which did not.

The crushing irony of the fantasized democratic Revolution lay not in its intention to replace the American system of governance with a foreign statist ideology, but in its effort to establish for the first time a guarantee of domestic social justice most Americans erroneously believed already existed.

Having no clue that the Constitution had not guaranteed any rights not already exercised by Americans at the time of its ratification, a gullible public majority assumed the purpose of a counterrevolution would be to take their supposed constitutional rights away.

Moreover, the popular majority in the decades after Appomattox was dominated by victorious Union war veterans, who were encouraged to believe they had subjugated the South in the service of human liberty. Thus patriotism, now implicitly defined as allegiance to the Nation State, became the staunchest ally of the victorious industrial oligarchs.

 When the Spanish-American War arrived, America first entered into the international sweepstakes of the second great Western colonization.

When the resultant Philippine War erupted in an unapologetic attempt to deprive Filipinos of democratic self-determination, it was this same sense of patriotic self-glorification that allowed American boys to herd thousands of doomed Filipinos into disease-ridden concentration camps.

Meanwhile, President William McKinley — having narrowly defeated the democratic-populist electoral threat two years previously — was so far removed from reality he reportedly had to refer to a map to discover where the Philippine atrocities were committed. Today, of course, nobody seems to know.      

But it would be Democrat Woodrow Wilson, despite his cameo appearance as a progressive president, who would possibly do more to undermine world-wide democratic reform than any other American in history, to include Ronald Reagan.

Starting in the 1890s, America middle-class progressives had begun to make some measurable progress not in promoting Revolution against the oligarchy but in using the power of the ballot to at least regulate some of society’s undemocratic flaws. Wilson was elected in part to promote the progressive cause.

But Wilson, having nominally stood against American entry into the largest war in human history, suddenly caved to the demands of bankers who feared losing billions in defaulting loans if the Allied cause foundered for lack of American support.

Over the span of a few weeks, Wilson thus reversed two years of principled neutrality, torpedoing more human progress than any number of German U-Boats.

Oddly, Wilson seemed to understand perfectly the result of his betrayal. On the night before he asked Congress to compel the nation into its first world war, he criticized his own decision to a confidant:

“Once lead this people into war,” he said, “and they’ll forget there was ever such a thing as tolerance. To fight, you must be brutal and ruthless, and the spirit of ruthless brutality will enter into the very fiber of national life, infecting the Congress, the courts, the policeman on the beat, the man in the street.”

And so it did.

Patriotic Oligarchy

War propaganda and the “rally ‘round the flag” mentality of wartime America not only distracted Americans from the project of progressive reform, but split them into two antagonistic factions: those who supported the war to “export democracy” worldwide, and those who believed the war, itself, was a betrayal of universal progressive principle.

More important, however, the war inevitably conferred more power and credibility to the oligarchs. Under cover of newly manufactured patriotism, an Espionage Act was passed, rivaling only the founding Federalists’ Sedition Act in totalitarian suppression of free speech.

As a result, prominent socialist labor leaders such as Eugene Debs and Bill Haywood were arrested on the specious charges of speaking their minds and sentenced to 10 and 20 years, respectively.

The engineered Red Scare following the Great War further decimated the ranks of American democratic-socialist reformers.

Soon the socialist IWW labor union was hounded out of existence; Sacco and Vanzetti were executed amid world-wide protest; draconian anti-immigration law was passed; and 9,000 armed miners lost the Battle of Blair Mountain after the intervention of the U.S. Army all serious setbacks to those who hoped for any sort of democratic Revolution.

None of these events was reported by the corporate-dominated press as American workers’ opposition to oligarchy, but rather as foreign-inspired sedition against an All-American democracy.

Then, at long last the Revolution came but it was not American.

For a very short while, Bolshevik Revolution seemed to promise hope. But Lenin was assassinated in 1924, and the rise of Stalin to power within the Bolshevik Party doomed any hope of its fidelity to egalitarian principles.

At home, the dismissal of Wilson’s Fourteen Points by American isolationists helped cement progressive cynicism as their expectations for a “world made safe for democracy” seemed to have failed domestically as well as abroad.

As American culture embraced the feverish consumerism and urban moral vacuity of the Roaring Twenties, renewed democratic activism languished. Even the progressive constitutional reform amendments (income tax, direct election of senators, Prohibition, and women’s suffrage) seemed too little to revive the spirit of social reform dulled first by abandoned neutrality, then again by abandoned war goals.

By the late 1930s, with Stalin’s anti-democratic brutality fully exposed, the democratic-socialist cause was a dead letter for all but the most radical reformers in America.

Heroes’ Warnings Ignored or Worse

Yet even then, America’s most highly decorated soldier, the once popular Marine Major General Smedley Darlington Butler, in 1935, wrote a book entitled War Is a Racket. Having earned two Medals of Honor and more in service to the oligarchy, it seems he had learned something about the “honor” of American war making.

“I spent 33 years and four months in active military service,” he said, “and during that period I spent most of my time as a high class muscle man for Big Business, for Wall Street and the bankers. In short, I was a racketeer, a gangster for capitalism.”  

One need not imagine why his is not now a household name even among U.S. Marines.

Then there was another World War and another Red Scare. The Soviets got the Bomb; China went “Red.” McCarthyist America, it appeared, went temporarily insane.

Almost immediately came yet another war, now in Korea. With it, came the permanent Cold War, and with it, a permanent Red Scare. America’s temporary insanity lapsed into chronic psychosis.

The once-fantasized Revolution, now tarred with the brush of Soviet and Chinese despotism and sidetracked by the incessant paranoia of nuclear holocaust, was never seriously considered again by the American working class.

The more Americans were rallied to defend the corporate nation state, the less able were its citizens to appreciate the structural flaws in its national charter. The collectivism of organized state violence had trumped the collectivism of democratic reform. 

Instead of a Revolution that would force the ruling elite to rewrite the social contract to represent the socially cooperative, “combinative” nature of man, as London and so many others had predicted, it was the people who were forced to sign “loyalty oaths” to a corporatist state bent on perpetual war and perpetual fear of perpetual war.

This dangerous state of affairs was poignantly detailed by an American working-class war hero at the height of the second Red Scare in 1951. Despite the ongoing war in Korea, General Douglas MacArthur found time to blow the whistle on patriotic oligarchy.

He said, “It is part of the general pattern of misguided policy that our country is now geared to an arms economy which was bred in an artificially induced psychosis of war hysteria and nurtured upon an incessant propaganda of fear. [S]uch an economyrenders among our political leaders almost a greater fear of peace than is their fear of war.”

Ten years later, another working-class war hero, President Dwight D. Eisenhower, reiterated MacArthur’s warning of “an artificially induced psychosis of war hysteria” in his 1961 farewell address to the American people.

Eisenhower famously warned that the oligarchy what he originally styled “the military-industrial-congressional complex” was conspiring to lead the nation into needless wars for power and for profit.

Did Americans heed the warnings of its own famed military heroes? Some did.

Eisenhower’s successor, John F. Kennedy, gave action to these words and refused to be goaded into an invasion of Cuba only weeks after Eisenhower’s warning. The next year Kennedy again refused to order the Pentagon’s planned invasion of Cuba during the missile crisis.

The year after that, Kennedy resolved to withdraw all American military advisors from the ever-tightening noose of war in Southeast Asia. At the same time, he privately vowed to withdraw all American forces from Vietnam following the next general election.

Weeks later, he was murdered. He would be the last American president to openly defy the military-industrial complex.

Only nine months after Kennedy’s assassination, Congress abdicated its constitutional responsibility. Eschewing a declaration of war, it nevertheless authorized open-ended military aggression against the country of North Vietnam all on the strength of carefully crafted, now-acknowledged lies, known as the Gulf of Tonkin affair.

If America failed to defeat the global communist threat in Vietnam, we were told, all would be lost. Americans would become communist slaves. Presumably to forestall their future loss of liberty, over two million Americans were then forced against their will to serve the armed forces during an unprovoked military invasion of Southeast Asia.

Nine years of utterly senseless combat ensued before the United States abandoned the war effort in humiliation, having caused the death of over 58,000 Americans and about two million Vietnamese.

Yet a generation after our inglorious military failure, we had not become communist slaves: on the contrary, Vietnam had been accorded Most Favored Nation trade status as American boys queued up in shopping malls to buy sports shoes, produced in American-subcontracted Vietnamese sweatshops, by girls too young to date.

The war drums and the profits beat on.

After 45 years, the $13 trillion Cold War stumbled to a close with the political and economic implosion of the Soviet Union. But it was an event predicted not to result in peace:

“Were the Soviet Union to sink tomorrow under the waters of the ocean,” said George F. Kennan in 1987, “the American military-industrial establishment would have to go on, substantially unchanged, until some other adversary could be invented.”

Kennan, the Cold War author of our “containment strategy,” knew whereof he spoke.

Kennan’s predicted “invention” arrived on cue. Simultaneously with the fall of the Soviet Union arrived the First Gulf War. Then, after the 9/11 terrorist attack, the Cold War was reinvented, permanently it seems, as the Afghanistan War.

It soon was augmented concurrently by the Iraq War founded, like the Vietnam War, upon yet more carefully crafted, now-acknowledged, lies. These seemingly endless conflicts have been joined by an openly secret war waged on the lawless frontiers of Pakistan, and more recently by aerial wars in Libya, Yemen, Somalia, and elsewhere.

 “No nation,” James Madison had said, “could preserve its freedom in the midst of continual warfare.” Ironically, this 1795 nugget of wisdom came from one of our founding oligarchs, who, in 1812, led the United States of America into the first senseless war it did not win.

He ended up proving his own point. Two years after the British burned the White House, Madison renewed Hamilton’s central banking cartel brainchild in order to pay the war debt loaned at interest by the rich.

The Conscripted Revolution

So what of the glorious Revolution, foretold as inevitable by some of our forefathers, many of whom witnessed the 20th century arrive with the eyes of hyphenated slaves: squalid immigrant-laborers, peasant-sharecroppers, or the imprisoned peonage-patrons of the “company store?”

Despite the violence (and it was legion) deployed against those who preached faith in a rejuvenated social contract, the long-awaited democratic Revolution was not crushed by force. It was simply drafted into the service of the corporate-state.

Instead of rebelling against the oligarchy during the second decade of the 20th century, as Jack London foretold fictionally, Americans instead allowed their rulers to register a fourth of the nation’s population for the draft. 

Over two and one half million men eventually were pressed into service to fight a war “to make the world” though not their own homeland “safe for democracy.”

 But when the nation failed to win the peace on its stated terms, the people failed also to perceive the oligarchy had won it on theirs. Flush with war profits, the moneyed class then indulged itself in a decade-long binge of market-driven hysteria which ended, predictably, in the global Great Depression.

This, as is happened, was a blessing in disguise for American democracy.

The governmental and economic reforms made under the New Deal constituted, perhaps for the first time in human history, a re-conceptualization of national government as a guarantor of social justice.

No longer was the principal purpose of American government to be the perpetuation of an oligarchy. Democracy would provide the protection of the “mass of the people” from the depredations of “the rich and the well born” the corporations, and the privileged few who control them.

Jefferson’s nebulous “Life, Liberty and the Pursuit of Happiness” were redefined concretely by Roosevelt’s Four Freedoms. Much more important, Madison’s Bill of Rights despised as it was by many of the Federalist aristocrats that penned our inadequate Constitution would at last encompass economic, instead of merely political, guarantees of right.

President Franklin Roosevelt told us:   

“We have come to a clear realization of the fact that true individual freedom cannot exist without economic security and independence. ‘Necessitous men are not free men.’ People who are hungry and out of a job are the stuff of which dictatorships are made.

“In our day these economic truths have become accepted as self-evident. We have accepted, so to speak, a second Bill of Rights under which a new basis of security and prosperity can be established for all, regardless of station, race, or creed.

“Among these are:

“The right to a useful and remunerative job in the industries or shops or farms or mines of the nation

 “All of these rights spell security. And after this war is won we must be prepared to move forward, in the implementation of these rights, to new goals of human happiness and well-being.”

This, then, was perhaps the pivotal moment in American democracy. This was no manifesto posted by foreign anarchists. It was no dormitory pipe dream of campus intellectuals. It was a gauntlet thrown down at the feet of the American oligarchy by the most popular and most victorious American leader of the century.

It was a promise never before made to the American people. 

That was in 1944. The war, and Roosevelt’s life, ended in 1945.

The next year saw 4,985 labor strikes, involving 4.6 million workers. In no year before, nor since, have so many Americans called themselves to action in an attempt to force corporations to extend a living wage to labor. But the oligarchy, fearing guarantees of security that threatened both its power and its profits, immediately counterattacked.

The very next year, 1947, saw the roll-back of workers’ rights and the establishment of a new and more consolidated “National Military Establishment,” replete with a novel organization called the CIA, the U.S. Air Force, and NATO, America’s first permanent international military alliance since 1778. And for the first time in history, Americans continued to be conscripted into military service with no impending war on the national horizon.

Thereafter, Franklin Roosevelt’s Revolutionary vision of an Economic Bill of Rights, proudly proclaimed to a long-suffering people, was relegated to the garage sale of Great Ideas. Not so, however, for America’s glorious wars, without which another generation of Americans might have recalled the rationale for London’s now-forgotten Revolution.  

The Revolution Disremembered

America reveled in its superstar status in the years immediately following the Second World War, its working-class children of the Great Depression desiring nothing so much as to put the ordeal behind them.

Having “fought the good fight,” Americans wanted only “what was coming to them.” As it happened, they allowed someone else to tell them what that would be.

American workers had produced the war machines and manned them, but they had not profited personally in the process; indeed, half a million had surrendered their lives, and millions of others their liberties, their wages, and their savings to the war effort.

For them, the war was something never to be repeated. They did not perceive, in the relief of peace, that the owners of the war industries had learned a far different lesson.

The corporate giants had become fabulously wealthy because of the war. It was not a lesson they would forget. Thereafter, for every subsequent war the American people were glad to put behind them, the “military-industrial complex” had already laid the foundation for yet another.

Americans tended to interpret victory in WWII as a validation of their own wartime propaganda: that America was land of the free and land home of the brave. Having defeated despotism overseas, Americans fantasized the home front to be an example of egalitarian virtue, the envy of a world we had helped to bomb flat.

In the mind of Americans, we had become the permanent Good Guys on planet Earth no matter whom we were told to bomb, invade or overthrow next, or whatever pretext was given for doing so. Being by definition always right, Americans imagined we could do no wrong.

But something crucial was lost amid the triumphalism, the battle-fatigue, and the self-flattery of postwar America culture.

As mostly white American veteran-workers escaped to suburbia from hardscrabble farms and claustrophobic city neighborhoods, they forgot the final battle had yet to be won. They lost sight of the fact that the Four Freedoms, the Economic Bill of Rights, and the New Deal in general stood only as notes scribbled hastily in the margins of the Constitution, but never finalized in a new social contract.

For all of the democratic justice the New Deal reforms had produced, the structural relationship of “the mass of the people” to the “rich and well born” remained precisely as it had when Hamilton first argued successfully to retain oligarchy in the federal Constitution.

Once isolated in sterile suburbia, America repressed its collective memory. We somehow forgot that the democratic Revolutionary banner had not first been raised by Marxists, but by American farmers in rebellions against oligarchs led in turn by Bacon, Shays, and Whiskey Tax rebels.

The same banner had been taken up in turn by American agrarian populists, urban progressives and democratic reformers of every stripe.

We as a people seemed to forget how, in the generations before Pearl Harbor, thousands of American militiamen and deputized goons had machine-gunned and bayoneted striking workers from Massachusetts to Seattle; how corporate interests had conspired to overthrow the White House with an armed coup d’état; how differences in race, class, ethnicity, gender, and national origin had all been and still are exploited by the ruling elite to divide and conquer democratic challenges to its power.

The rebellious, democratic spirit that had survived centuries of suppression, violence and poverty would not survive the American retreat to suburbia, where Americans traded Revolution for revolving credit. For in this diaspora to the temporary economic Fantasyland that Americans now call home for those who still have a home we left our history behind us.

How the oligarchy now the corporate-security state finally triumphed over the last shred of hope in a democratic Revolution is a story whose last chapter has recently been sent to the print shop of history.

Let it suffice to say that it transpired while a majority of Americans sat, conveniently stupefied, watching corporate-sponsored war news on a television manufactured by an outsourced American job.

It would not have surprised Jack London if the democratic Revolution he envisioned had failed in its first attempt, as he himself had imagined in The Iron Heel. What he did not imagine is that state-sponsored violence would co-opt a peoples’ revolution.  

Amongst all the wars and the rumors of war, after the manufactured patriotism, the decades of incessant fear and profitable lies, it is no wonder that London’s Revolution had not been defeated at the barricades. For in the end, it had simply been forgotten.

But let us remember the Revolution was forgotten by a nation continually at war. If a vast multitude of us are today unemployed, debt-ridden, homeless and desperate, it is past time we recall the major reason why.

Having never heard of Jack London’s novel of rebellion against oligarchy, today’s children if they are lucky read his tale, The Call of the Wild, instead. It is a poignant story about an abused dog that ultimately, despairingly, turns its back on a cruel and vicious civilization.

Our children are told it is London’s most important work.

Perhaps, by now, it is. 

Jada Thacker, Ed.D, is a Vietnam infantry veteran and author of Dissecting American History: A Theme-Based Narrative. He teaches U.S. History and at a private educational institution in Texas. He may be contacted at jadathacker@sbcglobal.net .




Should Christians Defend the Rich?

Republican presidential contenders Texas Gov. Rick Perry and Minnesota Rep. Michele Bachmann profess their Christian fundamentalist faith, but denounce efforts by the government to restrain the power of the rich. The Rev. Howard Bess looks at this enduring contradiction between Christianity’s principles and its alliance with the wealthy.

By the Rev. Howard Bess 

Today in America, we have an unholy concentration of wealth in the bank accounts of the few.  This concentration of wealth is not earned wealth, but wealth acquired by manipulation of the economic system, the abuse of labor and the evil of inheritance. 

What has taken place also is not merely the result of a benign economic system; it is the evil of greed at work. Parallel to this corrupt system is a view among too many confessing Christians that the Book of James with its emphasis on good works, not just faith doesn’t belong in the New Testament of the Bible.

Recently, I reread the Book of James and reviewed the history of this five-chapter epistle, as I pondered the controversies that have surrounded it in Christian church history. I found James’s words challenging and exhilarating in their insistence that Christians do good in the world.

Yet, over the centuries, many church leaders have doubted that the Book of James was worthy of inclusion in the New Testament. It was clearly not written by one of the disciples of Jesus, nor by the James who was thought to be a younger brother of Jesus. The best scholars today simply say we don’t know who wrote this collection of sayings.

Because of its emphasis on good works, the Book of James is criticized as “too Jewish” in its perspective and divergent from Paul’s writings about salvation by faith and faith alone. In the 16th Century, Martin Luther, the leader of the Protestant Reformation, concluded that James was not worthy of inclusion in the New Testament collection.

Contradicting Paul’s teachings on faith and faith alone, James states very plainly that faith without good works lacks value.

“What does it profit, my brethren, if a man says he has faith but has no works? Can his faith save him?” James asks. “So faith by itself, if it has no works, is dead.”  

Often moving from issue to issue without clear connections much like the Old Testament book of Proverbs the Book of James takes on a variety of questions relating to what is necessary for a true Christian faith. If there is a central theme, it could be characterized as “what does a Godly life look like?” 

The writer leaves us with snapshot after snapshot of that life. What is never in doubt is that a confessed faith must be matched by behavior patterns that are consistent with that faith.

In James’s writings, jealousy, bitterness and selfish ambition all come under criticism. They are delegated to the unspiritual and devilish.

War and greed are treated in some length tied together by the author who leaves no doubt that a true Christian faith is completely incompatible with war and greed. There is also no place for gossip among the people of God.

The Book of James can best be understood in its moment of early Christian history. The audiences for whom James wrote were third and fourth generation Christians. 

Understandably, the first generations of Christians were absorbed in trying to figure out who Jesus truly was and the significance of his death. They were aggressively evangelistic and spread the new religion with amazing rapidity.

In addition, early Christian believers were apocalyptic, convinced they would be translated into the next life without suffering death. By the time of James, reality had set in. Christians were going to live out their years and pass away just as people had before Jesus.

Recognizing that fact, James had the courage to ask the crucial question for Christians: How are we to live our lives?

Rereading the book of James was a reminder of the writings and work of Walter Rauschenbusch, a Baptist minister who taught at Rochester Divinity School in upstate New York in the early 20th Century. His most famous book was entitled Christianity and the Social Crisis, published in 1907. It set in motion the Christian social gospel movement in America.  

Observing that dominant Christian churches were allied with the powerful and the wealthy, Rauschenbusch called for a new social order that addressed the evils of concentration of wealth in the hands of the few. He noted how child labor and other abuses made the wealthy even wealthier.

As I reread the Book of James, I realized that James was challenging the social evils of his own day, evils that were being commonly embraced by confessing Christians. In his messages to his fellow Christians, he railed against confessing believers who gave deference to the rich.

Walter Rauschenbusch was merely restating the message of James for the 20th Century. Like James, he was speaking primarily to his own fellowship of believers, knowing full well that John D. Rockefeller was a prominent member of his own denomination. 

It is worthy of note that great American civil rights leader, the Rev. Martin Luther King Jr., credited Walter Rauschenbusch as being one of his mentors in the Christian faith. In his Letter from Birmingham Jail, King pointed his finger not at racists but at fellow clergy who counseled patience toward racial bigots. 

James, Rauschenbusch and King all spoke as deeply religious people and used the language of faith. They called sin sin and evil evil.

However, in today’s America, we do not have someone like a James, a Walter Rauschenbusch or a Martin Luther King Jr. to speak the Truth to power.

The Rev. Howard Bess is a retired American Baptist minister, who lives in Palmer, Alaska.  His email address is hdbss@mtaonline.net.      




Does Israel Teach Anti-Arab Bigotry?

Israel is experiencing a protest movement for “social justice” as are other countries in the Middle East and Europe. But the Israeli version seeks a more equitable society for Jewish citizens while sidestepping the plight of Palestinians, what Lawrence Davidson sees as the result of intense anti-Arab indoctrination.

By Lawrence Davidson

Over the last ten years, there have been periodic outbursts of rage over the alleged anti-Semitic nature of Palestinian textbooks. Most of these episodes have been instigated by an Israeli-based organization called the Center for Monitoring the Impact of Peace (aka, the Institute for Monitoring Peace and Cultural Tolerance in School Education).

However, the Center’s conclusions have been corroborated only by other Israeli institutions such as Palestinian Media Watch. And, not surprisingly, almost all independent investigations examining the same issue have come up with very different conclusions.

These non-Zionist sources include The Nation magazine, which published a report on Palestinian textbooks in 2001; the George Eckert Institute for International Textbook Research, reporting in 2002; the Israel/Palestine Center for Research and Information, reporting in 2004; and the U.S. State Department Report of 2009. They all found that Palestinian textbooks did not preach anti-Semitism.

According to one Israeli journalist, Akiva Eldar, the Center does sloppy work. It “routinely feeds the media with excerpts from ‘Palestinian’ textbooks that call for Israel’s annihilation [without] bothering to point out that the texts quoted in fact come from Egypt and Jordan.”

Nathan Brown, a professor of political science at George Washington University who did his own study on the subject in 2000, said Palestinian textbooks now in use, which replaced older ones published in Egypt and Jordan, do not teach anti-Semitism, but “they tell history from a Palestinian point of view.”

It might very well be that it is this fact that the Zionists cannot abide and purposefully mistake a Palestinian viewpoint for anti-Semitism.

Here is another not very surprising fact: When it comes to choosing which set of reports to support, American politicians will almost always go with the Zionist versions. Take then-Sen. Hillary Clinton who, in 2007, denounced Palestinian textbooks, saying they “don’t give Palestinian children an education, they give them an indoctrination.”

How did she know? Well, Israel’s Palestinian Media Watch told her so, and she did not have the foresight to fact-check the assertion before going public.

While the Palestinian textbooks don’t teach hatred of Jewish Israelis, the reality of daily life under occupation surely does. Those “facts on the ground” and not the textbooks supply the most powerful form of education for Palestinian youth.
 
Although in 2009 the U.S. State Department found that Palestinian textbooks were not the products of anti-Semites, there will be yet another Department-sponsored “comprehensive and independent” study in 2011. This time, the investigation will look at “incitement” caused by bias in both Israeli and Palestinian textbooks.

When this happens, one can only hope the investigators take a look at the work of the Israeli scholar Nurit Peled-Elhanan. She is a professor of language and education at Hebrew University in Jerusalem and also the daughter of the famous Israeli general turned peace activist, Matti Peled.

Peled-Elhanan has recently written a book titled Palestine in Israeli School Books: Ideology and Propaganda in Education. The book, which will be published this month in the United Kingdom, covers the content of Israeli textbooks over the past five years and concludes that Palestinians are never referred to as such “unless the context is terrorism.” Otherwise, they are referred to as Arabs.

And Arabs are collectively presented as “vile and deviant and criminal, people who do not pay taxes, people who live off the state, who don’t want to develop. … You never see [in the textbooks] a Palestinian child or doctor or teacher or engineer or modern farmer.”

In contrast, she finds that Palestinian textbooks, even while telling history from a Palestinian point of view, “distinguish between Zionists and Jews”; they tend to take a stand “against Zionists, not against Jews.”
 
Peled-Elhanan makes a link between what Israeli children are taught and how they later behave when drafted into the country’s military services.

“One question that bothers many people is how do you explain the cruel behavior of Israeli soldiers towards Palestinians, an indifference to human suffering, the inflicting of suffering. … I think the major reason for that is education.”

Historically, the mistreatment of Palestinians, including the periodic massacre of them, is taught to Israelis as something that is “unfortunate” but ultimately necessary and “good” for the survival of the state. In Peled-Elhanan’s opinion, Palestinian terrorist attacks are “the direct consequence of the oppression, slavery, humiliation and the state of siege imposed on the Palestinians.”
 
This Israeli process of educating children to hate and to feel prejudice is, of course, exactly what the Zionists accuse the Palestinians of doing. It turns out that all this time, while leveling charges of incitement at the Palestinian educational process, the Israelis have been practicing the same sort of indoctrination on their own children.

This revelation fills Peled-Elhanan with despair, lamenting that “I only see the path to fascism” for Israel.
  
Making Choices

Keeping the theme of education in mind, let us shift attention to the unprecedented protests now going on in Israel. For the last two weeks, massive demonstrations have hit all of Israel’s major cities. “Tent cities” have sprung up in some 40 locations. All of these protests are demanding “social justice.”

What, in this case, does social justice mean? It means addressing all the legitimate, standard-of-living problems that beset most of the Israeli demonstrators: soaring costs of food and housing, declining social services and the like. All of these are the predictable consequences of unregulated capitalism and neo-liberal governments.
 
A significant number of Israelis have decided that this lack of social justice has gone far enough. A recent poll shows that 88 percent of the citizenry supports the protests.

However, this is not entirely a good thing. In order to maintain such support, coming as it does from almost all sections of Israeli political life, the protest leaders now endeavor to remain “non-political” and “rooted squarely in the mainstream consensus.”

This is, of course, naive. The Israelis live in a skewed “democratic” political environment with a right-wing government that is not going to acquiesce to their demands, except to throw them an occasional bone, unless the protesters can command the votes to shape the outcome of elections. Like it or not, that is the way their system works.
 
There are other problems. In order to be “rooted in the mainstream consensus,” the protest leaders are staying away from the issue of social justice for the Palestinians. In Israel proper, that means turning their backs on the plight of over 20 percent of the population.

What sort of social justice is that? Well, it is social justice as defined by people educated in the system described by Nurit Peled-Elhanan. That is why the protest leaders can happily solicit the support of Naftali Bennett, the thoroughly despicable leader of the colonial/settler movement, but not any of the leaders of the Arab-Israeli community.
 
By not taking a social-justice-for-all stand, the protest movement leaders have registered their acceptance of the “justice for Jews only” system to which they were educated. This in itself is a political act which will make them vulnerable to being picked apart with pseudo-solutions that offer some of them a little while denying others a lot.

Already, as reported by Haaretz, dozens of members of the Knesset have petitioned Prime Minister Benjamin Netanyahu to “solve the housing crisis by building in the West Bank.” Soon thereafter, the government announced approval for “1,600 more settler homes” in East Jerusalem, with 2,700 more to come later.

That is the sort of solution this protest movement will get unless it can overcome the education/indoctrination and go into politics in a way that applies social justice to all citizens.
  
In all societies, there are two major goals for education: one is vocational and the other is acculturation. So, one important reason for education is to prepare young people for the job market. The other is to educate them to be “good citizens.”

What this latter goal means depends on the society one is raised in. In the old Soviet Union, becoming a good citizen meant being acculturated to a nationalist brand of communism, as is still the case today in China. In the United States, it means becoming a believer in the American version of freedom, both political and economic. And, in Israel, being a good citizen means becoming a believing Zionist.
 
The objective of acculturation means that education always has, and probably always will have, a strong dose of indoctrination attached to it. That the Zionists should find it shocking that the Palestinians want to use education for their version of indoctrination and acculturation is a sheer double standard.

And, finally, that the leaders of the protest movement in Israel so pointedly exclude the plight of the Palestinians is testimony to the success of their own education/indoctrination within the apartheid model.
 
You see, most of us really are what we are educated to be.

Lawrence Davidson is a history professor at West Chester University in Pennsylvania. He is the author ofForeign Policy Inc.: Privatizing America’s National Interest; America’s Palestine: Popular and Offical Perceptions from Balfour to Israeli Statehood; and Islamic Fundamentalism.