US/Israel Can Respect Palestinian Rights

The clock is ticking on what could be the next explosion in the Middle East, if Palestinians press their demand for United Nations recognition as a state and the United States and Israel continue to spurn this acknowledgement of Palestinian rights. But Adil E. Shamoo says this political bomb can be defused.

By Adil E. Shamoo

If conditions do not change quickly by the time of the U.S.-promised veto of Palestinian statehood at the UN General Assembly on Sept. 20, the Palestinian-Israeli conflict could explode into a new uprising with hundreds of deaths.

The recent attack of Palestinian extremists on a bus in the southern Israeli resort town of Eilat and the eager over-reaction of Israeli President Benjamin Netanyahu is a harbinger of what is to come.

The uprising will bring the United States into sharp conflict with not only the Palestinians but also the rest of the Arab world. A new Arab spirit is demanding that the rest of the world, especially the United States, treat Arabs with equal respect and dignity.

The Palestinians will ask the upcoming UN General Assembly to vote for “non-member state” status for the Palestinians on Sept. 20. Since this resolution bypasses the Security Council, the promised U.S. veto will not be operative.

The least desirable choice for the United States is to vote no in the General Assembly. It would isolate the United States from the rest of the world community, which is expected to agree to the Palestinians’ sought-after status.

With the United States at its lowest popularity in the Arab world, this further isolation would only create additional challenges as the Arab Spring turns cloudy and many long-term challenges complicate U.S.-Arab relations.

The Palestinians have struggled for over 60 years to regain their rights, economic justice, and dignity. They have tried peaceful confrontation, military action, terrorism, and negotiation — without any success.

The 1.5 million Palestinians in Gaza live in an open-air prison with the highest unemployment (45 percent) in the world, near-starving conditions, and little or no medical care. Israel even stops humanitarian flotillas from reaching Gaza.

Another 1.5 million Palestinians live in Israel as second-class Israeli citizens. Do the Israelis consider the Palestinians as equal human beings?

The Israelis paint the conflict at every step as an existential threat. Israel has legitimate security concerns, which have been addressed as part of successive deals.

The existential threat may have been true in the first few decades of Israel’s existence. However, most reasonable observers and many Israelis know that a demilitarized Palestinian state is not an existential threat.

Israel has the upper hand militarily, and it has used it with a vengeance to suppress Palestinian aspirations. The Israelis are engaged in a policy of open-ended negotiation while confiscating and resettling Palestinian land.

President Obama has attempted to move the negotiations forward slightly by endorsing the blueprint used by previous administrations, namely the 1967 borders with mutually agreed land swaps. But the Obama administration remains as reluctant as its predecessors to pressure its Israeli ally to negotiate in good faith.

The Israeli lobby remains powerful on Capitol Hill, the State Department is staffed by strong supporters of Israel, and the U.S. media features very few voices representing Arab concerns. It’s no surprise that U.S. policies rarely reflect Arab views.

Israel’s policy has increased its isolation in the Middle East and the rest of the world, everywhere in fact except in the United States.

Turkey used to be the closest ally of Israel in the Middle East. But after the killing of nine Turkish citizens (one also having U.S. citizenship) in the Gaza flotilla raid last year and Israel’s refusal to apologize, the relationship between the two countries could not be any colder.

Playing Catch Up

U.S. foreign policy toward the Arab world has not changed to catch up with the Arab Spring.

The Arab Spring is a result of centuries of occupation and indignity. Arabs are now more educated and more connected to the outside world. But instead of working with this new generation, the United States is trying to leverage its relationships with military contacts in Arab militaries to indirectly maneuver the Arab Spring in a way to sustain U.S. interests.

Arabs can easily see the inconsistency of a U.S. policy that supports the overthrow of Libya’s Muammar Gaddafi while taking no action in Bahrain and remaining silent about Saudi Arabia’s oppression.

The Arab Spring has forced the Arab people to face their reality of occupation, colonization, and U.S. and Western support of their corrupt regimes.

The current crises in several Middle Eastern countries, such as those in Syria, Yemen, Bahrain, Iraq, Jordan, and Iran are destabilizing the area. The U.S. veto of the Palestinian statehood resolution at the UN will further aggravate a difficult situation.

This destabilization can become further inflamed if the Palestinian-Israeli conflict deteriorates into another massacre of the Palestinians by Israeli forces. Arab anger can easily be directed against the United States.

As a primary issue among Arabs, the Palestinian-Israeli conflict remains a barometer that shows the willingness of the United States to grant Arabs equal respect. At this tenuous time in the Middle East, the killing of innocent Palestinian civilians by the Israeli military with U.S. acquiescence is explosive.

But the United States can do something to change the situation. It can acknowledge the new realities in the Arab world by recognizing Palestinian self-determination at the UN. Treating Arabs as equals rather than a people to be manipulated for political and economic gain is a lesson of the Arab Spring that the United States can still learn.

Adil E. Shamoo, a senior analyst for Foreign Policy In Focus, writes on ethics and public policy. He is the author of the forthcoming book Equal Worth: When Humanity Will Have Peace. He can be reached at

Making Airport Screening Saner

In the decade since 9/11, airports have invested a fortune in heightened security against terrorism while alienating millions of passengers with procedures that demean and delay. Retired prosecutor William John Cox suggests some improvements to the system.

By William John Cox

Google the phrase “TSA stupidity” and you will find that almost one-and-a-half million websites have something to say about the subject. 

If the United States is to avoid another major terrorist attack on its air transportation system without placing greater restrictions on the civil liberties of air travelers, the Transportation Security Administration (TSA) had better get smart.

Everyone who travels by air in the United States has a depressing story to tell about airport screening.

Media stories of a gravely ill 95-year-old grandmother forced to remove her adult diaper before being allowed on a plane and viral videos showing terrified children being intimately touched by TSA agents are more than depressing. They are a chilling commentary on the police state increasingly accepted by the American public in the name of security.

Air travelers dare not complain. TSA standards focus additional scrutiny on travelers who are “very arrogant” and express “contempt against airport passenger procedures.”

Is such repression the only choice? Or, can TSA officers be trained to exercise the necessary discretion to detect would-be terrorists, while allowing innocent travelers to swiftly and safely pass through screening?

A reasonable and practical balance in airport security screening policy must be obtained before another terrorist attack results in even greater repression.

Today’s TSA

Shocked that poorly trained airport security guards allowed terrorists armed with box cutters to board and use four passenger airplanes as flying missiles of mass destruction, Congress established the TSA two months after 9/11.

Fifty thousand Transportation Security Officers (TSO) were quickly hired and rushed through one-week training courses. Although these officers are now federal employees and receive improved training, they are still security guards. Even so, as “officers” of Homeland Security, they exercise great power over the flying public.

TSA transformed contract screening guards into quasi-law enforcement officers and provided uniform training and policies; however, the TSA was organized as a top-down directed organization which allows very little discretion to individual officers. 

It’s “one size fits all” approach to screening results in well-intended, but outrageous conduct by its agents.

In an attempt to prevent collective bargaining and to avoid adding Democratic-leaning permanent workers to the federal bureaucracy, the Republican-controlled Congress exempted TSA employees from most federal civil service laws. 

Instead, the Secretary of Homeland Security and the TSA administrator were given virtually unlimited authority to create a personnel system. This action was to have a number of unintended consequences.

Although legislation has been introduced to bring TSA officers into the federal civil service, the TSA administrator retains absolute control over the personnel system. Exercising this power, John Pistole, the administrator appointed by President Barack Obama, granted some bargaining rights earlier this year.

While Pistole’s order provides greater job protection to officers, it does nothing to improve the existing TSA personnel selection system. As presently constituted, the employment process perpetuates mediocrity and limits the ability of TSA managers to hire and promote the most qualified officers.

Currently TSA job applicants primarily use the Internet to identify job announcements for TSA airport operations at more than 450 airports, complete applications and take an online test to measure their ability to operate screening equipment.

All English-speaking U.S. citizens over the age of 18 with a high school diploma, a GED, or one year of experience as a security officer or x-ray technician, meet the basic requirements for TSA officers, as long as they are current in their payment of income taxes and child support.

The main problem is that, once applicants meet these minimum requirements and pass a physical examination, drug screening and perfunctory background investigation, they are lumped together with all other applicants in a hiring pool for each job site.

Unlike general civil service rules, there are no ranked lists of the most qualified applicants within these pools.

Under the personnel standards established by the TSA administrator, local managers are required to select officers from the hiring pool based on the earliest applicant first, irrespective of their additional qualifications. 

Thus, a local TSA manager must hire a high-school dropout with a GED and no experience who applied one day before a college graduate with a degree in criminal justice and who earned his or her way through college working for the campus police department. 

While some managers conduct oral interviews of candidates, only in rare cases are they allowed to reject candidates who meet the minimum qualifications.

Laboring under a flawed selection process and making the best of available candidates, TSA has identified three basic ways to achieve mission effectiveness: baggage inspection, passenger screening and, most recently, behavior observation.

Although every checked bag is not hand-inspected, passengers are not allowed to lock baggage unless special TSA locks are used. As a result most bags are inspected by inspectors who are either working alone or under limited supervision.

There have been some recent improvements in baggage security; however, the New York Press reports that “according to Transportation Security Administration records, press reports and court documents, . . . approximately 500 TSA officers” have been “fired or suspended for stealing from passenger luggage since the agency’s creation.”

Every passenger is personally screened before boarding commercial aircraft and the majority of TSA officers are deployed to handle this task. Having a mission in which officers “literally touch passengers” and their most private possessions “requires a workforce of the best and brightest,” according to Nico Melendez, TSA Public Affairs Manager of the Pacific Region.

Unfortunately, because of low hiring standards and minimum training, many, if not most screening officers possess poor people skills and manage to offend a large portion of the flying public on a daily basis.

Seeking to emulate the Israeli model of “identifying the bomber, rather than the bomb,” TSA deployed Behavior Detection Officers (BDO) in 2007 under its Screening of Passengers by Observation Techniques (SPOT) program. 

Officers randomly ask passengers questions, such as “Where are you traveling,” while looking for facial cues that might indicate deception or terrorist intent, leading to additional questioning and closer inspection of baggage.

Thousands of BDOs are now working in hundreds of airports and the program is being expanded; however, they are generally selected from screening personnel and only given two weeks of training before being deployed.

There has been no scientific validation of the program and, although there have been hundreds of criminal arrests, most have been for documentation issues, such as immigration violations and outstanding warrants.

Would improved personnel selection procedures of TSA officers better insure the safety of the flying public and reduce the incidence of civil rights violations?

Building a Better TSA

The essential question is whether TSA officers are security guards or police officers when it comes to the manner in which they lay hands on the bodies and belongings of passengers. The difference in the two roles being the manner and extent to which they make decisions.

Security guards with minimal training cannot be expected to exercise discretion in critical matters. They are told exactly what or what not to do. The result is that screaming children are being felt up by strangers and the sick and elderly are publicly humiliated.

On the other hand, even with the “mandatory” criminal laws passed in the past 30 years, America’s free society still requires the exercise of arrest, prosecution and sentencing discretion in the criminal justice system, if there is to be individual justice in an individual case.

TSA must rethink the manner in which its officers are hired and trained to allow greater discretion, without an unacceptable rise in the risk of a terrorist attack.

The TSA has been moving in this direction with its “risk-based intelligence-driven screening process”; however, its steps have been hesitant and unsure, as it has staggered from incident to increasingly negative incident.

TSA official Melendez believes the key to successful screening is a workforce capable of implementing a risk-based screening process based upon updated software and equipment and ready access to an improved data base.

So, how can a marginally trained group of 50,000 security guards be converted into a professional workforce, which has the intellectual ability and training to use sophisticated detection equipment and computer data bases and which allows TSA officers to decide which sick person or young child should be allowed to proceed without a mandatory body search?

Selection. A former high-level TSA manager, who declined to be publicly identified, firmly believes that TSA could build an elite organization, if local managers were simply allowed to rank the hiring pools by qualifications, rather than having to hire the candidate who filed the earliest application.

Certainly there is a need to avoid discrimination in hiring and to create a “diverse and inclusive” workforce that is reflective of the public it serves; however, police departments have used a civil service process for decades that involves testing and interviews to establish priority lists to ensure the employment and promotion of the most qualified candidates.

Among the federal law enforcement agencies, the FBI moves applicants though a multi-phase selection process in which advancement depends upon “their competitiveness among other candidates”; Secret Service applicants must pass several examinations and a series of in-depth interviews; and ATF applicants who pass entrance exams and assessment tests have to successfully complete a “field panel interview.”

The current recession and high unemployment rate has resulted in a gigantic pool of highly-qualified and well-educated people who are looking for work. At the same time, TSA has been experiencing a fairly high turnover of employees, even though it offers a generous salary and benefit package. 

Given all of this, there is a golden opportunity to improve the quality of the TSA workforce, particularly as it relates to the ability of its officers to exercise discretion.

A recent informal survey of airport car rental employees revealed that all of them were college graduates; however, they generally earned less and had fewer benefits than the TSA officers who worked in the same building.

In fact, most national car rental companies require all applicants to have college degrees.

Avis says, “College graduates, start your engines” in its attempt to attract “energetic pro-active college graduates who are eager to accelerate their careers in a fast-paced environment.” Enterprise “prefers” college degrees since applicants will “be involved in a comprehensive business skills training program that will help you make crucial business decisions.”

Clearly it is neither necessary nor appropriate for all TSA applicants to be college graduates; however, local TSA managers should be allowed to consider levels of education, as well as length and quality of relevant experience, in establishing priority lists for hiring replacement officers and for promoting officers to supervisory or BDO positions.

Revised personnel policies that rank applicants by qualifications for these advanced positions would also allow TSA managers to directly hire more qualified candidates, such as retired police officers, for positions requiring a higher level of decision making.

Training. Currently, most training of TSA officers is conducted through online applications of standardized instruction. 

While such training may be adequate to communicate rule-based procedures to security guards, it is inadequate to teach the more finely nuanced insights required for officers to safely exercise discretion in individual cases.

Behavior Detection Officers and supervisors are currently selected from the ranks of TSOs and receive as little as two weeks of additional training upon promotion. However, a successful risk-based screening process involving critical thinking requires more intensive development and training.

Obviously, TSA can’t fire 50,000 officers and start all over again from scratch, but surely there is a way to safely maintain the basic security guard approach to screening yet allow for higher levels of discretion during the process?

Assuming that TSA managers are allowed to more effectively promote officers and to select supervisors and Behavior Detection Officers from outside the organization, and further that TSA could improve the training of supervisors and BDOs, they could begin to exercise the quality of discretion which would allow small children and elderly grandmothers to safely pass through security without impermissible assaults.

TSA should consider establishing regional training academies at the larger facilities around the country to provide classroom training for newly-appointed supervisors and BDOs into the nature of policy, the concept of rational profiling and the exercise of security discretion in a free society.

Policy. The concept of policy, as differentiated from procedures and rules, is that policies are intended as broad guidelines for the exercise of discretion allowing decision makers some flexibility in their application.

The exercise of critical discretion will fail in the absence of effective policies. This was recognized by the National Advisory Commission on Criminal Justice Standards and Goals in its Report on the Police in 1973:

“If police agencies fail to establish policy guidelines, officers are forced to establish their own policy based on their understanding of the law and perception of the police role. Errors in judgment may be an inherent risk in the exercise of discretion, but such errors can be minimized by definitive policies that clearly establish limits of discretion.”

We are all aware of the insidious and repressive nature of racial profiling that has been practiced by some law enforcement agencies. Indeed, one criticism of the TSA Behavior Detection program involved Newark BDOs known as “Mexican hunters” was that they concentrated on Hispanic-appearing individuals, resulting in a large number of arrests for immigration violations.

Well-considered policies can allow BDOs to productively direct their attention to the most suspicious candidates for extended questioning, rather than to mindlessly and repetitively ask every single traveler where they are going.

With improved policy guidance and greater discretion, BDOs might actually identify and stop a real threat, but they will only offend even more travelers if they continue to follow rote procedures.

Perhaps most importantly, such polices can provide commonsense guidelines for qualified decision makers at each screening station to allow obviously harmless grandmothers and children to avoid intrusive body contact, while focusing attention on those individuals more likely to be a terrorist.

The Right Direction

According to TSA 101, a 2009 overview of the TSA, the agency seeks to evolve itself “from a top-down, follow-the-SOP culture to a networked, critically-thinking, initiative-taking, proactive team environment.”

TSA Administrator Pistole wants “to focus our limited resources on higher-risk passengers while speeding and enhancing the passenger experience at the airport.”

On June 2, Pistole testified before Congress that “we must ensure that each new step we take strengthens security. Since the vast majority of the 628 million annual air travelers present little to no risk of committing an act of terrorism, we should focus on those who present the greatest risk, thereby improving security and the travel experience for everyone else.”

It appears TSA is moving in the right direction and Pistole may the person to keep it on course. Prior to his appointment by Obama in May 2010, he served as the Deputy Director of the FBI and was directly involved in the formation of terrorism policies.

Most significantly, his regard for civil rights was suggested by his approval of FBI policy placing limits on the interrogation of captives taken during the “war on terror.” The policy prohibited agents from sitting in on coercive interrogations conducted by third parties, including the CIA, and required agents to immediately report any violations.

One can hope that TSA Administrator Pistole will exercise his authority to bring about improved selection and training of TSA personnel and will promulgate thoughtful screening policies achieving a safer and less stressful flying experience for everyone.

William John Cox is a retired prosecutor and public interest lawyer, author and political activist. He authored the portions of the Police Task Force Report on the role of the police and policy formulation for the National Advisory Commission on Criminal Justice Standards and Goals in 1973. His efforts to promote a peaceful political evolution can be found at, his writings are collected at and he can be contacted at

The World at a Tipping Point

America and the world seem precariously balanced between those who wish to deny the many problems facing mankind and those who insist that the human race address the multiple crises confronting the planet. Winslow Myers sees reason to hope that the world will tip in a positive direction.

By Winslow Myers

The brilliance of the “Mad Men” television series lies in the crackerjack acting and script, but even more in the way the series dramatizes the paradigm shift of American women from gross subjugation to rough equality.

In an early episode, protagonist Don Draper reluctantly allows his wife to consult a (male) psychiatrist, and then calls the doctor, who casually violates confidentiality.

The series explains much about how the males of my generation often haplessly misunderstood — or deliberately ignored — the autonomous subjectivity of females.

This begs two questions: what blindnesses operating in the present cultural moment might be illuminated by talented scriptwriters as they look back from the perspective of 2040?

And second, what is the vision that orients us as we work to ensure that there will be a future to look back from in 2040?

American politics in 2011, in the run-up to the next presidential election, seems to operate in a weird bubble of denial, the engine of which is politicians pandering for votes. No one gets to be a President or Senator by emphasizing such unvarnished truths as:

–Oil and coal companies exercise too much power to slow or prevent altogether an incentivized transition to clean and sustainable forms of energy generation.

–People of wealth and large corporations do not pay their fair share of taxes, and as long as Congress is in thrall to lobbyists, reform of the tax code toward simplicity, transparency and fairness will be difficult in coming.

–Some American financial institutions characterized as “too big to fail” are insufficiently regulated, making money off the misfortunes of ordinary citizens, intensifying the grotesque differences between the incomes of the super-rich and all the rest of us.

–The wars in Iraq and Afghanistan are obscenely expensive stalemates that have not increased our security, and may have created more terrorists than they have killed.

–Nuclear weapons have become completely useless as instruments of deterrence.

–The U.S. defense budget is bloated and lacks accountability.

–Global climate instability is clearly being intensified, if not caused, by human activity.

–The U.S. military is the biggest user of fossil fuels and polluter in the world, even as it plans to fight wars caused by the same extreme climate events that are presently intensifying chaos and dislocation for millions.

–The debt ceiling of nations may be negotiated or engineered, but the debt that comes from the unsustainable assault of too many humans on the living systems of the Earth is non-negotiable.

Coral reefs are dying; the oceans are polluted with plastic; many fish species have been harvested almost to extinction; tropical rain forests are still being put to the torch or the saw; polar icecaps and mountain glaciers continue to melt at faster than expected rates.

But there is good news also, about which we also do not hear enough from our candidates:

–There are millions of non-governmental organizations springing up around the world that agree upon the values of human rights for all, eco-sustainability, nonviolence, and democratic structures,the largest mass movement in history, says entrepreneur and ethicist Paul Hawken.

One important new organization is Awakening the Dreamer, which offers citizens a free half-day seminar that awakens us to the real challenges we face,and the real possibility of meeting them.

–War just might be a dying institution. Wars of decolonization or proxy wars between superpowers have scaled back to zero since the end of the Cold War. While still horrible, contemporary wars kill fewer civilians and soldiers than some of the conflagrations of the not-too-distant past.

Still, this optimism about war fails to include the continued presence of massive numbers of nuclear weapons, nor the ever-increasing effects of climate change upon the poorer nations, nor global population growth, nor the unpredictable element in current events.

New World

We find ourselves waking up in a whole new world, where rich and poor occupy the same leaky boat in a polluted sea.

Ensuring the future requires a fundamental shift in thinking from “I am separate” to “We are one”,a paradigm shift from measuring our economic success quantitatively to finding new qualitative criteria.

From turning reflexively toward war to moving aggressively to prevent war. From grotesquely large military budgets to humanitarian aid that directly meets human needs. From candidates who deny global warming to candidates who advocate for a reorientation of priorities on the level of a planetary Marshall Plan.

None of this will happen unless we all get involved, and push and question and become an active force that leaders cannot ignore.

This is the time when candidates are spending the most time listening to ordinary citizens. The questions we ask can be powerful agents of a new awakening.

If that came to pass, we might someday enjoy a TV series that looked back through the decades to dramatize the gradual end of our delusions.

It might make us wince at the “windy militant trash” (Auden) of present political discourse just as we wince at the dated chauvinism of the “Mad Men” era, but we might also be celebrating how far we had come.

Meanwhile we have a long way to go, baby.

Winslow Myers, the author of Living Beyond War: A Citizen’s Guide, serves on the Board of Beyond War (, a non-profit educational foundation whose mission is to explore, model and promote the means for humanity to live without war.

The Dangerous Reagan Cult

Exclusive: Ronald Reagan’s anti-government philosophy inspires Tea Party extremists to oppose any revenue increase, even from closing loopholes on corporate jets. Democrats try the spin that “even Reagan” showed flexibility on debt and taxes. But Robert Parry says it is the “Reagan cult” that is at the heart of America’s crisis.

By Robert Parry

In the debt-ceiling debate, both Republicans and Democrats wanted Ronald Reagan on their side. Republicans embraced the 40th president’s disdain for government and fondness for tax cuts, while Democrats noted that “even Reagan” raised the debt limit many times and accepted some tax increases.

But Reagan possibly more than any political leader deserves the blame for the economic/political mess that the United States now finds itself in. He was the patriarch for virtually every major miscalculation that the country has made over the past three decades.

It was Reagan who slashed taxes on the rich to roughly their current level; he opened the flood gates on deficit spending; he accelerated the decline of the middle class by busting unions and slashing support for local communities; he disparaged the value of government regulations; he squandered money on the Pentagon; he pushed more militaristic strategies abroad; and he rejected any thoughtful criticism of past U.S. foreign policies.

Reagan also created what amounted to a “populist” right-wing cult that targeted the federal government as the source of nearly all evil. In his First Inaugural Address, he famously declared that “government is not the solution to our problem; government is the problem.”

It is that contempt for government that today is driving the Tea Party extremists in the Republican Party. Yet, as with many cults, the founder of this one was somewhat more practical in dealing with the world around him, thus explaining some of Reagan’s compromises on the debt ceiling and taxes.

But once the founder is gone, his teachings can become definitive truth to the disciples. Flexibility disappears. No deviation is permitted. No compromise is tolerated.

So, at a time when government intervention is desperately needed to address a host of national problems, members of this Reagan cult apply the teachings of the leader in the most extreme ways. Since “government is the problem,” the only answer is to remove government from the equation and let the corporations, the rich and the magical “market” dictate national solutions.

It is an ironic testament to Ronald Reagan’s enduring influence that America’s most notable “populist” movement, the Tea Party, insists that tax cuts for the wealthy must be protected, even minor ones like tax loopholes for corporate jets. Inside the Tea Party, any suggestion that billionaire hedge-fund managers should pay a tax rate equal to that of their secretaries is anathema.

Possibly never in history has a “populist” movement been as protective of the interests of the rich as the Tea Party is. But that is because it is really a political cult dedicated to the most extreme rendering of Ronald Reagan’s anti-government philosophy.

Astro-Turf ‘Populists’

Granted, the Tea Party also can be viewed as an astro-turf outfit financed by billionaires like the Koch brothers and promoted by billionaire media mogul Rupert Murdoch. But Election 2010 proved that the movement is capable of putting like-minded politicians into office, especially when discouraged elements of the American Left choose to sit on the sidelines.

During the debt-ceiling battle, the GOP’s Tea Party caucus showed it was strong enough to block any compromise that included a revenue increase. The thinking is that the “evil” government must be starved even if that means defending indefensible tax loopholes and shoving the world’s economy to the brink of catastrophe.

The Tea Party’s rabid enforcement of the Reagan orthodoxy instills such fear among top Republicans that every one of the eight presidential hopefuls at a recent Iowa debate vowed to reject a deal that would include just $1 of higher taxes for each $10 in spending cuts. Even supposed moderates like Mitt Romney and Jon Huntsman threw up their hands.

But the Reagan cult reaches far beyond the Republican Party. Last February, a Gallup poll of Americans cited Reagan as the greatest president ever, with a five percentage point lead over Abraham Lincoln.

These days, virtually no one in Washington’s political or media circles dares to engage in a serious critique of Reagan’s very checkered record as president. It’s much easier to align yourself with some position that Reagan took during his long career, much like a pastor selectively picking a Bible passage to support his theological argument.

When negative national trends are cited such as the decline of the middle class or the widening gap between rich and poor the self-censorship demands that Reagan’s name not be spoken. Instead, there are references to these problems deepening “over the past three decades,” without mentioning whose presidency got things going big time.

Creating an Icon

And there is a self-interested reason for this hesitancy. The Republicans and the Right have made it a high priority to transform Reagan into an icon and to punish any independent-minded political figure or journalist who resists the group think.

The first step in this process occurred in the late 1980s, with aggressive cover-ups of Reagan’s crimes of state, such as scandals over the Iran-Contra arms-for-hostages affair, Contra-cocaine trafficking, and the Iraq-gate support of dictator Saddam Hussein.

Faced with furious Republican defenses of Reagan and his inner circle, most Democrats and mainstream journalists chose career discretion over valor. By the time Bill Clinton was elected in 1992, the refrain from Democrats and Washington pundits was to “leave that for the historians.”

Those who didn’t go along with the cover-ups like Iran-Contra special prosecutor Lawrence Walsh were subjected to ridicule from both the right-wing and mainstream media, from both the Washington Times and the Washington Post. Journalists who challenged the implausible Reagan cover-ups also found themselves marginalized as “conspiracy theorists.”

Leading Democrats decided it made more sense to look to the future, not dwell on the past. Plus, acquiescing to the cover-ups was a way to show their bipartisanship.

However, Republicans had other ideas. Having pocketed the concessions regarding any serious investigations of Reagan and his cohorts, the Republicans soon went on the offensive by investigating the heck out of President Clinton and his administration.

Then, having stirred up serious public doubts about Clinton’s integrity, the Republicans trounced the Democrats in the 1994 congressional elections. With their new majorities, the Republicans immediately began the process of enshrining Reagan as a national icon.

By and large, the Democrats saw these gestures, like attaching Reagan’s name to National Airport, as another way to demonstrate their bipartisanship.

But Republicans knew better. They understood the strategic value of elevating Reagan’s legacy to the status of an icon. If everyone agreed that Reagan was so great, then it followed that the hated “guv-mint” must be that bad.

More Accommodations

Increasingly, Democrats found themselves arguing on Republican ground, having to apologize for any suggestion that the government could do anything good for the country. Meanwhile, the Clinton-era stock market boom convinced more Americans that the “market” must know best.

Going with that flow, President Clinton signed a Republican-sponsored bill that removed Depression-era regulations in the Glass-Steagall Act, which had separated commercial and investment banks. With the repeal, the doors were thrown open for Wall Street gambling.

In the short run, lots of money was made, encouraging more Americans to believe that the government and its “safety net” were indeed anachronisms for losers. People with any gumption could simply day-trade their way to riches.

Reagan, it seemed, was right all along: government was the problem; the “free market” was not only the solution but it could “self-regulate.”

That was the political/media environment around Election 2000 when the wonkish Vice President Al Gore ran against the brash Texas Gov. George W. Bush, who came across to many as another version of Ronald Reagan, someone who spoke simply and disdained big government.

Though Gore could point to the economic successes of the Clinton years, including a balanced federal budget and the prospect of the total elimination of the federal debt, the major media mocked him as a know-it-all nerd who wore “earth-toned sweaters.” Meanwhile, mainstream journalists swooned over Bush, the regular guy.

Still, Gore eked out a narrow victory in the national popular vote and would have carried the key state of Florida if all legally cast votes were counted. But Bush relied on his brother’s administration in Florida and his father’s friends on the U.S. Supreme Court to make sure that didn’t happen. Bush was declared the winner in Florida and thus the new president. [For details, see Neck Deep.]

In retrospect, Election 2000 was a disastrous turning point for the United States, putting into the highest office in the land an unqualified ne’er do well who had lost the election.

But this outrage against democracy was largely accepted because of the muscular right-wing machine, the on-bended-knee mainstream media and the weak-kneed Democrats a political/media dynamic that Reagan had helped create and had left behind.

The progress that the Clinton administration had made toward putting the U.S. financial house in order was quickly undone as Bush pushed through two massive tax cuts benefiting mostly the rich and waged two open-ended wars financed with borrowed money.

Years of Reaganism also had taken its toll on the government’s regulatory structures. Reagan had consistently appointed regulators who were hostile to the very concept of regulating, such as Anne Gorsuch at the Environmental Protection Agency and James Watt at Interior. He also elevated Alan Greenspan, a “free market” admirer of Ayn Rand, to be chairman of the Federal Reserve Board.

In the 1980s, the looting of America was underway in earnest, but the elites of Washington and New York saw little to protest since they were getting a cut of the plunder. The real losers were the average Americans, especially factory workers who saw their unions broken or their jobs shipped overseas under the banner of “free trade.”

Feeling Good

But many Americans were kept entranced by Reagan’s feel-good magic.

Taking office after a difficult decade of the 1970s, when America’s defeat in Vietnam and the Arab oil price hikes had shaken the nation’s confidence, Reagan simply assured everyone that things would work out just fine and that no excessive sacrifice was in order. Nor should there be any feelings of guilt, Reagan made clear.

By the late 1970s, it was widely accepted even among many Republicans that the Vietnam War had been an abomination. But Reagan simply rebranded it a “noble cause,” no reason for any serious self-reflection on America’s imperial role in the world.

Reagan then allied the United States with “death-squad” regimes all over Latin America and across the Third World. His administration treated the resulting carnage as a public-relations problem that could be managed by challenging the patriotism of critics.

At the 1984 Republican National Convention, Reagan’s United Nations Ambassador Jeane Kirkpatrick labeled Americans who dared criticize U.S. foreign policy as those who would “blame America first.”

To continue this sort of verbal pummeling on those who continued to get in the way, Reagan credentialed a bunch of thuggish intellectuals known as the neoconservatives.

For the rest of the country, there were happy thoughts about “the shining city on a hill” and “morning in America.”

In reality, however, Reagan had set the stage for the tragedies that would follow. When George W. Bush grabbed power in 2001, he simply extended the foreign and economic policies of the Republican cult leader: more tax cuts, more militarism, less regulation, more media manipulation.

Soon, the gap between rich and poor was widening again. Soon, the United States was at open war in two countries and involved in secret wars in many others. Soon, the nation was confronted with new scandals about torture and deception. Soon, the federal budget was flowing with red ink.

And near the end of Bush’s presidency, the de-regulated excesses of Wall Street pushed the country to the brink of a financial cataclysm. Bush supported a bail-out to save the bankers but didn’t do much for the millions of Americans who lost their jobs or their homes.

Second Thoughts?

One might have thought that the financial crack-up in 2008 (plus the massive federal deficits and the botched wars in Iraq and Afghanistan) would have confronted the Reagan cult with an existential crisis of faith. It would seem obvious that Reagan’s nostrums just didn’t work.

However, after only a brief interregnum of Barack Obama, the Republicans seem poised to restore the Reagan cult to full power in the United States. The new apparent GOP frontrunner, Texas Gov. Rick Perry, is already being hailed in the Washington Post as “The Texas Gipper.”

The Washington Times (yes, Rev. Sun Myung Moon’s right-wing propaganda sheet is still around) fairly cooed over Perry’s tough attacks on Obama, depicting America’s first black president as someone who apologizes for America and isn’t deserving of its soldiers in uniform.

“One of the powerful reasons for running for president of the United States is to make sure every man and woman who puts on the uniform respects highly the president of the United States,” Perry said. “We are indignant about a president who apologizes for America.”

As far as Perry is concerned, America has nothing to apologize for.

These are themes right out of Ronald Reagan’s playbook. And it appears likely that Election 2012 will be fought over terrain defined by Reagan, even though he left office in 1989 and died in 2004.

It is already clear that President Obama will be on the defensive, trying to justify a role for the federal government in America and explaining why the Reaganesque policy of low taxes on the rich must finally be reversed. Obama also is certain to shy away from any serious examination of how U.S. foreign policy went so wrong, so as not to be labeled “apologist-in-chief.”

Rick Perry or whatever other Republican gets the party’s nomination will hold the high ground of Reagan’s lofty standing among the American people. The GOP nominee can continue blaming “guv-mint” for the nation’s problems and promising another “morning in America” if only the nation further reduces the size of “guv-mint.”

With Democrats also trying to associate themselves with the “greatest president ever,” it appears doubtful that any serious effort will be made to explain to the American people that the charming Reagan was the pied piper who led them to their current demise.

[For more on these topics, see Robert Parry’s Secrecy & Privilege and Neck Deep, now available in a two-book set for the discount price of only $19. For details, click here.]

Robert Parry broke many of the Iran-Contra stories in the 1980s for the Associated Press and Newsweek. His latest book,Neck Deep: The Disastrous Presidency of George W. Bush, was written with two of his sons, Sam and Nat, and can be ordered at His two previous books, Secrecy & Privilege: The Rise of the Bush Dynasty from Watergate to Iraq and Lost History: Contras, Cocaine, the Press & ‘Project Truth’ are also available there.

Strange Death of American Revolution

At the heart of the American experiment was always a tension between oligarchy and democracy, with the oligarchs usually holding the upper hand. However, in recent decades, the struggle has taken a curious turn with the oligarchs largely obliterating the people’s memory of the true democratic cause, writes Jada Thacker.

By Jada Thacker 

Most Americans know Jack London as the author of The Call of the Wild. Few have ever read his 1908 novel, The Iron Heel, which pits what London calls “the Oligarchy” (aka The Iron Heel) against the American working class, resulting in armed revolution.

The Oligarchy, London explains, is the ruling elite whose immense concentration of capital has empowered it to transcend capitalism itself. The Iron Heel is thus an allegorical tale of a fascist state whose hydra-headed business monopolies have seized control of all facets of production, consumption and national security.

London was not the lone American revolutionary author of his generation. Looking Backwards by Edward Bellamy, Caesar’s Column by Ignatius Donnelly, and the less militant Progress and Poverty by Henry George all assumed that some version of democratic-socialist Revolution was just around the corner of history or if not, then ought to be.

As late as the 1930s (and briefly during the anti-Vietnam War period), many Americans still thought “The Revolution” was in the offing. But those days have passed, and no one today speaks seriously of any such thing.

Why not?

The Traditional Oligarchy   

“Oligarchy” means “rule by the few.” It is an ugly word in its pronunciation as well as in its implied meaning.

Moreover, it is a tainted word because it is used often by “dangerous radicals” to describe the people they wish to see blindfolded and stood against a wall. Nonetheless, it is the proper word to describe the current practice of governance in the United States.

This, of course, is not a new development.

The origin of American civil government was not, as certain champions of Locke’s social contract would have it, to secure to each citizen his equal share of security and liberty, but rather to secure for the oligarchs their superior position of power and wealth.

It was for precisely this reason the United States Constitution was written not by a democratically-elected body, but by an unelected handful of men who represented only the privileged class.

Accordingly, the Constitution is a document which prescribes, not proscribes, a legal framework within which the economically privileged minority makes the rules for the many.

There is nothing in the Constitution that limits the influence of wealth on government. No better example of this intentional oversight exists than the creation of the first American central bank. It is worth a digression to examine this scheme, as it was the precedent for much yet to follow.

 The very first Congress incorporated a constitutionally-unauthorized central banking cartel (the Bank of the U.S.) before it bothered to ratify the Bill of Rights a sequence of events which eloquently reveals the priorities of the new government.

The bank was necessary in order to carry out a broader plan: the debts of the new nation would be paid with money loaned by the wealthy, and the people were to be taxed to pay the money back to the wealthy, with interest.

The 1791 Whiskey Tax which penalized small-scale distillers in favor of commercial-scale distilleries was passed to underwrite this scheme of bottom-up wealth-redistribution. When frontiersmen predictably rebelled against the tax, they were literally shackled and dragged on foot through the snowbound Allegheny Mountains to appear in show-trials at the national capital, where they were condemned to death.

Socialist bureaucrats were not the culprits here: the 16,000 armed militiamen that crushed the rebels were led in person by two principal Founding Fathers, President George Washington and Treasury Secretary Alexander Hamilton, the author of both the central bank and the whiskey tax legislation.

(After the disproportionate tax drove small producers out of competition, Washington went into the whiskey-distilling business, becoming by the time of his death the largest whiskey-entrepreneur in Virginia, if not the nation.)

This should be a “text-book” example of how oligarchy works, but such examples are rarely admitted in textbooks. Instead, the textbooks assure us that the Founders established the nation upon the principles of “liberty and justice for all,” words that do not appear in any founding document.

Fortunately, for the sake of candor, Hamilton made his support of oligarchy quite clear at the Constitutional Convention when he said, “All communities divide themselves into the few and the many. The first are the rich and well born, the other the mass of the people. … The people are turbulent and changing; they seldom judge or determine right. Give therefore to the first class a distinct, permanent share in the government.”

Who Were “We the People?”

Despite the “We the People” banner pasted on the Preamble, the Constitution, including the Bill of Rights, does not guarantee anyone the right to vote, nor did it prevent the wealthy from making laws denying that right to “the mass of the people.”

Any belief that the Founders countenanced “democracy,” would, at a logical minimum, require that term to appear at least one time within the Constitution or any of its 27 Amendments which it conspicuously does not.

Without some constitutional guarantee of democracy, government maintains the practice of oligarchy by default. Despite pretensions of Republicanism, even among the followers of Jefferson, the new nation was ruled by “the rich and well born” few for a generation before the specter of democracy even began to rear it head.

And so it was that the oligarchic social contract described in Rousseau’s Discourse on Inequality remained the actual basis upon which American socioeconomic order was founded not the Lockean version first fantasized by Jefferson in the Declaration of Independence and then summarily excluded from the Constitution by the Federalists.

Since money, then as now, buys both property and power, it was only logical that democracy would make its first appearance on the 19th century American frontier, where there was very little money, but much property, to be had.

The fact that the property mostly had been stolen was beside the point: possession of it now conferred the right to vote for the first time upon a majority of people who had no money. Thus, but for a limited time only, common Americans began to feel they were in charge of their future. 

For a few short decades, America actually became what it now believes it always was: a democratic Republic, largely free from Big Business, Big Government and Big Religion.

True, the majority of the people still could not vote, slavery still existed, and American Indians were being ravaged, but things were looking up for free, white males as the frontier expanded beyond the grasp of the old-money power of the traditional Eastern oligarchy.

Until the middle of the century when the war came, that is.

The Industrial Oligarchy

The coming struggle did not develop, as many had feared, between the Old East and the New West, nor even between haves and the have-nots. Following the tradition of our remarkably un-revolutionary “American Revolution,” the contest was again a proxy war fought by the common man, but led by factions of the wealthy.

In essence, it was a colonial war that would determine whether the Southern oligarchy of Property or the Northern oligarchy of Money would dominate the resources of the vast American Empire west of the Mississippi.

In practice, however, it was a war not so much between men as machines. When the Northern oligarchy whose money commanded both more men and more machines won the contest, it emerged as a political monopoly in possession of both the fastest growing industry and the mightiest military on Earth.

Requiring only a four-year period of gestation from Fort Sumter to Appomattox, America’s first “military-industrial complex” was born as a result of war, rather than in anticipation of it.

Facing no immediate foreign threat, the military component of the complex soon devolved into an occupation force for the subjugated South and an invasion force for the soon to be subjugated West. Meanwhile, the industrial arm expanded beyond all precedent, exploiting its political monopoly to lavish public subsidies on favored industries, which reciprocated by buying government offices wholesale.

Cloaked in its guise as the Emancipator of Man and the Savior of the Nation, the nationalist-corporate State had arrived. It was to become a super-oligarchy, controlled increasingly by the monopolists of capital, both foreign and domestic; its mission was nothing less than to monopolize what remained of the means of production: the land and labor of the world’s richest continent.

It was this London termed “the Iron Heel.” It was not free-market capitalism. It was a corporatist monopoly far beyond anything envisioned by the traditional, landed oligarchy. It was not controlled by statesmen in frocked coats, or by generals, or government apparatchiks, but by the denizens of the nation’s boardrooms, untouched and untouchable by democratic vote.

It was, in fact, a domestic version of the British Empire.

It did not take long for those under its heel to realize there was only one power on Earth ultimately capable of opposing it: democratic collectivization.

But when reformers made peaceful attempts to rally American farmers, miners and industrial labor, they were defeated by political chicanery, divisive media propaganda and state-sanctioned violence. When they dared employ violence, they were simply outgunned.

Fantasies of a democratic Revolution became the last refuge for those who held out hope for social and economic justice.

Revolution How?

Yet the violent military destruction of the U.S. government was not seriously entertained by any who had witnessed the burning of the Southern cities and the utter destruction of Dixieland.

Indeed, in the dystopic novels, The Iron Heel and Caesar’s Column, violent revolution proves initially suicidal for the working class. And, though Looking Backwards celebrates the emergence of a national-socialist state, the off-stage Revolution that produced utopia is reported as having been miraculously bloodless.

No doubt, American democratic reformers believed in sacrifice for the common good, but even the fringe anarchists among them were not Kamikazes.

The problem lay not in government, per se, but in the oligarchy that controlled the levers of power to benefit its own interests (a lesson contemporary government-hating reformers would do well to learn.)

Although American utopians before and at the turn of the 20th century seemed to assume the Revolution would soon arrive,  its intended purpose would not be to destroy American government wholesale and rebuild it anew.

The Revolution would restore the principal virtues of Jefferson’s Declaration and the Lockean social contract the Natural Right of revolution over that of the extant Constitution foretold by Rousseau, which did not.

The crushing irony of the fantasized democratic Revolution lay not in its intention to replace the American system of governance with a foreign statist ideology, but in its effort to establish for the first time a guarantee of domestic social justice most Americans erroneously believed already existed.

Having no clue that the Constitution had not guaranteed any rights not already exercised by Americans at the time of its ratification, a gullible public majority assumed the purpose of a counterrevolution would be to take their supposed constitutional rights away.

Moreover, the popular majority in the decades after Appomattox was dominated by victorious Union war veterans, who were encouraged to believe they had subjugated the South in the service of human liberty. Thus patriotism, now implicitly defined as allegiance to the Nation State, became the staunchest ally of the victorious industrial oligarchs.

 When the Spanish-American War arrived, America first entered into the international sweepstakes of the second great Western colonization.

When the resultant Philippine War erupted in an unapologetic attempt to deprive Filipinos of democratic self-determination, it was this same sense of patriotic self-glorification that allowed American boys to herd thousands of doomed Filipinos into disease-ridden concentration camps.

Meanwhile, President William McKinley — having narrowly defeated the democratic-populist electoral threat two years previously — was so far removed from reality he reportedly had to refer to a map to discover where the Philippine atrocities were committed. Today, of course, nobody seems to know.      

But it would be Democrat Woodrow Wilson, despite his cameo appearance as a progressive president, who would possibly do more to undermine world-wide democratic reform than any other American in history, to include Ronald Reagan.

Starting in the 1890s, America middle-class progressives had begun to make some measurable progress not in promoting Revolution against the oligarchy but in using the power of the ballot to at least regulate some of society’s undemocratic flaws. Wilson was elected in part to promote the progressive cause.

But Wilson, having nominally stood against American entry into the largest war in human history, suddenly caved to the demands of bankers who feared losing billions in defaulting loans if the Allied cause foundered for lack of American support.

Over the span of a few weeks, Wilson thus reversed two years of principled neutrality, torpedoing more human progress than any number of German U-Boats.

Oddly, Wilson seemed to understand perfectly the result of his betrayal. On the night before he asked Congress to compel the nation into its first world war, he criticized his own decision to a confidant:

“Once lead this people into war,” he said, “and they’ll forget there was ever such a thing as tolerance. To fight, you must be brutal and ruthless, and the spirit of ruthless brutality will enter into the very fiber of national life, infecting the Congress, the courts, the policeman on the beat, the man in the street.”

And so it did.

Patriotic Oligarchy

War propaganda and the “rally ‘round the flag” mentality of wartime America not only distracted Americans from the project of progressive reform, but split them into two antagonistic factions: those who supported the war to “export democracy” worldwide, and those who believed the war, itself, was a betrayal of universal progressive principle.

More important, however, the war inevitably conferred more power and credibility to the oligarchs. Under cover of newly manufactured patriotism, an Espionage Act was passed, rivaling only the founding Federalists’ Sedition Act in totalitarian suppression of free speech.

As a result, prominent socialist labor leaders such as Eugene Debs and Bill Haywood were arrested on the specious charges of speaking their minds and sentenced to 10 and 20 years, respectively.

The engineered Red Scare following the Great War further decimated the ranks of American democratic-socialist reformers.

Soon the socialist IWW labor union was hounded out of existence; Sacco and Vanzetti were executed amid world-wide protest; draconian anti-immigration law was passed; and 9,000 armed miners lost the Battle of Blair Mountain after the intervention of the U.S. Army all serious setbacks to those who hoped for any sort of democratic Revolution.

None of these events was reported by the corporate-dominated press as American workers’ opposition to oligarchy, but rather as foreign-inspired sedition against an All-American democracy.

Then, at long last the Revolution came but it was not American.

For a very short while, Bolshevik Revolution seemed to promise hope. But Lenin was assassinated in 1924, and the rise of Stalin to power within the Bolshevik Party doomed any hope of its fidelity to egalitarian principles.

At home, the dismissal of Wilson’s Fourteen Points by American isolationists helped cement progressive cynicism as their expectations for a “world made safe for democracy” seemed to have failed domestically as well as abroad.

As American culture embraced the feverish consumerism and urban moral vacuity of the Roaring Twenties, renewed democratic activism languished. Even the progressive constitutional reform amendments (income tax, direct election of senators, Prohibition, and women’s suffrage) seemed too little to revive the spirit of social reform dulled first by abandoned neutrality, then again by abandoned war goals.

By the late 1930s, with Stalin’s anti-democratic brutality fully exposed, the democratic-socialist cause was a dead letter for all but the most radical reformers in America.

Heroes’ Warnings Ignored or Worse

Yet even then, America’s most highly decorated soldier, the once popular Marine Major General Smedley Darlington Butler, in 1935, wrote a book entitled War Is a Racket. Having earned two Medals of Honor and more in service to the oligarchy, it seems he had learned something about the “honor” of American war making.

“I spent 33 years and four months in active military service,” he said, “and during that period I spent most of my time as a high class muscle man for Big Business, for Wall Street and the bankers. In short, I was a racketeer, a gangster for capitalism.”  

One need not imagine why his is not now a household name even among U.S. Marines.

Then there was another World War and another Red Scare. The Soviets got the Bomb; China went “Red.” McCarthyist America, it appeared, went temporarily insane.

Almost immediately came yet another war, now in Korea. With it, came the permanent Cold War, and with it, a permanent Red Scare. America’s temporary insanity lapsed into chronic psychosis.

The once-fantasized Revolution, now tarred with the brush of Soviet and Chinese despotism and sidetracked by the incessant paranoia of nuclear holocaust, was never seriously considered again by the American working class.

The more Americans were rallied to defend the corporate nation state, the less able were its citizens to appreciate the structural flaws in its national charter. The collectivism of organized state violence had trumped the collectivism of democratic reform. 

Instead of a Revolution that would force the ruling elite to rewrite the social contract to represent the socially cooperative, “combinative” nature of man, as London and so many others had predicted, it was the people who were forced to sign “loyalty oaths” to a corporatist state bent on perpetual war and perpetual fear of perpetual war.

This dangerous state of affairs was poignantly detailed by an American working-class war hero at the height of the second Red Scare in 1951. Despite the ongoing war in Korea, General Douglas MacArthur found time to blow the whistle on patriotic oligarchy.

He said, “It is part of the general pattern of misguided policy that our country is now geared to an arms economy which was bred in an artificially induced psychosis of war hysteria and nurtured upon an incessant propaganda of fear. [S]uch an economyrenders among our political leaders almost a greater fear of peace than is their fear of war.”

Ten years later, another working-class war hero, President Dwight D. Eisenhower, reiterated MacArthur’s warning of “an artificially induced psychosis of war hysteria” in his 1961 farewell address to the American people.

Eisenhower famously warned that the oligarchy what he originally styled “the military-industrial-congressional complex” was conspiring to lead the nation into needless wars for power and for profit.

Did Americans heed the warnings of its own famed military heroes? Some did.

Eisenhower’s successor, John F. Kennedy, gave action to these words and refused to be goaded into an invasion of Cuba only weeks after Eisenhower’s warning. The next year Kennedy again refused to order the Pentagon’s planned invasion of Cuba during the missile crisis.

The year after that, Kennedy resolved to withdraw all American military advisors from the ever-tightening noose of war in Southeast Asia. At the same time, he privately vowed to withdraw all American forces from Vietnam following the next general election.

Weeks later, he was murdered. He would be the last American president to openly defy the military-industrial complex.

Only nine months after Kennedy’s assassination, Congress abdicated its constitutional responsibility. Eschewing a declaration of war, it nevertheless authorized open-ended military aggression against the country of North Vietnam all on the strength of carefully crafted, now-acknowledged lies, known as the Gulf of Tonkin affair.

If America failed to defeat the global communist threat in Vietnam, we were told, all would be lost. Americans would become communist slaves. Presumably to forestall their future loss of liberty, over two million Americans were then forced against their will to serve the armed forces during an unprovoked military invasion of Southeast Asia.

Nine years of utterly senseless combat ensued before the United States abandoned the war effort in humiliation, having caused the death of over 58,000 Americans and about two million Vietnamese.

Yet a generation after our inglorious military failure, we had not become communist slaves: on the contrary, Vietnam had been accorded Most Favored Nation trade status as American boys queued up in shopping malls to buy sports shoes, produced in American-subcontracted Vietnamese sweatshops, by girls too young to date.

The war drums and the profits beat on.

After 45 years, the $13 trillion Cold War stumbled to a close with the political and economic implosion of the Soviet Union. But it was an event predicted not to result in peace:

“Were the Soviet Union to sink tomorrow under the waters of the ocean,” said George F. Kennan in 1987, “the American military-industrial establishment would have to go on, substantially unchanged, until some other adversary could be invented.”

Kennan, the Cold War author of our “containment strategy,” knew whereof he spoke.

Kennan’s predicted “invention” arrived on cue. Simultaneously with the fall of the Soviet Union arrived the First Gulf War. Then, after the 9/11 terrorist attack, the Cold War was reinvented, permanently it seems, as the Afghanistan War.

It soon was augmented concurrently by the Iraq War founded, like the Vietnam War, upon yet more carefully crafted, now-acknowledged, lies. These seemingly endless conflicts have been joined by an openly secret war waged on the lawless frontiers of Pakistan, and more recently by aerial wars in Libya, Yemen, Somalia, and elsewhere.

 “No nation,” James Madison had said, “could preserve its freedom in the midst of continual warfare.” Ironically, this 1795 nugget of wisdom came from one of our founding oligarchs, who, in 1812, led the United States of America into the first senseless war it did not win.

He ended up proving his own point. Two years after the British burned the White House, Madison renewed Hamilton’s central banking cartel brainchild in order to pay the war debt loaned at interest by the rich.

The Conscripted Revolution

So what of the glorious Revolution, foretold as inevitable by some of our forefathers, many of whom witnessed the 20th century arrive with the eyes of hyphenated slaves: squalid immigrant-laborers, peasant-sharecroppers, or the imprisoned peonage-patrons of the “company store?”

Despite the violence (and it was legion) deployed against those who preached faith in a rejuvenated social contract, the long-awaited democratic Revolution was not crushed by force. It was simply drafted into the service of the corporate-state.

Instead of rebelling against the oligarchy during the second decade of the 20th century, as Jack London foretold fictionally, Americans instead allowed their rulers to register a fourth of the nation’s population for the draft. 

Over two and one half million men eventually were pressed into service to fight a war “to make the world” though not their own homeland “safe for democracy.”

 But when the nation failed to win the peace on its stated terms, the people failed also to perceive the oligarchy had won it on theirs. Flush with war profits, the moneyed class then indulged itself in a decade-long binge of market-driven hysteria which ended, predictably, in the global Great Depression.

This, as is happened, was a blessing in disguise for American democracy.

The governmental and economic reforms made under the New Deal constituted, perhaps for the first time in human history, a re-conceptualization of national government as a guarantor of social justice.

No longer was the principal purpose of American government to be the perpetuation of an oligarchy. Democracy would provide the protection of the “mass of the people” from the depredations of “the rich and the well born” the corporations, and the privileged few who control them.

Jefferson’s nebulous “Life, Liberty and the Pursuit of Happiness” were redefined concretely by Roosevelt’s Four Freedoms. Much more important, Madison’s Bill of Rights despised as it was by many of the Federalist aristocrats that penned our inadequate Constitution would at last encompass economic, instead of merely political, guarantees of right.

President Franklin Roosevelt told us:   

“We have come to a clear realization of the fact that true individual freedom cannot exist without economic security and independence. ‘Necessitous men are not free men.’ People who are hungry and out of a job are the stuff of which dictatorships are made.

“In our day these economic truths have become accepted as self-evident. We have accepted, so to speak, a second Bill of Rights under which a new basis of security and prosperity can be established for all, regardless of station, race, or creed.

“Among these are:

“The right to a useful and remunerative job in the industries or shops or farms or mines of the nation

 “All of these rights spell security. And after this war is won we must be prepared to move forward, in the implementation of these rights, to new goals of human happiness and well-being.”

This, then, was perhaps the pivotal moment in American democracy. This was no manifesto posted by foreign anarchists. It was no dormitory pipe dream of campus intellectuals. It was a gauntlet thrown down at the feet of the American oligarchy by the most popular and most victorious American leader of the century.

It was a promise never before made to the American people. 

That was in 1944. The war, and Roosevelt’s life, ended in 1945.

The next year saw 4,985 labor strikes, involving 4.6 million workers. In no year before, nor since, have so many Americans called themselves to action in an attempt to force corporations to extend a living wage to labor. But the oligarchy, fearing guarantees of security that threatened both its power and its profits, immediately counterattacked.

The very next year, 1947, saw the roll-back of workers’ rights and the establishment of a new and more consolidated “National Military Establishment,” replete with a novel organization called the CIA, the U.S. Air Force, and NATO, America’s first permanent international military alliance since 1778. And for the first time in history, Americans continued to be conscripted into military service with no impending war on the national horizon.

Thereafter, Franklin Roosevelt’s Revolutionary vision of an Economic Bill of Rights, proudly proclaimed to a long-suffering people, was relegated to the garage sale of Great Ideas. Not so, however, for America’s glorious wars, without which another generation of Americans might have recalled the rationale for London’s now-forgotten Revolution.  

The Revolution Disremembered

America reveled in its superstar status in the years immediately following the Second World War, its working-class children of the Great Depression desiring nothing so much as to put the ordeal behind them.

Having “fought the good fight,” Americans wanted only “what was coming to them.” As it happened, they allowed someone else to tell them what that would be.

American workers had produced the war machines and manned them, but they had not profited personally in the process; indeed, half a million had surrendered their lives, and millions of others their liberties, their wages, and their savings to the war effort.

For them, the war was something never to be repeated. They did not perceive, in the relief of peace, that the owners of the war industries had learned a far different lesson.

The corporate giants had become fabulously wealthy because of the war. It was not a lesson they would forget. Thereafter, for every subsequent war the American people were glad to put behind them, the “military-industrial complex” had already laid the foundation for yet another.

Americans tended to interpret victory in WWII as a validation of their own wartime propaganda: that America was land of the free and land home of the brave. Having defeated despotism overseas, Americans fantasized the home front to be an example of egalitarian virtue, the envy of a world we had helped to bomb flat.

In the mind of Americans, we had become the permanent Good Guys on planet Earth no matter whom we were told to bomb, invade or overthrow next, or whatever pretext was given for doing so. Being by definition always right, Americans imagined we could do no wrong.

But something crucial was lost amid the triumphalism, the battle-fatigue, and the self-flattery of postwar America culture.

As mostly white American veteran-workers escaped to suburbia from hardscrabble farms and claustrophobic city neighborhoods, they forgot the final battle had yet to be won. They lost sight of the fact that the Four Freedoms, the Economic Bill of Rights, and the New Deal in general stood only as notes scribbled hastily in the margins of the Constitution, but never finalized in a new social contract.

For all of the democratic justice the New Deal reforms had produced, the structural relationship of “the mass of the people” to the “rich and well born” remained precisely as it had when Hamilton first argued successfully to retain oligarchy in the federal Constitution.

Once isolated in sterile suburbia, America repressed its collective memory. We somehow forgot that the democratic Revolutionary banner had not first been raised by Marxists, but by American farmers in rebellions against oligarchs led in turn by Bacon, Shays, and Whiskey Tax rebels.

The same banner had been taken up in turn by American agrarian populists, urban progressives and democratic reformers of every stripe.

We as a people seemed to forget how, in the generations before Pearl Harbor, thousands of American militiamen and deputized goons had machine-gunned and bayoneted striking workers from Massachusetts to Seattle; how corporate interests had conspired to overthrow the White House with an armed coup d’état; how differences in race, class, ethnicity, gender, and national origin had all been and still are exploited by the ruling elite to divide and conquer democratic challenges to its power.

The rebellious, democratic spirit that had survived centuries of suppression, violence and poverty would not survive the American retreat to suburbia, where Americans traded Revolution for revolving credit. For in this diaspora to the temporary economic Fantasyland that Americans now call home for those who still have a home we left our history behind us.

How the oligarchy now the corporate-security state finally triumphed over the last shred of hope in a democratic Revolution is a story whose last chapter has recently been sent to the print shop of history.

Let it suffice to say that it transpired while a majority of Americans sat, conveniently stupefied, watching corporate-sponsored war news on a television manufactured by an outsourced American job.

It would not have surprised Jack London if the democratic Revolution he envisioned had failed in its first attempt, as he himself had imagined in The Iron Heel. What he did not imagine is that state-sponsored violence would co-opt a peoples’ revolution.  

Amongst all the wars and the rumors of war, after the manufactured patriotism, the decades of incessant fear and profitable lies, it is no wonder that London’s Revolution had not been defeated at the barricades. For in the end, it had simply been forgotten.

But let us remember the Revolution was forgotten by a nation continually at war. If a vast multitude of us are today unemployed, debt-ridden, homeless and desperate, it is past time we recall the major reason why.

Having never heard of Jack London’s novel of rebellion against oligarchy, today’s children if they are lucky read his tale, The Call of the Wild, instead. It is a poignant story about an abused dog that ultimately, despairingly, turns its back on a cruel and vicious civilization.

Our children are told it is London’s most important work.

Perhaps, by now, it is. 

Jada Thacker, Ed.D, is a Vietnam infantry veteran and author of Dissecting American History: A Theme-Based Narrative. He teaches U.S. History and at a private educational institution in Texas. He may be contacted at .

Should Christians Defend the Rich?

Republican presidential contenders Texas Gov. Rick Perry and Minnesota Rep. Michele Bachmann profess their Christian fundamentalist faith, but denounce efforts by the government to restrain the power of the rich. The Rev. Howard Bess looks at this enduring contradiction between Christianity’s principles and its alliance with the wealthy.

By the Rev. Howard Bess 

Today in America, we have an unholy concentration of wealth in the bank accounts of the few.  This concentration of wealth is not earned wealth, but wealth acquired by manipulation of the economic system, the abuse of labor and the evil of inheritance. 

What has taken place also is not merely the result of a benign economic system; it is the evil of greed at work. Parallel to this corrupt system is a view among too many confessing Christians that the Book of James with its emphasis on good works, not just faith doesn’t belong in the New Testament of the Bible.

Recently, I reread the Book of James and reviewed the history of this five-chapter epistle, as I pondered the controversies that have surrounded it in Christian church history. I found James’s words challenging and exhilarating in their insistence that Christians do good in the world.

Yet, over the centuries, many church leaders have doubted that the Book of James was worthy of inclusion in the New Testament. It was clearly not written by one of the disciples of Jesus, nor by the James who was thought to be a younger brother of Jesus. The best scholars today simply say we don’t know who wrote this collection of sayings.

Because of its emphasis on good works, the Book of James is criticized as “too Jewish” in its perspective and divergent from Paul’s writings about salvation by faith and faith alone. In the 16th Century, Martin Luther, the leader of the Protestant Reformation, concluded that James was not worthy of inclusion in the New Testament collection.

Contradicting Paul’s teachings on faith and faith alone, James states very plainly that faith without good works lacks value.

“What does it profit, my brethren, if a man says he has faith but has no works? Can his faith save him?” James asks. “So faith by itself, if it has no works, is dead.”  

Often moving from issue to issue without clear connections much like the Old Testament book of Proverbs the Book of James takes on a variety of questions relating to what is necessary for a true Christian faith. If there is a central theme, it could be characterized as “what does a Godly life look like?” 

The writer leaves us with snapshot after snapshot of that life. What is never in doubt is that a confessed faith must be matched by behavior patterns that are consistent with that faith.

In James’s writings, jealousy, bitterness and selfish ambition all come under criticism. They are delegated to the unspiritual and devilish.

War and greed are treated in some length tied together by the author who leaves no doubt that a true Christian faith is completely incompatible with war and greed. There is also no place for gossip among the people of God.

The Book of James can best be understood in its moment of early Christian history. The audiences for whom James wrote were third and fourth generation Christians. 

Understandably, the first generations of Christians were absorbed in trying to figure out who Jesus truly was and the significance of his death. They were aggressively evangelistic and spread the new religion with amazing rapidity.

In addition, early Christian believers were apocalyptic, convinced they would be translated into the next life without suffering death. By the time of James, reality had set in. Christians were going to live out their years and pass away just as people had before Jesus.

Recognizing that fact, James had the courage to ask the crucial question for Christians: How are we to live our lives?

Rereading the book of James was a reminder of the writings and work of Walter Rauschenbusch, a Baptist minister who taught at Rochester Divinity School in upstate New York in the early 20th Century. His most famous book was entitled Christianity and the Social Crisis, published in 1907. It set in motion the Christian social gospel movement in America.  

Observing that dominant Christian churches were allied with the powerful and the wealthy, Rauschenbusch called for a new social order that addressed the evils of concentration of wealth in the hands of the few. He noted how child labor and other abuses made the wealthy even wealthier.

As I reread the Book of James, I realized that James was challenging the social evils of his own day, evils that were being commonly embraced by confessing Christians. In his messages to his fellow Christians, he railed against confessing believers who gave deference to the rich.

Walter Rauschenbusch was merely restating the message of James for the 20th Century. Like James, he was speaking primarily to his own fellowship of believers, knowing full well that John D. Rockefeller was a prominent member of his own denomination. 

It is worthy of note that great American civil rights leader, the Rev. Martin Luther King Jr., credited Walter Rauschenbusch as being one of his mentors in the Christian faith. In his Letter from Birmingham Jail, King pointed his finger not at racists but at fellow clergy who counseled patience toward racial bigots. 

James, Rauschenbusch and King all spoke as deeply religious people and used the language of faith. They called sin sin and evil evil.

However, in today’s America, we do not have someone like a James, a Walter Rauschenbusch or a Martin Luther King Jr. to speak the Truth to power.

The Rev. Howard Bess is a retired American Baptist minister, who lives in Palmer, Alaska.  His email address is      

Does Israel Teach Anti-Arab Bigotry?

Israel is experiencing a protest movement for “social justice” as are other countries in the Middle East and Europe. But the Israeli version seeks a more equitable society for Jewish citizens while sidestepping the plight of Palestinians, what Lawrence Davidson sees as the result of intense anti-Arab indoctrination.

By Lawrence Davidson

Over the last ten years, there have been periodic outbursts of rage over the alleged anti-Semitic nature of Palestinian textbooks. Most of these episodes have been instigated by an Israeli-based organization called the Center for Monitoring the Impact of Peace (aka, the Institute for Monitoring Peace and Cultural Tolerance in School Education).

However, the Center’s conclusions have been corroborated only by other Israeli institutions such as Palestinian Media Watch. And, not surprisingly, almost all independent investigations examining the same issue have come up with very different conclusions.

These non-Zionist sources include The Nation magazine, which published a report on Palestinian textbooks in 2001; the George Eckert Institute for International Textbook Research, reporting in 2002; the Israel/Palestine Center for Research and Information, reporting in 2004; and the U.S. State Department Report of 2009. They all found that Palestinian textbooks did not preach anti-Semitism.

According to one Israeli journalist, Akiva Eldar, the Center does sloppy work. It “routinely feeds the media with excerpts from ‘Palestinian’ textbooks that call for Israel’s annihilation [without] bothering to point out that the texts quoted in fact come from Egypt and Jordan.”

Nathan Brown, a professor of political science at George Washington University who did his own study on the subject in 2000, said Palestinian textbooks now in use, which replaced older ones published in Egypt and Jordan, do not teach anti-Semitism, but “they tell history from a Palestinian point of view.”

It might very well be that it is this fact that the Zionists cannot abide and purposefully mistake a Palestinian viewpoint for anti-Semitism.

Here is another not very surprising fact: When it comes to choosing which set of reports to support, American politicians will almost always go with the Zionist versions. Take then-Sen. Hillary Clinton who, in 2007, denounced Palestinian textbooks, saying they “don’t give Palestinian children an education, they give them an indoctrination.”

How did she know? Well, Israel’s Palestinian Media Watch told her so, and she did not have the foresight to fact-check the assertion before going public.

While the Palestinian textbooks don’t teach hatred of Jewish Israelis, the reality of daily life under occupation surely does. Those “facts on the ground” and not the textbooks supply the most powerful form of education for Palestinian youth.
Although in 2009 the U.S. State Department found that Palestinian textbooks were not the products of anti-Semites, there will be yet another Department-sponsored “comprehensive and independent” study in 2011. This time, the investigation will look at “incitement” caused by bias in both Israeli and Palestinian textbooks.

When this happens, one can only hope the investigators take a look at the work of the Israeli scholar Nurit Peled-Elhanan. She is a professor of language and education at Hebrew University in Jerusalem and also the daughter of the famous Israeli general turned peace activist, Matti Peled.

Peled-Elhanan has recently written a book titled Palestine in Israeli School Books: Ideology and Propaganda in Education. The book, which will be published this month in the United Kingdom, covers the content of Israeli textbooks over the past five years and concludes that Palestinians are never referred to as such “unless the context is terrorism.” Otherwise, they are referred to as Arabs.

And Arabs are collectively presented as “vile and deviant and criminal, people who do not pay taxes, people who live off the state, who don’t want to develop. … You never see [in the textbooks] a Palestinian child or doctor or teacher or engineer or modern farmer.”

In contrast, she finds that Palestinian textbooks, even while telling history from a Palestinian point of view, “distinguish between Zionists and Jews”; they tend to take a stand “against Zionists, not against Jews.”
Peled-Elhanan makes a link between what Israeli children are taught and how they later behave when drafted into the country’s military services.

“One question that bothers many people is how do you explain the cruel behavior of Israeli soldiers towards Palestinians, an indifference to human suffering, the inflicting of suffering. … I think the major reason for that is education.”

Historically, the mistreatment of Palestinians, including the periodic massacre of them, is taught to Israelis as something that is “unfortunate” but ultimately necessary and “good” for the survival of the state. In Peled-Elhanan’s opinion, Palestinian terrorist attacks are “the direct consequence of the oppression, slavery, humiliation and the state of siege imposed on the Palestinians.”
This Israeli process of educating children to hate and to feel prejudice is, of course, exactly what the Zionists accuse the Palestinians of doing. It turns out that all this time, while leveling charges of incitement at the Palestinian educational process, the Israelis have been practicing the same sort of indoctrination on their own children.

This revelation fills Peled-Elhanan with despair, lamenting that “I only see the path to fascism” for Israel.
Making Choices

Keeping the theme of education in mind, let us shift attention to the unprecedented protests now going on in Israel. For the last two weeks, massive demonstrations have hit all of Israel’s major cities. “Tent cities” have sprung up in some 40 locations. All of these protests are demanding “social justice.”

What, in this case, does social justice mean? It means addressing all the legitimate, standard-of-living problems that beset most of the Israeli demonstrators: soaring costs of food and housing, declining social services and the like. All of these are the predictable consequences of unregulated capitalism and neo-liberal governments.
A significant number of Israelis have decided that this lack of social justice has gone far enough. A recent poll shows that 88 percent of the citizenry supports the protests.

However, this is not entirely a good thing. In order to maintain such support, coming as it does from almost all sections of Israeli political life, the protest leaders now endeavor to remain “non-political” and “rooted squarely in the mainstream consensus.”

This is, of course, naive. The Israelis live in a skewed “democratic” political environment with a right-wing government that is not going to acquiesce to their demands, except to throw them an occasional bone, unless the protesters can command the votes to shape the outcome of elections. Like it or not, that is the way their system works.
There are other problems. In order to be “rooted in the mainstream consensus,” the protest leaders are staying away from the issue of social justice for the Palestinians. In Israel proper, that means turning their backs on the plight of over 20 percent of the population.

What sort of social justice is that? Well, it is social justice as defined by people educated in the system described by Nurit Peled-Elhanan. That is why the protest leaders can happily solicit the support of Naftali Bennett, the thoroughly despicable leader of the colonial/settler movement, but not any of the leaders of the Arab-Israeli community.
By not taking a social-justice-for-all stand, the protest movement leaders have registered their acceptance of the “justice for Jews only” system to which they were educated. This in itself is a political act which will make them vulnerable to being picked apart with pseudo-solutions that offer some of them a little while denying others a lot.

Already, as reported by Haaretz, dozens of members of the Knesset have petitioned Prime Minister Benjamin Netanyahu to “solve the housing crisis by building in the West Bank.” Soon thereafter, the government announced approval for “1,600 more settler homes” in East Jerusalem, with 2,700 more to come later.

That is the sort of solution this protest movement will get unless it can overcome the education/indoctrination and go into politics in a way that applies social justice to all citizens.
In all societies, there are two major goals for education: one is vocational and the other is acculturation. So, one important reason for education is to prepare young people for the job market. The other is to educate them to be “good citizens.”

What this latter goal means depends on the society one is raised in. In the old Soviet Union, becoming a good citizen meant being acculturated to a nationalist brand of communism, as is still the case today in China. In the United States, it means becoming a believer in the American version of freedom, both political and economic. And, in Israel, being a good citizen means becoming a believing Zionist.
The objective of acculturation means that education always has, and probably always will have, a strong dose of indoctrination attached to it. That the Zionists should find it shocking that the Palestinians want to use education for their version of indoctrination and acculturation is a sheer double standard.

And, finally, that the leaders of the protest movement in Israel so pointedly exclude the plight of the Palestinians is testimony to the success of their own education/indoctrination within the apartheid model.
You see, most of us really are what we are educated to be.

Lawrence Davidson is a history professor at West Chester University in Pennsylvania. He is the author ofForeign Policy Inc.: Privatizing America’s National Interest; America’s Palestine: Popular and Offical Perceptions from Balfour to Israeli Statehood; and Islamic Fundamentalism.

Life in an Age of Looting

The ugly scenes of rioting and arson in Great Britain are a preview of the societal breakdown that can be expected from today’s staggeringly inequitable economic/political system, where stock-market sharpies get away with plundering pension funds but the poor get nailed for looting consumer goods, observes Phil Rockstroh.

By Phil Rockstroh

As the poor of Britain rise in a fury of inchoate rage and as stock exchanges worldwide experience manic upswings and panicked swoons the financial elite (and their political operatives) are arrayed in a defensive posture, even as they continue their global-wide, full-spectrum offensive vis-à-vis The Shock Doctrine.

Concurrently, corporate mass media types fret over the reversal of fortune and trumpet the triumphs of the self-serving agendas of Wall Street and corporate swindlers even as they term a feller, in ill-gotten possession of a flat-screen television, fleeing through the streets of North London, a mindless thug.

According to the through-the-looking-glass cosmology of mass media elitists, when a poor person commits a crime of opportunity, his actions are a threat to all we hold dear and sacred, but, when the hyper-wealthy of the entrenched looter class abscond with billions, those criminals are referred to as our financial leaders.

Regardless of the propaganda of “free market” fantasists, the great unspeakable in regard to capitalism is its wealth, by and large, is generated for a ruthless, privileged few by the creation of bubbles.

When those bubbles burst, the resultant economic catastrophe inflicts a vastly disproportionate amount of harm upon those — the laboring and middle classes — who generate grossly inequitable amounts of capital for the elitist of the fraudster class … by having the life force drained from them by the vampiric set-up of the gamed system.

Woody Guthrie summed up the situation in these two (unfortunately) ageless stanzas:

“Yes, as through this world I’ve wandered
I’ve seen lots of funny men;
Some will rob you with a sixgun,
And some with a fountain pen.

“And as through your life you travel,
Yes, as through your life you roam,
You won’t never see an outlaw
Drive a family from their home.”
–excerpt from “Pretty Boy Floyd.”

Although, at present, U.S. bank vaults contain little tangible loot for a Pretty Boy Floyd-type outlaw to boost. How would it be possible for an old-school bank robber such as Floyd to make-off with a haul of funneling electrons?

Here’s the lowdown: The Wall Street fraudsters of the swindler class want to refill their coffers and line their pockets (that is, offshore accounts) with Social Security and Medicare funds. That’s the nature of the unfolding scam, folks. Oligarchic rule has always been a system defined by legalized looting that leaves a wasteland of want, deprivation and unfocused rage in its wake.

Consequently, in the U.K. (and beyond): When poor people’s hopes dry up, cities become a tinderbox of dead dreams, and we should not be stricken with shock and consternation when these degraded places are set aflame, nor should we be surprised when the bribed, debt-beholden and commercial media propaganda-bamboozled middle-class (who helped create the wasteland with their arid complicity) cry out (predictably) for police-state tactics to quell the fiery insurrection.

There have been incidents in which a fire has smoldered for years in an abandoned, sealed-off mineshaft, and then the fire, traveling through the tunnels of the mine, and up the roots of dead, dried trees have caused a dying forest to bloom into flames.

The rage that sparks a riot can proceed in a similar manner — and the insular, sealed-off nature of a nation’s elite and the willful ignorance of its middle-class will only make the explosion of pent-up rage more powerful when it reaches the surface.

We exist in a culture that, day after day, inundates its have-nots with consumerist propaganda, and then, when the social order breaks down, its wealthy and bourgeoisie alike express outrage when the poor steal consumer goods — as opposed to going out and looting an education and a good job.

Under Disaster Capitalism, the people in the underclass have had economic violence inflicted upon them since birth, yet the corporate-state mass media doesn’t seem to notice the situation, until young men burn down the night. Then media elitists wax indignant, carrying on as if these desperate acts are devoid of cultural context.

A mindset has been instilled in these young men and boys that they are nothing sans the accoutrements of consumerism. Yet when they loot an i-Phone, as opposed to creating economy-shredding derivative scams, we’re prompted by the corporate media to become indignant.

When the slow-motion, elitist-manipulated mob action known as our faux democratic/consumerist culture deprives people of their basic human rights and personal dignity — then, in turn, we should not be shocked when a mob of the underclass fails to bestow those virtues upon others.

The commercial mass media’s narrative of narrowed context (emotional, anecdotal and unreflective in nature) serves as a form of corporate state propaganda, promulgated to ensure the general population continues to rage against the symptoms rather than the disease of neoliberalism.

The false framing of opposing opinions — of those who state the deprivations of neoliberalism factor into the causes of uprisings, insurrections and riots as being apologists for violence and destruction is as preposterous as claiming one is an apologist for dry rot when he points out structural damage to a house due to a leaking roof.

Because of the elements of inverted totalitarianism, inherent within the structure of corporate state capitalism, and internalized within the general population by constant, commercial media re-enforcement, one should not be surprised when a sizable portion of the general populace is inclined to support police-state tactics to quell social unrest among the disadvantaged of the population.

Keep in mind: When watching the BBC or the corporate media, one is receiving a limited narrative (tacitly) approved by the global power elite, created by informal arrangements among a careerist cartel comprised of business, governmental and media personality types who have a vested interest in maintaining the status quo, even if, in doing so, they serve as operatives of a burgeoning police state.

Accordingly, you can’t debate fascist thinking with reason nor empathetic imagination, e.g., the self-righteous (and self-serving) pronouncements of mass media representatives nor the attendant outrage of the denizens of the corporate state in their audience — their umbrage engineered by the emotionally laden images with which they have been relentlessly pummeled and plied — because their responses will be borne of (conveniently) lazy generalizations, given impetus by fear-based animus.

Through it all, veiled by disorienting media distractions and political legerdemain, we find ourselves buffeted and bound by the predicament of paradigm lost that constitutes the onset of the unraveling of the present order.

“The kings of the world are growing old,
and they shall have no inheritors.
Their sons died while they were boys,
and their neurasthenic daughters abandoned
the sick crown to the mob.”
–Rainer Maria Rilke, excerpt from “The Kings of the World”

Yet, while there is proliferate evidence that, even as people worldwide are rising up against inequity and exploitation, the economic elite have little inclination to do so much as glimpse the plight of those from whose life blood their immense riches have been wrung, nor hear the admonition of the downtrodden that they are weary of life on their knees and are awakening to the reality that the con of freedom of choice under corporate state oligarchy is, in fact, a life shackled to the consumerism-addicted/debt-indentured-servitude that comprises the structure of the neoliberal, global company store.

“The rotten masks that divide one man
From another, one man from himself
They crumble
For one enormous moment and we glimpse
The unity that we lost, the desolation
…Of being man, and all its glories
Sharing bread and sun and death
The forgotten astonishment of being alive”
–Octavio Paz, excerpt from “Sunstone”

Accordingly, the most profound act of selfless devotion (commonly called love) in relationship to a society gripped by a sociopathic mode of being is creative resistance. Submission is madness. Sanity entails subversion. The heart insists on it; otherwise, life is only a slog to the graveyard; mouth, full of ashes; heart, a receptacle for dust.

Phil Rockstroh is a poet, lyricist and philosopher bard living in New York City. He may be contacted at: . Visit Phil’s website / And at FaceBook:

US Lost Its Way from Omaha Beach

Exclusive: Visiting Omaha Beach and the nearby American cemetery of World War II dead recalls a moment in time when the United States sacrificed to stop a global epidemic of madness. But Robert Parry discovered that those memories also underscore how the United States has since lost its way.

By Robert Parry

My pilgrimage to the World War II beaches of Normandy was a reminder to me of what the United States meant to the world not that long ago and the troubling contrast with today.

Before heading to Omaha Beach the iconic heart of D-Day heroism I spent several hours at Caen’s World War II museum where you literally descend down an inclined walkway into the murderous madness that engulfed Europe in the 1930s.

It is still hard to imagine that a racist fanatic like Adolf Hitler could gain control of Germany, then one of the world’s most advanced civilizations, and that he could win over enough Germans to undertake various forms of mechanized slaughter.

There was, of course, a long history in Europe of such butchery, from the Roman conquests more than two millennia ago, through the Christian religious wars of the middle of the last millennium to the wholesale killing of World War I.

Indeed, that European tendency to periodically sink into bloody barbarism was the historical backdrop of the American Revolution.

In creating a new Republic, the Founders tried to inoculate the United States from some of those viruses prohibiting a national religion, restraining the Executive’s war-making powers and cautioning against entangling alliances.

But the more integrated world of the 20th Century made isolationism a difficult approach.

Hitler’s Rise

By the 1930s, Europe had gotten itself into another fix with global implications. The German business elite had decided that Hitler was the man who could stop the rise of Bolshevism and regain some of Germany’s lost pride and territories from World War I.

Great Britain and France made some appeasing gestures toward Hitler by restoring land that had been stripped from Germany, but that only encouraged Hitler’s megalomania.

Soon, Hitler’s aggression against Poland pushed matters too far, shoving Europe into yet another war. France soon fell to Germany’s military might and Great Britain struggled under an unprecedented aerial bombardment focused on civilian targets.

Quietly assisted by President Franklin Roosevelt, Great Britain withstood the air campaign, causing Hitler to turn his attention to the Soviet Union.

Meanwhile, half a world away, Germany’s fascist allies in Japan were expanding their own empire and chafing against American power in the Pacific. After Japan’s surprise attack on Pearl Harbor, the United States entered the war against Japan and its Axis allies.

As the war expanded, so did the carnage, both involving armies and civilians.

Under the cover of war, Hitler advanced his genocidal goal of exterminating European Jews, whom he used as scapegoats for Germany’s troubles. The Nazis’ use of roving execution squads gave way to the construction of industrial-style killing factories.

The future of humanity looked exceedingly bleak.

However, by 1944, Hitler’s forces had suffered a devastating defeat at Stalingrad and were getting driven back by the Soviet Red Army on the eastern front. The United States and Great Britain had mounted a successful invasion of Italy, but progress northward was slow and bloody.

A Western Front

The world’s attention turned to the coast of France, where an amphibious assault was anticipated, though the Germans were unsure where. The assault came on June 6, 1944, along the Normandy coast farther west than Hitler had expected.

British and American forces carried out the major landings, with other allied countries and resistance movements contributing what they could.

The U.S./British high command considered the most important landing sites as those designated Utah (for the Americans) and Gold, Juno and Sword (for the British).

But the commanders feared that there was too much territory between those principal targets, so an American landing was also ordered under the bluffs of what was designated Omaha Beach.

As the invasion got underway, the landings at Omaha Beach proved particularly bloody with some 3,000 American troops dying in a desperate struggle to overcome well-entrenched Germans controlling the high ground.

Finally, Allied beachheads were established and the Germans were driven back, but the fighting across Normandy raged for more than two months. The losses were heavy on all sides.

Victory at Last

By the spring of 1945, the Red Army from the east and the U.S./British forces from the west had put an end to Hitler’s Third Reich. The crazed dictator committed suicide in his Berlin bunker.

The defeat of German fascism also stopped Hitler’s extermination plans, though not before nearly six million Jews and many other “undesirables” were put to death.

A year later, the Nuremberg Tribunal punished some of Germany’s leading war criminals and established what were to be principles for a future peaceful world.

A visit to Normandy is a reminder of how important the United States was in stopping the madness.

The most lasting reminder of this American contribution is the cemetery at St. Laurent-sur-Mer, where more than 9,300 U.S. servicemen are buried under row upon row of white crosses and the occasional Star of David.

After the war had ended, the American dead were collected from across much of Europe. Their families were given the choice of repatriating the bodies or having them interred at this American cemetery near where they had died, including many with the date June 6, 1944.

The cemetery, which overlooks a section of Omaha Beach, has become a point of pilgrimage for many Americans, although during my visit on Aug. 5 there seemed to be even more French visitors paying their respects than Americans.

The whole Normandy region retains an appreciation for Americans, unlike some other parts of France where Americans often find the French standoffish or haughty. Today, the long sandy stretch along Normandy’s north coast is still called Omaha Beach in honor of the Americans who died there.

Going west from Omaha Beach toward Utah Beach, there are other tributes to the American liberators.

In the little village of Sainte Mere Eglise, a dramatic moment is recalled from the 82nd Airborne’s assault on the night of June 5, 1944, when paratrooper John Steele’s parachute got entangled on the church steeple and he played dead for hours before being disentangled and taken down.

Looking up at the church today, a replica of Steele and his parachute are there. Inside the church, a stained-glass window commemorates the American paratroopers, whose death toll of about 4,000 was even higher than the fatalities at Omaha Beach.

A Dark Turn

While war should never be romanticized and U.S. history is replete with its own acts of bloody inhumanity it is difficult for an American to come away from a visit to Normandy without a lump in one’s throat about the necessary, if brutal, actions that occurred here.

Something truly evil had gained a powerful foothold in the world and had to be stopped. But the tragedy is also what happened next, how the United States became corrupted by much of the same viciousness that the Nazis and their Axis allies had unleashed.

Over Germany and Japan, the Allies undertook their own terror bombings of civilian centers, such as Dresden and Tokyo. On Aug. 6 and 9, 1945, President Harry Truman chose to drop atomic bombs on two nearly defenseless Japanese cities, Hiroshima and Nagasaki, rather than negotiate a peace with Japan.

After World War II, the United States engaged in a fierce competition with its erstwhile ally, the Soviet Union, again rebuffing possible openings for accommodation. Opting for a new kind of empire, the United States even collaborated with ex-Nazis and similarly brutal fascists in the Third World.

In Indochina, the U.S. military killed in the millions, and in Latin America, Washington allied itself with vicious dictators trained in the dark arts of assassination and torture.

To feed the national hunger for energy, American leaders sided with authoritarian leaders across the Middle East, just as long as those despots ensured a steady supply of oil.

Part of the Problem

Instead of seeing the Americans as liberators who were part of a solution, many people around the world came to view Americans as just the new imperialists on the block, as part of the problem.

When U.S. interference in the Middle East and Central Asia led to the emergence of al-Qaeda and its 9/11 attacks in 2001, President George W. Bush told the confused American public that the terrorists simply “hate our freedoms.” Many Americans were then duped into believing that Iraq was somehow behind 9/11, even though no Iraqis were involved in those attacks.

Thus, a majority of Americans enthusiastically supported Bush’s unprovoked invasion of that Arab country, a violation of the Nuremberg Tribunal’s prohibition against aggressive war.

Some Americans were caught up in a frenzy of waving the American flag; others unfurled the Christian banner or the Star of David for a renewed “clash of civilizations” with Islam. Bigotry against Muslims has become an accepted part of political thought across much of the U.S. heartland and is eagerly promoted by the still-influential neoconservatives.

Today, the United States seems to be leading the world into a new Dark Age, where science and fact are forced to take a back seat to religious and ideological beliefs, where free-market extremism mixes with jingoism, militarism and Christian fundamentalism.

Tea Party Madness

America’s most prominent “populist” movement, the Tea Party, is remarkable in that its central tenet is to make sure taxes on rich people are kept low and none of their tax loopholes even for corporate jets are closed.

Though the Tea Party denies that it is racist or has any similarities to the old-line fascist parties of Europe it appears particularly energized by its hatred of America’s first black president, having pushed false claims about Barack Obama’s Kenyan birth.

At its core, the Tea Party seems driven by a profound contempt for the necessity of democratic government as a counterbalance to the excesses of corporate power. The Tea Party amounts to a movement to shift power over U.S. society to corporate overlords.

Most recently, the Tea Party and its Republican allies shoved the United States to the brink of default, making the faith and credit of the country a hostage to right-wing demands for trillions of dollars in spending cuts but no revenue increases.

After the strategy proved successful, with President Obama and congressional Democrats bowing to the spending-cuts-only approach to prevent a default, Republicans gloated over their hostage strategy.

Senate Minority Leader Mitch McConnell, R-Kentucky, said the GOP/Tea Party approach proved the debt-ceiling limit was “a hostage that’s worth ransoming” and had “set the template for the future.”

The cumulative impact of right-wing American policies is also pushing the world toward what may be another cataclysm.

The Right’s anti-government, anti-regulation movement combined with Ayn Rand’s “greed-is-good” approach to economics played key roles in Wall Street’s financial collapse in 2008 and the resulting global recession.

Now, the Right’s austerity demands are squeezing the embattled middle class even more, setting the stage for worsening social unrest, which is already provoking renewed racial tensions in Europe and prompting demands for more “law and order.”

Key political forces in the United States seem determined to ignore the lessons of the 1920s and 1930s and force some post-modern “Clockwork Orange” replay of those troubled times.

After spending time in Normandy and recalling the sacrifice that so many Americans made to stop one lethal virus of madness, it is disconcerting to see the United States emerging as a principal carrier of another.

[For more on these topics, see Robert Parry’s Secrecy & Privilege and Neck Deep, now available in a two-book set for the discount price of only $19. For details, click here.]

Robert Parry broke many of the Iran-Contra stories in the 1980s for the Associated Press and Newsweek. His latest book,Neck Deep: The Disastrous Presidency of George W. Bush, was written with two of his sons, Sam and Nat, and can be ordered at His two previous books, Secrecy & Privilege: The Rise of the Bush Dynasty from Watergate to Iraq and Lost History: Contras, Cocaine, the Press & ‘Project Truth’ are also available there.


The Bible’s Clash with Today’s Reality

Among Republican presidential hopefuls, several such as Rep. Michele Bachmann and Gov. Rick Perry have stressed their commitment to fundamentalist Christianity, which bases its approach to cultural issues on a literal reading of the Bible. But the Rev. Howard Bess notes that many of those ancient traditions are repugnant to modern society.

By the Rev. Howard Bess

The essential messages of the Bible are justice, peace, love, reconciliation and hope — messages that have the power to operate in every age and every culture. But the list of clashes between the Bible and modern culture is long.

For instance, the Bible reflects an absurd understanding of the structure of the universe; it shows little understanding of physical and mental illnesses; and it was and is on the wrong side of patriarchal authority, marriage, equality for women, homosexuality, slavery, and the rights of an older son.

That is because the Bible is a collection of writings by many authors who wrote in ever-changing circumstances in ancient times. Today’s Bible readers live in circumstances that could not have been imagined by the original writers.

Family, social, economic and government structures today are completely different from those of the authors of the original writings. The place of women in Bible settings is a prime example of this dilemma, since that status during early Judaism is defined in the property codes of Leviticus.

Women were property owned by men. They were bought and sold. The most famous example of this law is the story of Jacob and Laban.

Jacob was moving back to his family’s home territory east of Palestine when he arrived at a watering hole and inquired where he could find an uncle named Laban, who was Jacob’s mother’s brother.

As providence would have it, a daughter of Laban, Rachel, appeared at the watering hole with some sheep. Jacob’s first cousin was beautiful and Jacob decided that Rachel was the girl of his dreams. He wanted her as a wife.

The next step in the process was to make a deal with Uncle Laban for the purchase of Rachel. The price was seven years of work as his uncle’s slave. Jacob worked the seven years and thought that the beautiful Rachel was his.

However, Laban switched products. When Jacob woke up from his wedding night, he discovered that he had slept not with Rachel, but with an older sister named Leah. Laban calmly explained that he had no choice. By custom he could not sell off a younger daughter until after he had sold his oldest daughter.

Jacob and Uncle Laban made a new deal. Jacob would work another seven years to get the wife he wanted. He worked the seven years and got Rachel.

The deeply embedded cultural code reflected in this tale eventually became Leviticus law, following a pattern in which established social customs typically get codified into binding law.

The Bible standard of male ownership of women was still fully in force in First Century culture CE at the times of the New Testament writings. The place of a woman was determined by her ownership.

A very common misunderstanding about many of the women who became followers of Jesus is that many were prostitutes. They were, in fact, women who for some reason no longer had an owner and thus were completely vulnerable in the male-dominated society.

A woman such as Mary from Magdala is an example. She was not a “loose” woman but a victim of a cruel male-dominated society. Such women attached themselves to Jesus to escape their plight. They called Jesus “Lord,” and he gave them a new understanding of the value of their lives.

Even in modern times, the Biblical standard of male ownership of women has been difficult to overcome.

I grew up in a small Mid-western farm community where the largest and dominant religious group in the area called themselves Apostolic Christians. Among Apostolic Christians in the 1930s, a man got a wife through negotiation with a young lady’s father.

There were no dating procedures. Their wedding was a celebration of the transfer of ownership of a woman from her father to her husband. They carried on this practice because they made ancient cultural rules a part of their faith practice, seeing themselves as being faithful to Bible standards.

In today’s world, women have carved out very different roles for themselves than the roles assigned to them by the Bible, both Old and New Testaments. Today’s women have made it clear they will never again submit to the cultural practices found in the Bible.

Christians have rightfully seen the necessity of translating the Bible from language to language to facilitate an understanding of the Bible messages. However, most Christians have not understood the necessity of translating the Bible messages from culture to culture.

To be effective the Christian message must be freed from the cultural shackles found in the Bible.

The Rev. Howard Bess is a retired American Baptist minister, who lives in Palmer, Alaska.  His email address is