A US Media Lost in Propaganda

There was once a time perhaps just a brief moment in time when American journalists were cynical and responsible enough to resist being jerked around by U.S. government propaganda, but that time has long since passed if it ever existed, a reality that William Blum describes.

By William Blum

Vulgar, crude, racist and ultra-sexist though he is, Donald Trump can still see how awful the American mainstream media is.

I think one of the main reasons for Donald Trump’s popularity is that he says what’s on his mind and he means what he says, something rather rare amongst American politicians, or politicians perhaps anywhere in the world. The American public is sick and tired of the phony, hypocritical answers given by office-holders of all kinds.

When I read that Trump had said that Sen. John McCain was not a hero because McCain had been captured in Vietnam, I had to pause for reflection. Wow! Next the man will be saying that not every American soldier who was in the military in Vietnam, Afghanistan and Iraq was a shining hero worthy of constant media honor and adulation.

When Trump was interviewed by ABC-TV host George Stephanopoulos, former aide to President Bill Clinton, he was asked: “When you were pressed about [Russian president Vladimir Putin’s] killing of journalists, you said, ‘I think our country does plenty of killing too.’ What were you thinking about there? What killing sanctioned by the U.S. government is like killing journalists?”

Trump responded: “In all fairness to Putin, you’re saying he killed people. I haven’t seen that. I don’t know that he has. Have you been able to prove that? Do you know the names of the reporters that he’s killed? Because I’ve been you know, you’ve been hearing this, but I haven’t seen the name. Now, I think it would be despicable if that took place, but I haven’t seen any evidence that he killed anybody in terms of reporters.”

Or Trump could have given Stephanopoulos a veritable heart attack by declaring that the American military, in the course of its wars in recent decades, has been responsible for the deliberate deaths of many journalists. In Iraq, for example, there’s the Wikileaks 2007 video, exposed by Chelsea Manning, of the cold-blooded murder of two Reuters journalists; the 2003 U.S. air-to-surface missile attack on the offices of Al Jazeera in Baghdad that left three journalists dead and four wounded; and the American firing on Baghdad’s Hotel Palestine the same year that killed two foreign news cameramen.

It was during this exchange that Stephanopoulos allowed the following to pass his lips: “But what killing has the United States government done?”

Do the American TV networks not give any kind of intellectual test to their newscasters? Something at a fourth-grade level might improve matters.

Prominent MSNBC newscaster Joe Scarborough, interviewing Trump, was also baffled by Trump’s embrace of Putin, who had praised Trump as being “bright and talented.”. Putin, said Scarborough, was “also a person who kills journalists, political opponents, and invades countries. Obviously that would be a concern, would it not?”

Putin “invades countries” Well, now there even I would have been at a loss as to how to respond. Try as I might I don’t think I could have thought of any countries the United States has ever invaded. [Editor’s Note: Sarcasm aside, Blum has compiled comprehensive lists of U.S. invasions and interventions in his books, including Killing Hope: U.S. Military and CIA Interventions Since World War II.]

To his credit, Trump responded: “I think our country does plenty of killing, also, Joe, so, you know. There’s a lot of stupidity going on in the world right now, Joe. A lot of killing going on. A lot of stupidity. And that’s the way it is.”

As to Putin killing political opponents, this too would normally go unchallenged in the American mainstream media. But earlier this year, I listed seven highly questionable deaths of opponents of the Ukraine government, a regime put in power by the United States, which is used as a club against Putin.  This of course was non-news in the American media.

So that’s what happens when the know-nothing American media meets up with a know-just-a-bit-more presidential candidate. Ain’t democracy wonderful?

Trump has also been criticized for saying that immediately after the 9/11 attacks, thousands of Middle Easterners were seen celebrating outdoors in New Jersey in sight of the attack location. An absurd remark, for which Trump has been rightfully vilified; but not as absurd as the U.S. mainstream media pretending that it had no idea what Trump could possibly be referring to in his mixed-up manner.

For there were in fact people seen in New Jersey apparently celebrating the planes crashing into the World Trade Center towers. But they were Israelis, which would explain all one needs to know about why the story wasn’t in the headlines and has since been “forgotten” or misremembered.

On the day of the 9/11 attacks, Israeli Prime Minister Benjamin Netanyahu was asked what the attacks would mean for U.S.-Israeli relations. His quick reply was: “It’s very good. Well, it’s not good, but it will generate immediate sympathy (for Israel).” There’s a lot on the Internet about these Israelis in New Jersey, who were held in police custody for months before being released. So here too mainstream newspersons do not know enough to enlighten their audience.

Russian Propaganda?

There is a Russian website [inosmi = foreign mass media] that translates propagandistic russophobic articles from the Western media into Russian and publishes them so that Russians can see with their own eyes how the Western media lies about them day after day.

There have been several articles lately based on polls that show that anti-Western sentiments are increasing in Russia, and blaming it on “Putin’s propaganda.” This is rather odd because who needs propaganda when the Russians can read the Western media themselves and see firsthand all the lies it puts forth about them and the demonizing of Putin.

There are several political-debate shows on Russian television where they invite Western journalists or politicians; on one there frequently is a really funny American journalist, Michael Bohm, who keeps regurgitating all the Western propaganda, arguing with his Russian counterparts.

It’s pretty surreal to watch him display the worst political stereotypes of Americans: arrogant, gullible, and ignorant. He stands there and lectures high-ranking Russian politicians, “explaining” to them the “real” Russian foreign policy, and the “real” intentions behind their actions, as opposed to anything they say. The man is shockingly irony-impaired. It is as funny to watch as it is sad and scary.

The above was written with the help of a woman who was raised in the Soviet Union and now lives in Washington. She and I have discussed U.S. foreign policy on many occasions. We are in very close agreement as to its destructiveness and absurdity.

Just as in the first Cold War, one of the basic problems is that Exceptional Americans have great difficulty in believing that Russians mean well. Apropos this, I’d like to recall the following written about the late George Kennan:

“Crossing Poland with the first US diplomatic mission to the Soviet Union in the winter of 1933, a young American diplomat named George Kennan was somewhat astonished to hear the Soviet escort, Foreign Minister Maxim Litvinov, reminisce about growing up in a village nearby, about the books he had read and his dreams as a small boy of being a librarian.

“We suddenly realized, or at least I did, that these people we were dealing with were human beings like ourselves,” Kennan wrote, “that they had been born somewhere, that they had their childhood ambitions as we had. It seemed for a brief moment we could break through and embrace these people.”

It hasn’t happened yet.

Kennan’s sudden realization brings George Orwell to mind: “We have now sunk to a depth at which the restatement of the obvious is the first duty of intelligent men.”

 William Blum is an author, historian, and renowned critic of U.S. foreign policy. He is the author of Killing Hope: U.S. Military and CIA Interventions Since World War II and Rogue State: A Guide to the World’s Only Superpower, among others. [This article originally appeared at the Anti-Empire Report,  http://williamblum.org/ .]




How Debt Conquered America

Special Report: America presents itself to the world as “the land of the free” but for the vast majority it is a place of enslaving indebtedness, a reality for much of “the 99%” that has deep historical roots hidden or “lost” from our history, as Jada Thacker explains.

By Jada Thacker

Since its center-stage debut during the Occupy Wall Street movement, “the 99%” a term emblematic of extreme economic inequality confronting the vast majority   has become common place. The term was coined by sociology professor David Graeber, an Occupy leader and author of the encyclopedic Debt: The First 5,000 Years, published just as the Occupy movement captured headlines.

What Graeber’s monumental work did not emphasize specifically, and what most Americans still do not appreciate, is how debt was wielded as the weapon of choice to subjugate the 99% in the centuries before the Occupy protesters popularized the term. Like so many aspects of our Lost History, the legacy of debt has been airbrushed from our history texts, but not from our lives.

The original 99% in America did not occupy Wall Street in protest. They occupied the entire Western Hemisphere as original inhabitants of North and South America. After 20,000 years of Occupy Hemisphere, an Italian entrepreneur appeared, having pitched an investment opportunity to his financial backers in Spain.

Soon after Columbus launched his business enterprise on the pristine beaches of the New World, each native discovered there above the age of puberty was required to remit a “hawk’s bell’s worth” of gold dust to the Spaniards every two weeks. The hands of all those failing to do so were cut off and strung about their necks so that they bled to death, thus motivating the compliance of others.

Bartolome de las Casas, a contemporary slave-owning priest-turned-reformer, reported three million natives were exterminated by Spanish entrepreneurship in only 15 years. His population figures were guesstimates, but modern researchers confirm that 80 to 90% of the Taino people in the Hispaniola-Cuba region died within 30 years of Spanish contact, the majority from disease.

In the century following Hernan Cortés’s extreme “hostile takeover” of the admittedly brutal Aztec regime (1519-21) the native population of the entire region also declined by 90%. The same story generally followed the Spanish march across Central and South America.

Spanish conquistadors rationalized that their colonial business model, however brutal, was morally necessary: without religious conversion to the Church, pagan natives would have been condemned to an everlasting Christian hell. Ostensibly to save pagan souls, Spaniards destroyed pagan persons with the draconian encomienda system, in essence a debt-based protection racket.

The encomienda dated from the Roman occupation of Iberia (Spain), but had more recently metastasized from the practice of Christians exacting tribute from Muslims during the so-called Spanish Reconquista, which ended the year Columbus sailed. Under this medieval debt obligation, the native 99% were deemed to owe their labor and resources (not their land, which was expropriated by the Crown) to Spaniards in exchange for “protection” and religious education.

The system’s legitimacy in the New World depended upon the useful fiction that the native labor force was not composed of sentient human beings. Thus, it was not lawful to impose encomienda upon persons of mixed-race (mestizo) presumably because they had enough European blood to be considered human.

In practice, this debt-labor system devolved into slavery and butchery of the most brutal sort imaginable, as witnessed by de las Casas. Though the encomienda was eventually abolished, it was replaced only by the hacienda system.

Haciendas were Spanish plantations on which natives worked as landless peasants, who owed a share of their produce to the landowner for the privilege of living lives similar only to those of Southern plantation slaves in the U.S. a century or two hence.

Spanish mines were the scene of even worse atrocities. In The Open Veins of Latin America, Eduardo Galeano details the horrors: native mothers in the notorious Bolivian Potosi silver mine murdered their own children to save them from lives spent as slave troglodytes.

Although some Potosi miners were nominally “free” laborers, they worked under a debt-peonage system that forbade them to leave the mine while still indebted to employers who loaned them the tools of their trade. Not even death extinguished their debt: upon the death of the indebted miner, his family was required to repay the debt with their own perpetually-indebted labor.

The tragedy of the Spaniards’ devastation of untold millions of native lives was compounded by seven million African slaves who died during the process of their enslavement. Another 11 million died as New World slaves thereafter.

The Spanish exploitation of land and labor continued for over three centuries until the Bolivarian revolutions of the Nineteenth Century. But even afterward, the looting continued for another century to benefit domestic oligarchs and foreign businesses interests, including those of U.S. entrepreneurs.

Possibly the only other manmade disasters as irredeemable as the Spanish Conquest in terms of loss of life, destruction and theft of property, and impoverishment of culture were the Mongol invasions of the Thirteenth and Fourteenth centuries. The Mongols and the Spaniards each inflicted a human catastrophe fully comparable to that of a modern, region-wide thermonuclear war.

The North American Business Model

Unlike Spaniards, Anglo-American colonists brought their own working-class labor from Europe. While ethnic Spaniards remained at the apex of the Latin American economic pyramid, that pyramid in North America would be built largely from European ethnic stock. Conquered natives were to be wholly excluded from the structure.

While contemporary North Americans look back at the Spanish Conquest with self-righteous horror, most do not know the majority of the first English settlers were not even free persons, much less democrats. They were in fact expiration-dated slaves, known as indentured servants.

They commonly served 7 to 14 years of bondage to their masters before becoming free to pursue independent livelihoods. This was a cold comfort, indeed, for the 50% of them who died in bondage within five years of arriving in Virginia this according to American Slavery, American Freedom: The Ordeal of Colonial Virginia by the dean of American colonial history, Edmund S. Morgan.

Also disremembered is that the Jamestown colony was founded by a corporation, not by the Crown. The colony was owned by shareholders in the Virginia Company of London and was intended to be a profit-making venture for absentee investors. It never made a profit.

After 15 years of steady losses, Virginia’s corporate investors bailed out, abandoning the colonists to a cruel fate in a pestilential swamp amidst increasingly hostile natives. Jamestown’s masters and servants alike survived only because they were rescued by the Crown, which was less motivated by Christian mercy than by the tax it was collecting on each pound of the tobacco the colonists exported to England.

Thus a failed corporate start-up survived only as a successful government-sponsored oligarchy, which was economically dependent upon the export of addictive substances produced by indentured and slave labor. This was the debt-genesis of American-Anglo colonization, not smarmy fairy tales featuring Squanto or Pocahontas, or actor Ronald Reagan’s fantasized (and plagiarized) “shining city upon a hill.”

While the Spaniard’s ultimate goal was to command native labor from the economic apex, the Anglo-American empire would replace native labor with its own disadvantaged 99%. The ultimate goal of Anglo colonization was not intended so much to put the natives under the lash as to have rid of them altogether.

Trade deficits and slavery would answer their purpose quite nicely. By the 1670s New England Puritans were already rigging the wampum market at their trading posts in order to pressure the Wampanoag into ceding land thus in part precipitating the Narragansett War, King Philip’s War, the ensuing genocide of some natives, and the mass enslavement of others to be sold abroad.

As chronicled by Alan Gallay in The Indian Slave Trade: The Rise of the English Empire in the American South, 1670-1717, Carolina colonists concurrently sold Indian slaves to the God-fearing Puritans and trading others for African slaves at a 2:1 exchange rate while wielding trading-post debt against local Indians, precipitating the Yamasee War which proved to be a major disaster for natives and whites alike.

For half a century, the Carolina colonists’ export of tens of thousands of Indian slaves exceeded imports of black slaves. This was the origin of Southern plantation agriculture.

The institution of North American Indian slavery was necessarily based upon debt. English law forbade colonists from enslaving free persons, but it conceded that prisoners of war could be considered slaves. Because captives owed their lives to their captors, the latter could dispose of the debt as they saw fit, to include transferring the debt to a third party for goods and services.

 

The captive-to-slave pipeline was sanctioned by none other than John Locke, the renowned philosopher who directly inspired Jefferson’s composition of the Declaration of Independence, and who is often championed today by libertarians and no wonder! as an oracle of private property rights.

All along the westward frontier, American colonists continued to foreclose on natives’ land with debt machinations perhaps less overtly brutal, but far more devious than the Spanish encomienda: to remove the self-reliant 99% from their land, it was necessary first to remove their self-reliance.

Here is how President Thomas Jefferson explained the process to future president William Henry Harrison in 1803: “To promote this disposition to exchange lands [] we shall push our trading uses, and be glad to see the good and influential individuals among them run in debt, because we observe that when these debts get beyond what the individuals can pay, they become willing to lop them off by a cession of lands.

“As to their fear, we presume that our strength and their weakness is now so visible that they must see we have only to shut our hand to crush them, and that all our liberalities to them proceed from motives of pure humanity only. Should any tribe be foolhardy enough to take up the hatchet at any time, the seizing the whole country of that tribe, and driving them across the Mississippi, as the only condition of peace, would be an example to others, and a furtherance of our final consolidation.”

Debt more so than firepower, firewater, or even disease provided the economic weapon by which Anglo-Americans designed to privatize the Indians’ means of self-reliance. “How the West Was Won” in the “Land of the Free” was a saga of debt moving inexorably westward in what Jefferson called “our final consolidation.” He might well have said “final solution,” but he did not.

As debt expanded westward, desperate Anglo settlers believed the frontier land was “free for the taking” for those with the stamina to seize it. This belief ultimately proved illusory, as land “squatters” and homesteaders were evicted or forced into paid tenancy through debt or the legal maneuvering of wealthy land speculators.

George Washington secured the eviction of pioneer families from western Pennsylvania land he claimed to own in absentia, although those he forced from the land possessed a deed that pre-dated his own, as related in Joel Achenbach’s The Grand Idea: George Washington’s Potomac & the Race to the West.

On the other hand, Daniel Boone, famed for leading pioneers westward through the Cumberland Gap, died landless, all his land claims having been picked off by legal sharpshooters. Also landless, Davy Crockett died at the Alamo in an attempt to secure Texas acreage he never survived to claim.

The final illusion of free soil vaporized when in 1890 the United States census declared the American Frontier closed. Much of what was left had at any rate been monopolized by railroad, ranching, mining, and forestry corporations after the Dawes Act had privatized most of the natives’ “protected” reservation lands in 1887. For most white and black Americans, meanwhile, free tenancy homesteads had never materialized in the first place.

Debt vs. Self Reliance

Jefferson’s “final consolidation” was accomplished by a system he admitted offered debt with one hand but held a sword in the other. The estimated 3,000,000 families who lost their homes during the Great Recession that began in 2007 understand this principle intimately.

The debt system is in fact more powerful in the Twenty-first Century than ever before because the 99% are far less self-reliant now than ever before. To understand why this is so, we must first think seriously about the term “self-reliance.”

Although we may casually refer to someone as being self-reliant, such people do not actually exist. Human beings simply are not equipped to survive, much less prosper, strictly as self-reliant individuals. As infants and children we cannot survive without familial care, and as adults we cannot prosper without the cooperation and support of peers.

There has never been, and never will be, such a thing as a “self-made man.”

NativeAmericanTribesMap

On the other hand, the 99% was self-reliant the day before Columbus arrived. They possessed the means of production of the energy and food resources needed for their group’s long-term survival and biological propagation, all without significant contact with others. Had the culture of Columbus been equally self-reliant, he would never have needed to set sail.

Judged by modern standards, American native groups were intensely cooperative, extremely egalitarian, and inherently (if informally) democratic. Government as a coercive force did not exist in these groups as we know it today, though leadership and traditional mores were vital to group survival.

Similarly, the concepts of money and monetary debt were unknown, as was the concept of an economic “class” that reserved economic privileges or property to itself at the expense of all. Interpersonal behavior within native groups, by eyewitness accounts, was respectful and peaceful.

Behavior between native groups usually was not peaceful. Persistent low-level warfare was the norm. It could be brutal indeed, but rarely if ever rose to the scale of civilized “total war.”

Indeed, since a “warrior class” did not exist and could not be conscripted, native combatants were necessarily volunteers who otherwise were needed at home to help provide for their families. Consequently, the severity and duration of native warfare were limited as it is for all human groups everywhere to what society at large can economically afford.

Among the original 99%, all men mostly performed the same sort of occupational tasks. All women did the same. Both sexes had a common goal: food production and the reproduction and rearing of children. While all human groups must achieve these basic goals, civilized peoples do so within a complex hierarchical labor system, wherein some occupational tasks are considered more worthy than others and are compensated accordingly.

Civilized division of labor inevitably has metamorphosed into a hierarchy of economic classes, ultimately resulting in the private ownership of the means of production by the “haves” and the lack of private ownership by the “have-nots.” This was unknown in uncivilized native society.

Native land, for example, was not actually “owned” in the contemporary sense at all. Natives were acutely aware that they, themselves, were products of the land; for them, claiming ownership of the land would have made as much sense as children claiming ownership of parents.

This is not to say that natives were not territorial, for they were highly territorial. But their territoriality was not based upon legalistic titles of private property. Access to communally-held food resources not ownership of real estate was their sine qua non for sustainable survival.

What natives shared in common they defended in common. Having no economic hierarchy, no one in their society could control the food supply of others, simply because no individual could claim exclusive ownership of the collective means of food production. Abundant resources were therefore abundant for all; if scarce, they were scarce for all.

True enough, when the Europeans arrived, they found native societies everywhere in conflict with their neighbors, but nowhere did they find endemic poverty, famine, disease or social degeneracy. Indeed, it was the self-reliant natives who helped feed the first generation of starvation-prone English colonists both at Jamestown and at Plymouth.

Once private ownership clamped down upon the landscape, virtually nobody would control their own food supply without some form of indebtedness to another. But since the resource stock of self-reliant food production the land itself would remain in place, private monopolist-owners required an economic mechanism to keep what remained within their grasp forever out of reach of others.

The Hand That Gives

As self-reliant native societies were decimated by debt, disease, and sword, ownership of the previously un-owned land was usurped by the conquerors. But it was not to be usurped equally by all of them.

Economic-class domination was problematical in British North America, because the economic pyramid that supplanted the communal native system was composed largely by people in the same Anglo ethnic group. It is one thing to justify violent economic domination of those with a “foreign” language, culture, religious sensibility and physical appearance; it is quite another to justify overlord-ship of those virtually indistinguishable from oneself.

Nevertheless, class domination was a stark fact of life in Colonial America where the economic division between masters and servants was sharp and where all land titles originally flowed down from the Crown to a short list of royal favorites, sycophants and lackeys. After the Revolution, however, maintaining economic class domination proved especially tricky, eventually requiring the drafting of an “all-American” document for that specific purpose.

Yet the solution to the problem to elite domination of a supposed “republic” had been imported, disease-like, from the Old World. In the centuries preceding Columbus’s arrival, two critical economic developments had transpired in Europe: the rise of an economy based upon metallic currency and the de facto repeal of the Biblical prohibition on loaning currency at interest.

The upshot of these quiet revolutions was the replacement of the sovereign currency owned by kings and emperors by that of private currency owned by the new economic elite known as bankers. In Antiquity, currency was owed by subjects to the monarch as tax. This was the reason for Joseph and Mary’s celebrated journey to Bethlehem.

But by the time of Columbus (and continuing to the present day) currency was to be owed to private persons by the monarch, who borrowed from them at interest to finance wars to protect and defend monarchical control over the means of food production.

To be sure, government still levied taxes as in the days of Jesus, but the tax revenue was not exchanged directly to conduct war, but to repay principal and interest to bankers only too obliging to finance wars for personal gain.

Indeed, war and bank-indebted sovereigns are inseparable. The very first European bank to loan at interest was established during The Crusades by the Knights Templar; Spanish king Charles I squandered the vast majority of his conquistadors’ New World gold on paying crushing interest charges incurred during his long war in the Netherlands; King William’s War against France was made possible by the establishment in 1694 of the Bank of England, the world’s first central bank.

Wherever modern war exists, governments are indebted to bankers. This is what prompted a cash-strapped Napoleon to observe: “When a government is dependent upon bankers for money, they and not the leaders of the government control the situation, since the hand that gives is above the hand that takes. Money has no motherland; financiers are without patriotism and without decency; their sole object is gain.”

It is hardly coincidental that the first bank in North America was chartered to supply arms for the American Revolution, and the first central bank of the United States was chartered specifically to fund Revolutionary War debt. Although no banks had existed in British North America during the 174 years preceding the Declaration of Independence, America’s first commercial bank sprang forth in 1781, literally before the smoke had cleared from the American Revolution nearly a decade ahead of Constitutional government.

This fact alone suggests where real economic power had been vested, long before the words “We the People” ever went to press.

Unlike the days of Antiquity, wealth taken by force was not to be held by those who wielded the sword, but by those who financed the supply of swords. The means of North American food production, first expropriated under the banner of European imperial power, would be owned thereafter in Republican America as in monarchical Europe alike by new conquistadors called creditors.

Scarcity, Debt, and “Necessitous Men”  

Monopoly of the food supply of the 99% was accomplished by replacing self-reliance with debt bondage. The second step was to reserve private land ownership to particular individuals within the dominant class. This would be accomplished by imposing upon the land a wholly arbitrary number called a “price.”

The prime function of a monetary price was to render land unaffordable for all except a few creditor-entrepreneurs. Under the precious-metal-based system then in place, currency had been endemically scarce in North America, making this an easy task.

If American land so recently filched from its native inhabitants, then re-filched from its imperial British overlords were to be had by freedom-loving American common folk, it would be had on credit, and on the creditor’s terms. The fate of the Ohio land of the Old Northwest Territory provides a case in point.

In 1749, King George II granted Ohio land to a private corporation, the Ohio Land Company, whose shareholders included George Washington’s paternal uncles. Tellingly, the grant was bestowed 15 years before Britain actually established sovereignty over that land by winning the French and Indian War.

That war which became the first World War in all of human history was ignited, not incidentally, by George Washington himself when he ordered the murder of Indians and a French nobleman upon Ohio land Washington likely considered private property, possibly his own.

It would yet require the American Revolution, three more Indian Wars all under Washington’s presidential authority plus a good deal of diplomatic treachery, finally to “open” the land for Anglo settlement.

But its first owners were not to be hardy frontiersmen and their growing families. In the meantime, select government committees, again not incidentally, had priced the land out of reach of those most in need, so ownership fell into the hands of well-heeled real estate speculators, to include of course Washington, himself.

With the Ohio land grab as background, let us consider the principle of scarcity as it relates to trade. All trade is predicated upon exchanging something one possesses for something one would prefer to possess instead. If there were no such thing as scarcity that is, if all persons already possessed sufficient quantities of the stuff they need nobody would have the inclination to trade anything at all.

What, then, about debt?  No self-regarding person would voluntarily borrow currency at interest if they already possessed enough currency to exchange for the stuff they need to live. If this is so, then for creditors to profit by lending at interest, a single condition is always necessary: persons should not be allowed a sufficient quantity of currency to complete the desired exchange. When currency is scarcer than the goods for which it is to be exchanged, prospective buyers have only two legal options: do without, or borrow.

If the consequences of doing without (starvation or homelessness, for example) are sufficiently unacceptable, and if a creditor is available, then a debtor is certain to emerge. This emphatically does not imply that the debtor is always a willing party to a debt obligation.

Indeed, the entire logic of indebtedness implies the opposite. No rational entities businesses, governments, or individual persons put themselves into interest-bearing debt if they need not do so. Put simply: debtors are the needy who can no longer afford self-reliance.

Franklin Delano Roosevelt expressed this reality eloquently in 1944 when he declared to the nation, “Necessitous men are not free men.” He was mostly preaching to the choir. The American 99% had known this truth since indentured-servant pioneers had waded ashore at Jamestown in 1607 only to find upon their emancipation that all the valuable land had already been monopolized by the planter elite.

Having failed to learn the history debt played in the exploitation of the New World, modern Americans fail to perceive they have inherited the same debt-system foisted upon the natives by colonizing Europeans and later by American elites upon their own people. Consequently, we think of debt only as a contract made voluntarily between two consenting adults.

But in America today every newborn child is a predestined debtor, no matter what he or she will consent to in the future. Once the free land tenure of communal native societies was destroyed, it was never to return, not even for the progeny of the destroyers.

Debt-bondage today takes a more subtle form. If Americans are no longer debt-peons, forced to slave away on the landlord’s estate or starve, it is only because the landlords no longer care where we slave away. Landlords extract their monthly payments regardless. Today about 25% or more of the earnings of workers flows as rent to landlords or as debt payments to mortgagees.

Indeed, fully half of the debts held by commercial banks alone are tied to real estate. Nor do Americans conceive the staggering price tag affixed to land:  for example, economist Michael Hudson reports the dollar valuation of real estate in New York City alone is higher than that of the industrial plant of the entire nation.

As 2016 arrived, American households owed about $13.8 trillion in home mortgage debt, which effectively is a rent payment to the mortgagee who holds the title to the property. All persons without a mortgage must pay rent to a landlord. All persons without mortgages or landlords must still pay property taxes until death, at which time any unpaid taxes in an echo of the Potosi miners’ debt-peonage system must be paid by their indebted heirs.

Thus, the entire landmass of the Western Hemisphere, which for over 20,000 years had been the source of self-reliance for all its human inhabitants, remains hostage to the same debt system that seized it and which now extracts ransom from the 99% who would claim the least corner of it as home.

Debt vs. Dollars 

In Early America, the scarcity of precious-metal coinage (specie) was a perennial problem. So scarce was gold that the first coinage act passed by Congress in 1792 defined the U.S. dollar as “each to be the value of a Spanish milled dollar as the same is now current, and to contain four hundred and sixteen grains of standard silver.”

Thus the United States dollar was literally established on a “silver standard,” and it was Spanish silver at that! The now oft fetishized American “gold standard” for currency would not officially materialize until the Twentieth Century with the passage of the Gold Standard Act in 1900. It did not last long.

After only 33 years, the Gold Standard was abandoned by the Emergency Banking Act of 1933, which was passed to remedy what else? the scarcity of currency during the Great Depression. (All the world’s major currencies dropped the gold standard during this crisis.)

Subsequently, the gold standard never was reemployed domestically in the U.S. And even the dollar’s international convertibility to gold finally was declared extinct, courtesy of President Richard Nixon, in 1971.

It was appropriate for gold to tread the path of the dinosaur, for it had come to resemble one. Though it is evident that currency must be scarce to some degree in order to maintain its exchange-value purchasing power, all precious-metal-backed currencies suffer from their dinosaur-like tendency of becoming increasingly scarce.

As the volume of trade increases in a growing economy, any medium of exchange based upon a finite quantity of metal cannot keep pace. The result is a constant increase in the purchasing power of the metal-backed currency, which becomes ever-more scarce in proportion to the number of persons needing to possess it.

This, in turn, leads to “necessitous men” becoming debtors to those in possession of the metal sometimes to governments, sometimes to businessmen, but always to bankers.

Historically, American currency has usually consisted of a mish-mash of specie, paper money, and debt, often accompanied by copious reserves of flimflammery. Thus, purchasing power has always been based partially upon some form of debt obligation, not just gold or silver coin.

During the Colonial Era, “bills of exchange” essentially IOUs of transferrable debt served as currency for the wealthy and the merchant class. Later, “bills of credit,” un-backed fiat currency issued by the Continental Congress as paper money, financed the Revolution, along with some $60 million of private domestic lending.

Finally, under the National Banking Act of 1862, paper currency issued by national banks was backed by the amount of federal debt IOUs owned by the bank. This officially made the most popular medium of exchange broadly based upon national (war) debt. In addition, the government spent into existence Greenback fiat money, which was backed by nothing at all.

When in 1913 the Federal Reserve Bank (the Fed) began issuing Federal Reserve Notes, they were  at first redeemable either in gold or “lawful money,” but their redemption was soon cancelled by the Emergency Banking Act noted above. Since then, all “backed” currency has been withdrawn from circulation, leaving Federal Reserve Notes (dollar bills), exchangeable only for debt owned by the privately-owned Fed as the sole legal tender of the United States.

Although the gold standard is long gone, “necessitous men” remain. This is not happenstance.

This is the result of applying the mentality of gold-standard scarcity to the modern creation of financial debt, which is but the latest scheme by which the means of production of food that is, the vast landscape of North America is owned by the few and rented out to the rest at interest.

“Gold Bug” libertarians decry the absence of a federal statute defining the value of a U.S. dollar but they miss the point. No doubt the “value” of a dollar matters a great deal to those who hoard them by the billions; but dollar “value” is less a concern for the 99%, who are allowed to earn so few dollars they are forced to borrow them at interest from the hoarders.

As it applies to the 99%, our wages are the source of the currency we all must possess in order to live. But none of us actually controls how much currency we earn (or quite often how much of it we must spend); as a result, we do not control how much we may be forced to borrow.

Finally, we certainly do not control the conditions that will be imposed upon us if we must borrow (or whether we will be allowed to borrow at all). Put bluntly, the economic lives of wage earners are not in their own hands, but rest in the “hand that gives.”

We have only to be thankful we live in a “free country,” whatever in the world that is supposed to mean.

How the West Was Owed: Redux

In the years following the Revolution, some 90% of the American population subsisted as farmers. American women in the coming century would, on average, give birth to eight children who lived into adulthood. Accordingly, not only did population double every 30 years, the demand for additional farms more than trebled every generation.

Moreover, uninformed agricultural practices rapidly destroyed topsoil, sending “dirt poor” farmers streaming westward in search of land more fertile than their wives. As the frontier chased the setting sun, speculators and corporate agents raced ahead like locusts, devouring the landscape on the cheap, renting or re-selling at the price dictated by the demographics of desperation.

covered-wagons

When American school kids are taught about Conestoga wagon “prairie schooners” creaking along the Oregon Trail or the Oklahoma “Sooners” land grab or the California Gold Rush, they are encouraged to view these events as technicolor visions of  a unique American Opportunity unavailable to lesser mortals.

In truth, the 99% headed west simply because there was no place left for them to go. These economic refugees no doubt saw opportunity before them, but a great many of them must have perceived it the same way the many Third Class passengers on the Titanic viewed the opportunity offered by a lifeboat with an empty seat.

In 1893, historian Frederick Jackson Turner enunciated his famous “Frontier Thesis.” He claimed that the American Frontier experience had produced a unique form of democratic culture, increasingly more hostile to social and economic hierarchy as it spread from the Atlantic to the Pacific. The Frontier Thesis is a powerful idea, especially in its view of the frontier as an evolutionary process, which it was.

But Turner’s thesis perhaps unintentionally reinforced both the preexisting theocratic Puritanical creed of American Exceptionalism and the ordained racism of Manifest Destiny both of which were based upon a belief in the sanctity of Anglo-American economic domination of the New World.

For 40 years, Turner continued to proselytize his prototype of Hollywood Americanism as the rapidly industrializing United States rose to world-power status, and he became a celebrity doing so. Had Turner spent an hour or two of that time pondering the deeper implications of his own home mortgage, he might have discovered a more profound reality.

While the American frontier had indeed fundamentally and irrevocably revolutionized the economic relationship between human beings and their control of the North American landscape, the revolution was not achieved by abandoning European principles of social and economic hierarchy, but by transplanting them to the Western Hemisphere.

To view the conquest of North America as the triumphant flowering of democratic liberty and affluence over bestial savagery and abject poverty is so factually vacant, so morally and economically perverse, as to be considered hallucinatory.

Native people literally had no words to describe the cataclysm that had destroyed them. It was left to follow-on generations of “necessitous men” to learn the vocabulary of servitude needed to describe their economic lives.

Here is only a part of the terminology “necessitous men” needed to learn: poverty level, payday loans, food stamps, interest rate, surcharges, eviction, unemployment rates, lock-outs, foreclosure, bankruptcy, credit scores, down payment, damage deposits, credit limit, collection agency, mortgages, user fees, closing costs, title loans, bail outs, insolvency, title insurance, origination fee, installment plans, tax levy, deed restrictions, market crashes, illiquidity, non-sufficient funds, minimum payment due, late fees, lay-offs, property lien, pawn tickets, collateral, tax withholding, service fees, forfeiture, inflation, deflation, stagflation

Is this the vocabulary of free people?

As of January, 2016, Americans (government, business and individuals) owed an estimated $65,000,000,000,000 ($65 trillion) in total debt, with the average citizen’s share about $200,000. Of course, these are aggregate figures: many individuals and businesses owe more, many owe less, but everybody owes.

Now consider: the annual median income (mid-point, not the average) of American citizens is $29,000; that median family savings is below $9,000; and that the median price of a new home is over $294,000. Personal debt (not including national and business debt) per citizen is $54,000, or about twice the median income.

It does not require Napoleonic genius to grasp the fact that the “hand that gives” can load more debt onto the 99% than they can ever possibly repay. This is not a mistake or a “conspiracy”; it is simply a business plan. For every citizen’s $200,000 share of debt, some other entity or citizen expects to collect $200,000 plus interest. Will such a business plan succeed? Ask a Taino or a Wampanoag if you can find one handy.

There is an historical anecdote of a question posed long ago by a Cherokee. “White Brother,” he said, “when you first came to this land, there were no debts. There were no taxes. And our women did all the work. Do you expect me to believe you can make this situation better?”

The 99% should know the answer. And women still do most of the work.

Jada Thacker, Ed.D is a Vietnam veteran and author of Dissecting American History. He teaches U.S. History at a private institution in Texas. Contact: jadathacker@sbcglobal.net




Taking Aim at the Israeli Boycott

Mainstream U.S. presidential candidates are lining up behind Israel’s demand that the next “leader of the free world” take aim at Americans who express their contempt for Israel’s persecution of Palestinians through a boycott, as Lawrence Davidson describes.

By Lawrence Davidson

Most readers will know that the United States has served as the patron of Israel for decades. Why has it done so?

The commonly given reasons are suspect. It is not because the two countries have overlapping interests. The U.S. seeks stability in the Middle East (mostly by supporting dictators) and Israel is constantly making things unstable (mostly by practicing ethnic cleansing against Palestinians, illegally colonizing conquered lands and launching massive assaults against its neighbors).

Nor, as is often claimed, is the alliance based on “shared Western values.” The U.S. long ago outlawed racial, ethnic and religious discrimination in the public sphere. In Israel, religious-based discrimination is the law. The Zionist state’s values in this regard are the opposite of those of the United States.

So why is it that a project that seeks to pressure Israel to be more cognizant in foreign affairs of regional stability, and more democratic and egalitarian in domestic affairs, is now under fire by almost every presidential candidate standing for the 2016 election?

That project in dispute is BDS, the Boycott, Divestment and Sanctions movement, promoted by civil society throughout the Western world. BDS is directed at Israel due to its illegal colonization of the Occupied Territories and its general apartheid-style discrimination against non-Jews in general and Palestinians in particular.

The Candidates and BDS

With but two exceptions, every presidential candidate in both parties is condemning the BDS Movement. Let’s start with the two exceptions.

The first exception is the Green Party candidate Jill Stein, who has taken the accurate position that “the United States has encouraged the worst tendencies of the Israeli government.” She has pledged to use both diplomatic and economic means to change Israeli behavior, behavior which she rightly believes is in contravention of international law and violates human rights.

The second exception is the Republican candidate Donald Trump, who recently told a meeting of Jewish Republicans that he didn’t think Israel is serious about peace and that they would have to make greater efforts to achieve it. When he was booed he just shrugged and told the crowd that he did not care if they supported him or not, “I don’t want your money.” Unfortunately, this appears to be the only policy area where Mr. Trump is reasonable.

Jill Stein gets absolutely no media coverage and Donald Trump gets too much. And neither is in the “mainstream” when it comes to American political reactions to BDS. However, the rest of the presidential candidates are. Here is what is coming out of the “mainstream”:

, Jeb Bush (Republican), Dec, 4, 2015: “On day one I will work with the next attorney general to stop the BDS movement in the United States, to use whatever resources that exist” to do so.

, Ted Cruz (Republican), May 28, 2015: “BDS is premised on a lie and it is anti-Semitism, plain and simple. And we need a president of the United States who will stand up and say if a university in this country boycotts the nation of Israel than that university will forfeit federal taxpayer dollars.”

, Marco Rubio (Republican), Dec. 3, 2015: “This [BDS] coalition of the radical left thinks it has discovered a clever, politically correct way to advocate Israel’s destruction. As president, I will call on university presidents, administrators, religious leaders, and professors to speak out with clarity and force on this issue. I will make clear that calling for the destruction of Israel is the same as calling for the death of Jews.”

Hillary Clinton (Democrat), July 2, 2015: In a letter to Haim Saban, who is a staunch supporter of the Zionist state and also among the biggest donors to the Democratic Party, she said, “I know you agree that we need to make countering BDS a priority, I am seeking your advice on how we can work together – across party lines and with a diverse array of voices – to fight back against further attempts to isolate and delegitimize Israel.”

Bernie Sanders (Democrat), Oct. 20, 2015: “Sanders’ fraught encounter with BDS supporters who challenged his defense of Israel at a town hall meeting in Cabot [Vermont] last year was captured on YouTube.” Sanders told them to “shut up.”

The Legitimacy of Boycott

This hostility to the tactic of boycott runs counter to both U.S. legal tradition and the country’s broader historical tradition.

For instance, advocating and practicing BDS can be seen as a constitutionally protected right. It certainly is more obviously protected by the First Amendment’s guarantee of free speech than is the use of money to buy elections.

Thus, if Zionist lobbyists can use money to buy support for Israel, why can’t anti-Zionists use their free speech rights to challenge that support? It should be noted that, in this regard, most Americans of voting age think it is the Zionists, and not the anti-Zionists, who have gone too far.

According to a December 2015 Brookings Institute poll, 49 percent of Democratic voters and 25 percent of Republican voters think that Israel has too much influence with U.S. politicians. Those supporting BDS in the United States might give some thought as to how to use these numbers to uphold their cause.

Then there is the fact of well-established historical tradition. The war for American Independence was built upon a framework of boycott. In November 1767, England introduced the Townshend Acts, requiring the colonists to pay a tax on a large number of items. The reply to this was both a boycott of British goods by many colonial consumers which was eventually followed by a boycott on the importation of such goods on the part of colonial merchants.

Subsequently, Americans have used the tactic of boycott against:

, (1930s) Goods produced by Nazi Germany , (1960s and 1970s) California-grown grapes in support of the United Farm Workers , (1970s and 1980s) All aspects of the economy and cultural output of South Africa , (1980) The Moscow-hosted Olympics of 1980 , Myriad number of boycotts of various companies and products ranging from Nestle (baby formula) to Coca Cola. See the list given by the Ethical Consumer.

The reality is that the tactic of boycott has long been as American as the proverbial apple pie.

Apple pie not withstanding, the legal and historical legitimacy of boycott no longer has much impact on the attitudes of presidential candidates or, for that matter, members of Congress. Nor does the fact that the changes the BDS movement seeks to make in Israeli behavior would be to the benefit of U.S. interests in the Middle East.

Instead what the positions of the candidates seem to indicate is that there will be an almost certain attack on the Boycott, Divestment and Sanctions movement, coming from the very highest levels of U.S. power, sometime soon after the 2016 elections.

How is it that such a contradiction between national interests and established tradition on the one hand, and imminent government policy on the other can exist? The answer is not difficult to come by. It is just a matter of fact that constitutional rights, historical tradition, and indeed the very interests of the nation, can be overridden by special interest demands.

The demands of what George Washington once called “combinations and associations” of “corrupted citizens” who would “betray or sacrifice the interests of their own country” in favor of those of some other “favorite nation.” It is exactly such demands that are now given priority by the politicians in Washington.

This form of corruption will go on as long as the general public does not seem to care that it is happening. And it is sadly clear that the BDS activists alone cannot overcome this indifference. Thus, the politicians can dismiss the Brookings Poll numbers mentioned above. They can shrug and say, So what?

As long as that majority does not express their opinion by actively demanding a change in the situation, as long as they are not successfully organized to do so, their opinion cannot compete with the millions of special interest dollars flowing into political campaigns.

In many ways our greatest enemy is our own indifference to the quiet erosion of important aspects of the democratic process. Allowing the attack on BDS only contributes to this disintegration of rights.

A combination of localness and ignorance sets us up for this feeling of indifference. However, in the end, there can be no excuse for not paying attention. One morning you will wake up to find that valued rights and traditions are no longer there for you.

Lawrence Davidson is a history professor at West Chester University in Pennsylvania. He is the author of Foreign Policy Inc.: Privatizing America’s National Interest; America’s Palestine: Popular and Official Perceptions from Balfour to Israeli Statehood; and Islamic Fundamentalism.




Anger and Ugly Faces of Bigotry

Millions of Americans feel disenfranchised by the political establishment numbers reflected in the populist candidacies of Donald Trump and Bernie Sanders but this sense of betrayal often pours out in ugly expressions of bigotry, as Michael Winship observed.

By Michael Winship

Philadelphia, Cradle of Liberty and City of Brotherly Love, was anything but on New Year’s Day. Visiting with family, we’d all decided to meet up at the annual Mummers’ Parade.

Now, it is well established that I am an unabashed lover of parades both as spectator and participant, having marched in protests, fireman’s carnivals, sugar beet festivals, and many other events. In high school, I was even in a freelance marching band, sort of a mobile garage band with a drum major and a couple of trombones.

So I’m a fan, and I remember watching the Mummers Parade on TV when I was a kid. Seeing it live and in person, the pageantry’s even more impressive. Reminiscent of New Orleans’ fabled Mardi Gras krewes, the various “brigades” in the line of march feature elaborate costumes, floats and superb musicianship. But unlike New Orleans, the marchers are overwhelmingly white. And their comedy skills leave a lot to be desired.

The 116-year-old Philadelphia event, perhaps the oldest of its kind in the United States, is troubled “dogged by controversy,” as Angela Bronner Helm writes at The Root. “Minstrelsy has always [been] a part of the Mummer DNA, including blackface. But in 1964, under pressure from the NAACP, blackface was officially banned, yet it seems to find its way into the parade each year. This year, ‘brownface’ was the insult du jour, with Sammar Strutters’ ‘Siesta Fiesta’ revelers dressed as tacos with brown face paint, including children”

But that wasn’t all, not by a long shot (for one thing, Helm forgot to mention the sombreros and serapes). In addition to the overall public drunkenness of the rowdy young crowd on the streets (I’m no prude but this was out of control), one of the other brigades in the parade crudely mocked Caitlyn Jenner, and one member was subsequently kicked out of the group for publicly spewing anti-gay bigotry.

Philadelphia’s new progressive mayor Jim Kenney tweeted that it was “hurtful Our Trans Citizens do not deserve this type of satire/insult.” What’s more, a gay man walking his parents’ dog reportedly was assaulted by four mummers he came upon urinating in an alley.

In fairness, organizers have made some small steps toward increasing the parade’s diversity. But then there was this.

As we stood in bleachers near City Hall, demonstrators from the Black Lives Matter movement arrived on the corner, anywhere from fifty to a hundred people with signs, shouting for racial and economic justice. Police on bicycles kept them from moving into the parade. Two were arrested.

The protesters were expressing their right to speak out and to focus the rest of us on some real problems. But many of those around us exploded in rage, screaming epithets and hatred at the demonstrators. One of them, her face contorted in fury, reminded me of the infamous New Orleans “cheerleaders,” the forty or so white women who in 1960 stood outside William Frantz Elementary every morning hurling threats and invective at six-year-old Ruby Bridges as she became the school’s first black student.

It was ugly and frightening. New Year’s blood-alcohol levels certainly played a role, but sober or not, these were working class men and women in Philadelphia lashing out, just as the right would have them do, in an incoherent frenzy of indignation without reason or solution.

Racist phantoms distract from and overpower the self-interest that should have them out in the streets protesting their lack of justice, jobs and economic security just as vehemently as the Black Lives Matter movement demonstrates against police killings and so many other criminal violations of basic human rights.

Those furious Philadelphians aren’t alone. A new Esquire magazine/NBC News poll finds half of all Americans angrier than a year ago. Esquire’s editors write:

“From their views on the state of the American dream (dead) and America’s role in the world (not what it used to be) to how their life is working out for them (not quite what they’d had in mind), a plurality of whites tends to view life through a veil of disappointment.

“When we cross-tabulate these feelings with reports of daily anger (which are higher among whites than nonwhites), we see the anger of perceived disenfranchisement — a sense that the majority has become a persecuted minority, the bitterness of a promise that didn’t pan out, rather than actual hardship. (If anger were tied to hardship, we’d expect to see nonwhite Americans,who report having a harder time making ends meet than whites reporting higher levels of anger. This is not the case.)”

As our presidential election year finally, officially begins, Democratic pollster Celinda Lake tells The Wall Street Journal that Americans “think the future is weak for themselves and the next generation, and they despair of politicians especially in Washington getting anything done.”

Then, pugnacious Republican pollster Frank Luntz says, “This election is about settling scores and getting even with everyone and everything” — a distressing commentary on how low our alleged democracy has sunk, and the very heart of the Trump phenomenon that appeals to the most resentful Americans. Other GOP presidential aspirants have fallen into step, hoping to catch some of that Trump “magic,” like flies to the rubbish heap.

Republicans and Democrats alike try to channel all this inchoate animosity into political support, even as they express bafflement at the widespread bitterness, but they reap what they sow, a whirlwind of their own creation, a betrayal of principle and nation crafted from years of kowtowing to big business and the very, very rich who bankroll their electoral desires. They have parlayed our ignorance, fear and basest instincts into power for themselves and their allies.

All those years neglecting the rest of us, all those jobs and factories shipped overseas, all the tax breaks and regulatory payoffs, financiers treated as indulged scofflaws, and the evisceration of housing, infrastructure and education are turning us into a nation of numbskulls. Now that’s what I call exceptionalism. Thanks.

Powerbrokers and political sycophants, from Philadelphia to Pasadena you’re marching this American parade straight into the pit. You don’t have to look beyond our borders for the enemy that will bring us down. Just into a mirror.

Michael Winship is the Emmy Award-winning senior writer of Moyers & Company and BillMoyers.com, and a former senior writing fellow at the policy and advocacy group Demos. Follow him on Twitter at @MichaelWinship.

 




Not Taking Sides in Saudi-Iran Fights

Many U.S. pols and pundits fret that Saudi Arabia’s feelings are hurt by the Obama administration’s opening to Iran, but they conveniently forget Saudi support for terrorism and other acts harmful to the American people, as ex-CIA analyst Paul R. Pillar explains.

By Paul R. Pillar

The recent intensification of Saudi-Iranian tension also has intensified the all-too-habitual urge, in debate about U.S. foreign policy, to take sides in other nations’ conflicts even in the absence of any treaty obligations to do so or good U.S.-centered reasons to do so. That urge has multiple sources.

Some may be common to humankind in general, growing out of ancient life amid warring tribes and clans. Other sources are more specific to Americans and are related to an American tendency to view the world in Manichean good-vs.-evil terms. The latter sources are rooted in several aspects of the American national experience. Whatever the combination of underlying reasons, the side-taking tendency is usually not good for U.S. national interests. The Saudi-Iranian rivalry illustrates why.

Any balance sheet that carefully takes account of the attributes, interests and objectives of Iran and Saudi Arabia does not yield a sound case for the United States to favor either side of that rivalry, and specifically not for the dominant tendency to consider Saudis as the good guys and Iranians as the bad ones.

Consider, for example, the political structure of each state. Saudi Arabia is one of the most undemocratic and politically backward countries in the world. It is ruled as a family enterprise in which ordinary citizens have barely begun to be granted any political role.

The convoluted Iranian constitutional structure also has undemocratic elements, especially in the power of the Guardian Council to disqualify arbitrarily candidates for public office. But it still has significantly more democratic qualities than Saudi Arabia, with elections for a legislature and the presidency that really mean something. By Middle Eastern standards, which isn’t saying a lot, Iran is one of the most democratic countries in the region.

Both countries have substantial deficiencies regarding consistent application of the rule of law. The secretive and politically manipulated judiciary in Iran leads to such injustices as the incarceration of American journalist Jason Rezaian. But Saudi justice isn’t appreciably better. Longtime Saudi watcher Thomas Lippmann writes of “Saudi Arabia’s record of mass arrests” and “secret rigged criminal trials.”

Personal liberties run into snags in both countries, but probably more so in Saudi Arabia, the country where women still are not even permitted to drive a car. In Iran, things have loosened up visibly since the early years after the Iranian revolution, with hijabs inching up to show more female hair and gatherings of people in public places looking somewhat more like scenes in the West.

In both countries the role of religion represents significantly different values from those of the United States. Saudi Arabia considers the Koran to be its constitution, and Iran calls itself an Islamic republic, with a disproportionate political role for Muslim clerics.

But of the two, religious restrictions are greater in Saudi Arabia, where legally there is zero freedom of religion. Any religious practice other than that of the approved version of Sunni Islam takes place only furtively and illegally behind closed doors in private residences.

In Iran there certainly is religious discrimination, most notably but not exclusively against people of the Baha’i faith. But the Iranian state officially recognizes religious minorities, including Christians, Jews, Zoroastrians and others, and permits them to practice their religion.

As for foreign policy, which is where hard American interests and not just American values are most involved, despite the habitual recitation of the familiar mantra about Iran “destabilizing” the region, the mantra simply does not reflect actual Iranian behavior. Destabilization is a term more accurately applied to Saudi actions in the region.

David Ignatius aptly writes that “Saudi Arabia’s insecurities have been a driver of conflict for 40 years. Fearful of domestic threats, the Saudis bankrolled PLO terrorism, jihadist madrassas, al-Qaeda’s founders and Syrian warlords.”

Looking specifically at international terrorism, Saudi policies and practices, including the intolerant Wahhabist ideology, the Saudi habit of foisting on to other countries the violent extremism that the ideology has incubated, and the actions that Ignatius mentions, have done much more to foster terrorism and specifically the brand of terrorism that most threatens U.S. interests today than anything Iran is doing. In Iraq and elsewhere, Iran is today on the opposite side of conflicts from that brand of terrorism.

The strong preference among many Americans to be on the opposite side from Iran of any conflict in which it is involved has multiple roots. Bad historical memories, especially of the 1979-1981 Tehran hostage crisis, have something to do with this. So does the political clout in the United States of certain Middle Eastern governments (not only, or even mainly, the Saudi one), that base their political and diplomatic strategy on eternally keeping Iran a bête noire.

Such reasons do not represent a rational pursuit of U.S. interests, and they do not take account of the considerations mentioned above when it comes to forming attitudes toward conflict between Iran and Saudi Arabia.

It would be just as much of a mistake for the United States to tilt in favor of Iran in this conflict as it is to tilt in favor of Saudi Arabia. Taking either side in this rivalry, as with many other international rivalries, entails several disadvantages for the United States.

The fundamental disadvantage is that taking sides means the United States committing itself to objectives and interests that are someone else’s, and not its own. An objective such as getting the upper hand in a local contest for influence may be a very rational objective for a local power to pursue, but that is not the same as what is in U.S. interests.

Some of the objectives and policies, as is true with Saudi Arabia, may not even be very rational for the local power itself. Internal political weaknesses and rigidity may lie behind some of the local power’s policies, as is true of the apparent Saudi inability to recognize the long-term threat that radical Salafism poses to Saudi Arabia itself and to shape policy accordingly.

Sheer emotion may underlie other policies, as with how the Saudi obsession with toppling Syrian President Bashar al-Assad is related to possible Syrian involvement in the assassination of Lebanese Prime Minister Rafik Hariri, who had close ties with Saudi Arabia.

Another disadvantage for the United States of taking sides in a conflict is that doing so immediately subjects the United States to resentment and disapproval because of whatever baggage has come to be associated with the conflict, in addition to whatever the immediate issues ostensibly are.

The current state of Saudi-Iranian relations is a function not just of last week’s execution of the Shia activist cleric but of several other things. One of the most prominent sore points in recent months, for example, has been the fatal stampede at last year’s hajj, in which hundreds of Iranian pilgrims died. Iranians have been understandably infuriated with Saudi Arabia for letting this incident happen. Anyone taking Saudi Arabia’s side on anything at issue with Iran right now may seem to be insensitive to this tragedy.

Related to the point about associated baggage is the strong sectarian flavor of the conflict. For the United States to be seen taking sides in a conflict between Sunni and Shia, amid the highly charged sectarian tensions along this fault line in the Middle East, can only be a lose-lose proposition for Washington. The United States is much more likely to be seen as an enemy of some part of Islam than as a friend of some other part of it.

A further disadvantage of taking sides is that it reduces the opportunities for U.S. diplomacy, which serves U.S. interests best when the United States can do business with anybody and everybody. Shrewd U.S. diplomacy exploits local rivalries to obtain leverage and to play different rivals against each other for the United States’ own advantage.

Stupid U.S. diplomacy would cut in half the number of other countries the United States can effectively deal with by declaring half of them to be on the “wrong” side of local conflicts. Diplomacy does not work well when one is using only carrots with some countries and only sticks with others.

Finally, one should always be wary of the danger of getting sucked into larger conflicts because of involvement with the spats of lesser states. The European crisis in the summer of 1914 is the classic case of this.

An equivalent of World War I is unlikely to break out in the Middle East, but this is just one of the costs and risks that constitute good reasons for the United States not to make as its own the quarrels of others, no matter how deeply ingrained is the habit of talking about certain states as allies and certain others as adversaries.

Paul R. Pillar, in his 28 years at the Central Intelligence Agency, rose to be one of the agency’s top analysts. He is now a visiting professor at Georgetown University for security studies. (This article first appeared as a blog post at The National Interest’s Web site. Reprinted with author’s permission.)




Saudi Arabia’s Dangerous Decline

Much of Official Washington still toes the Saudi line against Iran in part because Israel shares that hostility but that antagonism is putting the world at greater risk as Saudi Arabia demonstrates increasingly reckless and barbaric behavior, the sign of a declining power, says Trita Parsi.

By Trita Parsi

The escalating tension between Saudi Arabia and Iran is the story of a declining state desperately seeking to reverse the balance of power shifting in favor of its rising rival.

History teaches us that it is not rising states that tend to be reckless, but declining powers. Rising states have time on their side. They can afford to be patient: They know that they will be stronger tomorrow and, as a result, will be better off postponing any potential confrontation with rivals.

Declining states suffer from the opposite condition: Growing weaker over time, they know that time is not on their side; their power and influence is slipping out of their hands. So they have a double interest in an early crisis: First, their prospects of success in any confrontation will diminish the longer they wait, and second, because of the illusion that a crisis may be their last chance to change the trajectory of their regional influence and their prospects vis-à-vis rivals.

When their rivals, who have the opposite relationship with time, seek to de-escalate and avoid any confrontation, declining states feel they are left with no choice but to instigate a crisis.

Saudi Arabia is exhibiting the psychology of a state that risks losing its dominant position and whose losing hand is growing weaker and weaker. This explains why an otherwise rational actor begins making seemingly panicky and incomprehensible moves.

From its decision to give up a seat on the United Nations Security Council, after having campaigned for it for over a year and celebrated its election to the UN body only a day earlier, to its reckless and failing attack on Yemen, to its push against the nuclear deal with Iran, to the deliberate provocation of executing Shia political dissident Nimr al-Nimr, its conduct is that of a sun-setting power.

Iran, on the other hand, is by all accounts a rising power. Ironically, much of Iran’s rise is not due to its own actions, but must be credited to the reckless mistakes of its adversaries.

The U.S. invasions of Afghanistan and Iraq eliminated Tehran’s primary nemeses to its east (the Taliban in Afghanistan) and its west (the Saddam Hussein regime in Iraq). In addition, Iran’s own Machiavellian maneuvering also ensured that it, and not the U.S., has become the most influential outside actor in those two.

Even though the Syrian civil war has been very costly to Iran in terms of resources, soft power and standing in the Arab world, Tehran views the survival of its ally, the Bashar al-Assad regime, as reconfirmation of Iran’s power and deterrence.

Although Iran cannot be declared a winner of the Arab spring, it has probably lost the least compared to Saudi Arabia, Turkey and the U.S. Moreover, the nuclear deal has opened the door for Iran’s rehabilitation among the community of nations. Once a pariah in the eyes of many key states, Iran exercises power and influence in the region that is now increasingly accepted.

Furthermore, the European Union has made no secret that it views the nuclear deal as a first step towards a broader rapprochement with Iran and recognizes that the international community must work with Iran in order for it to be a force for stability.

In fact, the EU’s support for reengagement with Iran is partly driven by its assessment that the West’s current relationship with Saudi Arabia isn’t sustainable. As the New York Times has reported, in the current standoff between Saudi Arabia and Iran, EU sympathies tend to lean toward Tehran.

To make matters worse for the Saudis, the Chinese have shifted their position in the Persian Gulf to reduce their dependency on Saudi Arabia and strengthen their ties with Iran.

“China wants stability in the Persian Gulf,” an analyst close the Chinese government recently told me, “and it sees Iran as the most stable country in the region, while it is very worried about Saudi conduct.”

Yet, despite all of these windfalls for Iran, it is not yet acting singularly as a rising power. The patience and prudence characteristic of rising states whose path for greater influence and role has been paved by the international community’s approval, certainly was not on display when a crowd of angry protesters attacked the Saudi Embassy in Tehran and torched it while Iranian police largely stood by and watched.

There’s a duality in Iran’s conduct. There’s the more mature and prudent approach lead by President Hassan Rouhani and Foreign Minister Javad Zarif. Their leadership gave much of the international community hope that Iran can act as a responsible rising power.

But there is also a reactionary and intransigent segment led by a powerful minority of hardliners who see their own power protected through Iran’s continued isolation and conflict with the outside world. Their conduct is more reminiscent of a declining, anti-status quo power.

This internal tension does not bode well for the region or for Iran. The international community’s willingness to bet that a more powerful Iran will be a more responsible and prudent Iran is contingent upon this contradictory behavior coming to an end.

The Rouhani government appears to recognize this. The Iranian president quickly condemned the attack on the embassy and called it “totally unjustified.” But perhaps more importantly, conservative voices have also come out and blasted the attack. Brigadier General Mohsen Kazemeini of the Islamic Revolutionary Guards Corps condemned the torching of the embassy as “totally wrong” and as an “ugly, unjustifiable act.”

It took almost a year before hardliners in Iran grudgingly admitted that the 2011 sacking of the British embassy was wrong. But for the first time now, hardliners are paying a price and facing resistance almost immediately after committing a transgression of international norms and law.

But for Iran to rise as geopolitical stars align in its favor, condemnations after a transgression is not enough. “Totally unjustified” acts must be prevented, not just denounced. The region simply cannot afford having both of its leading powers acting like declining states.

Trita Parsi is founder and president of the National Iranian American Council and an expert on US-Iranian relations, Iranian foreign politics, and the geopolitics of the Middle East. He is also author of Treacherous Alliance: The Secret Dealings of Iran, Israel and the United States. [This article first appeared as an op-ed on AlJazeera. http://america.aljazeera.com/opinions/2016/1/the-power-logic-behind-riyadhs-moves.html]




Reality Peeks Through in Ukraine

Exclusive: With corruption rampant and living standards falling, Ukraine may become the next failed state that “benefited” from a neoconservative-driven “regime change,” though the blame will always be placed elsewhere in this case, on the demonized Russian President Putin, writes Robert Parry.

By Robert Parry

Nearly two years since U.S. officials helped foment a coup in Ukraine partly justified by corruption allegations the country continues to wallow in graft and cronyism as the living standards for average Ukrainians plummet, according to economic data and polls of public attitudes.

Even the neocon-oriented Wall Street Journal took note of the worsening corruption in a Jan. 1, 2016 article observing that “most Ukrainians say the revolution’s promise to replace rule by thieves with the rule of law has fallen short and the government acknowledges that there is still much to be done.”

Actually, the numbers suggest something even worse. More and more Ukrainians rate corruption as a major problem facing the nation, including a majority of 53 percent last September, up from 48 percent last June and 28 percent in September 2014, according to polls by International Foundation for Electoral Systems.

Meanwhile, Ukraine’s GDP has fallen in every quarter since the Feb. 22, 2014 putsch that overthrew elected President Viktor Yanukovych. Since then, the average Ukrainian also has faced economic “reforms” to slash pensions, energy subsidies and other social programs, as demanded by the International Monetary Fund.

In other words, the hard lives of most Ukrainians have gotten significantly harder while the elites continue to skim off whatever cream is left, including access to billions of dollars in the West’s foreign assistance that is keeping the economy afloat.

Part of the problem appears to be that people supposedly responsible for the corruption fight are themselves dogged by allegations of corruption. The Journal cited Ukrainian lawmaker Volodymyr Parasyuk who claimed to be so outraged by graft that he expressed his fury “by kicking in the face an official he says owns luxury properties worth much more than a state salary could provide.”

However, the Journal also noted that “parliament is the site of frequent mass brawls [and] it is hard to untangle all the overlapping corruption allegations and squabbling over who is to blame. Mr. Parasyuk himself was named this week as receiving money from an organized crime suspect, a claim he denies.”

Then, there is the case of Finance Minister Natalie Jaresko, who is regarded by top American columnists as the face of Ukraine’s reform. Indeed, a Wall Street Journal op-ed last month by Stephen Sestanovich, a senior fellow at the Council on Foreign Relations, hailed Jaresko as “a tough reformer” whose painful plans include imposing a 20 percent “flat tax” on Ukrainians (a favorite nostrum of the American Right which despises a progressive tax structure that charges the rich at a higher rate).

Sestanovich noted that hedge-fund billionaire George Soros, who has made a fortune by speculating in foreign currencies, has endorsed Jaresko’s plan but that it is opposed by some key parliamentarians who favor a “populist” alternative that Sestanovich says “will cut rates, explode the deficit, and kiss IMF money good-bye.”

Yet, Jaresko is hardly a paragon of reform. Prior to getting instant Ukrainian citizenship and becoming Finance Minister in December 2014, she was a former U.S. diplomat who had been entrusted to run a $150 million U.S.-taxpayer-funded program to help jump-start an investment economy in Ukraine and Moldova.

Jaresko’s compensation was capped at $150,000 a year, a salary that many Americans would envy, but it was not enough for her. So, she engaged in a variety of maneuvers to evade the cap and enrich herself by claiming millions of dollars in bonuses and fees.

Ultimately, Jaresko was collecting more than $2 million a year after she shifted management of the Western NIS Enterprise Fund (WNISEF) to her own private company, Horizon Capital, and arranged to get lucrative bonuses when selling off investments, even as the overall WNISEF fund was losing money, according to official records.

For instance, Jaresko collected $1.77 million in bonuses in 2013, according to WNISEF’s latest available filing with the Internal Revenue Service. In her financial disclosure forms with the Ukrainian government, she reported earning $2.66 million in 2013 and $2.05 million in 2014, thus amassing a sizeable personal fortune while investing U.S. taxpayers’ money supposedly to benefit the Ukrainian people.

It didn’t matter that WNISEF continued to hemorrhage money, shrinking from its original $150 million to $89.8 million in the 2013 tax year, according to the IRS filing. WNISEF reported that the bonuses to Jaresko and other corporate officers were based on “successful” exits from some investments even if the overall fund was losing money. [See Consortiumnews.com’s “How Ukraine’s Finance Minister Got Rich.”]

Though Jaresko’s enrichment schemes are documented by IRS and other official filings, the mainstream U.S. media has turned a blind eye to this history, all the better to pretend that Ukraine’s “reform” process is in good hands. (It also turns out that Jaresko did not comply with Ukrainian law that permits only single citizenship; she has kept her U.S. passport exploiting a loophole that gives her two years to show that she has renounced her U.S. citizenship.)

Propaganda over Reality

Yet, as good as propaganda can be especially when the U.S. government and mainstream media are moving in lockstep reality is not always easily managed. Ukraine’s continuing and some say worsening corruption prompted last month’s trip to Ukraine by Vice President Joe Biden who gave a combination lecture and pep talk to Ukraine’s parliament.

Of course, Biden has his own Ukraine cronyism problem because three months after the U.S.-backed overthrow of the Yanukovych government Ukraine’s largest private gas firm, Burisma Holdings, appointed his son, Hunter Biden, to its board of directors.

Burisma a shadowy Cyprus-based company also lined up well-connected lobbyists, some with ties to Secretary of State John Kerry, including Kerry’s former Senate chief of staff David Leiter, according to lobbying disclosures.

As Time magazine reported, “Leiter’s involvement in the firm rounds out a power-packed team of politically-connected Americans that also includes a second new board member, Devon Archer, a Democratic bundler and former adviser to John Kerry’s 2004 presidential campaign. Both Archer and Hunter Biden have worked as business partners with Kerry’s son-in-law, Christopher Heinz, the founding partner of Rosemont Capital, a private-equity company.”

According to investigative journalism inside Ukraine, the ownership of Burisma has been traced to Privat Bank, which is controlled by the thuggish billionaire oligarch Ihor Kolomoysky, who was appointed by the U.S.-backed “reform” regime to be governor of Dnipropetrovsk Oblast, a south-central province of Ukraine (though Kolomoisky was eventually ousted from that post in a power struggle over control of UkrTransNafta, Ukraine’s state-owned oil pipeline operator).

In his December speech, Biden lauded the sacrifice of the 100 or so protesters who died during the Maidan clashes in February 2014, referring to them by their laudatory name “The Heavenly Hundred.” But Biden made no heavenly references to the estimated 10,000 people, mostly ethnic Russians, who have been slaughtered in the U.S.-encouraged “Anti-Terror Operation” waged by the coup regime against eastern Ukrainians who objected to the violent ouster of President Yanukovych, who had won large majorities in those areas.

Apparently, heaven is not as eager to welcome ethnic Russian victims of U.S.-inspired political violence. Nor did Biden take note that some of the Heavenly Hundred were street fighters for neo-Nazi and other far-right nationalist organizations.

But after making his sugary references to The Heavenly Hundred Biden delivered his bitter medicine, an appeal for the parliament to continue implementing IMF “reforms,” including demands that old people work longer into their old age.

Biden said, “For Ukraine to continue to make progress and to keep the support of the international community you have to do more, as well. The big part of moving forward with your IMF program — it requires difficult reforms. And they are difficult.

“Let me say parenthetically here, all the experts from our State Department and all the think tanks, and they come and tell you, that you know what you should do is you should deal with pensions. You should deal with — as if it’s easy to do. Hell, we’re having trouble in America dealing with it. We’re having trouble. To vote to raise the pension age is to write your political obituary in many places.

“Don’t misunderstand that those of us who serve in other democratic institutions don’t understand how hard the conditions are, how difficult it is to cast some of the votes to meet the obligations committed to under the IMF. It requires sacrifices that might not be politically expedient or popular. But they’re critical to putting Ukraine on the path to a future that is economically secure. And I urge you to stay the course as hard as it is. Ukraine needs a budget that’s consistent with your IMF commitments.”

Eroding Support

But more and more Ukrainians appear to see through the charade in Kiev, as the poll numbers on the corruption crisis soar. Meanwhile, European officials seem to be growing impatient with the Ukraine crisis which has added to the drag on the Continent’s economies because the Obama administration strong-armed the E.U. into painful economic sanctions against Russia, which had come to the defense of the embattled ethnic Russians in the east.

“Many E.U. officials are fed up with Ukraine,” said one Western official quoted by the Journal, which added that “accusations of graft by anticorruption activists, journalists and diplomats have followed to the new government.”

The Journal said those implicated include some early U.S. favorites, such as Prime Minister Arseniy Yatsenyuk, “whose ratings have plummeted to single digits amid allegations in the media and among anticorruption activists of his associates’ corrupt dealings. Mr. Yatsenyuk has denied any involvement in corruption and his associates, one of whom resigned from parliament over the controversy this month, deny wrongdoing.”

The controversy over Yatsenyuk’s alleged cronyism led to an embarrassing moment in December 2015 when an anti-Yatsenyuk lawmaker approached the podium with a bouquet of roses, which the slightly built Yatsenyuk accepted only to have the lawmaker lift him up and try to carry him from the podium.

In many ways, the Ukraine crisis represents just another failure of neocon-driven “regime change,” which has also spread chaos across the Middle East and northern Africa. But the neocons appear to have even a bigger target in their sites, another “regime change” in Moscow, with Ukraine just a preliminary move. Of course, that scheme could put in play nuclear war.

Taking Aim

The Ukraine “regime change” took shape in 2013 after Russian President Putin and President Barack Obama collaborated to tamp down crises in Syria and Iran, two other prime targets for neocon “regime changes.” American neocons were furious that those hopes were dashed. Ukraine became Putin’s payback.

In fall 2013, the neocons took aim at Ukraine, recognizing its extreme sensitivity to Russia which had seen previous invasions, including by the Nazis in World War II, pass through the plains of Ukraine and into Russia. Carl Gershman, neocon president of the U.S.-funded National Endowment for Democracy, cited Ukraine as the “biggest prize” and a key step toward unseating Putin in Moscow. [See Consortiumnews.com’s “What the Neocons Want from Ukraine Crisis.”]

Initially, the hope was that Yanukovych would lead Ukraine into an economic collaboration with Europe while cutting ties to Russia. But Yanukovych received a warning from top Ukrainian economists that a hasty split with neighboring Russia would cost the country a staggering $160 billion in lost income.

So, Yanukovych sought to slow down the process, prompting angry protests especially from western Ukrainians who descended on Maidan square. Though initially peaceful, neo-Nazi and other nationalist militias soon infiltrated the protests and began ratcheting up the violence, including burning police with Molotov cocktails.

Meanwhile, U.S.-funded non-governmental organizations, such as the Organized Crime and Corruption Reporting Project (which receives money from USAID and hedge-fund billionaire George Soros’s Open Society), hammered away at alleged corruption in the Yanukovych government.

In December 2013, Nuland reminded Ukrainian business leaders that the United States had invested $5 billion in their “European aspirations,” and in an intercepted phone call in early February 2014 she discussed with U.S. Ambassador Geoffrey Pyatt who Ukraine’s new leaders would be.

“Yats is the guy,” Nuland said of Arseniy Yatsenyuk, as she also disparaged a less aggressive approach by the European Union with the pithy phrase: “Fuck the E.U.” (Nuland, a former aide to ex-Vice President Dick Cheney, is the wife of arch-neoconservative ideologue Robert Kagan.)

Sen. John McCain also urged on the protests, telling one group of right-wing Ukrainian nationalists that they had America’s backing. And, the West’s mainstream media fell in love with the Maidan protesters as innocent white hats and thus blamed the worsening violence on Yanukovych. [See Consortiumnews.com’s “NYT Still Pretends No Coup in Ukraine.”]

Urging Restraint

In Biden’s December 2015 speech to the parliament, he confirmed that he personally pressed on President Yanukovych the need to avoid violence. “I was literally on the phone with your former President urging restraint,” Biden said.

However, on Feb. 20, 2014, mysterious snipers apparently from buildings controlled by the far right fired on and killed policemen as well as some protesters. The bloodshed sparked other violent clashes as armed rioters battled with retreating police.

Although the dead included some dozen police officers, the violence was blamed on Yanukovych, who insisted that he had ordered the police not to use lethal force in line with Biden’s appeal. But the State Department and the West’s mainstream media made Yanukovych the black-hatted villain.

The next day, Feb. 21, Yanukovych signed an accord negotiated and guaranteed by three European nations to accept reduced powers and early elections so he could be voted out of office if that was the public’s will. However, as police withdrew from the Maidan, the rioters, led by neo-Nazi militias called sotins, stormed government buildings on Feb. 22, forcing Yanukovych and other officials to flee for their lives.

In the West’s mainstream media, these developments were widely hailed as a noble “revolution” and with lumps in their throats many journalists averted their misty eyes from the key role played by unsavory neo-Nazis, so as not to dampen the happy narrative (although BBC was among the few MSM outlets that touched on this inconvenient reality).

Ever since, the major U.S. news media has stayed fully on board, ignoring evidence that what happened was a U.S.-sponsored coup. The MSM simply explains all the trouble as a case of naked “Russian aggression.

There were kudos, too, when “reformer” Natalie Jaresko was made Finance Minister along with other foreign “technocrats.” There was no attention paid to evidence about the dark underside of the Ukrainian “revolution of dignity,” as Biden called it.

Though the neo-Nazis sometimes even teamed up with Islamic jihadists were the tip of the spear slashing through eastern Ukraine, their existence was either buried deep inside stories or dismissed as “Russian propaganda.”

That was, in effect, American propaganda and, as clever as it was, it could only control reality for so long.

Even though the fuller truth about Ukraine has never reached the American people, there comes a point when even the best propagandists have to start modifying their rosy depictions. Ukraine appears to have reached that moment.

Investigative reporter Robert Parry broke many of the Iran-Contra stories for The Associated Press and Newsweek in the 1980s. You can buy his latest book, America’s Stolen Narrative, either in print here or as an e-book (from Amazon and barnesandnoble.com).




Saudi Game-Changing Head-Chopping

Exclusive: Saudi Arabia likes to distinguish itself from the head-choppers of the Islamic State but the recent mass executions, including decapitating a top Shiite dissident, reveals the Saudi royals to be just better-dressed jihadists, while creating an opening for a U.S. realignment in the Mideast, says Robert Parry.

By Robert Parry

For generations, U.S. officials have averted their eyes from Saudi Arabia’s grotesque monarchy which oppresses women, spreads jihadism and slaughters dissidents in a crude trade-off of Saudi oil for American weapons and U.S. security guarantees. It is a deal with the devil that may finally be coming due.

The increasingly undeniable reality is that the Saudis along with other oil sheikhs are the biggest backers of Al Qaeda and various terrorist groups helping these killers as long as they spread their mayhem in other countries and not bother the spoiled playboys of the Persian Gulf.

President George W. Bush and then President Barack Obama may have suppressed the 28 pages of the congressional 9/11 report describing Saudi support for Al Qaeda and its hijackers but the cat is thoroughly out of the bag. Mealy-mouthed comments from the State Department spokesmen can no longer hide the grim truth that U.S. “allies” are really civilization’s enemies.

The big question that remains, however, is: Will Official Washington’s dominant neocon/liberal-interventionist claque continue to protect the Saudis who have built a regional alliance of convenience with Israel over their shared hatred of Iran?

Inside Official Washington’s bubble where the neocons and liberal hawks hold sway there is a determination to make the “designated villains,” the Iranians, the Syrian government, Lebanon’s Hezbollah and the Russians. This list of “villains” matches up quite well with Israeli and Saudi interests and thus endless demonization of these “villains” remains the order of the day.

But the Saudis and indeed the Israelis are showing what they’re really made of. Israel has removed its humanistic mask as it ruthlessly suppresses Palestinians and mounts periodic “grass mowing” operations, using high-tech munitions to slaughter thousands of nearly defenseless people in Gaza and the West Bank while no longer even pretending to want a peaceful resolution of the long-simmering conflict. Israel’s choice now seems to be apartheid or genocide.

Meanwhile, the Saudis though long-hailed in Official Washington as “moderates” are showing what a farcical description that has always been as the royals now supply U.S.-made TOW missiles and other sophisticated weapons to Sunni jihadists in Syria, fighting alongside Al Qaeda’s Nusra Front.

Using advanced U.S.-supplied warplanes, the Saudis also have been pulverizing poverty-stricken Yemen after exaggerating the level of Iranian support to the Houthis, who have been fighting both a Saudi-backed regime and Al Qaeda’s Yemeni affiliate. Amid the Saudi-inflicted humanitarian crisis, Al Qaeda’s forces have expanded their territory.

And, at the start of the New Year, the Saudi monarchy butchered 47 prisoners, including prominent Shiite cleric Nimr al-Nimr for his offense of criticizing the royals, or as the Saudis like to say without a touch of irony supporting “terrorism.” By chopping off Nimr’s head as well as shooting and decapitating the others the Saudis demonstrated that there is very little qualitative difference between them and the head-choppers of the Islamic State.

The Usual Suspects

Yes, the usual suspects in Official Washington have sought to muddle the blood-soaked picture by condemning angry Iranian protesters for ransacking the Saudi embassy in Tehran before the government security forces intervened. And there will surely be an escalation of condemnations of anyone who suggests normalizing relations with Iran.

But the issue for the neocons and their liberal-interventionist sidekicks is whether they can continue to spin obviously false narratives about the nobility of these Middle East “allies,” including Israel. Is there a limit to what they can put over on the American people? At some point, will they risk losing whatever shreds of credibility that they still have? Or perhaps the calculation will be that public credibility is irrelevant, power and control are everything.

A similar choice must be made by politicians, including those running for the White House.

Some Republican candidates, most notably Sen. Marco Rubio, have gone all-in with the neocons, hoping to secure largesse from casino tycoon Sheldon Adelson and other staunch supporters of Israel’s right-wing Prime Minister Benjamin Netanyahu. On the other hand, real-estate magnate Donald Trump has distanced himself from neocon orthodoxy, even welcoming Russia’s entry into the Syrian conflict to fight the Islamic State, heresy in Official Washington.

On the Democratic side, former Secretary of State Hillary Clinton is the most closely associated with the neocons and the liberal hawks and she has dug in on the issue of their beloved “regime change” strategy, which she insists must be applied to Syria.

She appears to have learned nothing from her misguided support for the Iraq War, nor from her participation in overthrowing Muammar Gaddafi’s secular regime in Libya, both of which created vacuums that the Islamic State and other extremists filled. (British special forces are being deployed to Libya as part of an offensive to reclaim Libyan oil fields from the Islamic State.)

A Sanders Opportunity

The Saudi decision to chop off Sheikh Nimr’s head and slaughter 46 other people in one mass execution also puts Sen. Bernie Sanders on the spot over his glib call for the Saudis “to get their hands dirty” and intervene militarily across the region.

That may have been a clever talking point, calling on the rich Saudis to put some skin in the game, but it missed the point that even before the Nimr execution the Saudis’ hands were very dirty, indeed covered in blood.

For Sanders to see the Saudis as part of the solution to the Mideast chaos ignores the reality that they are a big part of the problem. Not only has Saudi Arabia funded the extreme, fundamentalist Wahhabi version of Sunni Islam building mosques and schools around the Muslim world but Al Qaeda and many other jihadist groups are, in essence, Saudi paramilitary forces dispatched to undermine governments on Riyadh’s hit list.

That has been the case since the 1980s when the Saudis along with the Reagan administration  invested billions of dollars in support of the brutal mujahedeen in Afghanistan with the goal of overthrowing a secular, Soviet-backed government in Kabul.

Though the “regime change” worked the secular leader Najibullah was castrated and his body hung from a light pole in Kabul the eventual outcome was the emergence of the Taliban and Al Qaeda, led by a Saudi scion, Osama bin Laden.

Though Sanders has resisted articulating a detailed foreign policy instead seeking to turn questions back to his preferred topic of income inequality the latest Saudi barbarism gives him a new chance to distinguish himself from front-runner Clinton. He could show courage and call for a realignment based on reality, not propaganda.

President Obama, too, has a final chance to refashion the outdated and counter-productive U.S. alliances in the Middle East. At least he could rebalance them to allow a pragmatic relationship with Iran and Russia to stabilize Syria and neutralize the Saudi-backed jihadists.

Standing Up, Not Bowing Down

Instead of being supplicants to Saudi riches and oil, the West could apply stern measures against the Saudi royals to compel their acquiescence to a real anti-terrorist coalition. If they don’t comply immediately, their assets could be frozen and seized; they could be barred from foreign travel; they could be isolated until they agreed to behave in a civilized manner, including setting aside ancient animosities between Sunni and Shiite Islam.

It seems the European public is beginning to move in this direction, in part, because the Saudi-led destabilization of Syria has dumped millions of desperate refugees on the European Union’s doorstep. If a new course isn’t taken, the E.U. itself might split apart.

But the power of the neocon/liberal-hawk establishment in Official Washington remains strong and has prevented the American people from achieving anything close to a full understanding of what is going on in the Middle East.

The ultimate barrier to an informed U.S. public may also be the enormous power of the Israel Lobby, which operates what amounts to a blacklist against anyone who dares criticize Israeli behavior and harbors hopes of ever holding a confirmable government position or for that matter a prominent job in the mainstream media.

It would be a test of true political courage and patriotism for some major politician or prominent pundit to finally take on these intimidating forces. That likely won’t happen, but Saudi Arabia’s latest head-choppings have created the possibility, finally, for a game-changing realignment.

Investigative reporter Robert Parry broke many of the Iran-Contra stories for The Associated Press and Newsweek in the 1980s. You can buy his latest book, America’s Stolen Narrative, either in print here or as an e-book (from Amazon and barnesandnoble.com).




Failed US Sanctions on Russia

The U.S. mainstream media excludes almost all reporting and analysis that challenges the neocon/liberal-interventionist “group think” about the supposed Russian threat, but once in awhile a backhand acknowledgement of reality slips through, as Gilbert Doctorow was surprised to find.

By Gilbert Doctorow

The newest issue of Foreign Affairs continues to show a significant drop-off of professionalism in the mostly Russia-phobic essays at the flagship American magazine on international relations. Yet as low-grade as these essays may be, one of them is highly damaging to the dominant Washington narrative against Vladimir Putin’s Russia.

Emma Ashford, a visiting research fellow at the neoliberal/libertarian Cato Institute, produced an essay that is a jumble of statistics and arguments, many of them contradictory, and all of them set out without prioritization. The author clearly lacks experience and judgment. But what makes this essay newsworthy is that hit or miss the author is going up against the U.S. establishment and directly calling for an end to U.S. sanctions against Russia.

If I may sequence her arguments properly, the sanctions a) have been totally useless in changing Russian foreign and military policy in the directions desired by the U.S., b) they have caused very little damage to the Russian economy but much harm to immediate European and American economic interests, and c) they have caused the Russians to join with other BRICS members in creating institutions and pursuing financial practices that ultimately will undermine U.S. global hegemony, thereby compromising America’s future.

Along the way, Ashford agrees with IMF predictions that “even with continued low oil prices growth will return to the Russian economy in 2016.” This means the sectoral prohibitions have not impaired the economy in the ways intended.

The author notes that Moscow circumvented the sanctions partly by turning to China, where it concluded a $400 billion gas deal, a 150 billion yuan currency swap and other major agreements. Moreover, the sanctions on individual targeted companies have been compensated by largess from the Kremlin so as to attenuate any losses.

And the travel bans and property arrests on targeted members of the elite have only been a minor nuisance, which never provoked them to turn against their president. Looking to the future, Ashford does not expect the sanctions to eventually work, calling that “wishful thinking.”

The essay goes off the rails when Ashford tries to explain the “costs of containment” to the U.S. and its allies in Europe, which she characterizes as “major.” Next we read that in Europe the European Commission estimates that sanctions cut growth by 0.3 per cent of GDP in 2015. Perhaps even she understands that is not much, so Ashford tries again by citing predictions from the Austrian Institute of Economic Research that continuing the sanctions on Russia may cost Europe “over 90 billion euros in export revenue and more than two million jobs over the next few years.”  Predictions about the “next few years” are not the kind of hard data that normally moves politicians.

And she trots out the widely cited figure of 400,000 German jobs that are at risk over sanctions. Still more vaguely, she speaks of how major European banks like Société Générale in France and Raiffeisen in Austria may be destabilized and require state bailouts if their large loans to Russian concerns become uncollectible due to borrowers’ insolvency.  Turning to the U.S., Ashford directs attention to the administrative and legal costs that American banks have to bear as they enforce regulations calling for freezing and managing the assets of sanctioned individuals. They have had to hire additional legal and technical staff to ensure they are in conformity with the myriad of sanctions and thus avoid rippling penalties from the federal authorities for the least error of execution. At what cost? Not a word, although that is obviously a difficult measure to quantify.

Meanwhile U.S. energy companies are suffering foregone (not specified) profits by being unable to pursue the large exploration and production contracts they had concluded with Russian counterparts. And they may possibly lose the multi-billion-dollar investments they made in such projects before the sanctions came into effect. Still, there is no reason to see any of this as crippling punishment for U.S. energy companies.

I think it is fairly obvious that all of the foregoing “costs” for the U.S. and its allies are not much more than mosquito bites. By presenting them as she does, the author shows lack of discernment in what constitutes proof to justify a dramatic change in direction of a fundamental foreign policy stand by the U.S.

But her lapse of professionalism does not end there: Ashford moves on, falling into glaring logical inconsistencies. We are told that the sanctions “may harm European energy security.” Specifically, Ashford cites a prediction from Cambridge Energy Research Associates that as a result of sanctions Russian oil production may drop from 10.5 million barrels a day today to 7.6 million barrels in 2025.

This does not jibe with her remarks earlier in the essay on how the Russians were circumventing sanctions: “Russia has been able to find loopholes .[and] in order to obtain access to Arctic drilling equipment and expertise, Rosneft acquired 30 percent of the North Atlantic drilling projects belonging to the Norwegian company Statoil.”

Nor does this jibe with her assertion at the end of her essay when setting out her recommendations on what punitive measures should replace sanctions if we accept that they have been a failure. There she urges the U.S. to export oil and liquefied natural gas to Europe so as “to provide Europe with an alternative source of energy” and “to starve the Russian state of revenue.” This would, she says “allow European states to wean themselves off Russian oil and gas.”

One of these positions may be correct, but they cannot all be correct, and it should not be up to the reader to choose from this Chinese restaurant menu.

Given the unimpressive nature of Ashford’s arguments against sanctions coming from their past and present economic consequences, her real knock-out blow against sanctions comes in the completely different and unquantifiable area of argumentation that is political and geopolitical. She faults the sanctions for prompting a “rally round the flag” phenomenon in Russia that has, perversely, raised President Putin’s approval rating from 63 percent in March 2014 when Russia took possession of Crimea to 88 percent in October 2015. His power, which theoretically should have been shaken by the U.S. and E.U. sanctions, has instead consolidated.

The sanctions also encouraged Russia to take actions to protect its financial institutions that ultimately will threaten the global economic influence of the United States. These measures include the creation of an alternative international payment system to SWIFT, the creation of a domestic credit-card clearing house that challenges Visa and MasterCard, and the creation of a BRICS development bank that duplicates the World Bank and International Monetary Fund.

The net effect of these actions, once implemented, will be to cause the United States “to have a harder time employing economic statecraft,” by which she means imposing crippling financial sanctions on other states as they succeeded in doing to Iran. In the same vein, Ashford sees a threat in Russia’s shift away from trading in dollars.

Ashford’s recommendation, the true punch-line of the article, is that “the United States should cut its losses and unilaterally lift the majority of the sanctions on Russia.” This advice surely will set off alarms within the Beltway.

In that sense, Ashford’s essay may have dealt even a harder blow against Washington’s “sanction Russia” consensus than did John Mearsheimer’s iconoclastic Foreign Affairs article from 2014, “Why the Ukraine Crisis is the West’s Fault,” a top-drawer essay that caused dyspeptic fits and sparked a lively debate in the follow-on issue of the magazine.

Gilbert Doctorow is the European Coordinator, American Committee for East West Accord, Ltd. His latest book Does Russia Have a Future?(August 2015) is available in paperback and e-book from Amazon.com and affiliated websites. For donations to support the European activities of ACEWA, write to eastwestaccord@gmail.com. © Gilbert Doctorow, 2015




The Clintons’ Paid-Speech Bonanza

Exclusive: With primary voting set to start next month, one of Hillary Clinton’s remaining hurdles is convincing Democratic voters that she is not beholden to Wall Street and other wealthy interests that have fattened her family’s bank account with tens of millions of dollars for paid speeches, writes Chelsea Gilmour.

By Chelsea Gilmour

Hillary Clinton is said to be buoyant over her prospects to become the next U.S. President, as Republicans feud over Donald Trump’s disruptive campaign and Sen. Bernie Sanders fails to articulate a clear foreign policy, but perhaps the biggest obstacle still confronting the ex-Secretary of State is her own record as a beneficiary of rich and powerful corporate interests.

“The truth is, you can’t change a corrupt system by taking its money,” says a Sanders’s television commercial.  And Clinton has left herself open to that charge by profiting off her government experience, racking up $11.8 million in 51 speaking fees in the 14-month period from January 2014 to March 2015 before she became an official candidate for President, according to disclosure records.

For speeches usually lasting between 30 minutes and one hour, Clinton was paid from $100,000 to $335,000, an average around $230,000. Many of her paid speeches were delivered to Wall Street, Big Pharma, Tech and other industries with interests in influencing government policies.

Payments crossing the $300,000 mark came from Qualcomm Inc. ($335,000), the Biotechnology Industry Organization ($335,000), the National Automobile Dealers Association ($325,500), Cisco ($325,000), eBay ($315,000) and Nexenta Systems, Inc. ($300,000). Those amounts are each roughly equivalent to six times the typical American middle-class earnings in an entire year.

It’s true that the paid speaking circuit is a common stomping ground for former public officials. For instance, after leaving the Florida governorship and before running for President, Jeb Bush was compensated handsomely for his public speeches. During the same 14-month period between January 2014 and March 2015, Jeb made $1.8 million delivering 43 paid speeches in the U.S. and abroad (London, Prague, Toronto, and Punta del Este, Uruguay). His average compensation per speech was $42,500, less than 20 percent of Clinton’s average haul.

And former President Bill Clinton has amassed a fortune speaking to groups all over the world, as listed in Hillary Clinton’s disclosure forms for 2014-2015. According to those financial disclosures, Bill Clinton delivered 53 paid speeches from January 2014 to March 2015, totaling $13.3 million. His highest single payouts from speeches during that period came on March 6 and 7, 2014, when he received $500,000 each for speeches to Bank of America in London and Kessler Topaz Meltzer and Check LLP in Amsterdam.

As Larry Noble, senior counsel at the Campaign Legal Center, told CBS, “It’s not unusual for former elected officials to go out and give speeches and make a lot of money. What the problem now is that they’re coming back into government after going out and having been paid large sums by these various special interests.”

Two-for-One

In the Clintons’ case, there is also a potential for the speech-buyers to get what America’s most famous power couple in 1993 called “two for the price of one.” Since Clinton-42 would surely be a prominent member of a potential Clinton-45 administration, just as Hillary Clinton was a policy adviser during Bill Clinton’s presidency, lavishing money on former President Bill Clinton might be an indirect route to influence President Hillary Clinton.

The backdrop of this issue is the unprecedented nature of the Clintons’ double-teaming the international business world to leverage millions of dollars from their past experience and potential future power. This conflict of interest has even raised eyebrows in the mainstream news media. On Nov. 22, 2015, both The Washington Post and The New York Times published front-page articles examining Bill and Hillary Clinton’s unmatched accumulation of wealth from their careers as public servants.

The Post’s investigation found that since Bill Clinton’s 1974 congressional bid, the “grand total raised for all [Bill and Hillary’s] political campaigns and their family’s charitable foundation reaches at least $3 billion. They made historic inroads on Wall Street, pulling in at least $69 million in political contributions from the employees and PACs of banks, insurance companies, and securities and investment firms. Wealthy hedge fund managers S. Donald Sussman and David E. Shaw are among their top campaign supporters, having given more than $1 million each.”

This concern surfaced during the Nov. 14 Democratic presidential debate when Hillary Clinton was pressed about her acceptance of Wall Street largesse and she sought to justify her financial support from Wall Street as somehow related to her work as a New York senator following the 9/11 attacks on the Twin Towers of the World Trade Center.

“I represented New York on 9/11 when we were attacked,” Clinton said. “Where were we attacked? We were attacked in downtown Manhattan where Wall Street is. I did spend a whole lot of time and effort helping them rebuild. That was good for New York. It was good for the economy and it was a way to rebuke the terrorists who had attacked our country.”

Clinton’s reference to 9/11 as justification for her accepting large speaking fees and campaign donations from Wall Street quickly raised the hackles of some debate watchers on social media. Later in the debate, Clinton was confronted with one tweet that noted that “I’ve never seen a candidate invoke 9/11 to justify millions of Wall Street donations until now.”

Clinton responded, “Well, I’m sorry that whoever tweeted that had that impression because I worked closely with New Yorkers after 9/11 for my entire first term to rebuild. So, yes, I did know people. I’ve had a lot of folks give me donations from all kinds of backgrounds say, I don’t agree with you on everything, but I like what you do. I like how you stand up. I’m going to support you, and I think that is absolutely appropriate.”

Image Problem

After the debate, The New York Times examined this image problem created by Hillary Clinton’s Wall Street donations, reporting: “John Wittneben simmered as he listened to Hillary Rodham Clinton defend her ties to Wall Street during last weekend’s Democratic debate. He lost 40 percent of his savings in individual retirement accounts during the Great Recession, while Mrs. Clinton has received millions of dollars from the kinds of executives he believes should be in jail.

“‘People knew what they were doing back then, because of greed, and it caused me harm,’ said Mr. Wittneben, the Democratic chairman in Emmet County, Iowa. ‘We were raised a certain way here. Fairness is a big deal.’ The next day he endorsed Senator Bernie Sanders in the presidential race.

“Mrs. Clinton’s windfalls from Wall Street banks and other financial services firms, $3 million in paid speeches and $17 million in campaign contributions over the years, have become a major vulnerability in states with early nomination contests. It is an image problem that she cannot seem to shake.”

Meanwhile, The Washington Post article provided an overview of the source of Clinton campaign contributions and how the Clintons have systematically cultivated their donor base of the super-rich on Wall Street since Bill Clinton first ran for president in 1992.

Both publications recounted the episode in which Goldman Sachs executive Robert E. Rubin raised funds and opened doors to other Wall Street execs for Bill Clinton’s presidential bid and was rewarded with an appointment as Treasury Secretary.

According to the Post article, “Like-minded Wall Streeters such as investment banker Roger Altman joined [Rubin] in the new administration, and early on they helped craft an economic policy, known as Rubinomics, that was applauded by Wall Street but viewed critically by many on the left. When then-first lady Hillary Clinton decided to run for the Senate in New York in 2000, she turned to Rubin and Altman to introduce her to key players on Wall Street.”

Many of Clinton-42’s economic policies followed the Wall Street interest in “neo-liberalism,” a combination of government deregulation of the financial industry and promotion of “free trade” agreements that led U.S. companies to shift manufacturing jobs overseas.

While those tactics were credited with producing a go-go economy in the 1990s, many of the unpleasant consequences came home to roost a decade later in the burst of the Internet bubble in 2000 and then the Wall Street crash in 2008 that cost millions of Americans their jobs, their savings and their homes.

As Hillary Clinton now cites the relative affluence of the 1990s and dispatches Bill Clinton to be a key campaign surrogate, the mixed history of that era is becoming another key issue in the 2016 campaign with the underlying question: How would Hillary Clinton deal with Wall Street and the Big Banks if elected? Publicly, she has taken a relatively tough line on Wall Street.

Reining in Wall Street

As the Post explained, “In her current campaign, Clinton has pledged to rein in Wall Street. She has proposed higher taxes on high-frequency traders and an end to special tax breaks for hedge fund managers, and recently called for more aggressive enforcement of criminal statutes that govern the finance industry.

“But her rhetoric has not alarmed her backers in the financial sector. So far, donors in the banking and insurance industries have given $6.4 million to her campaign and allied super PACs, behind only those in communications and technology, the Post found.”

But Wall Street is not the only business sector that has courted Hillary Clinton with lucrative payments for her speeches and welcomed her comments. Our analysis of available recordings and transcripts of Clinton’s speeches showed that her paid speeches ranged across a variety of topics, mostly favorable or flattering to the corporate interests being addressed.

For instance, Clinton noted the importance of women in the real estate field during a speech to the Commercial Real Estate Women Network in October 2014. In March 2015, Clinton lauded the work that women have done in technology during a 20-minute speech to eBay, where Meg Whitman was chief executive during its meteoric rise from 1998 to 2008.

As The Washington Post noted, shortly after that speech, Clinton was back before some of the same people seeking donations for her campaign.

“Less than two months later, Clinton was feted at the San Francisco Bay-area home of eBay chief executive John Donahoe and his wife, Eileen, for one of the first fundraisers supporting Clinton’s newly announced presidential campaign,” the Post reported last May.

All told, Clinton gave nine speeches to the tech sector which paid her at least $2.4 million. (The Washington Post calculates her payments from the tech sector as $3.2 million, the discrepancy apparently reflecting different definitions of what constitutes a “tech company.”)

During her speaking engagement with the Biotechnology Industry Organization (BIO) in June 2014, Clinton spoke of the need for the U.S. government to encourage “risky” but profitable investments in the pharmaceutical industry and to avoid companies outsourcing biotechnology and pharmaceutical businesses.

Discussing the risky business of producing new pharmaceuticals, she called for affordable “insurance against risk,” and said if Washington was not helpful in this effort then perhaps the states that host these companies could come together and make legislation.

“We’ve got to rationalize our tax system because I don’t want to see biotech companies or pharma companies moving out of our country simply because of some kind of perceived tax disadvantage and potential tax advantage somewhere else,” she said to a hearty round of applause.

Before the pharmaceutical and health industries, she showed off her expertise gleaned from her experience managing Bill Clinton’s ill-fated health reform bill in 1993-94. She gave 10 paid speeches to healthcare organizations, totaling $2.3 million in fees, including $335,000 from the Biotechnology Industry Organization and a quarter-million dollars or more each from Drug, Chemical and Associated Technologies; Cardiovascular Research Foundation; and Advanced Medical Technology Association.

Making Multi-Millions

That nearly 38 percent of Hillary Clinton’s current personal wealth of approximately $31.3 million was accumulated during the brief period between her departure from the State Department and her run for the presidency underscores the extent to which she is a beneficiary of big-business’ financial largesse.

The close proximity between Hillary Clinton’s last paid speaking engagement on March 19, 2015 to the New York section of the American Camping Association for $260,000 and her announcement as a candidate for president on April 12, 2015, adds further fuel to a suspicion of impropriety.

Since Clinton knew that she would be announcing her candidacy, squeezing in paid speeches almost to the last minute gives the appearance that she was profiting off her possible rise to arguably the most powerful political position on earth.

Indeed, Clinton’s inner circle had been dropping hints about her 2016 presidential run since she lost the Democratic nomination to Barack Obama in 2008 and, more intensely, after she left the State Department in early 2013.

And though she was not alone in delivering paid speeches until shortly before announcing for instance, Jeb Bush also delivered paid speeches into March 2015 the staggering sums in Clinton’s case have sharpened criticism of her behavior.

Clinton’s campaign did not respond to questions regarding whether the lucrative compensation from speeches raised conflict-of-interest questions. Of the companies paying Clinton some of her highest rates (Qualcomm, Nexenta, eBay, the Biotechnology Industry Organization, and the National Automobile Dealers Association or NADA), Nexenta and NADA were the only companies to respond to inquiries, although indirectly.

Nexenta was prompt in replying, but said only, “We were very pleased with her participation and presentation at the Nexenta OpenSDx Summit 2014.”

A spokesperson from NADA responded to pointed inquiries with a general answer: “The National Automobile Dealers Association is a non-profit and non-partisan business trade group, and does not endorse presidential candidates or their views. Our role is to provide our dealer members with exposure to all facets of business and government that can affect their dealerships. NADA has a long history of inviting speakers from across the political spectrum. Many previous convention speakers, for example, presented views counter to Sen. Clinton. And her views will be counter-balanced by future speakers as well.”

Indeed, Republican strategist Karl Rove and former Vermont Gov. Howard Dean, who has endorsed Clinton, are scheduled to speak at the 2016 NADA convention in Las Vegas. Jeb Bush was a keynote speaker for the January 2015 NADA convention in San Francisco, although he was a relative bargain, earning $51,000, compared to Hillary Clinton’s $325,500 in 2014. (This disparity is likely explained by their relative “celebrity” and their “going rates” rather than any political favoritism by NADA.)

Yet, regardless of whether Hillary Clinton has violated the spirit of government ethics laws by accepting these speaking fees, she faces a challenge trying to convince voters that she will be a champion of middle- and working-class Americans rather than a defender of Wall Street and other corporate interests.

Editor’s note on tabulations: Cited amounts represent paid speeches that went to personal income and do not include speeches when compensation was given to a charity, nor do the tallies include money from book royalties.

Chelsea Gilmour is an assistant editor at Consortiumnews.com. She has previously published “The Mystery of the Civil War’s Camp Casey” and “Jeb Bush’s Tangled Past.