Thursday, May 06, 2010
THE GOLDMAN SACHS CASE Part III: "Jokers to My Right" James S. Henry
Well, la gente
Americano may not know the difference between a synthetic CDO and a snow shovel, but the masses are clearly frothing for a taste of banquero al la brasa, fresh from the spit.
"Financial reform," whatever that means, is now far more popular than "health care reform." And it has only recently become even more so, in the wake of all the recent investigations and prosecutions -- Warren Buffett might say "persecutions" -- of the "demon bank" Goldman Sachs.
Evidently the masses' appetite for banker blood was only slightly sated by the SEC's April 16th civil charges against Goldman, Senator Levin's 11-hour show-trial of senior Goldman officials on April 27, and the "entirely coincidental" announcement on April 30th that the US Justice Department -- which is under strong political pressure to bring more fraud cases to trial, but also tends to screw them up -- has launched a criminal investigation into Goldman's mortgage trading.
INSIDE BASEBALL
In the wake of this populist uprising, Senate Republicans have suddenly adopted "financial reform" as their cause too, allowing the Senate to commence debate this week on Senator Dodd's 1600-page reform bill.
However, this promises to be a lengthy process. While reform proponents like US PIRG and Americans for Financial Reform were hoping for final action as early as this week, Senator Reid now expects to have a Senate bill by Memorial Day at the earliest, and Obama only expects to be able to sign a bill by September.
That's just two months ahead of the fall 2010 elections, so there's not much room for error. But the beleaguered Democrats may just be figuring that they'd rather bash banks than run on their rather mixed track record on health care reform, unemployment, climate change, and offshore drilling, let alone -- Wodin forbid -- immigration reform.
In any case, Senator Dodd's bill has now been through more permutations than a Greek budget forecast. The latest one discards the $50 billion bank restructuring fund as well as new reporting requirements that would helped to spot abusive lending practices.
These concessions apparently were part of retiring Senator Chris Dodd's Grail-like
quest for that elusive 60th (Republican) vote -- rumored to be hidden away and guarded by an ancient secret order known as "Maine Republicans."
A GOAT RODEO
Meanwhile, behind the scenes, leading Republicans, aided by several Democrats from big-bank states like New York, California, and Illinois, and countless lobbyists, have been trying to weaken other key provisions in the bill, which was already pretty tame to begin with.
The most important measures at issue pertain to derivatives and proprietary trading, the power of the new Consumer Financial Products Bureau (especially, according to Senator Shelby, the Federal Reserve's shameless power grab over orthodontists), the regulation of large "non-banks," and (interestingly, from a states' rights perspective) the power of states to preempt federal regulation.
On the other hand, the bill has also inspired dozens of amendments from a cross-section of Senators who appear to be genuinely concerned -- even apart from the opportunities for grandstanding -- that the Dodd bill isn't nearly hard-hitting enough.
Some of these amendments are purely populist anger-management devices that don't really have much to do with preventing future financial crises.
These include Senator Sanders' proposals to revive usury laws and audit the Federal Reserve, a proposal by Senators Barbara Boxer and Jim Webb for a one-time surtax on bank bonuses, Senator Mark Udall's proposal for free credit reports, and Senator Tom Harkin's proposal to cap ATM fees.
The very first amendment adopted was also in this performative utterance category: Senator Barbara Boxer's bold declaration that "no taxpayer funds shall be used" to prevent the liquidation of any financial company in "receivership."
Cynics were quick to point out that in any real banking crisis, this kind of broad promise would be unenforceable, since it would also be among the very first measures to be repealed.
STRUCTURAL REFORM?
Other proposed amendments sound like more serious attempts at structural reform.
These include the Brown-Kaufman amendment that tries to limit the number of "too big to fail" institutions by placing upper limits on the share of system-wide insured deposits and other liabilities held by any one bank holding company, and the Merkley-Levin amendment, which attempts to "ban" proprietary trading and hedge fund investments by US banks, and also defines tougher fiduciary standards for market-makers.
But so far neither of these measures has received the imprimatur of the Senate Banking Committee, let alone Senator Reid. This means that for all practical purposes they are may amount to escape valves for venting popular steam, but little more.
This is especially true, given the delayed schedule that Reid, Dodd, and the Obama Administration seem to have accepted, which will relieve the pressure for such reforms.
Furthermore, upon closer inspection, both proposals leave much to be desired. Indeed, one gets the distinct impression that they dreamed up by Hill staffers on the midnight shift to appease the latest cause célèbre,
For example, the Brown-Kaufman amendment, highly touted by chic liberal "banking experts" like Simon Johnson, doesn't mandate the seizure and breakup of any particular large-scale financial institutions directly. Nor does empower the FTC to set tougher standards for competition in this industry, as it might have done, or even specify what kind of industry structure would be desirable from the standpoint of avoiding banking crises.
To a large extent that simply reflects the paucity of knowledge about the relationship between structure and behavior in financial services. As a bootstrap, the amendment specifies arbitrary caps on bank activities that may or may not be related to actual misbehavior -- for example, the share of "insured deposits" managed by any one bank holding company (≤ 10%), and the ratio of "non-deposit liabilities to US GDP" (≤ 2%).
This has arbitrary consequences. Under the limits in the amendment, for example, Wells Fargo and Citigroup, the # 4 and #1 banks in the country by asset size, would nearly avoid any breakup, while JPMorgan and BankAmerica would feel much more pressure.
Meanwhile, evil Goldman Sachs' minimal .3% shares under both limits would leave it plenty of room to grow -- perhaps even by acquiring the extra share that the "Big Four" would have to spin off.
Furthermore, even the largest US institutions might be able to avoid the caps by devoting more attention to large-scale private banking customers, whose deposits and other investments would avoid these regulations, or by conducting more of their risky business through offshore banking centers.
Indeed, this also suggests a key problem with the Merkley-Levin amendment as well: it is a US solo act. It completely ignores the fact that even our largest banks, and the US financial system as a whole, are part of a competitive global financial market.
As this week's Greco-European financial crisis has underscored, to be effective, bank regulation and structural reform must be conducted on a coordinated international basis. Unilateral initiatives only drive bad behavior to the myriad of under-regulated offshore and onshore financial centers.
From this perspective, I'm surprised that Senator Levin, a long-time critic of offshore financial centers, has proceed in such a ham-handed way with this. This was his year to finally round up global support to crack down on offshore centers -- a precondition for effective global bank regulation. Instead he decided to target Goldman and pursue this wayward, sloppy attempt at unilateral reform -- as if the Isle of Man, Guernsey, Jersey, Bermuda, and the Cayman Islands, let alone London and Zurich and Singapore and Hong Kong, are not waiting in the wings.
WHAT HAVE WE LEARNED?
If we step back from this political goat rodeo, what have we learned about the political economy of financiali reform?
CONSOLIDATION (UNDER BOTH PARTIES)
First, as shown in the above chart, the US banking industry has indeed undergone a major structural transformation, especially December 1992. The following 15 years became the era of Wild West banking, when all the lessons that should have been learned from the Third World debt crisis were forgotten. It became an era of rampant deregulation, rising US public and private debt levels, and asset speculation.
The impacts on financial structure were far reaching and rapid. Back in December 1992, there were more than 13,500 banks, and the top four US banks accounted for less than 10 percent of the sector's jobs.
Already by 1998, there was a decided increase in this concentration level, to more than 20 percent. Today there are fewer than 8000 banks. The top 4 alone -- Citigroup, JPMorganChase, Bank of America, and Wells Fargo -- now employ more than 800,000 people, over 40 percent of the US total. Indeed, together with the failed banks they acquired, the top four banks have accounted for almost all the sector's employment growth; the rest of the sector has shrunk.
Tiny Goldman has also been growing, but it now only accounts for about 18,900, less than 10 percent of any one of the top four.
This growing concentration is also reflected in most key US banking markets, especially the markets for deposits, overall bank loans, real estate loans in general, home mortgages, and credit derivatives. As indicated, in each of these markets, the market share commanded by top four banks has increased from less than 10 percent in 1992 to 40-50 percent or more by 2010. In the case of the credit derivatives market, the share now approaches 90 percent.
Nor has this increasing concentration been accounted for by superior performance. Indeed, the "big four" also now account for more than 78 percent of all bad home mortgages -- behind in payments, or suspended entirely. While some of that is accounted for by the acquisition of failing institutions, most of it is not.
THE ECONOMICS OF GOLDMAN BASHING
Third, once again, for the sake of Goldman bashers in the audience, as indicated above, its share of each of these key market indicators is trivial. Even in credit derivatives, the segment for which Goldman has taken such a beating, its market share today is just 8 percent, compared to the "Big Four's" commanding 88 percent. And Goldman's share of real estate loans, home loans, insured and uninsured bank deposits, and bad home mortgages are even lower.
Just to pick one example: today the "top 4" banks have more than $204 billion of bad home loans, compared with Goldman's $0.0 of such loans.
From this standpoint, the Levin hearings were a stellar example of completely ignoring industry economics. They singled out a smaller, more successful, widely-envied target for political scapegoating, while ignoring the much more economically much more important financial giants.
THE MORTGAGE-INDUSTRIAL COMPLEX
The key driver on the domestic side of all these developments is a political-economy complex that in the long run has had perhaps as profound an influence on our nation's political and economic system as the legendary "military industrial" complex. This is what we've called (in the first chart above) the "US mortgage-industrial complex," including financial institutions, real estate firms, and insurance companies. From 1992 to 2010, in comparable $2010, this industry spent an average of $2793 per day per US Senator and Congressman on federal campaign contributions and lobbying -- far more than the corresponding levels in the 1970s and 1980s.
Except for the insurance industry -- where health care reform efforts by Clinton and Obama tilted the giving -- Democrats and Republicans have more or less divided this kitty pretty evenly. It is also important to note that more than 71 percent of total federal spending by these industries from 1990 to 2010 was on lobbyists, not campaign contributions. While cases like the recent Citizens United decision may affect this balance,
Furthermore, within the financial services industry, the top four US banks alone have accounted for at least 20 percent of all spending on federal lobbying and campaign contributions (in comparable $2010) from 1992 to 2010. Investment banks as a group -- including Goldman, Lehman Brothers, Bear Stearns, Morgan Stanley, UBS, Credit Suisse, and their key predecessors, especially Paine Webber and Dean Witter -- added another 8 percent. But once again, by comparison, and contrary to its reputation as the premier political operator in Washington, Goldman Sach's share of total "real" spending on lobbying and contributions was relatively small -- just 2.2 percent.
This was just 40 percent of what Citigroup spent, and less than 60 percent of what JPMorganChase spent during this same period.
C'mon guys -- Is it any really wonder that Jamie Dimon gets invited to the Obama White House for dinner while Lloyd Blankfein gets served for dinner on a spit up on the Hill?
Ironically, if it were just a question of a given institution's loyalty to the Democratic Party, Goldman -- and indeed Lehman Brothers and Bear Stearns as well -- would have clearly had the inside edge. As shown below, these investment firms clearly preferred Democrats over the long haul.
Ironically, to paraphrase Senator Levin, especially in Goldman's case the Democratic Party appears at least so far to have "put its own interests and profits" first, basically turning a blind eye -- at least so far -- to the substantially much larger potential misbehavior of the "big four."
Meanwhile, when President Obama traveled to New York two weeks ago to give a speech on the urgent need for financial reform, the peripatetic Mr. Dimon could be found in Chicago. He was rumored to have met with CME and/or Board of Trade executives to prepare to invest in an exciting new "derivatives exchange," should JPMorgan need to transfer its substantial share of that business -- several times Goldman's market share, even in credit derivatives -- to an open exchange.
JOKERS TO MY RIGHT
So all this concentration of political and economic power in US financial markets would appear to make a strong prima facie case for a serious structural reform, perhaps even along the lines of the Brown-Kaufman amendment, n'est pas? Unfortunately, no.
As we argued earlier, that amendment sets very crude targets that bear little immediate relationship to bank misbehavior or even political influence. At worst, the caps might just force bad behavior like risky derivatives and hedge fund investing offshore. And the bill's current caps would, at best, just force banks like Cit, JPM, and BankAmerica to shed less than 10 percent of their market shares, setting them back to -- say -- 2005 levels.
In other words, they're not a substitute for effective regulation. But that puts us back in the chicken-egg problem with "regulatory capture."
My own particular solution to these dilemmas is suggested by the following chart -- although it also suggests
that the most opportune time to implement it has already come and gone. In terms of the current banal American political discourse, it would be probably be quickly dismissed as 'socialist," although that term is such a catch-all that it has really become virtually useless, except as a device for red-baiting timid liberals.
THE CHILEAN MODEL
So don't take my word for it; let's ask the ghost of Chile's General Pinochet, whom I'm quite certain no one ever accused of being a "socialist," at least not to his face. For years he was best known among economists as one of the key political proponents of Milton Friedman's so-called "Chicago School" of ultra-free market economics. But in February 1983, during a severe crisis when all the banks in Chile failed, Pinochet showed that he could be quite pragmatic -- with a little arm-twisting from from leading US banks, which threatened to cut off his trade lines if he didn't nationalize the banks' debts.
So, after swearing up and down that private debts and private banks would never be nationalized, Pinochet's government did so. Three to six years later, after restructuring the banks and cleaning them up, and privatizing their substantial investments in other companies, they were sold back to the Chilean people and the private sector -- for a nice profit. (Similar policies were also followed by "socialist" Sweden in the case of a 1990s banking crisis, but the Pinochet example provides a more instructive example for so-called conservatives. Much earlier, General Douglas MacArthur, a lifelong Republican, also employed similar pragmatic tactics in restructuring Japanese banks in the early 1950s.)
Now this is the plan that the US Treasury (under Paulson and then Geithner) might have adopted in the Fall 2008 - Spring 2010, if only it had not been so hide-bound -- and in the case of the Obama Administration, so wary of being termed a "socialist."
In hindsight, the economics of such a pragmatic temporary government takeover and reprivatization would have been compelling. At its market low in March 2009, the combined "market cap" of the "big four" banks was just $120 billion -- including $5 billion for Citi and $15 billion for Bank of American. This was a mere fraction of the capital and loans that were ultimately provided to them. (At that point Goldman's market cap had fallen to $37 billion from $80 billion a year earlier -- not as steep a decline as the giants, but clearly no picnic for its shareholders, either.)
Only a year later, while the "demon bank" Goldman has recovered to more or less where it was in June 2008, before the crisis, the market cap of the "top four" US banks is now nearly six times higher than its low in March 2009, and, indeed, at an all time high -- well above both previous peaks.
Too bad the US taxpayers have only captured a small fraction of that $500 billion industry gain.
Too bad the US Treasury hasn't exercized strong "socialist" control over these institutions, changing the way they behavior directly, and restructuring them in the interests of the economy as a whole before selling them back to the private sector.
Too bad that "big four" lobbyists are now back in force on the ground in Washington DC, influencing the fine print of the "financial reform" bill in ways that we will probably only understand years hence. Despite its woes, undoubtedly this will be a bumper year for political spending by the financial services industry.
Of course, President Obama IS now being widely demonized as a "socialist" -- anyway.
***
(c)JSH, SubmergingMarkets, 2010
May 6, 2010 at 02:10 AM | Permalink | Comments (0) | TrackBack
Thursday, April 01, 2010
ORDINARY INJUSTICE Even Beyond Guantanamo, Rendition, and Torture, the US Criminal (In)Justice System Is a National Disgrace James S. Henry
In 1840, Tocqueville, otherwise usually an astute observer of American society, proclaimed that “there is no country where criminal justice is administered with more kindness than in the
US.”
In the modern-day “Law and Order”/ Perry Mason made-for-TV version of this story, the US is still viewed by many as having, in author Amy Bach’s words, “the world’s finest criminal justice system.”
Certainly this is the preferred self-image when, as it is wont to do, the US criticizes the quality of
criminal justice in other countries.
Juries take their independence seriously and fight tooth and claw for the
truth; parole officers and prison wardens are all deeply committed to “correction.”
Public defenders are not only thoroughly informed about the latest nuances of criminal law, but also work tirelessly to insure that each and every defendant has
his day in court.
Fortunately, Ms.
Her new book, the product of seven years of
first-hand research in the bowels of the state and local court systems of New
York, George, Mississippi, and Chicago, focuses on “ordinary injustice” -- the routine
failure of judges, prosecutors, and defense attorneys as a community to
deliver on the Constitution’s basic promises.
EXCEPTIONS?
Tocqueville was not alone in his naivete'. Initially, the sheer amount of attention given to criminal justice in the US Constitution as well as state constitutions led many observers to expect that the US really might be distinctive.
Indeed, criminal rights are the subject of Article I’s explicit reiteration of habeas corpus, plus four of the first ten amendments (known collectively as the “Bill of Rights”), and their extension to states and non-citizens by the XIV th Amendment.
Many of the exceptions have occurred in times of war or perceived security threats – for example, the Sedition Acts of
1798 and 1918, the World War II internment of Japanese-Americans, the frequent persecution of labor unions, civil rights workers, and Left wing dissidents from the 1880s right up through the 1970s, the 2001
Patriot Act, the NSA's illegal spying program, and the systematic mistreatment of "enemy combatants" at Guantanamo and elsewhere.
Other exceptions have involved the application of "Jim Crow justice” to native Americans, Afro-Americans, and other minorities.
Overall, however, most legal scholars have treated these episodes as abnormal deviations. In the long run, the system as a whole is supposedly always improving, always trying to do the right thing.
On this theory, the US Constitution and the courts that interpret it are a kind of homeostatic machine, with built-in stabilizers that eventually prevent any serious rights violations from becoming permanent.
THE REALITY: FAST-FOOD JUSTICE
It is also conceivable that "path dependency" and "feedback loops" in the legal system may be destabilizing. The erosion of rights in one period may increase the chance that rights continue to erode later on.
Critics of the conventional view have also argued that rich people and poor people – including the indigent defendants who now account
for about 70 to 90 percent of all felony cases – essentially confront two very different US criminal justice systems, especially in state and local
courts.
Meanwhile, 90 percent of criminal defendants soon learn the hard way that their nominal "rights" consist of one brief collect call from a jail cell, followed by a tango with an alliance of police, prosecutors, and public defenders whose shared objective is to talk them into pleading guilty.
As
Clarence Darrow said in his 1902 address to the inmates at the Cook
County
Jail, “First and foremost, people are sent to jail because they are
poor.” And as
the American Bar Association --
not usually aligned with wild-eyed radicals -- reiterated in 2004, “The
indigent defense system in the US remains
in a state of crisis.”
DETAILS FROM THE FRONT
In doing
so, she tackles one of the main challenges that confronts any investigator who seeks to understand how the criminal
justice system really works. This is the fact that “ordinary injustice,” while
pervasive, is very hard to observe
without detailed, painstaking field work.
✔
For example, in her book we meet a Troy New York city judge who routinely fails to inform
defendants in his court of their rights to counsel, imposes $50,000 bails for $27
thefts and $25,000 bails for loitering, and enters guilty pleas for defendants
without even bothering to tell them.
✔
We meet a Georgia public defender who runs a
“meet’ em, greet’em, and plead ‘em”
shop that delivers just 4 trials in 1500 cases, with guilty pleas
entered in more than half of these cases without any lawyer present or any
witnesses interviewed.
✔
We meet Mississippi prosecutors who are so
concerned about their win/loss records and reelections that they simply “disappear” all the
harder-to-prosecute cases from their files.
✔
We meet a Chicago prosecutor who allows two
iinnocent young people to sit in jail for 19 years before he finally works up
the gumption to examine the relevant DNA evidence. This new evidence not only
cleared them, but it also helped to disclose a much larger police conspiracy.
✔ Ms. Bach also reminds us of the unbelievable 2001 case before
the Fifth Circuit Court of Appeals (Texas) where the court labored hard to overrule a
lower court decision that would have permitted a defendant on trial for his
life to receive the death sentence, despite the
fact that his attorney had been fast asleep through much of the trial.
PATTERNS
Amy Bach’s book is more than just a series of such horror stories, however. By doing painstaking legal anthropology in multiple locations, she's been able to go beyond the limits of the typical one-off journalistic expose about the courts. (See, for example, A, B, and C.)
Bach's focus is on identifying recurrent patterns of
misbehavior. These patterns were
unfortunately not “exceptional” at
all, but routine and widespread.
Most important, her research underscores the
fact that ordinary injustice is
not just due to isolated “bad apples.” There is a system at work here. Indeed, injustice thrives on a culture
of tolerance for illegal practices cultivated in whole communities of lawyers, judges,
and police over many years. This
culture, and the “fast food” plea bargaining that it
facilitates, are at the root of
all her cases.
Unfortunately Ms. Bach offers no real solutions
to the problems that she has described so well. She ends up leaning rather heavily on a fond hope that “new metrics” will be developed
to measure how well individual courts actually deliver “justice” -- sort of the legal equivalent of "No Child Left Behind."
There may be something to this. But in my experience, metrics, whether in education or judicial policy, are the last refuge of the policy wonk. They will undoubtedly be a long time coming. This is partly because of budget constraints. But it is also because if the metrics are really worth a damn, they will provoke stiff resistance from the very same bureaucratic interests that Ms. Bach had to overcome in her own research.
Pending the dawn of this brave new world of metrics, I suspect that we will just have to depend on a handful of dedicated lawyers, investigative journalists, and creative legal scholars like Ms. Bach to keep an eye on the courts, root out what’s really going on, and insist that all of the rights we have on paper and take for granted are still around when we really need them.
ROOT CAUSES
So where does “ordinary
injustice” come from, and what can we do about it? Fundamentally, as noted, the kind
of ordinary injustice described by Ms. Bach basically exists because of the
“fast food” plea bargaining system. But as she also recognizes, it would be a waste of time to outlaw this directly. This is
because the plea bargaining treadmill basically derives from the unsuccessful attempt to reconcile
several deeply-inconsistent public demands.
First, 9/11, the war on terror and GWB notwithstanding, most
Americans still fundamentally believe in freedom. Most of us still want to preserve the Bill of Rights -- at least on paper.
Second, we all want to save money – especially in these times. Implementing
the full-blown version of the adversarial trials in
every case would be very costly. While taxpayers value human
rights, they’re not all frothing to pay a
whole lot for them. This is partly just because at any given point in
time their value is a
little abstract -- like health
insurance before you become ill.
Of course the truth is that
the “fast food” system is anything
but cheap. The entire system
– courts, prisons and police – now
costs US taxpayers over $250 billion a year. That figure has been growing like Topsy – it is now at least three times the 1990
level.
Over 80 percent of
these costs are born by the hard-pressed state and local governments. Most of the funds are digested by police
and prisons; courts only account
for about one fifth. Even so, it is far from clear that ordinary taxpayers –
most of whom never expect to see the inside of a criminal court or jailhouse themselves -- would be willing to pay
anything more to help defend the poor
or curb ordinary injustice.
Third, what US taxpayers do care about, at least until now, is “fighting crime,” especially drug-related and lower-level street crime. Ever since the 1970s, these have been the fastest growing contributors to system-wide criminal justice costs.
For many taxpayers, under the influence of thirty years of campaign propaganda from the “war on drugs” industry and
“tough-on-street crime” politicians, this has usually been reduced to “lock ‘em up and throw away the key, as fast as possible.”
The result is that today, in the US, the number of inmates in our local jails and state and federal prisons is at an all-time high: over 2.3 million, 6.8 times the number in 1974.
This means
the US has the highest per capita
incarceration rate in the world. It is 754 per 100,000, higher than
Russia (610), Cuba (531), Iran (223), and China (119), let alone developed countries like the
UK (152), Canada (116), France (96), Germany (88), and Japan (63).
Indeed, southern states like Louisiana (1138), Georgia (1021), Texas (976), Mississippi (955), Oklahoma (919), Alabama (890), Florida (835), and South Carolina (830) have distinguished themselves with even higher rates -- by far the highest rates of incarceration in the world.
This alone helps to explain the
fact that annual cost of all US prisons now exceeds $80 billion a year. Indeed,
the annual cost of warehousing prisoners in California and New York prisons is
at least $50,000 per year per prisoner
– much more than the cost of providing them with full time jobs outside! In addition, in the US, there are over
9 million former prisoners who are now outside prison. More than 5.1 million others remain under supervision,
on parole or probation.
All told, the US now has more than 11.3 million
past and present inmates. This is
the world’s largest domestic criminal population, an incredible 23.5
percent of all current prisoners in the world. No doubt the sheer scale of our “criminal industry
experience curve” gives us
at least one clear national
competitive advantage -- in crime.
Indeed, because of our propensity to throw people in jail
regardless of what becomes of them there,
we now account for over a third
of the entire world’s living past and present prisoners. Not surprisingly, this also affords us
by far the most costly judicial and corrections systems that the world has ever
seen.
For all these costly
incarcerations, despite the vast sums and short-cuts associated with processing
all of these millions through the pipeline as rapidly as possible, there is not
one speck of evidence that this system has contributed one Greek drachma to
falling crime or safer streets.
Indeed, the best evidence is
just the opposite. Over two-thirds of US offenders who are released from prison
are likely to be re-arrested within three years. Reactionary voices may argue that this just shows we should
hold more of them longer, a sure recipe for system bankruptcy. What it really
shows is the complete lack of any real “correction” or retraining in most US prisons. The system that the
entire criminal justice machine works so hard to get people into as fast as
possible has become the world’s largest training ground for serial offenders.
In short, if we really want to understand the roots of "ordinary injustice," as well as the intense pressure that each and every player in the US criminal justice system feels to cut corners and slash costs each and every day, we need to look no further than this self-perpetuating failed prison state-within-a-state.
After all, this particular failed state already has a total population
of current inmates and former inmates under supervision that is greater than
Somalia’s!
***
April 1, 2010 at 04:36 AM | Permalink | Comments (1)
Friday, October 02, 2009
Pittsburgh's State of Siege
Suppressiing Dissent With High-Priced Cop Toys
James S. Henry
Pittsburgh's State of Siege
You didn't hear much about it from any major US news organizations, but there was a very disturbing case of gratuitous police-led violence and intimidation at the G20 Summit in Pittsburgh on September 23rd-25th, 2009. Perhaps the only consolation is that it allowed those of us who were there to get a close look at some of the disturbing "brave new world: technologies for anti-democratic crowd control. These were initially developed by the US military to fight terrorists on the high seas and abroad, in places like Afghanistan, Somalia, and Iraq, but are now coming home to roost. Indeed, ironically enough, this is one of the few remaining global growth industries where the US is still the undisputed world leader, as we'll see below.
One local newspaper account described the events at the Pittsburgh G20 as a "clash" between the police, protesters, and college students.
Indeed, a handful of storefronts were reportedly broken on Thursday September 24 by a few unknown vandals.
However, based on our own visit to the summit, interviews with several students and other eye witnesses, and a careful review of the significant amount of video footage that is available online, the only real "clash" that occurred in Pittsburgh on September 23-25, 2009, was between lawless policing and the Bill of Rights.
The most aggressive large-scale policing abuses occurred from 9 pm to 11:30 pm on Friday September 25th near Schenley Park, in the middle of the University of Pittsburgh campus. This was miles away from the downtown area where the G20 had met, and, in any case, it was hours after the G20 had ended.
This particular case of aggressive policing -- "Hammer and Anvil," as the operation was described on police scanners -- was clearly not just a matter of a few "bad apples."
Rather, it appears to have been part of a willful, highly-organized, one-sided, rather high-tech experiment or training exercise in very aggressive crowd control by nothing less than a really scary uniformed mob.
New York police sometimes describe their firemen counterparts, tongue in cheek, as "robbers with boots." In this case we have no hesitation at all in describing this uniformed mob in Pittsburgh as "assailants with badges."
Their actions resulted in the unlawful suppression of the civil rights of hundreds of otherwise-peaceful students who were just "hanging out with their friends on a Friday night in Oakland," or attending a free jazz/blues concert in Schenley Park.
Essentially they got trapped in a cyclone of conflicting and inconsistent police directives to "leave the area." The result was nearly 200 arrests, gassings, beatings, and the deployment of dogs and rubber bullets against dozens of innocent people.
In addition to the students, this aggressive policing also assaulted the civil rights of a small number of relatively-peaceful protesters and quite a few ordinary Pittsburgh residents, most of whom were as innocent as bystanders can possibly be these days.
Why did this occur? In addition to whatever top-down "experiment" or training action was being conducted there appears to have been an extraordinary amojnt of pent-up police frustration and anger. For example, one student overheard a policeman piling out of a rented Budget van near Schenley Park around 9:50 PM Friday.
The officer was heard to exclaim, "Time to kick some ass!"
This is disturbing, but perhaps not all that surprising. After all, thousands of police had basically stood around for days in riot gear, sweltering in the "Indian Summer" heat, dealing with the tensions associated with potential terrorist attacks as well as all the hassles of managing large-scale protest marches, even if peaceful.There was also the inevitable tensions of social class and culture among police, Guardsman, and college students.
On the other hand, precisely because such tensions are so predictable, those in direct command or higher political office, and, indeed University officials, should have acted forcefully to corral them.
JOIN THE CLUB
All this means that Pittsburgh has unfortunately now joined the growing list of cities around the world that have experienced such serious conflicts -- mainly in connection with economic summits or national political conventions.
The list of summit frays includes this summer's G-8 in Italy, last Spring's G20 in London, the September '08 RNC in Minneapolis, the '04 RNC in New York City, Miami's Free Trade Area of the Americas Summit (11/03),
Quebec (4/01), Naples (3/01), Montreal (10/00),
Prague 9/00), Washington D.C. (4/00), the November '99 WTO
"Battle in Seattle," the J18 in London (6/99), Madrid (10/1994), and Berlin (9/88).
President Obama had originally selected Pittsburgh for the G20 because he hoped to showcase its recovery since the 1980s, especially in the last few years, under a Democratic Mayor, in a Democratic state that he barely carried in the 2008 Presidential contest.
In seeking to explain such events, therefore, it alway helps to keep a firm eye on the question -- whose interests did really this serve?
In retrospect, the failure of these leaders to control the police at the G20 has created a serious blemish on the city's reputation for good government. It may have also to some extent undermined Obama’s relations with college students and other activists who worked so hard for his election in this key state. And it certainly did not help the reputation of the Democratic Party in Pittsburgh or Pensylvania at large.
TIANANMEN FLASHBACKS
To journalists like me who happened to have been in Beijing in May 1989, during the buildup to the June 4th massacre in Tiananmen Square, Pittsburgh also bears an interesting resemblance. The analogy may sound a little strained, but bear with me.
(1) As in Beijing, there was a very large deputized police force from all over the country. These included over 1000 police "volunteers" (out of 4000 total police and 2500 National Guardsmen) who were ported in just for the G20.
According to the conventional wisdom, not being from the same community is likely to reduce your inhibitions when it comes to macing and kicking the crap out of unarmed, defenseless young people.
The guest policeman also included several hundred police who were under the command of Miami Police Chief John F. Timoney, pioneer of the infamous "Miami model"
for suppressing protest that was first deployed at the Miami Free Trade Area of the Americas Conference in November 2003. (Here’s the Miami model checklist, most of which was repeated in Pittsburgh.)
As one writer has observed, Timoney, who also served as Police Chief in Philadelphia, "(L)iterally transformed the city into a police state war zone with tanks,
blockades and “non-lethal” (but severely damaging) artillery."
It is unclear to what extent he played a similar role behind-the-scenes in Pittsburgh this year, but there certainly is a strong sulfurous odor.
(2) As in Beijing, In Pittsburgh there were no identifying badges on officers' uniforms, and they also refused to provide any identifying personal information in response to questions. Several photographers also complained about receiving threats and actual damage to their cameras.
(3) As in Beijing, there was simply no direct contest between the power of the security forces once they mobilized, and those of the unarmed students. The only kind of victory that the students could possibly have one in both cases was a moral one -- by essentially sacrificing their bodies and their rights to a tidal wave of repression.
Indeed, the "clash" theory of these events looks even odder once we take into account the fact that on Friday night in Pittsburgh, for example, unarmed students and protesters faced hundreds of police in full riot gear, armed for bear with equipped muzzled attack dogs, gas, smoke canisters, rubber bullets, bean-bag shotguns, pepper pellets, long-range pepper spray, at least four UH-60 Black Hawk helicopters (courtesty of New York Governor Patterson and his National Guard's 3-142nd Assault Helicopter Battalion unit), plus several brand new "acoustic cannons" (see below). There were also probably dozens of undercover agents provocateurs -- at least three of whom were actually "outed" by the students.
The police were also actively monitoring student communications on web sites like Twitter.
From this angle, a key difference with Bejing in 1989 was that the Chinese authorities felt genuinely threatened by the growth of student power and the democracy movement, and feared being ousted,from power. and were therefore able to justify their brutality as part of a zero-sum game. In the case of Pittsburgh, whatever police violence occurred was entirely gratuitous.
"I hereby declare this to be an unlawful assembly. I order all those assembled to immediately disburse. You must leave the immediate vicinity. If you remain in this immediate vicinity, you will be in violation of the Pennsylvania crimes code, no matter what your purpose is. You must leave. If you do not disburse, you may be arrested and/or subject to other police action. Other police action may include actual physical removal, the use of riot control agents, and/or less lethal munitions, which could risk of injury to those who remain."
The fact is that this warning was itself completely unlawful. Putting on the NYCLU lawyer's hat for a moment, absent a "clear and present danger" to the public peace, these threats violated the First Amendment's explicit recognition of right to "peacefully assemble.”
In effect, the fact is that the police and National Guard in Pittsburgh temporarily seized control over public streets, parks, and other public spaces, and exercised it arbitrarily. By the time the victims of these outrageous civil rights infringements have their day in court, the damage will have been long since done.
(5) As in Beijing, the police and military decided to launch their biggest raid late at night, after the summit had ended, most major media had gone home, and the courts had closed for the weekend.
GLOBAL COP TOYS Police behavior at all these global summits has evolved over time into a rather high-tech affair that would make Iranian crowd control experts turn bright green with envy. These sophisticated "phase array" device s emit a targeted 30-degree beam of 100+decibel sound that is effective up to several hundred yards, and is potentially very harmful to the human ear. The Pittsburgh units were apparently purchased by local sheriffs' departments across the country with the help of recent grants from the US Department of Homeland Security. Officially the grants have been justtified in the name of improving communications with the public, by permitting clearer voice channels (!), but that's a cover story -- the true purpose is crowd control. ( Roll tape: LRAD-500X_SDCo_Sheriff1). Other recent ATCO customers include the US Army (for "force protection" in Iraq and Afghanistan), and the US Navy and the navies of Japan and Singapore, for communicating with potentially-hostile vessels at sea. In 2008 ATCO flogged its wares at the biannual China Police Forum, Asia's largest mart for police security equipment. Obviously China would make a terrific reference customer, since it is one of the global front-runners in the brutal suppression of mass dissent. ATCO also has a 2007 contract with the US Marine Corps' "Joint Non-Lethal Weapons Program" to develop new, even more powerful weapons, euphemistically branded "acoustic hailing devices." Just two weeks before the Pittsburgh G20, they turned up in San Diego, where the Sheriff's Department provoked controversy by stationing them near a Congressional town hall forum -- just in case. This growing use of LRADs for domestic crowd control in the For all the homeland security technology buffs in the audience, you may rest assured that LRADs are hardly the only In the last decade the non-lethal weapons arena has exploded, and the US appears to be far ahead, assisted by ample R&D grants and purchase contracts from organizations like the Department of Justice's "National Institute of Justice," DHS's multi-billion dollar Homeland Security Grant Program, the U.S Coast Guard, and the Security Advanced Research Projects Agency, and DOD's Joint Non-Lethal Weapons Directorate (JNLWD) Program. The industry has also been aided by key contractors like ATCO, spearheaded by legendary engineer, inventor, and entrepreneur "Woody" Norris; and Penn State's Advanced Research Lab -- home of the Institute for Emerging Defense Technologies. NIJ also works closely with police organizations like PERF, and international organizations like the UK's Home Office Scientific Development Branch. In the first instance, the development of such non-lethal technologies is usually justified by their potential for providing an alternative to heavier weaponry, thereby reducing civilian casualties in combat situations. The fact that the US military now has at least 750 military bases around the world, and has also recently been playing an important "military policing" role in countries like Somalia, Haiti, Bosnia, Iraq, and Afghanistan, underscored DOD's rationale for these technologies. The problem is that just as in the case of the LRAD, once developed, it is very difficult to wall such technologies out of the US, or restrict them to "pro-civilian/pro-democratic" uses, like providing clearer amplification for outdoor announcements. Even aside from their technical merits, the competitive nature of the global law enforcement equipment industry virtually insures that every tin-horn US sheriff, as well as every Chinese party boss in Urumqi, will soon have access to these very latest tools in the arsenal for suppressing dissent. The ultimate irony, of course, is that the first generation of all these powerful new free speech suppressors have all been developed, not by authoritarian China, Iran, Burma or North Korea, but by US, ostensibly still the leader of the "Free World." TOYS IN THE PIPELINE So what's in store for those who are on the front lines of popular dissent?
We assume that some of the juiciest details are classified. But even a cursory review of public sources reveals that the following new crowd-control technologies may soon be
coming to an economic summit near you.
(See this recent UK review for more details.). ▣
"Area Denial Systems." This is a powerful new "directed-energy" device that generates a precise, targeted beam of "millimeter waves," producing an "intolerable heating sensation on an adversary's skin."
Under development by the US military since at least the late 1980s, this class of "non-lethal" weapons is now close to field deployment. Its key advantage over LRADs is that it has about ten times the range. Raytheon is already supplying its "Silent Guardian" version of the system to the US Army.
The next step required to bring this product to the police market will be to make it smaller and more mobile. According to this week's
New Scientist,
a new highly-portable, battery-powered version of the system, called the
"Thermal Laser,"
will soon become available -- though it has yet to show that demonstrate conclusively that it is within the bounds of the
UN Binding Protocol on Laser Weapons.
▣ New Riot-Control Chemicals and Delivery Systems. Subject to the dicey question of whether these new "calmative," drug-like agents are outside the boundaries of the 1993 Chemical Weapons Convention (to which the US and 187 other countries are signatories), these would not irritate their targets, unlike pepper spray or tear gas, but calm them down.
▣ Glue Guns. If all else fails, UK's Home Office reports that another approach to "less- lethal" crowd control weaponry is also making progress -- a gigantic glue gun that sprays at least some 30 feet, bemingling its target audience in one huge adhesive dissident-ball. Apparently still unsolved is the question of precisely what becomes of all those who are stuck together, or how the police avoid becoming entangled with them. But undoubtedly millions of pounds are being devoted to solving these issues even as we speak. SUMMARY I went to Pittsburgh last week on behalf of Tax Justice Network, a global NGO that is concerned about the harmful impacts that tax havens and dodgy behavior by First World banks, MNCs, lawyers, and accountants are having, especially on developing countries. I was under no illusion that the reforms we were rather politely advocating would quickly be adopted, but at least we'd say our piece, if anyone cared to listen. I came away with the depressing sense that the G20 summit, like its many predecessors, was never intended to be a listening post for independent, outside opinions. But even worse, it had actually become, in practice, an excuse for the criminalization of dissent in capital cities all over the globe, even in those that are nominally the most free, by way of the vast new security measures that it requires and subsidizes,and the repressive tactics that it legitimized. In this day and age, of course, we are told that almost any amount of security is too little. And this heightened sense of insecurity is certainly not aided by having the world's top 20 leaders regularly shuffling from pitstop to pitstop, trying to conduct the world’s business from a traveling roadshow. But I was struck by just how unnecessary, senseless, and counterproductive almost all of the repressive policing tactics deployed in Pittsburgh really were -- how they ran roughshod over many of our most precious freedoms, freedoms that we are supposedly trying to protect. And to what a degree whatever “terrorists” there are out there have already won, by succeeding in creating a society that is really is often ruled by fear instead of justice, by force instead of discourse. ***
For example, last week's G20 featured one of the largest US deployments ever against civilian demonstrators of "LRADS," or acoustic cannons.
Manufactured by San Diego's tiny American Technology Corporation (NASDQ: ATCO), the $37,500 so-call "500X" version of the sound cannon that was used in Pittsburg was developed at the behest of the US military, reportedly in response to the USS Cole incident in 2000, to help the Navy repel hostile forces at sea.
Until recently the most widely-publicized use of LRADS had been against Somali pirates. The devices have also been deployed against "insurgents" by the US military in Fallujah, by the increasingly-unpopular, anything-but-democratic regime of Mikhail Saakashvili in the Republic of Georgia, and by New York City at the RNC in 2005.
US is worrisome, not only because it is a potent anti-civil liberties weapon, because -- just like tasers, rubber bullets, OC gas, and other so-called "non-lethal but actually just "less lethal" weapons" -- they can cause serious injuries to ears, and perhaps even provoke strokes.
potential "less-lethal" free speech-and-assembly killers in the pipeline.
ated 135 miles east of Pittsburgh, has been especially active in advocating the advantages of such new chemical weapons.
Rather than, say, simply allowing the overwhelmingly non-violent demonstrators and students at that peaceful Friday night blues concert to have their say, instead some 200 people were arrested and scores were gassed, clubbed, rubber-bulleted, and imprinted with galling memories that will last a lifetime. The City of Pittsburgh and its residents will certainly be fighting criminal cases and civil rights law suits for years to come. I supposed we are meant to be consoled by the fact that, as the New York Times chose to emphasize this week, things are much more repressive in Guinea.
So perhaps it is time to establish a permanent location for all these global summits. Perhaps one of the Caribbean tax havens, like Antigua or St. Kitts, would do -- journalists always like the sun, and after TJN gets done with them, these havens are going to need to find a new calling anyway!
October 2, 2009 at 08:47 AM | Permalink | Comments (3)
Saturday, October 11, 2008
HANK PAULSON'S OCTOBER REVOLUTION Why This Republican X-Banker Has Decided to (Partially) Socialize Our Entire Banking System James S. Henry
"We have made changes, Sire. Yes, it is true, we have made changes. But we have made them at the right time. And the right time is, when there is no other choice."
-- Conservative adviser to King Edward VII, explaining his support for liberal reforms
We may have just reached a critical turning point in American political economy -- not only in our efforts to overcome the burgeoning global banking crisis, but also to overcome the pernicious influence of free-market fundamentalism, which has dominated US economic policy for the last 30 years.
Ironically enough, the person who deserves more credit than anyone else for helping us to reach these goals is our current Treasury Secretary, a lifelong die-hard Republican and former Wall Street king-pin.
Last night, a few hours after the US stock market closed, the Bush Administration, embodied in Henry M. Paulson,Jr.,
announced that in order to stem the continuing turmoil in capital markets, in conjunction with other G-7 countries, the US federal government will begin "as soon as we can" to use taxpayer money to buy preferred equity in private financial institutions, especially banks.
Depending on how it is implemented, this could very well amount to a partial nationalization of the entire US banking system by the US Government. Are you paying attention here, Hugo, Fidel, and Evo?
This marks a very sharp U turn in recent US policy. Indeed, just two weeks ago, in their September 23rd testimony before Congress, Paulson and Federal Reserve Chair Ben Bernanke dismissed such equity investments as a "losing strategy," compared to their preferred scheme, a government-run "reverse auction" to buy up to $700 billion of "toxic" bank assets.
HAIL MARY
This policy U-turn was not due to sudden new insights generated by careful academic analysis or some precise economic model.
It feels more like a Hail-Mary pass, coming at the end of one of the most disastrous weekly stock market performances in US and global history.
That, in turn, was preceded by ten exhausting days of political goat-rodeo and Congressional negotiations over the infamous "$700 billion bailout," on top of the preceding six exhausting months of more or less ad hoc, increasingly expensive but largely unsuccessful one-off interventions in money markets and the banking system by the US Treasury, the Federal Reserve, and a myriad other US bank regulators.
Meanwhile, there has been an even more quirky set of poorly-coordinated improvisational remedies administered by diverse regulators in the UK, Germany, France, Iceland, and Belgium.
At the end of all this, fear, uncertainty, and doubt (FUD) have continued to spread across global capital markets, as policymakers stumble into each other, national banking systems compete for deposits, the US Treasury becomes (ironically) a huge depository for safe-haven flight capital, and no one manages to get ahead of the crisis.
If the FUD continued to spread, and global credit remained on lock-down, the forthcoming global recession -- already likely to plunge real growth in Europe and the US to zero or less next year, China to 6 - 8 percent or less, and the rest of the developing world to 4-6 percent -- would become dire indeed.
So one answer to the riddle of Paulson's "sudden conversion" is that he simply had no alternative. Given that ad hoc bailouts, coordinated interest rate cuts, increased deposit insurance, the extension of government insurance and liquidity to money market funds, the commercial paper market, on top of the takeovers of AIG and Fannie/ Freddie, had not done the job, nationaliziation -- really internationalization, since global banks are involved, and other countries will presumably be asked to contribute -- is one of the few arrows left in his quiver.
This will hardly be the first US experience with quasi-nationalization. Indeed, on September 16, the Federal Reserve had effectively "nationalized" the giant insurance company AIG, acquiring 80 percent of its equity in exchange for an $85 billion loan. And on September 7, the US Government announced that it had formally taken over Fannie Mae and Freddie Mac, the world's largest players in the "secondary" mortgage market, with more than $1.6 trillion of assets. All told, these are probably the largest nationalizations anywhere in human history.
Way back in the 1930s and 1940s, the US Reconstruction Finance Corporation seized and recapitalized many banks. The FDIC has also done this many times since then.
European governments have even longer histories of direct intervention in banking markets. And several of them moved almost too quickly in the last year to nationalize particular banks -- for example, the UK's Northern Rock in February 2008 and Fortis in September 2008.
Of course, most of these recent cases have involved failing institutions, where the government was a "lender of last resort." As discussed below, Paulson's plan is rather different.
Even farther back, in the early 19th century, states like Virginia and Pennsylvania often invested directly in state-chartered banks to set them up and keep them going. Those were not especially happy experiences with government ownership.
But this is hardly a great time for champions of private capital markets to be quibbling about the efficiency costs of government intervention -- private markets in the US alone have just lost $7 trillion of market cap in the last year, including $3 trillion in the last 3 weeks. And the global "opportunity cost" of the crisis is probably at least twice this high.
A GLOBAL RECOVERY FUND?
If done right, Paulson's PIP (Public Investment Program) will be much broader, more proactive and more innovative than previous bank nationalizations.
For example, one idea would be to establish a "Global Recovery Fund," permitting fresh private capital, "sovereign wealth funds" like those in Norway and the UAE, and European, Latin and Asian countries that have a clear stake in restoring the world's financial sector to health to invest alongside the US Treasury.
Even, Heaven help us, the IMF and the World Bank's IFC might participate in such a fund. They have run out of developing country crises to solve, are looking for a new role, and have $billions in untapped credit lines.
Such an approach would help to share the heavy burden placed on US taxpayers, and make this program more politically palatable than the TARP bailout proved to be.
A global fund would also help to diversify investment risks across many more countries and banks.
Indeed, the USG and its new partners might even become lenders of far-from-last resort, clearing the way for threatened but essentially-healthy institutions to survive the financial contagion, raise much more private capital as well, and, most important, turn each and every new $1 of equity into $10 to $15 of new lending.
If the fund is successful in reviving the world's financial system, and restoring banks to financial health, taxpayers and investors will no doubt all be paid back handsomely. But the most important benefits may be the "hidden" ones -- the catastrophic losses that would be avoided by preventing further chaos and market decline.
This is very different from Paulson's original TARP buy-back scheme, which promised to boost bank equity informally by way of overpaying for toxic assets with highly-uncertain values.
Ironically, that approach just rewards those companies with the very worst portfolios and lending practices, while enabling much less increased lending.
Indeed, TARP's only comparative advantage seems to have been that by avoiding direct government investment in the private sector, it did not violate any red lines of so-called free-market conservatives.
In hindsight, however, given TARP's birth pains, plus the fact that the market value of all US publicly-traded stocks fell from $12.9 trillion on September 19 to $9.2 trillion in the three weeks after Paulson Plan I was announced.
So respecting this neoliberal ideological taboo may well have just cost US investors -- most of whom are taxpayers -- at least $1 to $2 trillion of market value that might have been saved with an immediate recapitalization plan.
With that much extra dough, we could almost afford to wage another Iraq-scale war somewhere.
The PIP program faces many challenges. It needs careful guidelines about how to value investments, which banks will be eligible, and how they will be incented to participate. There needs to be controls the propensity of Treasury officials to have "revolving door" relationships with the companies they are investing in.
It is also vital to focus on the program's central objective -- a temporary
investment to stabilize the financial system, returning the investment
(hopefully with gains) to the Treasury as soon as possible.
The US Treasury also needs to decide what corporate rights we should get for our money.
For example, Mr. Warren Buffett, everyone's favorite wealthy investor these days, would probably demand protections against non-dilution and excessive dividends to other shareholders, and perhaps voting rights as well, if he were the investor. If taxpayers are investing and taking all this risk, why is Warren's money any more deserving of such rights than ours?
None of these issues are insurmountable. Furthermore, purchasing equity in established, publicly-traded institutions will certainly be a whole lot easier than setting up brand new, complex "reverse auction" markets for previously untraded mortgage-backed securities, much less insurance on them.
In any case, as we'll examine in Part II of this article, given the incredibly shaky structure of the global banking system's balance sheet, especially in the US and Europe, at this point Hank Paulson's public equity investment plan is really the US Treasury's only option for putting our banking system back on its feet.
So viva Comrade Hank! Y Viva la Revolucion!
But investors, workers, home owners, students, beware: it still pays to be conservative here, despite Hank's Revolution.
Because even if the government invest heavily in all these banks, no one is still quite sure what all those $trillions of asset-backed securities on and off their balance sheets are worth.
We may not find out until the housing market touches bottom and there are comprehensive audits of major financial institutions and their hedge fund buddies.
So keep at least some of your powder dry and hang on to your hats -- the ride will continue next week.
(c) SubmergingMarkets,
October 11, 2008 at 04:25 AM | Permalink | Comments (0) | TrackBack
Wednesday, September 24, 2008
BUSH SPEAKS: "HOW DID WE EVER GET IN THIS MESS? WHERE HAVE I BEEN?" James S. Henry
Last night on national television, Comrade Bush presented his own miniature 14-minute "Cliff Notes" version of the roots of the current US financial crisis, and a heart-rending appeal for the most generous act to date of his Administration, the $700 billion blank-check Wall Street bailout.
By now the man has established a bit of a pattern -- customarily trying to scare us all into granting him unlimited powers, while arguing that there is simply no alternative to whatever bitter pill he happens to be pushing at the moment.
You are of course free to believe him if you like. Hundreds do.
In fact, as we argued yesterday, there are all sorts of improvements to be made to the proposed Whale of a Bailout package.
These include things like, at a minimum, (1) equity investment and warrants for taxpayers, to provide some upside returns in proportion to the risks we are taking on any purchases of bank assets; (2) stronger oversight; (3) more assistance for the millions of Americans who are experiencing home foreclosures; (4) compensation ceilings, clawbacks, and stiff progressive tax rates on incomes over $1 million and estates over $10 million, to offset the cost of all this; (5) a Financial Products Safety Commission; (6) a new Treasury-backed competitive insurance market for mortgage securities, available to banks and homeowners; (7) expanded FDIC reserve fund rather than buy "toxic" bank securities up now, set up an -- since all the "I-banks" are commercial banks now, anyway; (8) a new installment of the 1932 Pecora Commission, complete with subpoena power, to investigate the origins of the crisis and hold people accountable.
(For more details, here is the testimony that Dr. Brent Blackwelder, Friends of the Earth, and I submitted to Rep. Frank's House Financial Services Committee yesterday: BAILOUT.pdf)
Even more important, the President's central claim that there is no alternative to this bitter pill is a triple whopper with cheese.
As the IMF -- not our favorite institution, but it does know a thing or two about recapitalizing broken banking sectors -- has suggested just this week, long-term "swaps" of mortgage-backed securities for government bonds could be used to clean up the banks' balance sheets while completely sparing taxpayers the risks of a huge loss on the $billions of toxic assets we'll soon be owning.
There are also numerous other approaches to broken-banking sector restructuring that have been employed by governments all over the planet in more than 124 banking sector crises since 1970 -- for example, in Chile, Korea, Germany, Mexico, and Japan.
Doesn't anyone else find it odd that none of this expertise is being put to use?
Or that, with losses on complex derivative and structured securities at the core of this debacle, and thousands of "quants" from MIT and Wharton on Wall Street, we cannot design some simple security vehicles to help taxpayers reduce their personal or collective exposure to its potential costs?
Perhaps Bush & Co. are not familiar with the IMF or the World Bank/ IFC's "Capital Markets Group." Perhaps Secretary Paulson never bothered to understand the first thing about derivatives and options during his 32 years at Goldman Sachs.
For their information, the World Bank/ IMF/ IFC are located at 700 19th St. NW, Washington D.C., three blocks from the White House.
Their staffs are not especially busy at the moment -- indeed, 15 percent of the IMF's professionals are being laid off, so they may have some time to help out.
We've been assured by the Bush Administration, however, that the IMF's assistance is not really needed at this point.
"What are we," Bush asks, "Some sort of two-bit corrupt, debt-ridden plutocracy that can't manage its own affairs?"
Indeed, the President, Secretary Paulson, and a weird new assortment of bottom-feeding Wall Street investors (Omaha's Warren Buffet, Tokyo's Nomura Holdings and Mitsubishi UJF), and, of course, those who are still left on Wall Street itself are in a white heat to get this deal done, and are trying to create a stampede.
They are also clearly not interested in improving the bailout. They just want our money -- and a loosey-goosey, behind-closed-doors process for distributing it that hasn't even been designed yet.
Unlike US taxpayers, Buffet, who is reportedly investing $5 billion in Goldman Sachs, was saavy enough to get preferred shares and warrants for his money -- worth up to 8 percent of the premier bank's share.
At that rate, just think how much Wall Street real estate our $700 billion would buy -- if only Paulson and Bernanke and the US Congress would follow in Buffet's footsteps and insist on some equity and warrants in exchange for a bailout.
Meanwhile, Bush has the temerity to intermeddle (once again) with the orderly conduct of a US Presidential election, by inviting the two leading Presidential candidates to the White House just to help him close the deal with Congress. (Ralph Nader and Bob Barr are reportedly already camped out in the White House basement.)
As if the candidates don't have anything else to do, like debate each other, 41 days before the
election.
As if Secretary Paulson and Chairman Bernanke were not already the world's consummate sales team!
Of course both Obama and McCain will accept the President's hospitality -- they have no choice.
So both have now been roped into making this deal happen.
Alas, it probably will -- minus almost all of the possible improvements noted above. The largely symbolic CEO comp limit is probably the only exception
To those to whom much has been given, even more will be given.
We do have one consolation, however, as we prepare to pay the check for this lousy meal. We've located a different version of the history of this crisis that is more accurate -- and more entertaining -- in this must-see video:
September 24, 2008 at 05:40 PM | Permalink | Comments (0) | TrackBack
Tuesday, September 23, 2008
SO, FORREST, WHAT DO WE DO NOW? Ten Steps to Fix the Paulson Plan and Solve the US Debt Crisis JS Henry and Brent Blackwelder
The US Congress is busy working hard on US Treasury Secretary
Henry M. Paulson Jr.'s $700 billion TARP bailout plan --
at least everyone except Alabama's Rep. Spencer Bachus, the ranking Republican on the House Financial Services Committee, who has spent much of the day explaing why a senior official in his position has the time, much less the ethical license, to be making scores of options trades during office hours.
While we have every confidence that Rep. Bachus and his peers will provide masterful oversight of the Secretary's proposal, it is understandable that with less than six weeks left to the November election, and Congress set to adjourn on Sept. 29, we appreciate that they may have more important things to worry about than the greatest US financial crisis since the Great Depression.
So it is time to help them out. Given the widespread dissatisfaction -- indeed, revulsion -- at Paulson's initial request for a $700 billion blank check -- on top of the other $500 - $700 billion that the Treasury/ FDIC and the Federal Reserve have already committed to Fannie/ Freddie, AIG, Bear Stearns, and other banks this year -- it is clear that revisions are needed. But time is short -- not just because of election imperatives, but because global financial markets are on pins and needles, waiting for a clear solution.
Any time there is this kind of sea-changing economic event, it tends to surface every interest group's Christmas wish list of long-delayed "essential reforms."
In this situation, indeed, the crisis has brought forth everything from proposals for "nationalizing the banks" and new regulatory agencies to "clawbacks" in executive severance plans and income tax reform. There are also a substantial number of people who are concerned about the implications of the initial Paulson proposal for constitutional democracy -- some have called it as nothing less than an "economc coup d'etat" by "Commandate Paulson," because of all the unreviewable authority it would have vested in the Secretary and his minions.
Given that Congress is moving at the speed of light, we need to "tier" these proposals according to their importance. There are also a few more innovative ones that deserve immediate attention. Here's our own "Top Ten Improvements" wish list.
WISH LIST
1. Equity “Upside” and Voting Power.
In return for the undeniable new risks that US taxpayers are taking on, and the poor management track record of leading Wall Street institutions, it is reasonable to insist that they receive an “upside” on the value of participating financial institutions (FIs) themselves as well as on the potential increased value of acquired mortgage-backed assets. This proposal commands widespread support in this panel.
Technically, this could be accomplished by demanding preferred shares (with anti-dilution provisions) from any financial institutions (FIs) that receive assistance, as was routinely done by Bank of Japan in exchange for financial assistance during the Japanese bank restructuring of the 1990s, and by the Chilean government during the February 1983 bank nationalization.
Warrants might also be used, as was done in the case of the 1979 $1.2 billion Treasury loan guarantee to Chrysler. (According to Sen. Bradley, the Federal Government eventually made money on those warrants.) We believe that while warrants are easier to implement, it is vital to insist on actually equity (including voting power). This will provide the Treasury with much more direct influence over management behavior, will be easier to value, and will also be easier to explain to the public than warrants.
2. Clawback Provisions for Executive Severance Pay.
The basic principle here is that for senior FI executives, there should be accountability for some time period even after they leave office – at a minimum, any future compensation or severance that they receive should be subject to stiff taxes or repossession in bankruptcy court. Insisting on compliance with this standard should be a condition for participation in the bailout.
3. Share the Pain.
A. Emergency Taxes.
Since this very costly bailout package may severely limit the ability of the Federal Government to afford vital programs like health insurance reform and alternative energy, it is important that we deal now with the substantial “tax justice” implications of the bailout.
One way to do this would be to start treating this as the national emergency that it really is, and help ordinary taxpayers pay for it by: (1) eliminating the carried-interest benefits for hedge fund managers; (2) cracking down on offshore havens – no FIs should be permitted to establish subs or place SPVs in them; (3) imposing at least a temporary increased income tax rate on all people with incomes above $1 million and on all estates above $10 million.
B. Compulsory Write-Down/ Debt Reduction of Residential Mortgages.
Given the failure of this summer’s relief packages for ordinary mortgage holders to have much impact, and the fact that foreclosures are still increasing (to a record 100,000+ per month, and that housing prices are still falling in a majority of key markets, this is an another essential measure. The debt restructuring should be implemented quickly, affect large numbers of people, and be inversely proportional to mortgage size. It might also be means –tested.
Such a measure would not only provide equitable relief to millions of would-be homeowners; it would also help to kick-start a US economy recovery.
4. Financial Products Safety Commission.
This would review and certify the quality of all financial products offered to the general public. Products like zero-down payment mortgages would require special labeling, and might not qualify for government incentives like interest deductibility, access to the government insurance window, and so forth.
5. A New US Treasury-Created Market for MBS Insurance.
A novel idea suggested by our good friend Prof. Lawrence Kotlikoff of Boston University is that the US Treasury might be able to use current authority to offer ABX-like insurance at a fixed price per tranche to institutions that hold MBSs. According to Professors Kotlikoff and Merlin, if such a government-backed insurance market were in place, backed by a significant reserve against losses, it might even obviate the need for the entire $700 billion, while creating a market-based workout alternative.
This could be combined with #1, if FIs were allowed to pay for the insurance with equity or warrants. This would also have the benefit of helping to recapitalize troubled FIs.
6. New “Pecora Commission” (ala 1932): a congressional committee with subponae power to investigate the root causes of this crisis and recommend further steps.
September 23, 2008 at 04:42 PM | Permalink | Comments (0) | TrackBack
Saturday, September 20, 2008
SOCIALISM FOR BANKERS, SAVAGE CAPITALISM FOR EVERYONE ELSE? Bailout Jeopardizes the Entire Progressive Agenda James S. Henry
"“There’s class warfare, all right, but it’s my class, the rich class, that’s making war, and we’re winning.” -- Warren Buffet, June 2008
Ladies and gentlemen: pardon my intemperance, but it is high time for some moral outrage -- and a little good old-fashioned class warfare as well, in the sense of a return to seriously-progressive taxation and equity returns for public financing.
After all, as this week's proposed record-setting Wall Street bailout with taxpayer money demonstrates once again, those in charge of running this country have no problem whatsoever waging "class warfare" against the rest of us -- the middle classes, workers and the poor -- whenever it suits their interests.
At a time when millions of Americans are facing bankruptcy and the risk of losing their homes without any help whatsoever from Washington DC, the CEOs and speculators who created this mess, and the top 1 percent of households that owns at least 34 percent of financial stocks, and the top 10 percent that owns 85 percent of them, have teamed up with their "bipartisan" cronies in Congress, the US Treasury and the White House to stick us with the bill, plus all of the risk, plus none of the upside.
Upon close inspection, the Treasury's proposal is nothing more than a bum's rush for unlimited power over hundreds of $billions, to be distributed at Secretary Paulson's discretion behind closed doors and without adequate Congressional oversight.
This time they have gone too far.
As discussed below, the cost of this bailout could easily jeopardize our ability to pay for the entire economic reform program that millions of ordinary citizens across both major parties have been demanding.
Some kind of bailout may indeed be needed from the standpoint of managing the so-called "systemic risk" to our financial system.
However, as discussed below, the Paulson plan does not really tackle the real problem head on. Thsi is the fact that many financial institutions, including hundreds of banks, are undercapitalized, and need more equity per dollar of debt, not just fewer bad assets.
To provide that, we may well want to mandate debt restructurings and debt swaps, or provide more equity capital .
If private markets can't deliver and we need to inject public capital into financial services companies on a temporary basis, so be it. But it should only be in return for equity returns that compensate the pubilc for the huge risks that it is taking.
Call that "socialism" if you wish -- I think we are already well beyond that point -- sort of like Chilean economists became in 1983, when the entire private banking sector collapsed and was nationalized -- successfully -- by the heretofore "Los Chicago Boys."
To me, public equity investment, in combination with increased progressive taxation, should be viewed as just one possible way to get these companies the equity they need, while providing fair compensation to the suppliers of capital and participation in any "upside," if there is one.
Absent such measures, progressives certainly have much less reason to support this plan. After all, the increased public debt burdens that it would impose are so large that they could easily jeopardize our ability to pay for the entire economic reform program that millions of ordinary citizens (across both major parties) have been demanding.
From this angle, the Paulson program, in effect, is a cleverly-designed program to "nationalize" hundreds of billions of risky, lousy assets of private financial institutions, without acquiring any public stake in the private institutions themselves, and without raising any tax revenue from the class of people who not only created this mess, but would now like to be bailed out.
Any mega-bailout should come at a high price for those who made it necessary.
In particular, we must make sure that the butcher's bill is paid by the tiny elite that was responsible for creating this mess in the first place.
This is not about retribution. It is about insuring taxpayers are truly rewarded for the risks that they are taking -- isn't that the capitalist way? And it is also about making sure that this kind of thing never happens again.
After all, the real tragedy of this bailout is its opportunity cost. Consider a well-managed $1 trillion "matching" investment in strategic growth sectors like energy and health....If we really wanted to insure our competitive health, we would not be investing $1 trillion in lousy bank portolios generated by the chicanery-prone financial services sector.
CAPITALISTS AT THE TROUGH
In financial terms, this latest Wall Street bailout is likely to cost US taxpayers at least $100-$150 billion per year of new debt service costs -- just for starters.
This estimate is consistent with the $700 billion ("at any point in time") that President Bush and Treasury Secretary Hank Paulson are requesting from Congress this week to fund their virtually-unfettered ("unreviewable by any court") new "Troubled Asset Relief Program." (TARP)
The sheer scale of Paulson's proposal implies that federal authorities plan to acquire at least $3 trillion of mortgage-backed securities, derivatives, and other distressed assets from private firms -- on top of Fannie/ Freddie Mac's $5.3 trillion mortgage securities portfolio. How the Fed and the Treasury actually propose to determine the fair market value of all these untrade-able assets is anyone's guess. But since 40 percent derive from the exuberant, fraud-prone days of 2006-7, they will probably all be subject to steep (60-90 percent) discounts from book value.
That's consistent with the 78 percent "haircut" that Merrill Lynch took on the value of its entire mortgage-backed securities portfolio earlier this month -- actually, more like a 94.6% haircut, the portion that it received in cash.
This implies, by the way, that if the Federal Government were required to "mark to market" their $29 billion March 2008 investment in Bear Stearns' securities, it would now have a cash value of just $1.6 billion. Not a very hopeful sign from a taxpayer's standpoint.
Paulson's latest proposal dictates another sharp increase in the federal debt limit, to $11.313 trillion. This limit stood at just $5.8 trillion when Bush took office in 2001. By October 2007 it stood at $9.8 trillion. Then it jumped again to $10.6 trillion in July 2008, during in the Fannie/Freddie meltdown. As of March 2008, the actual amount of Federal debt outstanding was $9.82, just six months behind the limit and gaining.
All this new TARP debt will be on top of $200 billion of new debt that was issued to buy Fannie/Freddie's preferred stock, plus the assumed risk for their $1.7 trillion of debt and $3.1 trillion of agency mortgage-backed securities.
It is also in addition to the $85 billion 2-year credit line that Federal Reserve just extended to AIG, the $29 billion "non-recourse" loan provided for the Bear Stearns deal noted above; $63 billion of similar Federal Reserve lending to banks this year; $180 billion of newly-available Federal Reserve "reciprocal currency swap lines:" $5 billion of other emergency Treasury buybacks of mortgage-backed securities; $12 billion of Treasury-funded FDIC losses on commercial bank failures this year (including IndyMac's record failure in July); perhaps another $455 billion of Federal Reserve loans already collateralized by very risky bank assets; and the FDIC's request for up to $400 billion of Treasury-backed borrowings to handle the many new bank failures yet to come.
There is also the record $486+ billion budget deficit (net of $180 billion borrowed this year from Social Security trust fund) that the Bush Administration has compiled for 2008/09, drivem in part by the continued $12-$15 billion per month cost of the Iraq and Afghan Wars and the impact of the deepening recession on tax revenues. Longer term, there is also the projected $1.7 trillion to $2.7 trillion "long run" cost of those wars (through 2017).
All told, then, we're talking about borrowing at least another $1-1.4 trillion of federal debt to finance a record level of lousy banking.
COMPARED TO WHAT?
By comparison, Detroit's latest request for a mere $25 billion bailout looks miserly. And if we were in Vienna, we would say, "We wish we could play it on the piano!"
Compared to other bailouts, this is by far the largest ever.
For example, the total amount of debt relief provided to all Third World countries by the World Bank/IMF, export credit agencies, and foreign governments from 1970 to 2006 totaled just $334 billion ($2008), about 8 percent of all the loans. (Henry, 2007).
The savings and loan bailout in the late 1980s cost just $170 billion ($2008).
And the FDIC's 1984 bailout of Continental Illiinois, the largest bank failure up to this year, was (in $2008) just $8 billion (eventually reduced to $1.6 billion by asset recoveries).
Meanwhile, compared with other countries that are well on their way to building forward-looking "sovereign wealth funds" to make strategic investments all over the world, the US seems to be on a drive to create this introverted "sovereign toxic debt dump."
CASH COST
No one has a very precise idea of how much all this will cost, not only because many of the securities are complex and thinly traded, but also because their value depends to a great extent on the future of the US housing market. Housing prices have already fallen by 20-32 percent in the top 20 markets since mid-2006, and they continue to fall in 11 out of 20 major markets, especially Florida, southern California, and Arizona, where the roller-coaster has been the most steep.
At current T-bond rates (2-4 percent for 2-10 year bonds, the most likely maturities), near-term cash cost of this year's bailu is likely to be an extra $40 to $60 billion a year in interest payments alone.
Furthermore, since the borrowed funds will be invested in high-risk assets, the most important potential costs involve capital risk. There's a good chance that, as in the case of Bear Stearns, we'll ultimately get much less than $.50 for each $1 borrowed and invested. For example, Fannie and Freddie alone could easily be sitting on $500 billion of losses (=$2 trillion/$5.3 trillion* 50% default*50% asset recovery).
This could easily make the long-run cost of this bailout to taxpayers at least $150 billion a year.
No wonder traders on the floor of the New York Stock Exchange reportedly broke out singing "the Internationale" when they heard about the bailout.
But the direct financial costs of the bailout are only the beginning....
HIJACKING THE FUTURE
Last week's events produced terabytes of erudite discussion by an army of Wall Street journalists, prophets and pundits about short-selling rules, "covered bonds," and the structure of the financial services.
This is absolutely par for the course, as modern financial crisis journalism is concerned -- the "story" is always told mainly from the standpoint of what's in it for the industry, the banks, the regulators, and the investors.
For the 90 percent of Americans who own no money-market funds, and less than 15 percent of all stocks and bonds, however, this bailout means just one thing.
All of the money has just been spent. And it has not been spent on you.
For example, unless we demand an increase in taxes on the rich, big banks, and big corporations, as well as some public equity in exchange for the use of all this money, we can expect that the long-term costs of this bailout will "crowd out" almost all of the $140 to $160 billion of new federal programs that Barack Obama proposed. It will certainly make it impossible for Obama to finance his programs without either borrowing even more heavily, or going well beyond the tax increases (on oil companies and the upper middle classes) that he has proposed.
Without such changes, there will be no federal money available for comprehensive health insurance, or the reform of the health care delivery system.
There will be no additional funding for pre-school education, child care, or college tuition.
There will be no additional funding for investments in energy conservation, wind, or solar power.
There will be no additional investments in national infrastructure (e.g., the reconstruction of our aging roads, highways, and bridges to "somewhere.")
Highway privatization and toll roads, here we come.
There will be no money to bail out the millions of Americans who are on the brink of losing their homes.
The supply of housing loans and other credit will remain tight, despite the bailout.
Indeed, if the economic elite has its way, the long-sought dream of "a home for every middle-class American family" may be abandoned as a goal of government policy.
Meanwhile, the government-sponsored consolidation of the financial services industry will make financial services more profitable than ever.
This is good news for the "owners of the means of finance." For the rest of us, it means steeper fees and rates. And if we fail to keep up with the new charges, we'll face the rough justice delivered by the latest bankruptcy "reform," which was rammed through the Congress in 2005 with support from many top Democrats.
There will be no money to shore up the long-run drain on Social Security or Medicare.
Indeed, ironically enough, this latest bank bailout may even increase the financial pressure to privatize these comparatively successful government programs.
There will be no extra money to house our thousands of new homeless people, relieve poverty, rebuild New Orleans, or support immigration reform.
There will be no additional funds for national parks.
Indeed, we might as well start by privatizing our national and state parks, and drilling for oil and gas in the Arctic National Wildlife Refuge, Yosemite, the Grand Canyon, and right off the Santa Barbara coast. We're going to need those federal lease royalties. (Perhaps the oil barons will lend us an advance.)
There will be no funds available for increased homeland security.
There will certainly be no "middle-class" tax cut. Absent a progressive tax reform, the only "cut" the middle class is going to receive is another sharp reduction in living standards.
TRUMPING REAGAN
All told, the Bush/Paulson "permissive banking/ massive bailout" model beats even the old 1980s vintage Reagan formula, which tried to force government down-sizing with huge tax cuts.
Contrary to the sales pitch, those cuts never produced any incremental tax revenues, let alone any significant down-sizing. It has simply proved too easy for the federal government to borrow. And "conservatives" can always find wars, farm subsidies, defense contractors, and "bridges to nowhere" to spend the money on, just as fast as liberals.
Lately, however, it appears that US debt levels may indeed be reaching the point where they could impose a limit on increased spending. Given the sheer size of the new federal debt obligations, foreign creditors,who have recently been supplying more than half of new Federal borrowing, have been muttering about taking their lending elsewhere. And outside the financial services industry, Main Street companies are concerned being "crowded out" by record federal borrowing.
THE ALTERNATIVE -- THE "GET REAL" NEW DEAL
To make sure that real economic reform is still feasible, we need to demand a "Get Real/ New Deal" from Congress right now.
At a minimum, this Get Real/New Deal package should consider measures like:
(1) The restoration of stiff progressive income and estate taxes on the top 1 percent of the population (with net incomes over $500,000 a year and estates over $5 million) -- especially on excessive CEO and hedge fund manager compensation;
(2) Much more aggressive enforcement and tougher penalties against big-ticket corporate and individual tax dodgers;
(3) Tougher regulation of financial institutions -- possibly by a new agency that, unlike the US Federal Reserve, the SEC, and the US Treasury, is not "captive" to the industry;
(4) A crackdown on the offshore havens that have been used by leading banks, corporations, and hedge funds to circumvent our securities and tax laws;
(5) The immediate revision of the punitive bankruptcy law that Congress enacted in 2005 at the behest of this now-bankrupt elite; and
(6) While we are at it, stiff "pro-green" luxury taxes on mega-mansions, private jets, Land Rovers, yachts, and all other energy-inefficient upscale toys.
We also need (7) a National Commission to investigate the root causes of this financial crisis from top to bottom, and actually (unlike the hapless, ineffectual 9/11 Commission) hold people accountable.
Finaily, if the pubilc is going to provide so much of the risk capital for this restructuring, we should demand (8) public equity in the private financial institutions that receive so much of our help.
This will permit taxpayers to share in the upside of this restructuring, rather than just the downside risks.
Along the way, this will require that we explain to Secretary Paulson that this country is not Goldman Sachs. Even after 8 years of President Bush, this is still a democracy.
Secretary Paulson is not going to be given unfettered discretion to hand out closet "liquidity injections" to his buddies on the street -- no matter how worthy they are.
Over time, this progressive Real/ New Deal would help raise the hundreds of billions in new tax revenue needed to offset the costs of this bailout.
This will be essential, if the Federal Government is to be able to afford key reforms like health insurance, clean energy, and investments in education.
These may not matter very much to Wall Street executives, financial analysts, Treasury and Federal Reserve executives, or the more than 120-130 Members of Congress and 40-45 US Senators who earn more than $1 million a year -- and are already covered by a generous "national health care" package of their own design.
But these are the key "systemic risks" that ordinary Americans face.
These reforms may sound ambitious. So is the bailout. And the reforms that we are discussing are only fair.
After all, we the American people have recently been the very model of forgiveness and understanding.
We have tolerated and footed the bill for stolen elections, highly-preventable terrorist attacks, gross mismanagement of "natural" disasters, prolonged, poorly conceived, costly wars, rampant high-level corruption, pervasive violations of the US Constitution, and the systematic looting of the Treasury by politically-connected defense contractors, oil companies, oligopolistic cable TV and telecommunications firms, hedge fund operators, big-ticket tax evaders, and our top classes in general.
Does "class" still matter in America? You betcha -- perhaps more than ever. But enough is enough. Call your Congressperson now. Demand a"Get Real/ New Deal" qualifier to the bailout package before it is too late. We deserve to get much more for our money. So do our kids.
(c) SubmergingMarkets, 2008
September 20, 2008 at 04:41 PM | Permalink | Comments (0) | TrackBack
Tuesday, July 01, 2008
THE EDUCATION OF DR. PHIL GRAMM UBS Role Raises Basic Questions About McCain's Key Economic Adviser James S. Henry
-- Peter Wuffli, x UBS CEO
John McCain has long since admitted that he has a great deal to learn when it comes to economics. But it turns out that his
own chief economic advisor, former US Senator Dr. Phil Gramm, has also needed rather extensive retraining lately. Unfortunately this has been acquired mainly at the expense of millions of US home buyers, honest taxpayers, former Enron employees, and would-be enforcers of our (bank-driven, loophole-ridden) anti-money laundering laws.
GRAMM CRACKERS
Gramm, a somewhat goofy-looking, deceptively slow-talking business economist from Georgia, spent 12 years teaching economics at Texas A&M before getting elected to Congress as a conservative Democrat in 1978. By 1982 he'd switched sides, joining the Reagan Revolution to become one of the Republican Party's most outspoken champions of deregulation, tax cuts, and spending controls -- so long as this didn't affect his pet interest groups.
In the next two decades, Dr. Gramm was perhaps the Senate's leading proponent of financial services deregulation, weakened restrictions commodity trading, credit cards, consumer
banking, and predatory lending practices, in addition to leading the fight against Hillary Clinton's health insurance reforms. As chairman of the Senate Banking Committee from 1996 to 2000, he was a key author of legislation that eliminated most of the legal barriers between US banks, brokerages, investment banks, and insurance companies that had been in place since the 1930s.
Phil was also a determined opponent of tougher IRS tax enforcement, and a principal author of a 2000 law that exempted companies like Enron from regulation for online energy trading activities. Of course this made sound economic sense. After all, Phil's wife Wendy was a member of Enron's board, and Enron was Phil's largest corporate contributor in the 1990s.
In 2000-2002, both before and after 9/11, Phil also became the key opponent of tougher anti-money laundering regulations, and -- not coincidentally-- one of the largest recipients of contributions from the powerful financial services lobby. Among independent journalists, all this helped to make him known by a variety of sobriquets, including "Foreclosure Phil," "Slick Philly," and "The Personal Representative of the Bank of Antigua."
U-BS-er
This track record stood Dr. Gramm in good stead when it came time to seek new employment in 2003, after the Republicans lost control of the Senate. Naturally enough, he gravitated toward his friends in the global private banking industry, whose noble calling it is to gather the assets of
the world's wealthiest people and protect and conceal them from
taxes, regulation, and expropriation, not to mention embittered family members, ex-lovers and business partners, and each other.
Since 2002, Dr. Gramm has served as Vice Chairman of UBS Investment Bank, which is owned by UBS AG, the largest Swiss bank, the world's 16th largest commercial bank, and the world's largest private asset manager, with more than 80,000 employees and offices in 50 countries.
Even after joining McCain's campaign during the summer of 2007, Dr. Gramm continued to serve as a registered Washington lobbyist for UBS from 2004 until April 2008, lobbying Congress to maintain weak restrictions on sub-prime lending and predatory lending.
BAD TIMING
In hindsight, Dr. Gramm's recent crusade for even more financial freedom turned out to be ill-timed, for several reasons.
First, this was hardly the moment for even more financial deregulation than the US had already digested in the 1990s. After 2002, on Dr. Gramm's watch, UBS became one of the most world's aggressive banks, helping to foment and finance the sub-prime lending crisis that has already cost nearly three million Americans their homes, generated more than $250 billion in bank losses, and driven a $7.7 trillion hole in global equity markets.
Since November 2007 UBS alone has written off $37 billions in mortgage-related assets, the largest write-off for any bank. In July 2007, UBS's McKinsey-trained CEO, Peter Wuffli, was forced to resign, and in April 2008 its $24 million -per-year Chairman, Marcel Ospel, was given the toe. Since then its
stock price has plummeted more than 70 percent, to its lowest level since 2002.
Meanwhile, the bank also revealed itself to be curiously insensitive to US financial regulations. For example, in May 2004, it was fined $100 million by the US Federal Reserve for violating an embargo on funds transfers to countries like Iran and Cuba.
Finally, it now turns out that Dr.Gramm's colleagues at the bank have also been up to their eyeballs in yet another dubious business: helping up to 20,000 wealthy American tax cheats hide their wealth offshore and commit outright tax fraud, cheating the IRS out of tens of $billions in tax revenue.
SWISS CHEESE
Late last month, Bradley Birkenfield,
a senior private banker who'd worked with UBS from 2001 until 2006 out
of Switzerland, and then continued to service their clients out of
Miami, pleaded guilty to helping
dozens of his wealthy American clients launder their money. His name
had originally surfaced when a Southern California billionaire
property developer, Igor M. Olenicoff, had been discovered by the IRS to be paying much less income tax than his status on the Forbes 400 list status warranted.
With the help of Birkenfield and other UBS private bankers, Olenicoff, who'd first established offshore accounts as early as 1992, succeeded in parking at least several hundred million of unreported assets offshore.(Download bankers-indicment-in-florida.pdf)
Ultimately Olenicoff settled with the IRS for $52 million in back taxes, one of the largest tax evasion cases in Southern California history. He also agreed to repatriate $346 million that he had parked in Switzerland and Liechtentstein.
In theory he also faced up to 3 years of jail time, but in practice -- following the standard US practice of going easy on big-ticket tax evaders with no priors -- his maxmum exposure was just six months under standard US sentencing guidelines. Indeed, ultimately Olenicoff only got two years probation and 3 weeks of "community service."
One also gets the sense that this case was a bit like the cat pulling on the sweater yarn. According to Forbes, Olenicoff reported that many of his other foreign accounts were controlled by Sovereign Bancorp Ltd., a Bahamian company that he claimed had been set by former Russian Premier Boris Yeltsin.
In any case, in the process of making up for lost time with the
IRS, Olenicoff also gave up his two UBS private bankers, Birkenfield,
and According to Birkenfield, he was just one of more than 50 UBS
private bankers who visited the US out of Switzerland each quarter.
This case, the first US prosecution of a foreign private banker ever,
signals that even the Bush Administration has become fed up with the
estimated $100 billion per year in lost tax revenues that such
practices are costing, and has decided to make an example of Dr.
Gramm's employers.
UBS' sin was that it took "you be us" a step too far. Like other major global banks, UBS AG had signed a "qualified intermediary" agreement with the US Treasury in 200(x), giving its corporate word that it would either insure that its clients were not US citizens, or withhold appropriate taxes. But when UBS AG's American clients refused to go along with such arrangements, UBS just caved in and lied to the US Government.
As a result, despite his cooperation, Birkenfield, the former UBS private banker, is likely get serious jail time this August. Meanwhile, the DOJ has just issued a "John Doe" summons to UBS AG, requiring it to turn over the identify of its entire list of wealthy American clients. The head of UBS AG's Global Private Banking business unit has been arrested and detained in the US on "material witness" charges, pending resolution of this dispute. The private banker's wealthy clients are experiencing the tender mercies of the IRS's tax fraud department as we speak -- not only from this US case, but also from the recent scandal involving Liechtenstein's largest bank, where many UBS clients were also channeled. UBS's shareholders all over the globe must be quaking in their boots, fearing the bank could be subject to massive fines or even a corporate indictment that would prevent it from doing business in the US ever again.
QUESTIONS FOR DR. PHIL
The questions for Dr. Gramm arising out of these scandals are many.
- First, was Dr. Gramm completely unaware that UBS AG had organized this massive illicit global campaign to elicit capital flight from the US and other "honest-tax" jurisdictions, conceal it in low-tax havens like Liechtenstein, and completely shelter it from the taxes that ordinary taxpayers have little choice but to pay?
- Second, are any of these 20,000 wealthy tax cheats from Texas? Does Dr. Phil know any of them personally?
- Third, what kind of changes, if any, in laws pertaining to "qualified
intermediaries," offshore havens, private banking, and international
tax havens does Dr. Gramm believe are necessary? Would he, for example,
support the reform bill on foreign havens and "qualified intermediary" rules that Senators Levin and Obama have
co-authored? Precisely when will John McCain sign up to endorse that legislation?
- Fourth, what else has Dr. Phil learned from all these cases? Has he
changed any of his views on the morality of tax dodging, money laundering, and predatory lending? Is all this just a matter of "sauve qui peut" -- of whatever we can all get away with, especially the rich? Does John McCain agree with him on such matters? What then remains, alas, of "patriotism" and "national sacrifice," two of McCain's favorite leitmotifs?
- Finally, given that John McCain really does need sound advice on economic issues like the mortgage crisis, taxation, and money laundering from a "qualified intermediary" of his own, does all this experience really qualify Dr. Phil Gramm to fill the bill?
(c) SubmergingMarkets 2008
July 1, 2008 at 08:41 PM | Permalink | Comments (0) | TrackBack
Friday, December 15, 2006
Blood Diamonds Part 1: The Empire Strikes Back! by James S. Henry
"...(O)ne of the great dramas of Africa: extremely rich areas are reduced to theaters of misery...."
-- Rafael Marques, Angolan journalist (July 2006)
"For each $9 of rough diamonds sold abroad, our customers, after cutting them, collect something like $56..."
-- Sandra Vasconcelos, Endiama (2005)
"We found the Kalahari clean. For years and years the Bushman have lived off the land....thousands of years...We did not buy the Kalahari. God gave it to us. He did not loan it to us. He gave it to us. Forever. I do not speak in anger, because I am not angry. But I want the freedom that we once had."
-- Bushman, Last Voice of an Ancient Tongue, Ulwazi Radio, 1997
The global diamond industry, led by giants like De Beers,
RTZ, BHP
Bililton, and Alrosa Co Ltd., Russia's state-owned diamond company, a handful of aggressive independents like Israel's Lev Leviev, Beny Steinmetz's BSG Group, and Daniel Gertier's DGI, a hundred other key "diamantaires" in New York, Ramat-Gan, Antwerp, Dubai, Mumbai, and Hong Kong, and leading "diamond industry banks" like ABN-AMRO, is not exactly renowned for its abiding concern about the welfare of the millions of diamond miners, cutters, polishers, and their families who live in developing countries.
But the industry -- whose top five corporate members still control more than 80 percent of the 160 million carets that are produced and sold each year into the $70 billlion world-wide retail diamond jewelry market -- certainly does have an undeniable long-standing concern for its own product's image.
PERENNIAL FEARS
Indeed, for decades, observers of the diamond industry have warned that it was teetering on the brink of a price collapse, because the industry's prosperity has been based on a combination of artificial demand and equally-artificial -- but often more unstable -- control over supply.
Most of the doomsayers have always predicted that the inevitable downfall, when it came, would arrive from the supply side, in
the form of some major new diamond find that produced a flood of raw diamonds
onto the global market.
The precise culprits, in turn, were expected to be artificial diamonds (in the 1960s and 1970s), "an avalanche of Australian diamonds" (in the 1980s,) and Russian diamonds (in the 1990s.)
This supply-side pessimism has lately been muted, given the failure of the earlier predictions and the fact that raw diamond prices -- though not, buyers beware, retail diamond resale prices!! -- have recently increased at a hefty 10-12 percent per year. There is also some evidence that really big "kimberlite mines" are becoming harder and harder to find.
However, there are still an awful lot of raw diamonds out there waiting to found, and one does still hear warnings about the long-overpredicted Malthusian glut, now from new sources like deep mines in Angola, Namibia's offshore fields, Gabon, Zambia, and the Canadian Northwest.
THE REAL THREAT?
Meanwhile, the other key threat to the
industry's artificial price structure -- where retail prices are at least 7
to 10 times the cost of raw diamonds -- comes from the demand side. This is the concern that diamonds may lose the patina of glamour, rarity and respectability that the industry has carefully cultivated since the 1940s.
It is therefore not surprising that the industry has been deeply disturbed by the December 8, 2006, release of Blood Diamond, a block-buster Hollywood film that stars Leonard DiCaprio, Jennifer Connelly, and Djinmon Hounsou.
While extraordinarily violent and a bit too long, the film is
entertaining, mildly informative, and far from "foolish" -- the
sniff that it received from one snide NYT reviewer -- who clearly knew nothing about
the subject matter, other than, perhaps, the fact that the Times' own Fortunoff- and Tiffany-laden ad department didn't care for the film.
Indeed, this film does provide the most critical big-screen view to date of the diamond industry's sordid global track record, not only in Africa, but also in Brazil, India, Russia, and, indeed, Canada and Australia, where diamonds have often been used to finance civil wars, corruption, and environmental degradation, and indigenous peoples often been pushed aside to make room for the industry's priorities.
Surely the film is a
small offset to decades of the diamond cartel's shameless exploitation of Hollywood films, leading ladies like Marilyn Monroe, Elisabeth Taylor, and Lauren Bacall, and scores of supermodels, rock stars, and impresarios.
INDUSTRY WHITE WASH
Dismayed at the potential
negative impact of the film ever since the industry first learned about Blood Diamond in late 2005, it is reportedly spending at least an extra $15 million on a PR campaign that responds to the film -- in addition to the $200 million per year that the World Diamond Council already spends on regular marketing.
For example, if you Google "blood diamonds," for example, you'll see that the industry has purchased top billing for its own version of the "facts" regarding this film. Always eager for a new marketing angle, some diamond merchants have also seized the opportunity to pitch their own product lines as "conflict diamond - free."
DEF JAM'S BLACK WASH
This shameless PR campaign has also included a "black wash" effort by the multimillionaire hip hop impresario Russell
Simmons, who
launched his own diamond jewelry line by way of the Simmons Jewelry Co. in 2004, in partnershp with long-time New York diamond dealer M. Fabrikant & Sons.
Simmons, who admits to "making a lot of money by selling diamonds," rushed back to New York on December 6 from a whirlwind nine-day private jet tour of diamond mines in South Africa and Botswana -- but, admittedly, not in conflict-ridden Sierre Leone, Angola, the Congo, the Ivory Coast or Chad.
Simmons was originally scheduled to travel with one of his latest flames, the 27-year old Czech supermodel and Fortunoff promoter, Petra Nemcova. But Petra reportedly preferred to stay home and accept a huge diamond engagement ring of her own from British singer/soldier James Blunt, whose 2005 pop hit "You're Beautiful" was recently nominated the "fourth most annoying thing in Britain," next to cold-callers, queue-jumpers, and caravans.
The timing of Simmons' trip, which he filmed for UUtube, just happened to coincide with the December 8 release of the Warner Brothers feature.
Upon his return, Simmons held a press conference, accompanied by his estranged wife Kimora Lee Simmons and Dr. Benjamin F. Chavis
Mohammed, a former civil rights activitist and fellow investor in the jewelry company who is perhaps best remembered for being fired as NAACP Director in 1994 after settling a
costly sexual harassment suit, and for joining the Rev. Louis Farrakhan's Nation of islam. Simmons' astounding conclusion from his wonder-tour: "Bling isn't so bad."
Whatever the credibility of Simmons and his fellow instant experts, it was evidently not enough to save M. Fabricant & Sons, which filed for Chapter 11 in November.
THE GODS MUST (STILL) BE CRAZY
Simmons managed to tour a few major diamond mines on his African safari, but apparently he lacked time to examine the contentious land dispute between the Kalahari San Bushmen,
the members of one of Africa's oldest indigenous groups, and the Botswana
Government -- with the diamond industry's influence lurking right offstage.
In the 1990s, after diamond deposits were reportedly discovered on the Bushmen's traditional lands, the Botwana Government -- which owns 15 percent of De Beers, is a 50-50 partner with De Beers in the Debswana diamond venture, the largest diamond producer in Africa, and derives half its revenue from diamond mining -- has pressured the Bushmen to leave their tribal lands.
The methods used were not subtle. To force the Bushmen into resettlement camps outside the Reserve, the Botswana Government closed schools and clinics, cut off water supplies, and subjected members of the group to threats, beatings, and other forms of intimidation for hunting on their own land -- all of it ordained by F.G. Mogae, Botswana's President, who declared in February 2005 that he 'could not allow the Bushmen to return to the Kalahari." Those who have been resettled have been living in destitution, without jobs and little to do except drink. (See a recent BBC video on the subject.)
Thankfully, on December 13, 2006, Botswana's High Court ruled that in 2002, more than 1000 Bushmen had been illegally evicted by the Botswana Government from the Central Kalahari Game Reserve, where they'd lived for 30,000 years.
The Botswana Attorney General has already attempted to attached strict conditions to the ruling, so this struggle is far from over. But at least the first prolonged legal battle has been won -- thanks to the determination of the Bushmen, public-spirited lawyers like Gordon Bennett, their legal counsel, courageous crusaders like Professor Kenneth Good, and NGOs like Survival International, which has supported the legal battle.
In the wake of this decision, as usual, the global diamond industry, led by De Beers, has denied that any responsibility whatsoever for the displacement of the Bushmen.
However, the fact is that De Beers and other companies has been
prospecting actively in the Kalahari Reserve, especially around the Bushman
community of Gope (see this video), where De Beers has falsely claimed that no Bushmen were living when it started mining. It has actively opposed recognizing the rights of indigeneous peoples in Africa. In 2002, at the time of the eviction, Debswana's Managing Director -- appointed by De Beers -- commented that "The government was justified in removing the Basarwa (Bushmen)….’.
De Beers' behavior in Botswana has so outraged activists that they
have joined together with prominent actors like Julie Christie and
several Nemcova-like supermodels who used to appear in De Beers ads, in
an appeal for people to boycott the now-UK-based giant -- which has
lately been trying to move downstream into retail diamonds.
However, De Beers is far from alone in this effort. Indeed, as has often been the case with "conflict diamonds," less well-known foreign companies have been permitted to do much of the nastier pioneering.
In Botswana's case, these have included Vancouver-based Motapa Diamonds and Isle of Jersey-based Petra Diamonds Ltd. both of which have have obtained licenses to explore and develop milliions of acres, including CKGR lands. Petra is not unfamiliar with "conflict diamonds;" it is perhaps best known for a failed 2000 attempt to invest in a $1 billion diamond project in the war-torn DR Congo, in which Zimbabwe's corrupt dicator, Robert Mugabe, reportedly held a 40 percent interest.
In the case of Botswana, in September 2005 Petra acquired the
country's largest single prospecting license -- covering 30,000 square
miles, nearly the size of Austria -- by purchasing Kalahari Diamonds Ltd, a company that was 20 percent owned by BHP Billiton and 10 percent by the World Bank/IFC
-- which apparently saw the sponsorship of CKGR mining as somehow
consistent with its own financial imperatives, if not its developmental
mission. (!!!). Petra has also licensed proprietary explorations
technology from BHP Billiton, and offered it development rights, a
front-runner for the Australian giant.
Meanwhile, at least 29 of the 239 Bushmen who filed the lawsuit have perished while living in settlement camps, waiting for the case to be decided, and many others are impoverished.
Perhaps the diamond industry's $15 million might be better spent simply helping these Bushmen return to their homes -- and also settling up with the Nama people in South Africa, the Intuit and Kree peoples in Canada, and the aborigines in Australia.
FAR CRY
Meanwhile, as we'll examine in Part II, despite the "Kimberly Process" that was adopted by many -- but not all -- key diamond producers in 2003, the fact is that diamonds continue to pour out of conflict zones like the Congo, Ghana, and the Ivory Coast, providing the revenues that finance continuing bloodshed.
The industry's vaunted estimate that they account for just "1
percent" of total production is based on thin air -- there are so many loopholes
in the current transnational supply chain that there is just no way of
knowing. Of course, given the scale of the global industry, and the poverty of the countries involved, even a tiny percent of the global market can make a huge difference on the ground.
Furthermore, in cases like Angola, the Kimberly Process has provided an excuse for corrupt governments to team up with private security firms and diamond traders to crack down on independent alluvial miners.
Finally, the diamond industry still has much work to do on other fronts -- pollution, deforestation, and, most important, the task of creating a fairer division of the spoils, in an industry where the overwhelming share of value-added is still captured by just a handful of First World countries.
The objective here is not to kill the golden goose. In principal, the diamond industry should be able to reduce world inequality and poverty, since almost all retail buyers are relatively-affluent people in rich countries, while more than 80 percent of all retail diamonds come from poor countries.
But beyond eliminating traffic in "blood diamonds," however, we should also demand that this industry starts to redress its even more fundamental misbehaviors.
***
(c) SubmergingMarkets,2006
December 15, 2006 at 12:06 PM | Permalink | Comments (0) | TrackBack
Thursday, November 30, 2006
ASSASSINATION POLITICS Learning the Lessons from Decades of "Conspiracies" James S. Henry
Conspiracy buffs of the world, rejoice! High-stakes political assassinations and the
inscrutable tales of intrigue that inevitably accompany them are back in the headlines!
In the last few months we've had new evidence surfacing about old cases like RFK and JFK that have been unresolved for decades.
We also have many exciting new cases emerging from places like London, Beirut, Moscow, and Gaza -- cases that promise to be unresolved for decades to come.
It cetainly won't be possible to resolve all these cases here, though a few winks and
nods toward our favorite theories will be hard to resist.
However, there are some very important implications to be drawn from examining these political assassination cases side-by-side -- especially for
the bloodless abstractions put forth by the tiny, vocal group of unabashed neoimperialists at the Council on Foreign Relations, the Harvard Law School, the National Review, and the American Enterprise Institute who have been trying to rehabilitate assassination as an acceptable tool of US foreign policy.
In recent weeks
we've been treated to a flurry of assassination news, including the dramatic polonium -210
poisoning of former KGB agent and Putin critic Alexander Litvinenko in London; the gangland-style slayings of investigative journalist Anna Politkovskaya and Andrei Kozlov, Deputy Governor of the Russian Central Bank, in Moscow; the fatal ambush of the Lebanese Christian Falangist leader Pierre Gemayel in Beirut; and UN approval for an international tribunal to pursue another Lebanese case, the February 2005 slaying of Prime Minister Rafik Hariri.
While players like the Russian mafia and other "private enemies" cannot be completely ruled out in these cases, it is suspected that most of them were "political assassinations," in the sense that the perpetrators were sponsored by hostile states or key factions within them, which were motivated by the desire to eliminate politically-influentlal enemies -- often across international borders.
In principle such political assassinations are to be distinguished from purely-terrorist attacks, as well as from attempts to eliminate "military" leaders -- for example, the June 2006 US Predator attack on Abu Musab Al-Zarqawi, the July 2006 explosion that killed the Chechnyan rebel Shamil Basayev, and Israel's innumerable targeted assassinations in the West Bank and Gaza.
In practice these distinctions often break down, given the fact that assassinations also terrorize, and that leaders like Basayev, Sheik Yassin, and Al-Zarqawi have also played important political roles in insurgent organizations.
But part of the price of being an insurgent from a state-less organization, rather than a conventional politician, journalist, agent of the state, or crusading bishop (Romero) is that one's enemies find it much more legally and socially acceptable, as well as more useful, to kill quite openly, and to take credit for the achievement.
This kind of official credit-taking rarely occurs for the type of cases cited earlier. Even if the targets happen to be corrupt politicians or blood-stained former KGB agents, they are deemed to be more "respectable" than the typical insurgent; indeed, conspiring to eliminate them is usually against the law. So responsibility must be hidden -- in many cases, for decades.
CASE REOPENED?
This brings us to the other recent events that have brought this subject back to the surface. These include the 43rd anniversary of the (by now faintly-observed) assassination of JFK on November 22, 1963 in Dallas, and the recent release of "Bobby," a feature film about events at the Ambassador Hotel in Los Angeles on RFK's last day, June 5, 1968.
They also include a striking news report that aired on BBC 2 on November 20, highlighting the new findings of filmmaker Shane O'Sullivan about the RFK assassination. According to O'Sullivan, a careful reexamination of photos taken of the crowd that fateful night at the Ambassador has disclosed the presence, in proximity, of at least three long-time CIA covert operatives who had already become notorious among JFK "assassination buffs" (we wear that label proudly) for other reasons. The men in question were not random associates -- they had all held senior positions in 1962-64 at JM/WAVE, the huge Miami CIA station that was heavily involved in anti-Castro plots and the recruitment of allies among Cuban exiles, US veterans, and the Mafia.
According to O'Sullivan, these were Gordon Campbell, the former Deputy Director of JM/WAVE; George Joannides, the former Director of Psychology Warfare at JM/WAVE; and most interesting of all, David Sanchez Morales, a senior assassinations and sabotage expert who also worked for the CIA in Venezuela, Uruguay, Laos, and Vietnam, and also reportedly developed a close relationship with Chicago mob boss John Rosseli. Roselli's body ended up in an oil drum off the coast of Miami, a week before he was supposed to testify before the House Select Committee on Assassination that was reinvestigating the JFK case.
November 30, 2006 at 03:58 AM | Permalink | Comments (0) | TrackBack
Friday, March 03, 2006
"IRAQ'S MOMENTOUS ELECTION, ONE YEAR LATER" Iraq War Supporters Are Running For Cover James S. Henry
For those who have not been paying attention in class, the so-called "Iraq War" has recently been setting new records for violence, brutality, and terror -- with at least 379 to 1300 iraqi fatalities in the last week alone, in the wake of the bombing of the 1,062-year old Al-Askariya shrine at Samarra.
Nor did the apprentice Iraqi Army -- with its 20,000-man force, trained by the US military at the phenomenal cost of $15 billion to date, or $750,000 per soldier -- prove to be much help in quelling the violence. This is not really surprising -- after all, this Army shares the same divided loyalties as the population at large.
While a few senior US military officers have issued Westmoreland-like statements assuring us that "the crisis has passed," and that this is not -- I repeat -- not a "civil war," it is hard to know what else to call it.
A few journalists have
speculated that, ironically enough, all the increased violence and polarization may
undermine the Pentagon's "hopes" to reduce the number of US troops
in Iraq to 100,000 by year end.
Those "hopes," however, are vague. One suspects that they have always been mainly for public consumption, including the morale of US troops. We only began to hear about them last fall when opposition to the war really soared in the US.
The Pentagon's not-so-secret hope -- among senior planners, at least -- is different. This is to turn Iraq into a neutered or even pro-US -- better yet for cosmetic purposes, "democratic" -- regime right in the heart of the Middle East, complete with permanent basing rights, immunity for US personnel from war crimes prosecution by the International Criminal Court, and, naturally enough, the occasional juicy construction, security, arms, and oil contract for friendly US and UK enterprises -- at least so long as they are not owned by Dubai.
ALL AGAINST ALL?
It is this vision that is most threatened by the recent surge in Iraqi violence. Clearly this is no longer just a "foreign terrorist/ dead-ender-led insurgency" against the US and its apprentice army.
Nor has the US-guided constitutional process, and continuous interventions by our heady Ambassadors in Baghdad -- safe behind the walls of the world's largest US embassy -- succeeded in stabilizing the country.
Rather, Iraq is now engaged in a complex, multi-sided bloodbath, fought along age-old religious, ethnic, and clan lines by well-armed groups. While American battle deaths continue, almost all the casualties are now Iraqis felled by Iraqis.
Furthermore, this inter-Iraqi violence goes well beyond the suicide bombings that still garner most of the media's attention. It escalated sharply in the last year, long before the Samarra bombing, and even as the vaunted constitutional process was unfolding.
For example, as reported by the Guardian this week, the former director of the Baghdad Morgue recently fled the country, fearing for his life after reporting that more than 7000 Iraqis had been tortured and murdered by "death squads."
According to the former head of the UN's human rights office in Iraq, most of these victims had been tortured by the Badr Brigade, the military wing of SCIRI, the Supreme Council for the Islamic Revolution in Iraq.
As we reported over a year ago on this site, SCIRI is not just some fringe element. It is one of Iraq's two key Shiite-led political factions, and one of the principle victors in the December 2005 parliamentary elections. Unfortunately, our expectations have been fulfilled. Upon acquiring power, SCIRI has behaved exactly as anyone familiar with its history -- but apparently not the US military -- would have expected.
TOUGH LIBERALS?
Meanwhile, among America's befuddled liberal intelligentsia, hard-nosed realism has been sorely missing. The December election and its January 2005 predecessor were events that most neoliberal observers -- for example, the American Prospect -- could not praise highly enough:
Iraqis have concluded one of the most successful constitutional processes in history. Rarely, if ever, before has an important country moved from tyranny to pluralism so quickly, with so little bloodshed, and with such a quality and degree of popular participation.
This assessment was spectacularly wrong. Iraq's constitutional process has not led to "pluralism," much less staunched the bloodshed.
Rather -- no doubt with ample assistance from Iranian secret agents, "foreign fighters," and other officious intermeddlers -- the process has exacerbated social and religious divisions -- divisions that Iraq was always noted for mitigating.
The continued US presence has also helped to legitimize the extremists, letting them fly the "national liberation" flag. We have reached the point where country's armed private militias are expanding faster than the US-trained police and army.
In this perilous Somalia-like situation, with US troops
viewed as part of the problem, and shot at by all sides, it is harder and harder to justify incremental American casualties.
Indeed, about the only thing that all Iraqi factions -- apart from some
Kurds and the country's dwindling minority of remaining secularists -- agree on now is the desire for the US military to leave. We should respect their wishes.
TOO FEW TROOPS?
By now, even arch-conservative pundits like William F. Buckley have agreed that the Iraq War was a costly mistake, and that a US withdrawal is called for.
Meanwhile, however, some die-hard US neoliberal defenders of the war -- including tough-guys like the New York Times' Tom Friedman and Vanity Fair's Christopher Hitchens -- are still denying the existence of Iraq's deep-seated, historically-specific obstacles to democratization and unified self-rule, as well as the overwhelming opposition in Iraq to the US presence.
Of course, admitting that local history actually matters might require one to study Middle Eastern history a little more closely, or perhaps even learn Arabic.
It might also interfere with certain pet theories, like the "inevitable triumph of technology and free markets over local markets, nations, peoples, customs and practices," or the "inevitable struggle to the death between Islamic extremism and Western democracy."
From the standpoint of these and other warhawks, our only mistake in Iraq was really quite simple -- the Bush Administration sent in too few troops.
On closer inspection, this claim spins itself into the ground faster than a Halliburton drill bit.
- One key reason why more troops were not available was the fact that the war's supporters -- not only the Bush Administration, but also leading Democrats like Hillary Clinton, John Kerry, and Joe Lieberman, and their pundit camp followers -- failed to persuade anyone other than Mad Tony Blair that a variety of cockamamie theories about "democratizing the Middle East," the "connection" between Saddam and al-Qaeda, and WMDs had any validity whatsoever.
- Second, while a handful of Pentagon skeptics did support larger troop commitments before the invasion, they were in the minority -- and not just because of Rumsfeld's desire to fight the war with a high-tech army. Most of the war planners and pro-war enthusiasts alike were swept away by Friedman-like naievete about the enthusiasm of ordinary Iraqis for US-backed "liberation." They systematically underestimated the Iraqis' nationalism and their resentment of occupation -- especially by armies of "Christian" nationals from the US and the UK. In retrospect, it is easy to say that even more troops were needed to maintain order and suppress resistance. But the larger US presence would have provoked even more resistance.
- As most US commanders agreed, the "more troops" answer is flawed from a technical perspective, given the nature of the insurgency. It would have provided more targets for suicide bombers, without delivering a remedy for their simple IED and sniper tactics. While more troops might have provided better border interdiction, Iraq has a larger land mass than Vietnam, and twice as many neighbors. For the "more troops" claim to work with any certainty, the number would have had to rival Vietnam proportions -- at least 500,000, probably for several years. The US military manpower system has already experienced great strains trying to sustain its 133,000 commitment to Iraq with a volunteer army -- to be effective, the "more troops" approach might well have required a military draft.
Apart from New York's Congressman Rangle, who may have just been tweaking the establishment's chin for his black constituents, not even the most aggressive neoliberal warhawk has ever proposed that.
Ever since WMDs failed to turn up and Saddam's connection to al-Qaeda turned out to be a canard, the neoliberal warhawks have been running for cover -- worried, quite rightly, that history will not take kindly to their dissembling, and their collaboration with the Bush Administration's neoimperialists.
For much of the last three years this cover story was provided by the expectation of "nation building," "democratization," and the "training of the Iraqi Army" -- achievements that always seemed to be, conveniently enough, just around the corner.
As the last week's events have dramatized, these are all more mirages in the desert. We've run out of time and excuses.
(c)SubmergingMarkets, 2006.
March 3, 2006 at 11:28 AM | Permalink | Comments (2) | TrackBack
Saturday, November 05, 2005
BUSH HEADS SOUTH Receives Rousing Welcome In Argentina... Fox News Analysis
President Bush received an incredibly warm welcome at the 34-nation Summit of the Americas in Mar de la Plata, as thousands of ordinary people from all over the Continent turned out to hail his presence.
The effervescent US President was clearly buoyed by polls that showed that he still commands the support of an incredible 80 percent of Republicans -- otherwise known as his "base."
True, "non-base" support is reportedly a little less certain. Overall, in this week's latest polls, 59 percent expressed "disapproval," while 42 percent expressed "strong" disapproval." A quarter of the US population surveyed reported "violent morning sickness...."
However, knowledgeable insiders have called this a "temporary setback" that will be easily corrected if and when Presidential advisor Karl Rove, recently distracted by the Pflame investigation, starts covering the bases again.
The President, speaking through an interpreter, voiced optimism that "Free trade and liberal investment policies, plus a few billion dollars on defense, corn subsidiies, and our brand new military base in Paraguay" would completely change the lifestyles of the estimated 100 million Latin Americans who remain below the $1 per day world poverty line.
Said Bush, "These policies have only been tried for a decade or two. They need to be given a chance. Right here in Argentina, you've seen how well they've worked, right?"
Bush's sentiments were echoed by Vincente Fox, Mexico's amazingly popular lame-duck President, and Paul Martin, the astonishing Canadian PM, whose own popularity ratings have recently been taken to record levels by the Gomery Report, which documented the disappearance of $250 million of government funds, mainly by way of Mr. Martin's own party.
Said Martin: "We are quite pleased to have become a wholly-owned subsidiary of US multinationals. We didn't think we'd like the sensation, but it has become an experience that we really look forward to every night. You will also learn to enjoy it. Now if only the US would pay us that $3.5 billion...."
Said Fox: "Yes, it is true, millions of Mexican small farmers have been wiped out by free trade. But this criticism is baseless. Just look at all the remittances they are sending back home from the US !"
Meanwhile, the US President had an especially warm greeting from Diego Maradona, the famous Argentine soccer star, now in recovery. Maradona used a colloquial Argentine expression to describe just how delighted he is to finally have this particular American President visit his country.
Elsewhere, Cuba's Fidel Castro, who was not permitted to attend the summit, was reported to have decided to remove all restrictions on US trade and investment with Cuba, after having listened to President Bush's persuasive arguments.
Said the aging inveterate leftist leader, "I knew we were doing something wrong. Now I finally know what it was. We were way off base!"
After a prolonged negotiating session on Saturday, in which Summit delegates basically agreed to continue to debate the merits of free trade for a long time to come, Bush departed for a Sunday meeting in Brasiia with yet another embattled President, Luis Ignacio da Silva ("Lula.")
Brasilia is a pretty lonely, desolate, and distinctly un-Brazilian place on a Saturday night, because all the whores and politicians have flown back to Rio or Sao Paulo for the weekend, and one is just left with all these 1950s-vintage monuments to Brazil's cement industry. But perhaps President Bush will find a little solace taking a moonlit walk on the empty esplanades, wandering through the otherwise flat, lifeless landscape that Robert Campos once called "the revenge of a Communist architect against capitalist society."
November 5, 2005 at 11:42 PM | Permalink | Comments (1) | TrackBack
Friday, November 04, 2005
"LIBBYGATE" Why Scooter Will Skate... James S. Henry
Irving Lewis Libby, Jr. was finally arraigned this week, after the Special Prosecutor Patrick "Bulldog" Fitzgerald's two-year investigation. It's always nice to see warmongers twisting in the wind, but what have we really learned from all of this?
Unfortunately, the five-count federal indictment of Vice President Dick Cheney's 55-year old Chief of Staff did not actually reveal who outed CIA spookette Valerie Pflame.
But at least we do now know "Scooter's" real first name and the origins of his cute little boys' school handle.
Before Big Media's attention was deflected back to bird flu and another contentious Supreme Court nomination, the indictment also produced much speculation about whether Libby would cop a plea; whether "Official A" -- Karl Rove -- or even the Veep himself might eventually be charged; and how long the judicial torments suffered by Libby, Tom Delay, Jack Abramoff, and other inner-circle Republicans will persist.
For a few moments, it also appeared that Patrick "Bulldog" Fitzgerald might finally get down to a few of the really important issues:
- (1) To what extent did the White House, the Pentagon, its operatives, and its allies in the media and foreign governments conspire to orchestrate the fraudulent case for the Iraq War -- as opposed to just being victims of "faulty intelligence?" (E.g., "Tenet made me do it.")
- (2) How often were "house journalists" like Judith Miller, Tim Russert, and Bob Novak -- whose principle skill is trading various kinds of favors with officials in high places -- used as distribution channels for the Administration's agitprop?
- (3) If they didn't learn Valerie Plame's identity from Libby or Rove, from whom did they learn it?
- (4) What special interests - energy companies, defense contractors, and several Middle East countries, would-be countries, and religious/ ethnic factions -- helped weave the cobweb of distortions and lies that got us into this War, and have kept us in it long after even Brent Scowcoft and William Odom agree that it is a monumental US strategic blunder?
- (5) What was the role of these same interests in insuring that so many leading Democrats have been completely supine on the War? And what other wars do they have in store for our sons and daughters?
Alas, the case against Libby & Co. is unlikely to ever reach these issues. This is not because of Fitzgerald's investigation, which was ably led by FBI agent Jack Eckenrode, known and admired as a straight shooter by this author since 1987. Rather, it is because, as argued below, Scooter Libby will almost certainly escape scot-free... just like his oldest client, Mark Rich, who's recently been implicated in paying bribes to Saddam Hussein -- post-pardon. For the incredible story, read on......
THE LIBBY CASE
At first glance, Fitzgerald's 22-page indictment seems like a good start. While perjury and obstruction of justice charges can be tough to prove, Fitzgeral's case looks straightforward. It also has the extra-added attraction of compelling this particular crop of journalists to bite a hand that has fed them handsomely.
Fitzgerald displayed a palpable sense of relief that he'd been spared having to prosecute violations of the complex 1982 Intelligence Identities Protection Act, the original basis for his investigation.
That statute would have required him to show not only that officials like Libby and Rove who had security clearances had willfully exposed the identity of a true"covert" agent, but also that these same officials had learned the agent's identity from official sources.
By turning the case into a perjury charge, Fitzgerald avoided having to convince a jury that Pflame was still a covert agent when her identity was disclosed. That wasn't going to be a slam dunk, given that she'd been driving herself to Langley every day, and that she was at least partly responsible for the decision to send her own husband, former Ambassador Joe Wilson IV, on the uranium fact-finding mission to Niger in February 2002.
There also appears to have been an organized campaign to punish Pflame and her husband, with several officials leaking her identity to multiple journalists at once, and folks like the curious friend of both Lt. Colonel Larry Franklin and Judith Miller, Israeli Embassy "political counselor" Naor Gilon, also in the loop. It will be far easier to for Fitzgerald to prove how Libby learned Plame's identity than to prove that any particular journalist learned it only from him.
ROLL OVER?
Considering the strength of the case, Fitzgerald's unbroken track record of convictions, and the 30-year sentence that Libby might theoretically face if he doesn't cooperate, many pundits now expect him to "roll over" and testify against the Veep or Rove.
However, the poker-faced Libby has showed no signs of knuckling under. indeed, he has expressed confidence that “(A)t the end of this process I will be completely and totally exonerated.” His attorney has indicated that Libby wants a jury trial "to clear his name."
Is this just typical defendant braggadocio? Or does this savvy member of the Bush Administration's inner circle, who also held key posts under Reagan, Bush I, and Clinton, spent 16 years as a litigator and partner at leading DC and Philadelphia law firms, and personally represented big-time felons, know something that the pundits do not?
The fact is that those who are hoping for a plea bargain here, much less a trial of the Veep, are likely to be disappointed.
While Fitzgerald has a solid case, Libby -- like his client Marc David (Reich) Rich, the fugitive from 48 felony counts who was pardoned by President Clinton in January 2001, and the six senior officials and convicted felons who were pardoned by President George H.W. Bush in December 1992 -- has a trump card.
He already knows that he will never do a single minute of jail time.
The simple, if inelegant, reason is this: Scooter Libby knows far too much, and not just about "Pflameburn."
Given his background and experience, Libby might well be in a position to bring down the entire Bush Administration on any number of matters, from secret detention centers and CIA "wet jobs" to missing funds in Iraq to Halliburton's no-bid contracts to the hyping of the case for the war. He might also have a few interesting things to say about the shenanigans of the Clinton, Bush I, and Reagan Administrations.
Absent divine intervention, therefore, the fix is in. Libby's gameplan is already clear: he will insist on a jury trial, and will try to delay that as long as possible -- perhaps up to a year, as his counsel recently indicated. That trial will commence during the fall of 2006 -- not before the November 2006 Congressional elections, if Libby has his way. The trial itself will last at least 3-6 months, and there is always a chance that Libby not be convicted. Even if he is, the appeals would take us well into 2008, Bush's last year in office. So even if Libby is convicted, he'll receive a Presidential pardon with minimal jail time.
RICH IRONIES
From this angle, it was indeed ironic to learn late last week, just as Scooter was about to be indicted, that his 20-year client Marc Rich had been named by Paul Volcker as a leading provider of bribes to Saddam Hussein in the UN Oil-For-Food (OFF) scandal -- for the most part AFTER his January 2001 pardon by President Clinton.
Furthermore, it also turned out that several other key OFF beneficiaries and Saddam bribers also had close ties to both Rich and to Halliburton, the Veep's old firm -- including Mikhail Fridman's Alfa Group, Switzerland's Glencore, and US-based oil companies like Bayoil and Coastal Petroleum.
The striking thing is how bi-partisan most of these corporate kleptocrats have been.
For example, while Halliburton is closely identified with the Republican Party, Coastal's Oscar Wyatt, Jr., now also under federal indictment, has been a heavy life-long contributor to the Democratic Party.
Rich's ex-wife Denise, operating out of her New York City condo and her high-hedged mansion in Southampton, greased the skids for her husband's pardon by contributing over $1 million, becoming one of the largest fundraisers for Bill Clinton's new Presidential library.
Alfa Group's Ukrainian-born Mikhail Fridman maintains close ties not only with President Putin and certain leading Moscow mobsters, but also with the Council on Foreign Relations, where Alfa has recently become a leading contributor.
And when Marc Rich pursued his Presidential pardon, his main legal gun wasn't Scooter, but Jack Quinn, the Arnold & Porter senior partner who had served as Al Gore's Chief of Staff in the early 1990s.
So, from this angle, Dick Cheney's Chief of Staff has just been trying to keep up.
When we finally sweep clean these Augean stables, we will have to employ a very large, non-partisan broom indeed.
(c) SubmergingMarkets, 2005
November 4, 2005 at 11:34 AM | Permalink | Comments (0) | TrackBack
Friday, July 08, 2005
"PICTURES OF FOOD?" The G-8's Incredible Deal James S. Henry
"Have we just been invited to dinner and served pictures of food?"
Aging poverty rockers Bob Geldof and Bono are apparently quite satisfied with today's G-8 announcement on aid, trade, debt relief, and global warming.
Sir Bob, 51, called it "a great day...Never before have so many people forced a change of policy onto a global agenda."
Paul Hewson (Bono), 45, butchering a quote from Winston Churchill, remarked that he "would not say this is the end of extreme poverty, but it is the beginning of the end."
On the other hand, the global NGOs that follow the subjects of debt, development, and trade reform most closely disagree vehemently. For example:
- Oxfam UK said that "the outcome here in Gleneagles has fallen short of the hopes of millions."
- Christian Aid said. "This will not make poverty history. It is a vastly disappointing result."
- The Jubillee Debt Campaign said the "the G-8 stand still on debt, when a giant leap is needed."
- Global Call to Action on Poverty said that "the people have roared, but the G-8 has whispered. The promise to deliver by 2010 is like waiting five years to respond to the tsunami."
- Friends of the Earth UK said that on the issue of climate change, the G-8 accord represented "more talk, no action...a very disappointing finale."
- ActionAid said of the deal, "It is still too little, too late, and much of it is not new money. Fifty million children will die before the aid is delivered in 2010."
So whom are we to believe?
Should we believe the NGOs that are full-time specialists in these issues, but also, in a sense, have a vested interest in the glass being perpetully half-full?
Should we believe the professional celebrities and politicians who also have a huge stake in the equally-curious notion that the way to "end poverty" is rely on their episodic cycles of concern and their undeniable ability to periodically whip us all into a guilt-ridden frenzy?
Or should we and the world's poor perhaps begin to do some thinking on our own about what "ending poverty" really means, and how to go about it?
MORE LATER
July 8, 2005 at 02:16 PM | Permalink | Comments (1) | TrackBack
Saturday, July 02, 2005
TIME'S ARROW A Media Conglomerate's Fateful Decision to Undermine Investigative Journalism James S. Henry
It was disappointing, but not really surprising, that Norman Pearlstine, Time Inc.'s Editor-in-Chief since 1995, decided last week to comply with a federal subpoena and turn over documents to the federal government that will probably burn one of his own leading journalist's sources.
As discussed below, this spineless decision was a cardinal sin against investigative journalism and the First Amendment.
But it was entirely in keeping with Time Warner's long-standing passivity and obsequiousness toward the powers-that-be.
It also reflects this media conglomerate's increasingly entangled interests with governments and corporations around the world. Thank goodness that Matt Cooper wasn't working for Time's China subsidiary!
Our condolences go out to the dwindling crop of serious investigative journalists at Time, People, Sports Illustrated, and TWX's other 130 publications, as well as CNN and HBO.
There is, however, one ray of light in Pearlstine's otherwise cowardly, inexcusable decision. We may finally get to learn the identity of the White House felon who leaked Valerie Plame's CIA relationship to the press. Already there's been some very interesting speculation....!
THE CONTEXT
For the past year, some marketing genius at AOL Time Warner -- now just Time Warner Inc. ("Twinkie," as journos know it, or TWX, as it is known to its hapless stockholders) -- has been sending us free copies of its "weekly news magazine," Time Magazine.
Even though Twinkie's stock price has been dropping more or less continuously since 1999, I am delighted that it can still afford this kind of largess. It is, after all, "the world's largest media company."
Personally, I've always enjoyed the covers and that pretentious "Person of the Year" award -- which really should still be called "Man of the Year," since they've only honored 2 women, Queen Elizabeth and Corazon Aquino, (plus 1 black and 4 Asians) since 1927. (Peter Ueberroth and Jeff Bezos, but not Nelson Mandela? FDR three times, and Truman, Ike, LBJ, Nixon, Bill Clinton, Dubya, Stalin, Deng Xiaoping, Gorbachev, and Churchill twice each? Does anyone there enjoy sucking up to power, or what?
And who WILL it be this year? The deceased Pope? The new Pope? Justice O'Connor? Bono? Abu Musab al-Zarqawi?)
Anyway, apart from all that, I almost never actually read the damn thing -- at least not since 1973, when Skip (Henry Louis) Gates and I shared an apartment in London, and he was interning for Time. It seemed like a comradely thing to do for a brother.
CONVENTIONAL JUDICIOUSNESS In general, Time Magazine, and TWINK's myriad other publications -- People, Sports Illustrated, Yachting, Wedding & Home, Popular Science, Marie Claire, Ski, Family Circle, and over 130 others -- may be helpful and even informative for people who have regular day jobs.
But they have simply never been noted for cutting-edge investigative journalism.
On the rare occasions when they try to do it, as in CNN/Time's infamous Tailwind investigation in 1998, they usually get caught in their own knickers.
To borrow from a few leading Time Inc. titles, they're really more comfortable keeping things Real Simple, providing Entertainment Weekly, and making a lot of Money, or perhaps even a Fortune -- unless you are a hapless TWX investor.
Of course TWINK's other media properties -- especially CNN and HBO -- have contributed some valuable reporting over the years. But "Chicken Noodle Network," in particular, has been under increasing pressure from rivals like Fox to do more info-tainment and be more gung-ho.
Recently they've also had big fish to fry with the Pentagon, given the increasing importance of embedded journalism and satellite feeds in all sorts of new Third World war zones.
None of this has been especially encouraging to crusading Edward R. Murrow-style journalism.
GETTING BURNED
In this case, the Time Magazine journalist whose sources were burned, Matthew Cooper, has been fighting this subpoena through the courts for 18 months. He and New York Times' reporter Judith Miller had both refused to reveal their sources to a federal grand jury that is investigating the Valerie Plame case, and had appealed their subpoenaes all the way to the US Supreme Court. After the court ruled against them in late June, Pearlstine reached his own verdict.
Pearlstine denied that he was influenced at all by the risk that TWX might be subjected to a $1,000 per day fine, much less the even more costly possibility that defiance of the subpoena might jeopardize last year's settlement with the US Department of Justice. Under that settlement, the DOJ filed a criminal complaint against AOL TW for certain misconduct under securities laws, but agreed not to prosecute, so long as the company cooperated fully with the terms of the settlement.
Instead, Pearlstine simply asserted that since the US Supreme Court has decided the matter, he has no option but to comply with the subpoena.
To their credit, The New York Times, Judith Miller, and Matt Cooper, as well as -- we hope -- most other journalists and publishers -- all see it differently.
- They understand that investigative journalists simply cannot do their jobs -- and by extension, the First Amendment can't be effective -- if they are not able and willing to protect their sources.
- They understand that source confidentiality is a long-standing common law tradition which has already been recognized in statutes enacted by 45 states.
- They understand that the so-called trade-off between grand jury access to confidential sources and journalists' rights is no trade-off at all -- since without confidential sources, none of the information sought would even exist in the first place.
- They understand that in individual cases, journalists who have agreed to protect their sources may have a moral obligation that may indeed, depending on the facts of the case, transcend the legal duty to help prosecute a specific crime.
In short, Pearlstine's short-sighted stance, if emulated by other news-gathering organizations, could pose a real challenge to hard-hitting investigative journalism -- at least to any that is still being done inside media conglomerates like TWX, Disney, and Viacom.
This is just one more example of the kind of pusillanimous behavior that we have come to associate with publicly-owned media giants.
They have so many interests at stake in their dealings with the government and their fellow corporate giants that they simply lack the will to do vigorous investigative reporting.
The good news is that while these corporate giants atrophy, a whole new generation of spunky alternative sources for news and investigative reporting are springing up.
Unlike some media conglomerates, we protect our sources -- and there are no corporate hirelings who might "balance the interests at stake" and say otherwise.
We do feel sorry for all the many good reporters at organizations like Time, CNN, and HBO, as well as for any writers affiliated with Little, Brown or TWX's other publishing ventures.
Like Matt Cooper, they may be willing to keep promises to their sources themselves.
But how can they ever be sure now, if push comes to shove, that the commitments they've made to their sources will be respected by the myriad of senior editors and other corporate executives above them?
AND THE LEAKER IS....?
Meanwhile, speculation continues to build about who the White House source or sources might be who leaked Valerie Plame's status as a CIA operative to the press last year -- reportedly out of pique at her husband's anti-Bush stance. The latest rumor on the street is that it is Karl Rove.
If true, some might say -- well, source confidentiality be damned! What could be more satisfying that to see this bloated blow-hard take a few laps in the federal pen for perjuring himself before a grand jury?
After all, should a confidential sources rule that is primarily intended to protect whistle-blowers really be extended to a government leaker who's gone on the offense, trying to punish political enemies while hiding behind confidentiality? Some in the journalism community have indeed argued not.
However, from our standpoint, one person's "offensive leaker" is another's "whistleblower" -- that's not an easy line to draw.
Far better that we protect the right of journalists to honor their commitments to confidential sources of all kinds -- except perhaps in the "hard cases" where human lives are clearly at risk. Unless Dubya is a threat to Karl, presumably that's not the situation here.
(c) SubmergingMarkets, 2005
July 2, 2005 at 12:40 PM | Permalink | Comments (2) | TrackBack
Wednesday, May 18, 2005
"I AM NOT NOW, NOR HAVE I EVER BEEN, AN OIL TRADER!" George Galloway Kicks Senate Butt
This week's developments in the so-called Iraq Oil-for-Food scandal ("OFF") have turned out to be nothing less than a fiasco for the US Senate's Permanent Investigations Subcommittee and its feckless freshman Republican Chairman, Minnesota's Norm Coleman.
In the first place, a newly-released minority staff report by Democrats on the Subcommittee shows that Bayoil USA, a Houston-based oil trading company headed by David B. Chalmers, Jr., now under indictment, was by far the most important single conduit for the illegal surcharges pocketed by Saddam Hussein under the program.
The report showed that more than half of Iraq's oil sales that generated surcharges for Saddam were made to US buyers during the period September 2000 to September 2002, most of them right under the nose of the Bush Administration and the US Treasury's rather lackadaisical Office of Foreign Assets Control.
Other US companies that have reportedly received subpoenas in the on-going surcharges investigation include ExxonMobil, ChevronTexaco, and Houston's El Paso Corp, as well as prominent Texas oilman Oscar S. Wyatt Jr., who was also deeply involved in supporting and profiting from oil-for-food.
Next, British MP George Galloway, appearing voluntarily before the Subcommittee, deliverered a feisty denial of allegations that he had personally profited from the oil allocations, as well as a withering assault on the last twenty years of US policies toward Iraq.
Meeting little resistance from the badly-outgunned Senators, Galloway made the points that
- He met with Saddam no more times than Donald Rumsfeld, who had met with Saddam to sell arms and provide maps, while Galloway met him to seek peace and encourage arms inspections;
- He had actually opposed Saddam's policies way back in 1990, while the first Bush Adminstration was still making loans and selling arms to Saddam;
- He had always opposed the oil-for-food program as a poor substitute for lifting sanctions, which unfairly punished all Iraqis for the sins of its dicator -- especially its children, up to 1 million of whom may have died because of increased infant mortality;
- The Subcommittee's investigation was a "smokescreen" that distracted attention from far more serious issues -- such as the disappearance of more than $8.8 billion of Iraqi national funds during the first year after the US invasion.
The combative Scot's hard-hitting testimony makes compelling viewing.
Meanwhile, we recall that back in June 2003, J. Bryan Williams III -- ExxonMobil's former head of global crude procurements, and the US' hand-picked UN overseer on the Iraq Sanctions Committee, in charge of making sure that Saddam did not obtain any illicit income from the oil-for-food program -- pled guilty to evading taxes on $7 million, including a $2 million kickback to help Mobil win business in Kazakhstan's oil dictatorship.
So there is at least some good news here, Senators -- if you want to find big-time corruption in the international oil trade, you don't have to go looking for it in London, Moscow, or Paris.
These developments also help to put Senator Coleman's continual "head-hunting" of UN Secretary General Kofi Annan in perspective. While there's no evidence that Kofi profited personally from OFF, his minions were probably not squeeky-clean. But the enormous profits earned by Saddam's "fellow travelers" in Houston make them seem like pikers.
Furthermore, while Kofi is certainly not much of an effective manager, we now know from the Bolton hearings that administrative skill doesn't count for very much with the Bush Administration.
Indeed, it appears that Annan's key fault is that he had the temerity to oppose the Iraq invasion, and even to label the War "illegal" -- once the invasion had already occurred. With Paul Volcker's final report on the oil-for-food scandal due out soon, and US Ambassador to the UN John Bolton (!) likely to arrive as soon as he clears the Senate and adjusts his meds, the outlook for the summer is definitely for more fireworks.
***
(c) SubmergingMarkets.Com, 2005.
May 18, 2005 at 01:54 PM | Permalink | Comments (1) | TrackBack
Saturday, January 01, 2005
SO-CALLED “NATURAL” DISASTERS Part I. Overview James S. Henry
For the second year in a row, December comes to a close with a dramatic reminder of the precariousness of daily life in the developing world -- and the continuing failure of the international community to provide adequate early warning systems, pre-crisis funding, and rapid, effective global relief for the victims of so-called “natural disasters” -- most of which are actually quite predictable, at least in the aggregate.
This year, on December 26, 2004, it was the 9.0Rs earthquake off the western coast of northern Sumatra, Indonesia’s second largest island, the fifth largest earthquake recorded since 1900.
One year ago to the day, on December 26, 2003, the disaster in question was the 6.6Rs earthquake that devastated the city of Bam in southeast Iran, at a cost of 26,500 lives, 25,000 injured and 80,000 homeless.
The death toll from this year's Sumatra quake is likely to exceed 150,000, with thousands of people still missing, several hundred thousand who have been seriously injured, and more than five million -- most of whom were impoverished to begin with -- suffering from thirst, hunger, homelessness, lost employment, and the threat of mass epidemics.
Furthermore, as we were also reminded in Bam, among the worst consequences of such catastrophic events are the longer-term traumas associated with disease, losing friends, family, fellow citizens, livelihoods, communities, and whole ways of life.
As usual -- and as was true in the case of 9/11, for example -- much of the initial media coverage of this Sumatra tsunami has focused on body counts, other dire visible consequences, and the massive relief effort that has followed.
That is to be expected. But before our attention span drifts too far off in the direction of some other new Third World calamity, it may be helpful to step back and examine some of the systematic factors that contribute to the high costs of such mishaps over and over again, and the extraordinary costs of this "natural" tsunami disaster in particular.
Our overall theme is that there is really no such thing as a “natural disaster” per se. This is not to say that man-made forces were responsible for Saturday’s tsunami. But, as discussed below, the degree to which any such event results in a social and economic “disaster” is often to a great extent under our control.
In the case of this particular tsunami, its high costs:
- Were entirely foreseeable, at least in a “sometime soon” sense, based on both long-term and recent experience with tsunamis in the Indonesian arena;
- Were actually foreseen by several geological experts, some of whom have been advocating (unsuccessfully) an Indian Ocean tsunami early warning system for years;
- Could have been substantially mitigated if US, Japanese, and other scientists around the globe who monitor elaborate earthquake- and tsunami-warning systems, and had ample warning of this event, had simply shown a reasonable degree of human concern, imagination, and non-bureaucratic initiative;
- Might have been avoided entirely with a relatively modest investment in tsunami “early warning systems” for Indonesia and the Indian Ocean.
Furthermore, the global response to this horrific disaster has been long on the size of aid pledges, dignitary press conferences, and “oh – the horror” press coverage.
It has been conspicuously short on actual aid getting through to the front lines. Today, almost a week after the disaster, aid efforts are well-funded, but they remain sluggish, disorganized, and ineffective, with at least as many additional lives in jeopardy right now for want of aid as perished in the original waves.
This is partly explained by the sheer difficulty of getting aid through to remote regions like northern Sumatra. But, as explained below, it is also due to political factors, and the fact that the world community still runs its humanitarian relief efforts like a “pick-up” softball game.
Fortunately, this particular crisis seems to have captured the attention of the world's donor community. At this point, with more than $2 billion in aid pledged by governments, multilateral institutions, and more than 50 private relief organizations, the real problem is not money, but organization.
But we may want to demand that the UN, the US Government, the EU, and all these relief organizations get their acts together, and establish a permanent, well-run, well-funded global relief organization that can move more quickly the next time around. Along the way, they should also pay far more attention to preventive systems that can help save the future victims of such disasters, before all the relief becomes necessary.
© James S. Henry, Submerging Markets™, January 05
January 1, 2005 at 05:29 PM | Permalink | Comments (0) | TrackBack
Friday, December 10, 2004
Global Growth, Poverty, and Inequality Part I. A Little Christmas Cheer? James S. Henry and Andrew Hellman
|
The Christmas season is a very special time of year, when Americans, in particular, engage in a veritable month-long orgy of holiday revels and festivities, including eggnog sipping, Santa sitting, package wrapping, neighborhood caroling, tree decorating, menorah lighting, turkey stuffing, and generally speaking, spending, getting, and giving as much as possible, at least with respect to their immediate friends and family.
We certainly don’t wish to question the legitimacy of all these festivities. After all, as this November’s Presidential election has reminded us, ours is surely one of the most powerful, vehement, unapologetic Judeo-Christian empires in world history. Like all other such empires, it has every right to celebrate its triumph while it lasts.
According to the latest opinion surveys, this is indeed an incredibly religious nation, at least if we take Americans at their word. More than 85% of Americans adults consider themselves “Christians,” another 1.5% consider themselves “Jews," 84% pray every week, 81% believe in life after death, 60% believe the Bible is “totally accurate in all its teachings,” 59% support teaching creationism in public schools, and fully 32% -- 70 million people, including 66% of all evangelicals -- would even support a Constitutional Amendment to make Christianity the official US national religion.
In light of all this apparent religious fervor, it is disturbing to read several recent analyses by OXFAM and the UN of certain persistent, grim social realities around the world – and our paltry efforts to redress them. Is the intensity of our religious rhetoric and this season's celebrations just a way of escaping these unpleasant realities?
CHRISTMAS CHEER?
· According to the UN’s International Labor Organization (December 2004), among those still waiting for economic justice are nearly three-quarters of the world’s population – 4.7 billion people -- who somehow manage to survive on less than $2.50 per day. These include 1.4 billion working poor, half of the 2.8 billion people on the planet who are employed.
· According to the UN’s Food and Agricultural Organization (December 2004), the world’s poor now include at least 852 million people who go to bed hungry each night – an increase of 20 million since 1997. The continuing problem of mass famine has many side-effects – including an estimated 20 million low-birth-rate babies that are born in developing countries each year, and another 5 million children who simply die of malnutrition each year. In some countries, like Bangladesh, half of all children under the age of six are malnourished.

· Overall, for the 5.1 billion residents of low- and middle-income countries, average life expectancy remains about 20-30 percent shorter than the 78 year average that those who live in First World countries now enjoy. By 2015, this will produce a shortfall of some 50 million poor children and several hundred million poor adults. But at least this will help us realize the perhaps otherwise-unachievable “Millennium Development Goals” for poverty reduction.

· According to UNICEF (December 2004), more than 1 billion children – half of all children in the world -- are now growing up hungry, in unhealthy places that are suffering from severe poverty, war, and diseases like HIV/AIDs.
· According to Oxfam (December 2004), First World countries have basically reneged on their 1970 promise to commit .7 percent of national income to aid to poor countries. Last year such aid amounted to just .24 percent of national income among OECD nations, half the 1960s average. And the US commitment level was just .14 percent, the lowest of any First World country, and less than a tenth of the Iraq War’s cost to date.
· This month’s 10th UN Conference on Climate Change (COP-10) in Johannesburg reviewed a growing body of evidence that suggests that climate change is accelerating, and that the world’s poor will be among its worst victims. Among the effects that are already becoming evident are widespread droughts, rising sea levels, increasingly severe tropical storms, coastal flooding and wetlands damage, tropical diseases, the destruction of coral reefs and arctic ecosystems, and, God forbid, a reversal of the ocean’s “thermohaline” currents.
Overall, as the conference concluded, for world’s poorest countries – and many island economies – the threat of such effects is much more threatening than “global terrorism.”
So far, however, the US – which with less than one-twentieth of the world’s population, still produces over a fifth of the world’s greenhouse gases -- seems determined to do nothing but stand by and watch while energy-intensive “economic growth” continues. This year’s oil price increases have slowed the sales of gas-guzzling SUVs somewhat, yet more than 2.75 million Navigators, Hummers, Land Rovers, Suburbans, and Expeditions have already been sold. The US stock of passenger cars and light trucks, which accounts for more than 60 percent of all US oil consumption, is fast approaching 220 million -- almost 1 per person of driving age.
Meanwhile, leading neoconservative economists and their fellow-travelers in the Anglo-American media continue to tout conventional measures of “growth” and “poverty.” Indeed, according to the most corybantic analysts, a free-market-induced “end to poverty as we have defined it” has either already arrived, or will only require the poor to hold their breath just a little bit longer – until, say, 2015.
As we will see in Part II of this series, this claim turns out to be -- like so many other elements of modern neoconservative dogma – a preposterous falsehood. But it does help to shelter our favorite dogmas – religious and otherwise -- from a day of reckoning with the truth.
èèè
December 10, 2004 at 03:16 PM | Permalink | Comments (0) | TrackBack
Friday, December 03, 2004
”WHERE’S WARREN?” Bhopal’s 20th Anniversary
Today marks the twentieth anniversary of the deadly December 3, 1984, chemical gas leak at an Indian pesticide plant in the very center of Bhopal, a city of 90,000 – just a little larger than Danbury, Connecticut -- in the state of Madhya Pradesh, in central India. At the time the plant was owned by Union Carbide India, Ltd. (UCIL), an Indian company whose majority (50.9%) shareholder was Danbury-based Union Carbide Corporation (UCC) which was acquired by Dow Chemical in 2001.
This anniversary provides us with an opportunity to reflect on “lessons learned” from this disaster – including the need to make sure that the globalization of trade and investment is also accompanied by the globalization of justice for the victims of transnational corporate misbehavior.
Download WhereIsWarren12032004.pdf
THE COSTS
As a recent report by Amnesty International details, this industrial accident, perhaps the worst in history, killed more than 7,000 to 10,000 people in the first few days, including many children.
There were also serious long-term injuries to up to 570,000 others who were exposed to the fumes.
At least 15,248 of these survivors have already died because of their injuries – in addition to the 7,000 to 10,000 initial victims.
Up to 570,000 others continue to suffer from a wide range of serious health problems, including birth defects, cancer, swollen joints, lung disease, eye ailments, neurological damage, and many other painful, long-term illnesses.
Thousands of animals also died, and many people lost their homes, jobs, income, and access to clean water.
WHO WAS TO BLAME?
U |
nion Carbide’s ultimate “parent authority” for this accident is very clear. In the middle of the night, a cloud of lethal gas caused by the leak of at least 27 tons of “methyl isocyanate” (MIC), a high-toxic odorless poison, and another 13 tons of “reaction products” began wafting through the city center. The gas spread without warning throughout the town. The leaks continued for more than two hours before any alarms were sounded.
All six of the plant’s alarm systems failed. It was later shown that the company management had systematically tried to cut corners on safety and warning equipment – by, for example, failing to equip the plant with adequate safety equipment and trained personnel to handle bulk MIC storage; failing to apply the same safety standards that it used in the US; and failing to insure that there was a comprehensive plan to warn residents of leaks.
In fact, company staff and many others were aware of the risks created by this situation. In June 1984, six months before the accident, an Indian journalist had written an article about them: “Bhopal – On the Brink of Disaster.” But nothing was done – partly, according to Amnesty, just to cut costs.
The result was that shortly after midnight on December 3, 1984, Bhopal’s families woke up screaming in the dark, unable to breathe, their eyes and lungs on fire from the poison, choking on their own vomit. By daybreak there were already hundreds of bodies on the ground, with scores of funeral pyres burning brightly.
In addition, long before the 1984 accident, there had been a series of leaks at the site that management was well aware of, and which caused serious pollution – contamination that continues to this day.
All told, as the Amnesty report makes clear, this amounts not only to an health and environmental disaster, but a serious infringement of the human rights of thousands of Indian citizens.
CONTINUING IMPUNITY
All this was bad enough. But the other key part of Bhopal’s injustice has to do with the fact that key actors like Dow Chemical/Union Carbide, the Indian Government, and the individual US and Indian senior executives and other officials who were responsible for the accident have managed to avoid liability for the full costs of the “accident,” as well as personal accountability.
This impunity was underscored this week when the BBC fell victim to a hoax perpetrated by someone who pretended to be a Dow Chemical executive. He concocted a false statement that the company was reversing its denial of all responsibility for Bhopal, and was establishing a E12 billion fund for 120,000 victims.
In fact,
· Union Carbide (UCC) and Dow Chemical, UCC's new owner since it purchased the company for $10.3 billion in 2001, have consistently denied any liability for the disaster. They have argued, for example, that UCC was a “domestic” US company, with no “operations” in India. Supposedly it was also not responsible for UCIL’ actions, because UCIL was just an “independent” Indian company.
· In fact, while UCC disposed of its interests in UCIL in 1994, until then, UCC maintained at least 51 percent ownership in UCIL. Furthermore, according to the Amnesty report, UCC played an active role in UCIL’s management and board activities, and was responsible for the detailed design, senior staffing, and on-going operating procedures and safety at the Bhopal plant.
· Furthermore, as UCC’s CEO at the time, Warren Anderson, bragged before the US Congress in 1984, Union Carbide had 100,000 employees around the world. At the same time, another senior UCC executive, Jackson Browning, said that UCC’s “international operations represented 30 percent of sales,” and that “India was one of three dozen countries where the company has affiliates and business interests.”
· After the spill, according to the Amnesty report, UCC officials (1) tried to minimize MIC’s toxicity, (2) withheld vital information about its toxicity and the reaction products, which they treated as trade secrets; and (3) refused to pay interim relief to the victims.
· The Indian Government and the State Government of Madhya Pradesh also bear grave responsibility for the disaster itself, and then for striking an irresponsible private settlement with the perpetrators. As the Amnesty report makes clear, environmental regulations were very poorly enforced against UCIL. Then, having sued for $3 billion in damages in 1988, the Indian Government settled for just $470 million in 1989, without adequate participation from victims. The Indian Government has also discontinued medical research on the impact of the gas leak, and failed to publish its interim findings.
In October 2003, it was disclosed that by then, some 15,298 death claims and 554,895 claims for other injuries and disabilities had been awarded by the Madhya Pradesh Gas Relief and Rehabilitation Department – five times the number assumed in the settlement calculations by the Indian Supreme Court.
· UCC’s insurance paid that paltry amount in full. But then the Indian Government was very slow to pay out the money to victims. As of July 2004, $334.6 million had been paid out, while $327.5 million was still sitting in Indian government custody. At that point, 20 years after the disaster, the Indian Supreme Court finally ordered that the remaining money be paid out to some 570,000 registered victims – an average of $575 apiece. Even these payments won’t all get to the victims; a significant portion is reportedly consumed by India’s notorious bribe-ridden state bureaucracy.
· Local authorities in Bhopal filed criminal charges against both UCC its former CEO Warren M. Anderson in 1991-2. Anderson was charged with “culpable homicide (manslaughter),” facing a prison term of at least 10 years. He failed to appear, and is still considered an “absconder” by the Bhopal District Court and the Supreme Court of India.
However, despite the existence of a US-India extradition treaty, the Indian Government has failed to pursue a request for Anderson’s extradition vigorously.
The 82-year old Anderson, who is still subject to an Indian arrest warrant, has a very nice home with an unlisted number in Bridgehampton, New York, and another in Vero Beach, Florida.
Meanwhile, while the Indian Government has been willing to hold local Indian companies that operate hazardous businesses strictly liable for damages caused by them, it has been reluctant to apply this rule to transnational companies -- perhaps because it is more worried about attracting foreign investment than insuring that foreign investors manage their activities responsibily.
SUMMARY – GLOBALIZING JUSTICE
Overall, twenty years after the original incident, Bhopal remains a striking example of transnational corporate misconduct, an incredible case of the negligent mishandling of a true “chemical weapon of mass destruction.”
This behavior may not have been as culpable, perhaps, as the willful use of toxic weapons against innocent civilians by former dictators like Saddam and Syria's Assad. But it was no less deadly.
As we saw above, Bhopal was also an example of the incredible loopholes that still apply to leading companies in globalized industries.
Especially in corruption-ridden developing countries like India, they have often been able to take advantages of lax law enforcement, weak safety regulations, clever holding company structures that limit liability, and the sheer expense of bringing them to justice.
Evidently the globalization of investment and trade is not sufficient. Economic globalization needs to be augmented by the globalization of justice. Among other things, that means that it is high time for transnational corporations to be subject to an enforceable code of conduct, back up by an International Court for Corporate Responsibility.
èèè
© James S. Henry, SubmergingMarkets™, 2004
December 3, 2004 at 09:26 PM | Permalink | Comments (1) | TrackBack
Saturday, October 30, 2004
IRAQI FUBAR: The Road to God Knows Where
Whatever Americans may choose to believe about whether they are really better off with Saddam Hussein gone, it is by now evident that, nineteen months after the US invasion of Iraq, most ordinary Iraqis feel much less secure, much less well-off, and more anti-American than ever before.
Indeed, the country appears to be spiraling out of control. Nor is it clear that even a sharp post-US election crackdown by Coalition Forces, with all the attendant Iraqi casualties that it is likely to produce, will be able to turn this trend around.
So far, neither leading US Presidential candidate appears to have fully come to grips with this deteriorating situation in Iraq, or the fundamental strategic blunders that underlie it. At least they are not saying so in public.
A rather undistinguished, pig-headed, President continues to defend his faith-based initiative for “democratizing” the Middle East.
A rather undistinguished junior Senator from Massachusetts -– who has spent much of his life trying to be on all sides of recent US wars -- continues to argue that the key problems with the Iraq War have been tactical – too few troops and equipment, too little Allied support, too few trained Iraqis, careless handling of high explosives, and so forth.
Neither of these positions is realistic.
Indeed, as we will argue below, regardless of who is elected US President on November 2, the facts on the ground in Iraq are now pointing relentlessly toward one seemingly counter-intuitive conclusion:
The US will only be able to stabilize Iraq, preserve that country's national unity, win more support for the interim government, undermine the role of “foreign terrorists” in the country, and secure a modicum of domestic and international support for “democratization” if and when it links the calendar for Iraqi democratization and constitutional reform to a firm, near-term timetable for the withdrawal of all US forces (though not necessarily all Coalition forces) from the country.
October 30, 2004 at 06:29 PM | Permalink | Comments (1) | TrackBack
Wednesday, September 22, 2004
Democracy in America and Elsewhere: Part III: How the US Stacks Up: - A. Qualifying Voters
We certainly wish President Bush much greater success than President Woodrow Wilson, who saw his own favorite proposal to “make the world safe for democracy,” the Versailles Treaty, throttled by Republican Senators who opposed the League of Nations, and suffered a stroke in the ensuing battle.
Knowing President Bush, he will probably not be dissuaded from his mission by this unhappy history, or by the fact that many other world leaders, like France's Chirac and Brazil's Lula, are now much more concerned about fighting global poverty and taxing "global bads" like arms traffic, anonymous capital in offshore havens -- an idea we first proposed in the early 1990s -- and environmental pollution than they are about neo-Wilsonian evangelism.
But of course any suggestion by the US that democracy can actually be propagated by multilateral consensus rather than by unilateral military aggression is always to be welcomed.
Before proceeding any farther with this latest American crusade to sow democracy abroad, however, it may be helpful to examine how the US itself really stacks up as a “democracy," relative to "best democratic practices" around the world.
One approach to this subject would be to start off with a comparison with other leading First World democracies like the UK or France. After all, at the outset, one might think that only such countries have the well-educated, politically-engaged citizenry, political traditions, affluence, and technical know-how needed to implement truly state-of-the-art democratic processes.
However, following the lead of former President Jimmy Carter’s brief comparative analysis of Peru in 2001, we find it more interesting to see how the US compares with younger developing democracies that lack all these advantages – much less access to the yet-to-be-created UN Democracy Fund.
In our case, we’ve chosen Brazil, the world’s sixth most populous country, with 180 million inhabitants, two-thirds of South America’s economic activity, a federal system and a long history of slavery (like the US).
As we’ll see, our overall finding is that while Brazil’s democracy has plenty of room for improvement, it already boasts a much more democratic electoral system than the United States of America.

While Brazil’s electoral institutions are by no means perfect, and its campaign finance laws and federal structure have many of the same drawbacks as the US, it has recently been working very hard to improve these institutions. I
Indeed, it turns out that Brazil is making remarkable progress toward effective representative democracy, especially for a country with enormous social problems, a high degree of economic and social inequality, and a per capita income just one third of the US level
Brazil’s new democracy provides a striking contrast along many dimensions – in particular, the processes and structures by which it (1) qualifies voters, (2) conducts campaigns, (3) administers voting, and (4) provides fair representation of voter preferences. The following essay focuses in on the first of these elements; the sequel will deal with the others.

Actually “mandatory voting” is a misnomer – people are just required to show up at a polling station or consular office and submit a vote, which can be blank. There are fines for violators who lack valid excuses, like illness.
Brazil adopted mandatory voting in part to overcome the apathy induced by more than two decades of military rule. It is just one of many countries that have mandatory voting, including Australia, Belgium, Cyprus, Greece, Luxembourg, Liechtenstein, one Swiss canton, Egypt, Fiji, Singapore, Thailand, Argentina, Bolivia, Costa Rica, the Dominican Republic, Ecuador, Uruguay, and Venezuela.
Mandatory voting in Brazil is facilitated by the fact that, as in 82 other countries, all Brazilians age 18 or over are required to obtain a national identity card, with their photo, fingerprint, signature, place and date of birth, and parents’ names.
These cards, which are now becoming digital, are needed to qualify for government services and to conduct financial and legal transactions. They also enable cardholders to vote at polling booths anywhere in the country, eliminating the need for a separate, costly voter registration process.
To encourage voter turnout, Brazil also makes Election Day a national holiday, and often holds its elections on Sundays. Any eligible voter may be required to assist for free at the polls.
Mandatory voting, plus Brazil’s proportional representation system (See Part IIIB), have yielded voter turnouts in recent national elections that have routinely exceeded 75 percent of the voting-age population (VAP).
By comparison, US voter turnouts have recently averaged less than 45 percent of the VAP.
Brazil’s mandatory system has also had many other benefits. It has probably increased turnout the most among social groups that have much less access to education and income, thereby boosting their “voice” in the political system. It has also placed pressure on public authorities to implement efficient voting procedures, and shifted responsibility for registration and turnout away from Brazil’s political parties, allowing them to focus on campaigning.
As one might expect, mandatory voting does produce slightly more blank votes as a proportion of all votes than we see in US elections. But the system also seems to have made voting more habitual.
Some countries, like Austria and the Netherlands, have recently abandoned the practice, and Brazil is also considering this, now that the population has re-acquired the voting habit. As Brazil matures, especially given its use of proportional representation, it may well be able to follow in the footsteps of these other countries and eliminate mandatory voting without sacrificing high turnout.
The US. Voting is entirely voluntary in the US, and there are no national identity cards or centralized voter registration systems. Originally, many states viewed voter registration as undemocratic. But in the course of the 19th century, growing concerns over vote fraud, combined with the desire in some states to curb voting by blacks and the lower classes, led to the widespread adoption of stricter voter registration laws. By now, every state but North Dakota requires voters to “register” before they can “vote.” US elections are also never held on Sundays, nor is Election Day a national holiday.
As we’ll examine closer in Part IIIB, the US’ “winner-take-all” electoral system is also highly inefficient, with more than 95 percent of all Congressional incumbents now re-elected, and almost all US House and Senate races now a foregone conclusion. So US voters are naturally not eager to participate in such “Potemkin” elections, which are approaching Soviet-like party reelection rates (though the US does have TWO Soviet-like parties.)
None of this has helped to encourage voter turnout. Not surprisingly, therefore, for the entire period 1948-1998, US voter turnout averaged just 48.3 percent as a share of VAP, and ranked 114th in the world. This was the lowest level among all OECD countries -- forty percent lower than the average turnouts recorded in First World countries like Germany, Italy, Sweden, and New Zealand. Even if we omit the 17 countries like Brazil with mandatory voting, it is hard to make this track record look like an achievement.
One can argue that relatively low turnout is precisely the point. Indeed, participation by ordinary Americans in their political system has always been a bit trifle unwelcome. For example, just 6 percent of all American citizens – 20 percent of whom were slaves -- participated in George Washington’s election in 1789. This was mainly because most state legislatures at the time had decreed that voters had to be white, propertied, male, Protestant and at least 21 years old. Studies of 19th century voter turnout in the South also show that turnout, which once exceeded 60 percent in the 1880s, plummeted sharply in the next 30 years under the impact of tougher registration laws that targeted black voters. To this day, the Neo-Republican South still boasts the lowest turnout rates and highest black population shares in the country.
Some cynics argue that low US turnout rates are just a sign of how deeply “satisfied” American voters are with the way things are. However, these turnout rates have declined sharply over the last three decades, at a time when it is hard to believe that Americans have become more and more satisfied with their political system.
In 1968, for example, 73.2 million Americans voted, a 61 percent turnout level. Thirty years later, in 1998, the number of Americans who voted was still just 73 million -- despite the fact that US population had increased by 40 percent.
Beyond voting, as of 2002, one US citizen in three (33.6 percent) did not even bother to register to vote. And that proportion was higher than it was in 1993, when Congress passed the National Voter Registration Act, which was intended to facilitate voter registration.
Evidently a majority of American voters have now become so “satisfied” that they no longer choose to participate in it at all. According to this bogus "apathy" theory of non-registration, the most “satisfied” groups of all must be blacks, other minorities, youth, the poor, and residents of Southern states, whose turnout rates are all miserably low.
In 2002, in four states (Texas, West Virginia, Indiana, and Virginia), less than 40 percent of all eligible citizens of voting age voted. Of 24 million Americans between the ages of 18 and 24, 38 percent registered, and 4.7 million, or 19.3 percent, voted. Just 27 percent of unemployed citizens, 30 percent of Hispanic citizens, 30 percent of Asian American citizens, 30 percent of the 35 million disabled Americans, 35 percent of all women ages 18 to 44, 37 percent of high school graduates, and 42 percent of all black citizens voted.
In fact, as we’ll examine later, there are very important structural reasons that help to explain why these groups fail to register or vote.
In the case of black males, for example, prisoner and ex-felon disenfranchisement may account for a substantial fraction of their relatively low participation rates. And 70 percent of those who registered and didn’t bother to vote in 2002 blamed logistical problems – transportation, schedule conflicts, absence from home, registration problems, homelessness (2.3-3.5 million adult Americans, depending on the year), the failure to get an absentee ballot on time, inconvenient polling places, or illness (including 44% of non-voting registrants age 65 or older).
All these obstacles affect poorer, less educated, older voters more than others. Most of them might easily be addressed with improved voting technology, if this country’s leaders, despite their putative concern for democratization around the world, were really serious about implementing democracy at home.
Meanwhile, in 1998, some 83 million Brazilians voted – 5 million more than in the entire US, which has about 100 million more citizens. Brazil’s voter turnout increased dramatically since the 1960s, from 37 percent of VAP in 1962 to an average of more than 80 percent in 1994-2002. In 2002, while 88 million Americans were proudly exercised their right to vote, so were 91 million Brazilians – for an 81 percent turnout. On the “satisfaction” theory, all these Brazilians must be nostalgic for the dictatorship.
After the 2002 Congressional elections, some US political pundits were impressed because voter turnout had increased slightly, from 41.2 percent in 1998 to 42.3 percent (46.1 percent of all citizens).
From an international perspective, however, that merely put the US on a par with Haiti and Pakistan –- just half of Brazil’s level.
Overall, the US trends described here are hardly indicative of “voter satisfaction.” Rather, they are a very disturbing sign that there are deep structural impediments to voting in America. Furthermore, the grass roots organizing power that has always been essential for getting out the vote in this country, much of it supplied by parties and unions, may have been waning.
From this angle, it will be very interesting to see whether this November’s contest, and the elaborate new organizing drives that have been mounted to increase US voter turnout and registration, will reverse these trends. No doubt turnout will be higher than it was in the dismal 2002 off-year election, but that's not saying very much. A more telling indicator will be to see whether turnout surpasses the (relatively modest) 59 percent median VAP turnout rate that the US recorded in nine Presidential elections over the whole period 1968-2000. We would love to see it happen, but since that would amount to a 10 percent improvement over the turnouts recorded in 1996 and 2000, we doubt it will happen.
2. Voting Rights for Prisoners and Ex-Felons
Brazil. Disenfranchising prisoners and ex-felons is unfortunately a longstanding, widespread departure from “one person, one vote” -- a legacy of the age-old practice of ex-communicating social outcasts. Worldwide, there is a growing trend toward discarding this medieval practice, with 32 countries now allowing all prisoners to vote and 23 more that allow certain classes of them to do so.
Brazil is one of 54 countries that prohibit prisoners from voting while they are in jail, but it permits them to vote after they are released, or are on parole or probation.
The US. The American approach to prisoner voting is much more restrictive than Brazil's. All but 2 (Vermont and Maine) of the 50 states disenfranchise all incarcerated prisoners, including those awaiting trial. Thirty-four states disenfranchise all felons on parole, while thirty disenfranchise those on probation.
Furthermore, the US is one of only 8 countries where ex-felons are temporarily or permanently disenfranchised even after they have completed their sentences, unless they successfully petition the authorities to have their voting rights restored. In 7 US states, felons are disenfranchised for several years after serving their sentences – for example, 5 years in Delaware, or 3 years in Maryland. In 3 states – Arizona, Maryland, and Nevada -- recidivists are permanently disenfranchised. And in 7 other states – Alabama, Nebraska, Kentucky, Mississippi, and the “battleground states” of Iowa, Florida, and Virginia – all ex-felons are permanently disenfranchised.
Many of these rules date back to the Ante-Bellum period of the 1880s, when they were enacted by Southern and border states to maintain control over the newly-freed blacks -- contrary to the spirit of the 15th Amendment.
The impact of prisoner and ex-felon disenfranchisement on electoral outcomes is much greater in the US than Brazil, because of the electoral college system and the size, composition and location of the US convict population. Indeed, while Brazil's prison system is horribly overcrowded, its entire prison population is just 285,000 inmates -- .2% of Brazil’s voting-age population.
The US, in contrast, now has the world’s highest proportion of its population in prisons, jail, on probation or parole, or under correctional supervision, outside jail. As of August 2004, this “correctional population” totaled 7.2 million adults, 3.3% of the US VAP. Relative to population, as well as in absolute terms, this is the largest US prison population ever. It is also by far the largest prison population in the world, well ahead of the US’ closest competitors, China and Russia.
There are also another 3.2 million American citizens – 1.4% of the US VAP -- who have served time in state or federal prison for felonies and are no longer in correctional programs. Depending on their states of residence, they may be subject to the voting restrictions imposed on former felons in the US.
Both these totals have soared since 1980 because of stiffer drug laws and sentencing laws -- the “correctional” population as share of VAP has almost tripled, from 1.17% to 3.3%. (See Figure 3A-1.) (See Figure 3A-1.)
Furthermore, compared with 1980, when a majority of state and federal prison inmates were serving time for violent crimes, a majority are now either awaiting trial because they cannot afford bail, or are serving time for non-violent offenses, more than a quarter of which were relatively minor drug-related offenses.
Drug Offenses and Disenfranchisement. As other analysts have recently noted, such drug offenses rarely involve “victims,” and there is a high degree of prosecutorial discretion. This makes them especially vulnerable to racially-discriminatory arrest practices. For example, recent studies of drug arrest rates show that black arrest and conviction rates for drug-related offenses are way out of proportion to drug use in the black community, and that the disparity between black and white arrest rates for drug use has been soaring because of policing practices, not because of greater underlying criminality.
The resulting steep rise in the US prison population since the 1980s provides a strong contrast with European countries and leading developing countries, where per capita prison populations have been stable or even declining. Not surprisingly, the disparity is also consistent with the fact that Europe’s drug laws are much less punitive.
Unemployment Impacts.The increase in the US correctional population as a share of the population since 1980 has not only reduced the ranks of poorer voters. It has also reduced the size of the “observed” civilian labor force and the official US unemployment rate by 18-20 percent. In other words, the US unemployment rate in July 2004, for example, would have been 6.43 percent, not the official 5.43 percent reported by the Bureau of Labor Statistics. So without this swollen prison population, there would now be more than 10 million unemployed in the US – at least 2.2 million more than the official statistics show, and more than enough to swamp any alleged “job growth” in the last year.
So US penal policies have not only removed a huge number of prisoners from the ranks of potential voters. They have also helped to disguise the seriousness of the US economy’s rather tepid recovery.
And some of us thought the point of the US’ punitive drug laws was to reduce drug trafficking! (Note to reader: US real retail cocaine prices have plummeted since the 1980s. See Figure 3A-2.)
While it is not easy to measure the impact that US prisoner disenfranchisement has had on recent elections, it may have been substantial, as several analysts have recently noted. For example, one recent study estimated that in 2000, more than 3.0 million prisoners, parolees, and probationers, plus 1.5- 1.7 million ex-felons, were formally disenfranchised – 2.1% of the US voting age population. Another recent study of prisoner disenfranchisement in the state of Georgia found that 13% of adult black males were disenfranchised by this policy, and that it explained nearly half the voter registration gap between black males and non-black males.
There were also another 358,000 who had been jailed awaiting trial, and 218,000 more who had been jailed on misdemeanor charges. All these people were also effectively disenfranchised.
All told, during the 2000 Presidential race, the total number of potential American citizen/voters who were disenfranchised because of the US penal system and its archaic laws was about 5 million. Since the numbers have continued to grow since then, by now they have reached 5.5 – 5.8 million.
As other commentators have noted, this policy is also practically unique -- no other putative “democracy” comes anywhere close to this kind of systematic vote deprivation.
No doubt there are some determined ex-felons, parolees, and probationers who manage to slip through and vote even in states that prohibit them from doing so. Many others would not vote even if given the chance. However, even apart from the question of whether such harsh treatment encourages better behavior, this disenfranchisement policy is far from politically neutral:
Texas alone has at least 500,000 ex-felons and more than 200,000 prisoners and other inmates who have been disenfranchised, the overwhelming majority of whom are black or Hispanic.
Of Florida’s 13.4 million people of voting age, at least 600,000 to 850,000 prisoners, parolee/probationers, and ex-felons, have been disenfranchised by such voter registration laws, including at least one-fifth of all adult black males who reside there. Other battleground states, including New Mexico, Virginia, Iowa and Washington, have also used such laws to disenfranchise 15-25 of their adult black male populations.
All told, the top 15 battleground states account for at least 1.4 to 1.6 million excluded potential prison/ ex-felon votes this year. Combined with US’ knife-edged “winner take all” electoral system, this is clearly a very important policy choice.
Furthermore, in states like Florida, Texas, Mississippi, and Virginia, the opportunity to purge thousands of minority voter from the polls in the search for “ex-felons” has opened the doors to many other abuses.
For example, in 2000, there was the notorious purge by Florida’s Republican Secretary of State of 94,000 supposed “felons.” It later turned out that this number included more than 50,000 blacks and Hispanics, but just 3,000 actual ex-felons.
One might have hoped that one such flagrant anti-democratic maneuver would have been enough. But that was followed attempts by Florida’s Republican state administration to do the very same thing again in 2002 and again this year, when Florida tried to use another “bogus felons” list with another 40,000 names.
From this angle, all of the many arguments over Nader’s candidacy, “hanging chads,” and the narrow 537 vote margin by which Bush carried that state in 2000, were side-shows.
We are reminded of the Reconstruction period from 1867 to 1877, when Florida and 8 other Southern states had to be put under military occupation by the US Government, to prevent the white elites’ systematic attempts to deprive freed slaves of their voting and other civil rights. By the late 1870s, Northern passions toward the South had cooled, the Union troops left, and white-supremacist governments reacquired power. Unfortunately, unlike the 1860s, the “Radical Republicans” in Congress now side with the closet supremacists.
Counting Prisoners for Apportionment.The punitive US policy toward current and former prisoners appears even more bizarre, once we take into account the fact that for purposes of redistricting, the US Census – unlike Brazil – counts prison and jail inmates as residents of the counties where the prisoners are incarcerated, rather than the inmates’ home towns.
In general, this approach to counting prisoners for districting purposes tilts strongly in favor of rural Southern and Western states – areas that also now happen to vote Republican. (See Figure 3A-3), It has an important impact on the apportionment of Congressional seats and seats in state legislatures, the allocation of federal funds to Congressional districts, and the total number of electoral college votes that each state receives. It also creates a huge, influential, coalition of interests -- construction companies, prison administrators and guards, and politicians -- that mounts to a “politician-prison-industrial complex,” with powerful selfish motives to support tough sentencing laws and the construction of new prisons and jails.
The resulting combination of disenfranchisement and malapportionment recalls the “three-fifths compromise” that was built into the US Constitution in 1787, to accommodate the original six Southern slave states, where slaves constituted more than forty percent of the population. Under this provision, even though slaves could not vote, they were counted as three-fifths of a person, for purposes of determining each state’s Congressmen and Presidential electors.
Given this provision, it was no accident that 7 of the first 8 US Presidents were Virginian slave owners. This exaggerated Southern political power, entrenched by the anti-democratic electoral college, had disastrous consequences – it made resolving the problem of slavery without a regional civil war almost impossible. (Contrast Brazil’s relatively peaceful abolition of slavery.) From this perspective, the electoral college and prisoner disenfranchisement are both just throwbacks to America’s “peculiar institution,” slavery. As John Adams wrote in 1775,
All our misfortune arise(s) from a single source, the reluctance of the Southern colonies to republican government….The difficulties lie in forming constitutions for particular colonies and a continental constitution for the whole…This can only be done on popular principles and maxims which are so abhorrent to the inclinations of the barons of the South and the proprietary interests of the middle colonies…..
In a sense, the modern analog is even worse: prisoners can’t vote either, but they count as one whole person in the districts where they are imprisoned, for purposes of redistricting. In general, this approach to counting prisoners for districting purposes tilts strongly in favor of rural Southern and Western states – areas which also now happen to vote Republican.
Surprisingly, illegal immigrants are also included in the US Census count for redistricting purposes. Depending on where immigrants locate, this may reinforce the prisoner effect in some key states. The US illegal immigrant population has also been growing rapidly, with a Census-estimated 7.7 - 8.9 million illegals in the US by 2000, compared with about 3.5 million in 1990. According to the INS, two-thirds are concentrated in just five states – California, Texas, New York, Illinois, and Florida. However, unlike prisoners, estimating where illegal immigrants are located is much more uncertain. So the US policy of including non-voting illegals in the Census for purposes of drawing voting districts is also very peculiar.
Brazil. To encourage young people to get involved in politics, Brazil gives those who are 16 or 17 the right (but not the duty) to vote. This measure increases Brazil’s VAP by abou 6 percent. Brazil argues that a relatively low voting age is consistent with the spirit of the UN’s Convention on the Rights of the Child. It also argue that this youth vote acknowledges the basic fact that a majority of 16-17 year olds (in both Brazil and the US) pay taxes and can marry, drive, and be tried as adults, so they ought to be able to vote. So far Brazil has only been joined in this experiment by a handful of other countries, including Indonesia (age 17), Cuba (16), Iran (15), and Nicaragua (16). But the UK is now also seriously considering teen voting.
The US. The minimum voting age in the US has been 18 since 1971, when the 26th Amendment was adopted. A few states (Maine, California) have recently considered reducing the voting age below 18, but so far voting rights for 16-17 year-olds, much less the more radical proposal to let children of all ages vote, has not taken off. Obviously this cause has not been strengthened by abysmal voter turnout levels by 18-24 year old Americans in recent elections.
September 22, 2004 at 02:50 PM | Permalink | Comments (1) | TrackBack
Thursday, September 16, 2004
Democracy in America and Elsewhere: Part II: Recent Global Trends Toward Democracy


Of course we are also very proud of our free markets, our relative affluence, and our occasional ambitions -- at the moment perhaps a bit muted -- to provide equal opportunities for all our citizens.
However, when we try to market our country’s best features to the rest of the world, or teach our children to be proud of their country, it is not the economy that we brag about.
Even self-styled “conservatives” usually lead, not with glowing descriptions of perfect markets and opportunities for unlimited private gain, but with our supposedly distinctive commitment to defending and expanding political democracy and human rights at home and abroad.
Indeed, one of the most important official justifications for recent US forays into the Middle East, as well as our many other foreign interventions, has been to help bring “democracy” to supposedly backward, undemocratic societies like Iraq and Afghanistan (…and before that, Haiti, Colombia, Panama, Nicaragua, Grenada, Panama, the Dominican Republic, Cuba, Guyana, Guatemala, Iran, Laos, Vietnam, the Philippines, etc. etc. etc.)
Even though, time and again, this noble commitment has turned out to be pure rhetoric, it provides such an elastic cover story for all our many transgressions that it keeps on being recycled, over and over and over again.
Whatever the truth about US motives for such interventions, it may come as a surprise to learn that in the last two decades, the United States itself has actually fallen behind the rest of the democratic world in terms of “best democratic practices” and the overall representativeness of our own domestic political institutions.
Meanwhile, many developing countries have recently been making very strong progress toward representative democracy, without much help from us.
Indeed, in some cases, like South Africa, this progress was made in the face of opposition from many of the very same neoimperialists who have lately voiced so much concern about transplanting democracy to the Middle East.
While we have been resting on our democratic laurels, or even slipping backwards, the fact is that emerging democracies like Brazil, India, and South Africa, as well as many of our First World peers, have been adopting procedures for electing governments that are much more democratic at almost every stage of the electoral process than those found in the US.
The institutions they have been developing include such bedrock elements of electoral democracy as the rules for:
Of course effective democracy has many other crucial elements beside electoral processes alone. These include (1) the relative influence of legislative, executive, and judicial branches; (2) the concrete opportunities that ordinary citizens have -- as compared with highly-organized special interests and professional lobbyists -- to influence government decisions between elections; (3) the respective influence of private interests, religious groups, and the state; (4) the degree to which the rule of law prevails over corruption and "insider" interests; and (5) the overall degree of political consciousness and know-how.
However, fair and open electoral processes are clearly a necessary, if not sufficient, condition for effective democracy -- all these other elements simply cannot make up for their absence.
We hope that increasing the recognition of this “electoral democracy gap” between the US and the rest of the democratic world will be helpful in several ways:
This used to be much easier than it is now. As of the early 1970s, there were only about 40 countries that qualified as “representative democracies,” and most were First World countries.

Since then, however, there has been a real flowering of democratic institutions in the developing world. This was partly due to the collapse of the Soviet Empire in the late 1980s. But many more people were in fact “liberated” by the Third World debt crisis, which undermined corrupt, dictatorial regimes all over the globe, from Argentina, Brazil, and Chile to Indonesia, the Philippines, South Africa, and Zaire.
Voting in the Philippines, 2004
Assessments of the degree of “freedom” of individual regimes by organizations like Freedom House or the UN Development Program’s Human Development Indicators, are notoriously subjective. However, while there is plenty of room for disagreement about specific countries, there is little disagreement on the overall trend. (See Table 3.)
By 2004, about 60 percent, or 119, of the nearly 200 countries on the planet could be described as “electoral democracies,” compared with less than one-third in the early 1970s. Another 25-30 percent have made significant progress toward political freedom.
Voting in South Africa, 1994
Indeed, notwithstanding our present challenges in Iraq and Afghanistan, from the standpoint of global democracy, this has been a banner year. As of September 2004, 32 countries had already held nationwide elections or referenda, with 886 million people voting. (See Table 4.) By the end of 2004, another 33 countries will join the US in doing so – nearly three times as many national elections as were held each year, on average, in the 1970s.
All told, this year, more than 1.7 billion adults – 42 percent of the world’s voter-age population -- will be eligible to vote in national elections, and more than 1.1 billion will probably vote. That that will make American voters less than 10 percent of the global electorate.
Of course, some of these elections will be held in countries where democratic institutions and civil liberties are still highly imperfect. And some developing countries like Russia and Venezuela have recently been struggling to find a balance between democracy and national leadership, partly to undo the effects of neoliberal policies in the 1990s, or in response to terrorist threats.
But the good news is that democracy is clearly not a “luxury good.” The demand for it is very strong even in low-income countries like Bolivia, Bangladesh, Mozambique, Guatemala, and Botswana. And while self-anointed dictators, military rulers, and one-party elites or theocracies are still clinging to power in 50-60 countries that have more than 2.4 billion residents, such regimes are more and more anachronistic. (See Table 5.)
Interestingly, Asian dictatorships, especially China and Vietnam, now account for more than three-fifths of the portion of the world’s population that still lives under authoritarian rule. While several Islamic countries appear on the list of authoritarian countries, they account for just one fifth of the total. Furthermore, by far the most important ones happen to be close US “allies” like Pakistan, Egypt, Morocco and Saudi Arabia.
Evidently the simple-minded neoconservative “clash of cultures” model, which pits supposedly democratic, pluralist societies against an imaginary Islamic bloc, doesn’t have much explanatory power.
Furthermore, the US also clearly faces some very tough choices, if it is really serious about promoting non-discriminatory, secular democratic states that honor the separation between church and state among its Islamic allies, as well as in Palestine, and, for that matter, Israel.
Voting in East Timor. 2001
A more encouraging point is that many developing countries are already providing useful lessons in democratization. Indeed, as we will see in Part III of this series, there is much to learn from the experiences of new democracies like Brazil and South Africa.
These countries are undertaking bold experiments with measures like free air time for candidates, “registration-free” voting, direct Presidential elections, electronic voting, proportional representation, and the public finance of campaigns. While not all these experiments have worked out perfectly, the fact these countries have already demonstrated a capacity to innovate in “democratic design” is very encouraging.
Of course there is a long-standing tension between the US dedication to Third World democracy and its tolerance for the independence that democratic nationalism often brings. By renewing and deepening our own commitment to democracy at home, we will also protect it abroad -- even though (as in Venezuela, Russia, Iran, and perhaps eventually also Iraq) it does not always produce governments that we agree with.
September 16, 2004 at 09:08 PM | Permalink | Comments (2) | TrackBack
Wednesday, September 15, 2004
Democracy in America and Elsewhere: Part I: Does It Really Have to Be This Way?
Although many pundits and politicians have hailed this contest as the “most important election of our lives," and talked about the “striking difference” between the two top candidates, most opinion polls show that a majority of ordinary Americans are profoundly dissatisfied with the limited choices available on this year's Presidential menu, with a majority of undecided voters disapproving of both Bush and Kerry in this week's polls.
They correctly sense that the prolonged, torturous process by which we choose the Leader of the Free World is deeply flawed. Many are asking, “Am I the only one who is unhappy with having to choose between these two Yale-bred prima donnas?......the only one who feels that this year's campaigns and the mass media have systematically avoided most of the critical issues that confront our country??”
If so many Americans prefer a different type of politics, however, why can’t they have it?
It turns out that this year’s unsatisfying campaign isn’t just an aberration. Rather, many of its less attractive features are a direct byproduct of deep-seated structural flaws in our electoral system, most of which are decades- or even centuries-old.
Unless we undertake the fundamental reforms required to fix these problems, our version of "democracy" is likely to become less and less attractive as a role model.
These structural flaws come into sharp relief is when we compare the US to other democracies, especially several younger ones that have proved to be much more innovative than we are when it comes to "designing democracy."
When we do so, we arrive at a disturbing conclusion: in many respects, American democracy is falling behind the rest of the democratic world.
As we will explore in this series, the fact is that many other countries – including several developing countries as well as our First World peers – have adopted electoral processes that are much more democratic than our own.
Rather than bemoan this year’s Presidential campaign, therefore, we propose to explore the root causes of our political malaise. We will tackle this problem with the help of a comparative approach, examining "best practices" in other leading democracies that are working hard to insure that national elections are more than just the costly, high-carb biennial beauty pageants that they have become in the US.


Another factor that has made the race close is the record level of campaign spending on both sides – more than $795 million for the Presidential race alone, plus at least $272 million of “527” money, including $20 million from the National Rifle Association and $2.6 million from the Swift Boat Veterans group. Contrary to expectations, both parties have stayed about even in the money rush, despite President Bush’s renowned fund-raising abilities.
Finally, there have also been an unusual number of wild-cards – putative terrorist threats, oil price shocks, North Korean nukes, job growth, Ralph Nader’s quixotic quest, and the continuing ups and downs of the Iraq and Afghan Wars, plus all the typically American quasi-religious disputes over gay marriage, the Vietnam War, assault rifles, stem cell research, and late-term abortions.
Along the way, we've had a floodtide of small-bore reporting, encouraged by instant polling, “rapid response,” and the army of several thousand reporters who have nothing better to do than cover the campaign 24/7 and 31/12.
Much of the resulting reportage reads like a kluge of the Daily Racing Form and the National Enquirer, with far less attention paid to hard policy issues than to campaign tactics, polls, and candidate “features" -- values, personal histories, eating habits, wives, children, appearances, misquotes, and mood swings. (For a good example, witness, for example, today's cover stories on "where's Edwards?" in both the San Francisco Chronicle and the New York Times.)
So far the candidates, their campaigns, and their advertising have reinforced this pattern, spending far more time on their own values and competence than on fundamental issues -- or on the real benefits that voters would derive from electing them. Even when they do get down to issues and actual benefits, the focus is on just a handful that won't offend undecided voters in key states. (See Box A.) Given the electoral college, they are also spending almost all their time and resources in the same 15-16 “battleground states,” where the polls show a gap between Kerry and Bush of 3 percent or less. (See Table 1.) Within these states, they are also focusing on the same 6-10 percent of voters who have somehow managed to remain “undecided” even at this late date.
As a result, this year's election will be decided by just a sliver of potential voters in a handful of states. The battleground states account for less than a third of all US “voter-age” residents or “potentially-eligible voters” -- after deducting non-citizens, convicts, and others not eligible to vote. If these states repeat the modest turnouts that prevailed in 2000, less than 57 percent of their voting age populations will vote. And since a winning candidate only needs a plurality, which can be less than 50%, this implies that in a country with more than
Given the way our particular version of democracy is structured, therefore, the rest of us face the fact that the chances that we will die in an accident on the way to the voting booth are infinitely greater than that our votes will have any impact whatsoever on this election. As we will see below, the chances are also slim to nilthat we will exert any influence on the vast majority of Senate or Congressional races.
All this might not matter so much if undecided voters in battleground states were good proxies for the rest of us. But they are not.
As indicated in Table 2, they differ from the rest of the country in many important respects. Most of these states are relatively backward, in the bottom half of all 50 states in terms of per capita incomes and education levels, with a much higher-than-average share of poor residents. Nearly a quarter of their populations live in rural areas, twice the average share for non-BG states.
Compared with non-BG states, the residents of these states are also more likely to own guns, and much less likely to have lost factory jobs in the recent recession. They are also more religious -- Catholic voters account for 40 percent and 35 percent, respectively, of all potential voters in New Mexico and New Hampshire, while conservative Christians account for more than a third of the population in 10 of the 15 BG states, and orthodox Jews are a key voting bloc in Florida. Hispanics account for more than 15 percent of potential voters in Florida, Arizona, Nevada, and New Mexico. Illegal immigrants constitute a critical part of the work force and Census headcount (for apportionment purposes) in half the BG states.
Furthermore, as shown in Table 1, almost half of the BG states have harsh “felon disenfranchisement” laws. Most of these permanently deprive anyone ever convicted of a felony within their states of the right to vote, even after their sentences have been served. These laws could play a decisive role in battleground states like Florida, Virginia, Arizona, Tennessee, and Iowa, just as they did in 2000.
All told, given the “winner take all” nature of the US electoral college system (see below) and the strategic role of so-called “undecided voters” will play in battleground states, it is not surprising that there is a long list of important issues where both Bush and Kerry have either both been completely silent, or have adopted straddling positions, many of which can only be distinguished from the President's under an electron microscope.
So far, at least, Kerry appears to have bet heavily that the American people will choose him mainly because he's brighter, more competent and more trustworthy than Bush, not because his foreign and domestic policy alternatives are wildly different and exciting. This is a bet that the far more well-liked, if quasi-competent and semi-literate Bush has been delighted to accept.
As a result, as discussed in Box A, even though a majority of US voters may well be open to far less timid new approaches to issues like the Iraq War, farm subsidies, illegal immigration, gun control, energy conservation, corporate crime, our relationships with “allies” like Israel and Saudi Arabia and “enemies” like Cuba, Iran, and Venezuela, not to mention the Patriot Act, balancing the budget, global warming, reforming Social Security, and revising drug laws, from the standpoint of capturing marginal voters in battleground states, many of these issues have been deemed “no win” and too-hot-to-handle.
Furthermore, when other candidates – independents and third party candidates – try to raise such issues, they are either ignored or tagged as “spoilers” whose presence only serves to help those voters’ least preferred candidates. Since such candidates are marginalized by the mainstream media and excluded from US Presidential debates, the major party candidates can easily skip over any special issues that they seek to raise.
Despite all the hoopla, therefore, many American voters justifiably feel like they've been invited to dinner and served pictures of food. In effect, discourse on fundamental issues has been stifled by the rules of the American electoral game.
E.g., it is not only the felons who have been disenfranchised by this US Presidential election process.
Does it really have to be this way? To get a handle on this question, it will be helpful for us to step back and take stock of how the US stacks up against other democratic countries, especially those in the developing world -- not in terms of economic performance, but in terms of effective electoral democracy. Interestingly, it turns out that we actually have quite a few things to learn from them.
September 15, 2004 at 10:38 PM | Permalink | Comments (2) | TrackBack
Sunday, June 27, 2004
"Letters from the New World:" Fighting Corruption at Eye Level in Nigeria
Whenever we talk Nigeria, we talk corruption. The two go together. Finland, ice and cellphones. Israel, strife. Australia, complacency and kangeroos. When you talk Nigeria, corruption is the first thing that comes up.
Try it. Tell people “I’m going to Nigeria.” Four out of five responses will give you detail, usually second-hand, about the necessity of spreading your dollars through different pockets in different denominations. From there they’ll go into horror-stories about corruption.
I found my first trip to Nigeria challenging. Scary, too, but that was a different thing. When the captain said: “We’re commencing the descent” I clutched my wallet. The scare starts fading once reality replaces rumour, but the challenge doesn’t fade at all.
Corruption is one of the main reasons why Africa spent its first half-century of liberation heading, on statistical averages, backwards. Another major one is Indigenisation, Transformation, or whatever the name is for shoving wrong people into wrong roles. Indigenisation is tricky to deal with, and the real answer still awaits.
Corruption is not tricky at all. The simple answer is: don’t do it. To me that’s the sole answer. It’s ingrained. I can’t pay a bribe, any more than I can kill an animal or fling a bottle. The neurons are just programmed another way. Which poses a certain fluttering of the pulse when one is landing in Lagos, stocked high with dire warnings. But I remind myself that visitors to Johannesburg, too, are warned direly about our corruption, whilst I who live in Johannesburg remain a virgin after all these years – apart from sandwiches and cold-drinks.
Initially a little teeth-gritting is required, to take the same approach to Lagos, but it settles. By the time the first guy outright demanded a bribe (“because I am the one who is in charge of your baggage, heh, heh, heh…”) I was emboldened. All he got from me was two short traditional words.
I recognise that it’s hollow to sound holy when all you’re risking is a suitcase of used clothes, or the R500 fine for phoning while you drive. I admit I have no bosses to sack me if I fail to secure the contract,
no shareholders to re-deploy me as Deputy Manager of the Jammerdrif depot.
Still, the principle is not much different: Most times that you give the bribe-seeker short words he backs down. The times he doesn’t, it’s better to suffer the consequences than to hammer another nail into the coffin of your continent’s aspirations. In which light, the way that the going-into-Africa discussion usually plays out can be depressing.
It starts well, invariably. “We’re all Africans now, isn’t it wonderful. And you should see how much good we’re doing!”
They are, too. And it is spectacular, often. On Monday the housewives of Ndola are buying scrawny ox-shank, little more than bone with hair, chowed on for a week by 10,000 flies. They’re buying baked beans in rusted tins, a year after sell-by. They’re paying three times the price that their privileged southern sisters pay for first-class merchandise at our fancy Cresta or Cavendish supermarkets.
On Tuesday the new South African supermarket opens its doors. By Friday, Ndola is dancing to a new tune. The city is galvanised. Everybody has to give service, give value, wake up. On a March visit to a provincial capital, the dinner options are a suspicious unnameable stew on a sweltering dusty roadside, or a two-star menu at a five-star hotel with the seven-star tariff designed for the expense accounts of the aid brigade (who keep the aircon on frigid to remind them of home). In April a South African chain introduces middle-class eating at middle class prices. By July the city is holding its head higher.
Similar processes occur in every industry from brick machines to water meters. Good is being done, no two ways. But then the question the question comes up: “and, er, ahem, how do you handle the matter of adaptation to local mores?” There is a common answer to that question: “Oh, no, no, no. No corruption for us. We know that corruption causes ruin and destruction. We have no part in it, except of course when absolutely necessary.”
There is also the increasingly fashionable answer: “You know, we do have to grow into African ways. The time for arrogance is over now. We must mature into an acceptance that our sectional traditions are not universal.”
Then there is the answer that nobody gives unless he’s absolutely certain you are never going to quote him: “Why should I worry? That country is a total stuff-up anyway. If I can give some guy a million rand and make ten million in return, what do you think?”
Finally there is the still small voice that says: “No, on no account do we do it.” You hear that voice not often, and believe it less often. When you do believe it, perhaps because you know the people concerned especially well and repose in them a special faith, it is jolting indeed to find that the rumour factory is thick with alleged inside tales that place your faith under constant question.
The net result is disappointing, especially at the times that I am revelling in the magnificent welcome that tropical Africa addresses to Seffricans of the paler kind. Tropical Africa addresses magnificent welcomes to most people in most circumstances, but they have an added knack of making the whiteys from “South” as they call it, feel like a long lost brother.
You’re taken as a member of the family – a fairly pushy member, perhaps, rich in annoying habits, but in some way one of us, something more than solely a buccaneer on the profit trail. You’re a curiosity factor as well, and you’re assumed to be – potentially, at least – a handily systematic sort of character, the long lost brother who maintained the household inventory and made sure the insurance premiums were paid.
It’s a delightful combination. Not for nothing does every second SA expat go on and on about being wanted, being needed, being befriended, being loved (in the intervals between going on and on about not being robbed).
The prospects are wonderful, moving, emotional. A continent actually moving upward, after fifty years of empty talk about moving upward; moving upward and forward and with us, us, the once untouchable white South Africans, in there and part of it, in the engine room, the galley, the bridge, the lot.
Unfortunately that vista gets harder to glimpse as time goes by. Reality intrudes. I look at the heavy hands that RSA brings into the rescue of this or that failing African mine or plant or factory. I look at the hubris “stand aside, mere locals; we’re very friendly, as you see, calling us Jack and Joe and not ‘Bwana’ any more, but we’re in charge again so keep out of our way.” I look at the sickening crass insensitivity; the pulling of rank, sometimes unwitting; the disinterest in learning the barest syllable of even French or Portuguese, let alone Swahili or Hausa; the fervour to adopt every management fad emanating from New York or LA.
The pioneers carrying business to the tropics could and should be our heroes, our champions. Too often they become embarrassments. The many lesser embarrassments could usefully be discussed.
The one big embarrassment is not susceptible to much discussion. A culture of corruption means a pathetic nation; that is no more arguable than that the sun comes up in the east, and a critical mass of pathetic nations means a continued pathetic continent.
There’s sabotage in there.
Some foreign companies that have recently tried to enter Nigeria, like Shell and Halliburton, have apparently been following this well-trodden road to perdition. But there are signs of hope. A few others, like Vodafone, have recently been told by their shareholders to refuse to play at all unless they can play it straight.
Ironically, behind the closed doors of our cynical business community, it is Vodafone that gets the most ridicule. Indeed, almost wheresoever two or three businesspersons gather together in South Africa these days, one hears: “Look at these wussies, getting their asses whupped in Nigeria; buncha sissies calling themselves an African company, squealing ‘good governance’ because they’ve come short. What business did they have leaving just because they couldn’t play it straight? ”
Somebody’s got this upside-down. The real question we should be asking of is of those who stayed: “Precisely how did you manage to stay on and keep playing it straight?”
June 27, 2004 at 10:20 PM | Permalink | Comments (0) | TrackBack
Tuesday, June 22, 2004
"Farmingville" A New Film About Agro-Business, Globalization, and Poor Mexican Farmers
This week marks the television premier of Farmingville, an outstanding documentary on the devastating impact that a really quite lethal combination of globalization plus First World farm subsidies is having on developing countries like Mexico.
Produced and directed by fellow Long Islanders Carlos Sandoval (Amagansett, NY) and Catherine Tambini (Hampton Bays, NY), Farmingville won this year’s “Special Award for Documentary” at the Sundance Festival, and it has also received many other prestigious awards. (For those of you in Long Island, it will also be shown on Thursday June 24 on Ch. 21, accompanied by a discussion with Sandoval and several of the film’s participants, moderated by OLA’s outstanding local leader, Isabel Spevedula de Scanlon.)
The social crisis described by Farmingville is a striking example of one of neoliberalism’s more disturbing patterns – the combination of “socialism for the rich” with “free trade for the poor.” Each year the US government provides more than $10 billion in subsidies to American corn farmers in politically-influential states like Iowa, Minnesota, Nebraska, and Kansas. From a political standpoint, these subsidies are usually justified in the name of preserving the “American family farm.” In fact the vast bulk of the subsidies goes to a handful of incredibly rich US agro-conglomerates, such as Cargill and Archer Daniels, Midlands (“ADM”).Together, these corporate giants now account for more than 70 percent of domestic US corn production.
These subsidies have not saved America’s family farmers, who continue to disappear at a rapid rate. But the $10 billion a year in subsidies has the giants to overproduce, resulting in surpluses that have been dumped onto world markets at artificially-low prices.
As documented in Farmingville, combined with the “free trade” policies adopted by the US and Mexico in the last decade, these surpluses have devastated family farmers throughout Mexico.
Of course Mexican farmers were the original source of “corn” – they’ve been growing it for at least 10,000 years. Until recently, corn accounted for at least half of the acreage they planted. In fact corn is not just a product in Mexico; it is also at the core of a whole cuisine and culture.
Since the adoption of the North American Free Trade Treaty (NAFTA) in 1993, however, the real price of corn has dropped more than 70% in Mexico. even as domestic non-labor production costs have risen dramatically.
Most of the price declines are due to escalating US corn imports. Recent estimates by an Oxfam study of “The Mexican Corn Crisis,” for example, show that US corn is dumped in Mexico at between $105m to $145m a year less than the cost of US production.
As a result, many campesinos are being forced out of business -- the country has lost the majority of its corn farmers in just the last 10 years. This has caused havoc in the entire rural economy, produced mass unemployment and forcing a mass migration to Mexico’s already overstuffed cities. And that, in turn, has accelerated emigration, with thousands of desperate, hungry people trying to leave Mexico every day, and dozens of them literally dying in the desert wastelands along the border, trying to get to
“El Norte.”
Indeed, according to the latest statistics from the US Bureau of Immigration and Naturalization, illegal immigration along the Mexican border is now at an all-time high.
Meanwhile, US agricultural conglomerates like ADM and Cargill have become more profitable than ever. They are using their fat profits to extend their dominance abroad. For example, Cargill now owns 30 percent of Maseca, the giant Mexican food distributor that dominates the Mexican tortilla market.
As Oxfam’s recent report on this neoliberal debacle concludes,
"The Mexican corn crisis is yet another example of world trade rules that are rigged to help the rich and powerful, while destroying the livelihood of millions of poor people.”
Indeed, the story that Farmingville relates is an especially graphic example of the perverse consequences that neoliberal policies can have once powerful interests get hold of them -- when US corporate giants are able to have their way with free trade, wide-open capital markets, lavish government subsidies, political leaders on both sides of the border, and poor farmers all at once.
Obviously this is tough time for leading US politicians to take on the powerful farm lobby, much less propose policies that might trim US exports at a time of massive trade deficits. But are there no US or Mexican political leaders with longer-term vision, willing to tackle this grossly-inequitable, morally-reprehensible situation?
June 22, 2004 at 02:05 PM | Permalink | Comments (0) | TrackBack
Thursday, June 17, 2004
The "Reagan Revolution," Part Two: The View from Developing Countries


"Man wants to forget the bad stuff and believe in the made-up good stuff. Its easier that way."
"He (Reagan) may have forgotten us. But we have not forgotten him."
"Folly is a more dangerous enemy to the good than evil. One can protest against evil; it can be unmasked and, if need be, prevented by force....Against folly we have no defense. Neither protests nor force can touch it; reasoning is no use; acts that contradict personal prejudices can simply be disbelieved. Indeed, the fool can counter by criticizing them, and if they are undeniable, they can just be pushed aside.... So the fool, as distinct from the scoundrel, is completely self-satisfied. In fact, he can easily become dangerous, as it does not take much to make him aggressive...."
Following last week's prolonged national memorial to President Reagan, the most elaborate in US history, most Americans have turned their attention back to the troubled present. But we cannot resist continuing down the revisionist path that we started on in Part One of this series.
Contrary to Henry Ford, history is not "bunk," nor is it "just one damn thing after another." In fact, it is one of our most valuable possessions. But unless we take the time to learn from it, it can easily come back to haunt us -- as it is doing right now. At the very least this exercise will prepare us to evaluate President Clinton's new autobiography, which is due out next week.
As noted in Part One, most recent discussions of Ronald Reagan's foreign policy legacy have focused almost entirely on the Cold War. Even there, as we argued, his legacy is decidedly mixed. While he may have helped to pressure the Soviets to reform, he also took incredible risks with the balance of nuclear forces, including some risks that we are still living with to this day.
When we turn from superpower relations to Reagan's impact on developing countries, the legacy is even starker. In The Blood Bankers, we've detailed how the Reagan Administration's lax policies toward country lending and bank regulation exacerbated the 1982-83 Third World debt crisis. And then the administration did very little to help developing countries fundamentally restructure their debt burdens and recover. By the end of the 1980s, most country debt burdens were higher than ever.
Here we will focus on another long-term legacy of Reagan's relations with the developing world -- the consequences of his support for a plethora of reactionary dictatorships and contra armies all over the globe.
Most Americans are probably not aware of it, but this bloody-minded policy fostered several nasty wars in developing countries that have cost literally millions of lives -- and are still producing fatalities every day, by way of wounds, continuing conflicts, unexploded ordnance, and landmines.
Furthermore, as described below, the Reagan Administration was also responsible for several of the clearest examples in history of state-sponsored terrorism.
Unfortunately, it turns out that very little of this was really necessary, either from the standpoint of defeating the Soviets, pushing the world toward democracy and free markets, or enhancing US security.
Indeed, in the long run, Reagan's policies basically destabilized a long list of developing countries and increased their antagonism towards the US. Combined with the policies of "benign neglect," stop-go intervention, and ineffective neoliberal reforms that characterized the Clinton Administration's policies toward developing countries, and the neoconservative policies pursued by both Bushes, it is no accident that America's reputation in the developing world is now at a record low.
Unfortunately, like some of the risks that Reagan's policies introduced into the nuclear balance, these effects may have a very long half-life. Surely they will be with us long after Ronald Reagan has met his Maker. We just hope for the Gipper's sake that his Maker does not read this article before pronouncing judgment upon him.
THE INDICTMENT
There is an abundance of examples of the Reagan Administration's strong negative impacts on developing countries. To cite just a few:
In the case of the Philippines, the Reagan Administration was a staunch ally of Ferdinand and Imelda Marcos right up to their last helicopter ride to Hawaii in February 1986. Vice President George H.W. Bush visited Manila in March 1981, soon after Reagan was elected, to thank him for his generous support. He toasted Marcos in glowing terms: "We love your adherence to democratic principles and democratic process....." The thousands of political opponents who were tortured, imprisoned, or died fighting this corrupt conjugal dictatorship and the millions of Filipinos who have spent the last twenty-five years servicing the couples' unproductive foreign and domestic debts would probably disagree.
In the case of Iran and Iraq, Reagan helped arm and finance Saddam Hussein throughout the 1980s, encouraged the Saudis and Kuwaitis to finance his invasion of Iran when it bogged down, helped to equip him with chemical and biological weapons, sent Donald Rumsfeld to Baghdad to assure close relations and propose a new pipeline to Saddam to help him export his oil, and even provided a team of 60 Pentagon analysts who sat in Baghdad, using US satellite imagery to target Saddam's chemical weapons against the Iranians.
At the same time, as the Iran-Contra arms scandal later disclosed, Reagan also helped Iran buy spare parts and advanced weapons for use against Iraq. He also looked the other way when Saddam decided to turn his US-supplied Bell Helicopters and French-supplied Mirage jets and chemical weapons on the defenseless Kurds at Halabja. Of course, the fact that the UN, under strong US pressure, did nothing at the time to condemn Saddam for this behavior did not exactly discourage further aggression.
This bipolar policy contributed to prolonging the 1980-88 Iran-Iraq War, one of the largest and bloodiest land wars since World War II. It cost 500,000 to 1 million lives and 1-2 million wounded, and created more than 2.5 million refugees. It also caused a huge amount of damage to both countries' economies, and left Iraq, in particular, broke and heavily indebted. As we've argued in The Blood Bankers, that destabilization, in turn, contributed significantly to Saddam's 1991 decision to invade Kuwait in 1991 -- and ultimately, our current Iraq fiasco.
In the case of South Africa, the Reagan Administration steadfastly opposed any US or UN sanctions on international trade and investment. Indeed, it continued to work closely with the apartheid regime on many different fronts, including the civil wars in Angola (see below), Namibia, and Mozambique.
It also now appears that both Carter and Reagan turned a blind eye to South Africa's development of nuclear weapons and ballistic missiles, in collaboration with Israel, which purchased its uranium from the Pretoria regime. Fortunately, no thanks to Reagan, Bush I, or for that matter, Bill Clinton, apartheid came to an end in the early 1990s, and South Africa became the first nuclear power ever to dismantle its nuclear weapons.
In the case of Guatemala, Reagan gave a warm embrace to the brutal dictatorship of General Efrain Rios Montt in the early 1980s. Rios Montt, a graduate of Fort Benning's School for the Americas, was also an ordained "born-again" minister in California-based Gospel Outreach's Guatemala Verbo evangelical church. Evidently that combination endeared him to the Reagan Administration -- US Assistant Secretary of State Thomas Enders praised him for his "effective counter-insurgency," and President Reagan called him "a man of great personal integrity," "totally dedicated to democracy," someone who Amnesty International had given "a bum rap."
This cleared the way for hundreds of $millions in World Bank loans and US aid that helped to make Rios Montt and his generals rich. Meanwhile, the junta implemented a genocide that a UN-backed Truth Commission later found was responsible for the deaths of 200,000 Guatemalan peasants, mainly Mayan Indians.
In the case of Argentina, Reagan turned a blind eye to the "dirty war" waged by the military junta against its opponents, at a cost of 30,000 lives and many more destroyed families.
When this junta launched the April 1982 invasion of the Falkland Islands to deflect public attention from its political and economic woes, Reagan and Secretary of State Al Haig ultimately decided to side with the UK's Margaret Thatcher, a fellow neoconservative. However, key Reagan aids Jeane Kirkpatrick and Michael Deaver worked behind the scenes to support the fascist junta, encouraging it to believe that the US might stay neutral. The very evening that the invasion was launched, Kirkpatrick was the guest of honor at an elaborate Washington D.C. banquet that was sponsored by the junta.
In the case of Panama, Reagan's CIA subsidized and promoted the rise of General Manual Noriega, another graduate of the notorious US School of the Americas. The US made extensive use of Noriega's intelligence gathering capabilities during the contra war with Nicaragua (see below).
This encouraged Noriega to believe that he could get away with anything. For a while he did: in the early 1980s, he became one of the most important cocaine wholesalers in the region, shipping a ton of coke per month to Miami on INAIR, a Panama airline that he co-owned, literally under the US Customs' nose. By 1989, even George H.W. Bush was embarrassed, and he had the dictator forcibly removed -- at a cost of the lives of 23 US troops, 314 Panamanian Defense Forces, and several hundred Panamanian civilians.
In the case of tiny Honduras, the poorest country in Central America, the Reagan administration turned another of its many blind eyes to the rise of death squads in the early 1980s. John Negroponte, the former US Ambassador to the UN and our new "proconsul" in Iraq, served as Ambassador to Honduras from 1981 to 1985. As this author knows from first-hand experience, reports of human rights abuses in Honduras were rampant during this period. It is hard to believe that Negroponte, who cultivated close relations with the Honduran military, was simply unaware of all these reports.
One of the key offenders was Battalion 3-16, the CIA-trained and funded Honduran military unit that was responsible for hundreds of disappearances and torture cases, including several that involved Americans.
One US embassy official later reported that in 1982, Negroponte had ordered any mention of such abuses removed from his annual Human Rights reports to Congress. Negroponte has denied any knowledge of this, and has skated through several confirmation hearings to arrive at the very top of the US diplomatic corps, where he will soon be running the world's largest US embassy.
In the case of El Salvador, the Reagan Administration also sharply increased economic and military support to a brutal oligarchical regime that was also deeply involved in death squads. President Carter had also provided military aid to the regime -- indeed, Archbishop Oscar Romero's condemnation of that aid was one key factor in his assassination in March 1980. After Reagan's November 1980 election, the Salvadoran military felt it had a "green light" to become even more aggressive with its opponents in the Church and unions, as well as the FMLN rebels.
One immediate byproduct of the "green light" was the murder of four US Maryknoll nuns in December 1980. Reagan's first Secretary of State, Al Haig, later suggested that the nuns might have been killed in a "crossfire" when they "ran a roadblock. " But their murders were later attributed to five Salvador National Guard members, who, in turn, appear to have acted on orders from senior members of the Salvador military.
A law suit was eventually brought on behalf of the nuns against the commanders to whom these guardsmen ultimately reported -- Jose Guillermo Garcia, El Salvador's Minister of Defense from 1979-1983, and Carlos Eugenio Vides Casanova, the former head of the National Guard. These were the Reagan Administration's key Salvadoran allies in the early 1980s, and they'd been rewarded with retirement in Florida.
In 2000 a jury ruled that even though they had given the orders, they did not have "effective control" over their subordinates, given the instability in the country. However, in July 2002, another jury in West Palm Beach found the duo liable for torture and other human rights abuses against three other victims, and ordered them to pay $54.6 million in damages.
Meanwhile, their paymasters and other collaborators in the Reagan Administration have gotten off scot free. Reagan's insistence on a military solution to the conflict in El Salvador helped to perpetuate the civil war throughout the 1980s, at a cost of more than 75,000 lives. Ultimately, under Bush I and Clinton, the long-delayed negotiated solution was achieved.
As for Archbishop Romero's assassin, he has never been found. There are credible reports, however, that the actual triggerman now lives -- naturally enough -- in Honduras.
In the case of Lebanon, Reagan was responsible for a broken promise to the Palestinians that ultimately contributed to the 1982 massacres at the Sabra/ Shatila refugee camps. To get the PLO to withdraw from Beirut, Reagan promised to protect Palestinian non-combatant refugees in those camps. Indeed, the PLO fighters left on August 24, 1982, and US Marines landed on August 25. But they were withdrawn just three weeks later, on September 10, after the PLO fighters left. Ariel Sharon,Israel's Defense Minister at the time, promptly ordered the Israeli Defense Forces to surround the camps. They refused to let anyone leave, and then permitted his Lebanese allies, the rightist Christian Phalangists, to move in.
The result was the slaughter of at least 900 to 3000 unarmed Palestinians, including many women and children, on September 16-18, 1982. As former Secretary of State George Schultze later commented, "The brutal fact is, we are partially responsible." Israeli's own Kahan Commission later found Sharon "indirectly responsible" for the massacre, but imposed no penalties, other than forcing him to resign as Defense Minister.
In the case of Angola, Reagan, in cooperation with South Africa's apartheid regime and Zaire's dictator Mobutu, helped to sponsor UNITA, Joseph Savimbi's rebel band, against the left-leaning MPLA, which also happened to have far stronger support from the Angolan people. Reagan hailed the power-hungry Savimbi as a "freedom fighter," and enlisted wealthy arch-conservatives like beer merchant Joseph Coors and Rite-Aid owner Lewis E.Lehrman to organize assistance and lobby Congress for millions in aid.
In fact Savimbi turned out to be one of the world's most lethal terrorists. Even after UNITA lost UN-supervised elections in September 1992, he continued the war, financing his operations by trafficking in "blood diamonds."
The resulting guerilla war cost the Angolan people up to 1 million dead, turned a quarter of Angola's 12 million people into refugees, and devastated health and education programs and the domestic economy. It also left an estimated 6 to 20,000,000 land mines scattered all across the country, one of the world's most heavily mined countries, with more than 80,000 amputees as a byproduct. Only with Savimbi was finally killed in May 2002 was the country finally restored to peace.
In the case of Afghanistan, Reagan considerably expanded aid to the Afghan rebels in the early 1980s, providing them more than $1 billion in arms and sophisticated weapons like Stinger missiles to fight the Soviets. The resulting battle ultimately cost the Soviets 15,000 lives. But the price to Afghanistan was much higher -- the Afghan people lost more than 1 million dead and wounded, plus millions of refugees. Furthermore, after the Soviets finally left in the 1989, the country became a stomping ground for opium-dealing warlords, religious fanatics like the Taliban, and al-Qaeda's global terrorists.
Furthermore, we now know that Gorbachev had offered to pull Soviet troops out of Afghanistan in 1987, in exchange for reduced US arm shipments to the rebels. However, he was rebuffed by the Reagan Administration, which wanted to prolong the Soviets' agony. This not only cost a great many more Afghan (and Soviet) lives, but also helped turn Osama Bin Laden from a nobody into a folk hero. All this helped to pave the way to 9/11, the continuing war in Afghanistan, and the even more dangerous global terrorist war.
All told, then, the Reagan Administration clearly has a lot to answer for with respect to the developing world. And this is even apart from one of the most perfidious examples of Reagan's brutilitarian policies, that of Nicaragua -- as the following excerpt from The Blood Bankers makes clear.
NICARAGUA'S COUNTERREVOLUTION
By the end of 1980, with Nicaragua's civil war concluded, General Anastasio Somoza deBayle dead in Paraguay, and the country''s debt settlement with its foreign banks concluded, many Nicaraguans were looking forward to rebuilding their economy and finally achieving a more peaceful society. Alas, it was not to be.
Undoubtedly the Sandinistas deserve some of the blame for the way things turned out, though, as we will see, the odds were clearly stacked against them. As the strongest faction in the winning coalition, and “the boys with the guns,” at first they commanded overwhelming popular support for having rid the country of the world’s oldest family dictatorship outside of Saudi Arabia and Paraguay. However, like Venezuela’s Hugo Chavez in the 1990s, they were torn between leading a social revolution and building a multi-party democracy.
Their hero, Augusto “Cesar” Sandino, “the general of free men,” had fought the US military and the Nicaraguan army for six years to a standstill, before he was betrayed and murdered by General Anastasio Somoza Garcia in 1934. After a decade of insurgency in the 1970s, the Sandinistas’ most important experiences to prepare them for the job of running the country were limited to armed struggle, clandestine organizing, and some very rough times in Somoza’s jails. Unhappily, one of their most accomplished political leaders, Carlos Fonseca, had been murdered by the National Guard in 1976.
On the other hand, as South Africa demonstrates, it is not impossible for committed revolutionaries to lead a fairly peaceful transition to a multi-party democracy. After all, the ANC had waged just as long a struggle against a state that was no less repressive as Somoza’s. Many of the ANC’s supporters were also just as radical as the Sandinistas, and it also sourced most of its weapons and advisors from radical watering holes like the Soviet Union, East Germany and Libya.
However, ironically, South Africa was not as easy for the US to push around as Nicaragua. South Africa accounted for two-thirds of sub-Saharan Africa’s economy and most of the world’s gold, diamonds, platinum, and vanadium. By 1982, with some help from the UK and Israel, it had acquired nuclear weapons. Compared with Nicaragua, South Africa’s economy was actually in pretty good shape when the ANC came to power. While there had been a protracted low-intensity war against apartheid, South Africa managed to avoid the full-blown civil war that Nicaragua was forced to undertake in the 1970s to rid itself of the Somoza dictatorship.
Nicaragua was also objectively a far less strategically important target. To Washington’s national security planners, however, that made it an ideal opportunity for a relatively low-cost “demonstration." Its population was the same as Iowa’s. Its entire economy was smaller than Des Moines’s. It had few distinctive natural resources. Its only “weapons of mass destruction” were volcanoes, earthquakes, and hurricanes. It was surrounded by other countries that were also of modest strategic value – except for whatever symbolic value was associated with repeatedly crushing the aspirations of impoverished peasants into the dirt.
During the late 19th century, Nicaragua had been selected several times over by US Canal Commissions for a canal across Central America, until Teddy Roosevelt finally opted to create Panama and build a canal across it in 1902, for reasons that had more to do with Wall Street than engineering. After that, Nicaragua’s canal plans went nowhere, especially after the US Marines landed in 1910 to collect debts owed to British and US banks and to depose a nationalist leader who, among other things, made the fatal mistake of seeking European funding for an alternative to the Panama canal.
The ANC also had one other weapon that the Sandinistas clearly lacked. This was the extraordinary wisdom and good fortune of 72-year old Nelson Mandela, who had earned everyone’s respect during his 27 years in prison. He had also learned survival skills like patience, diplomacy, and the capacity for making adroit compromises with bitter enemies. Under his influence, the ANC set out to build a mass party. It agreed to hold new elections within two years of his release. It went out of its way to commit itself publicly to multi-party democracy, a market economy, civil liberties, and peaceful reconciliation.
Most of the Sandinistas’ top leaders – the so-called cupola -- were not really interested in building a mass party, much less a multi-party democracy, at least not initially. They saw themselves as a vanguard party, leading the masses toward a social revolution. As Sergio Ramirez, a leading FSLN member who served as Nicaragua’s Vice President under Daniel Ortega from 1984 to 1990, wrote in his 1999 book, Adios Muchachos,
The FSLN was not prepared...to assume its role of party of opposition inside a democratic system, because it had never been designed for this. Its vertical structure was the inspiration of Leninist manuals, of the impositions of the war and of caudillismo, our oldest cultural heritage.
To be fair, the FSLN leadership also believed that the first priority was to attack the country’s dire health, literacy, land ownership, and education problems, and to build “direct democracy” through civic organizations, not through party politics and national elections. Given the country’s emergency and the need to recover from the civil war, this was entirely understandable. But it did provide cheap shots for the FSLN’s opponents and the mainstream US media, which basically wrote Nicaragua off very early as a reprise of Castro’s Cuba.
The Sandinistas were also widely criticized for lacking the soft touch when it came to domestic politics. Among their many ham-handed moves was their May 1980 decision to expand the Council of State to include “mass organizations,” the August 1980 decision to postpone elections until 1984, the rough way they dealt with the Miskito Indians, the 1986 decision to shut down the (by then, CIA-subsidized) La Prensa, and Daniel Ortega’s various high-visibility trips to Havana, Moscow, Libya and Gucci’s eyeglass counter in New York They were also criticized for implementing a compulsory draft, detaining alleged contra sympathizers without trial after the contra war heated up, permitting the FSLN’s National Directorate (Daniel Ortega, Tomas Borge, Victor Tirado, Henry Ruiz, and Bayardo Arce) to remain an unelected (all-male) body until 1991, and seizing a huge amount of property from ex-Somocistas, even middle-class ones, for their own use during the “pinata” period after Ortega lost the 1990 election -- including more than a few beach houses.
At the same time, they were not given much credit for preserving a mixed economy, reforming the health and education systems, pursuing aid from numerous non-Communist countries in Latin America and Europe, implementing a badly-needed land reform, tolerating the virulent La Prensa, which supported the contras and called for their overthrow, until they finally reached the limit and shut it down in 1986, ultimately holding free elections in November 1984 and February 1990, and respecting the outcome of those elections even when, as in 1990 (...and 1996, and 2001..) they lost.
The basic reality is that from at least 1981 on, Nicaragua’s new government was operating in an increasingly hostile international environment, where the Western media and the USG, as well as the Miami-based Somocistas, were predisposed to seize upon the slightest departures from Roberts’ Rules of Orders to consign them to hell – and if no such departures were readily at hand, to invent them out of whole cloth. These hostile attitudes had much less to do with the FSLN’s behavior than with the USG’s new aggressive stance with respect to the Soviet Union – actually dating back at least to President Carter’s initiation of a contra-like war against the Soviet-backed government in Afghanistan in July 1979.
STATE-FUNDED TERRORISM - REAGAN STYLE
So, despite all the FSLN’s undeniable missteps, it would probably have taken divine intervention to save Nicaragua from the wrath of Ronald Reagan, who decided almost immediately upon taking office to single tiny Nicaragua out for a replay of the Carter/ Brzezinski strategy in Afghanistan.
As former CIA analyst David MacMichael testified at the International Court of the Hague’s hearings on a lawsuit brought by Nicaragua against the US in 1986, from early 1981 on, the US Government set out to create a “proxy army” that would “provoke cross-border attacks by Nicaraguan forces and demonstrate Nicaragua’s aggressive nature,” forcing the Sandinistas to “clamp down on civil liberties.....arresting its opposition, (and) demonstrate its allegedly inherent totalitarian nature.”
In other words, if they were not totalitarian enough to begin with, we would see to it that they became totalitarian – and then blame them for making the switch.
President Reagan offered several different justifications for this ultimately rather bloody-minded policy. In March 1983, in a speech to Congress, he presented his subversion theory, Congress, warning that the Sandinistas had already “imposed a new dictatorship…supported by weapons and military resources provided by the Communist bloc, (that) represses its own people, refuses to make peace, and sponsors a guerrilla war against El Salvador. (emphasis added).”
At other times, he emphasized the beachhead theory, according to which the Sandinistas provided a “Soviet beachhead… only two hours flying time away from our borders…with thousands of Cuban advisors…camped on our own doorstep…close to vital sea-lanes.” He offered similar characterizations of the threat posed by left-wing guerillas in El Salvador, Honduras, and Guatemala. In 1982, Jeane Kirkpatrick, Reagan's hawkish UN Ambassador, also promoted this beachhead theory with her own profound geographical analysis:
I believe this area is colossally important to the US national interest. I think we are dealing here not...with some sort of remote problem in some far-flung part of the world. We are dealing with our own border when we talk about the Caribbean and Central America and we are dealing with our own vital national interest.
Other elements were also sometimes thrown into the mix. On November 6, 1984, just two days after the Sandinistas won a decisive 67-percent victory in the country’s freest elections in history, there was a huge media flap in the US press over their alleged attempt – later proved false – to buy Soviet MiGs for air defense. This story later turned out to be a wholesale concoction of the State Department’s “Office of Public Diplomacy,” and of Oliver North, Otto Reich, and Robert McFarlane in particular, just one of many US propaganda efforts that were designed to distract attention from the FSLN’s victory in those elections.
Together, the subversion theory and the beachhead theory added up to a revival of the time-worn domino theory, transposed from Southeast Asia to Central America. Apparently, the notion was that since Nicaragua bordered on Honduras and El Salvador, which bordered on Guatemala and Belize, which bordered on Mexico, the Red Army might soon be drinking margaritas on the banks of the Rio Grande. Or the Reds might just jet in to El Paso in their MiGs from Managua, “only two hours away.” The fact that “they” were already 90 miles away in Havana, armed with brand new MiG 23 Flogger bombers and MiG 29s, did not get much mention from the Gipper. After all, Cuba had already demonstrated that it could stand up to a US invasion, and the Bay of Pigs was not a happy memory.
This rather strained analysis of Nicaragua’s purported threat to US national security was later endorsed, with only slight variations, by the January 1984 Bipartisan National Commission on Central America chaired by Dr. Henry Kissinger. One might have expected Kissinger to reach a different conclusion, given his long personal experience with Vietnam, Laos, Cambodia, and China, whose leftist regimes spent most of the 1970s fighting with each other, demonstrating conclusively the power of nationalism over solidarity. But he was performing the assignment to ingratiate himself with the Republican Party’s conservative wing. And unlike the National Commission on Terrorist Attacks, which he resigned from in December 2002, it did not require him to identify his consulting firms’ private clients.
In any case, well into the 1990s, long after there were peace settlements in Nicaragua, El Salvador, and Guatemala, and long after the Sandinistas had handed over political power to their opponents, hawkish Republicans like Senators John McCain and Jesse Helms were still seeing ghosts in Nicaragua, trying to make hay out of the Sandinistas’ potential subversive threat. Indeed, as we’ll see, these charges even played a role in Daniel Ortega’s defeat in Nicaragua’s Presidential elections in 2001, even when his running mate was Violeta Chamorro’s son-in-law!
Eventually, in fact, all the stockpiles of AK47s, landmines, rocket launchers, and surface-to-air missiles acquired by the Sandinistas to defend Nicaragua against the contras did end up posing a security threat to the US. But it was not precisely the one that that the Sandinistas' right-wing critics had predicted. In November 2001, Colombia’s 11,000-strong nasty, right-wing, drug-dealing paramilitary group, the AUC, procured 3,500 AK47’s from Nicaragua’s military stockpiles, by way of Israeli arms merchants based in Panama and Guatemala. The arms were part of a five-shipment package that included 13,000 assault rifles, millions of bullets, grenade and rocket launchers, machine guns, and explosives. The AUC, which was on the G.W. Bush’s administration’s official list of terrorist groups, was supported by landlords who wanted to combat Colombia’s leftist guerillas, the ELN and the FARC. The AUC was also supposedly fighting Colombia’s Army. From 2000 to 2003, Colombia received $2.5 billion of US military aid, plus more than 400 Special Forces troops, making it the world’s third largest recipient of US aid. The AUC also reportedly purchased arms from army stockpiles in El Salvador and Guatemala. In 2002, a OAS study also revealed that a Lebanese arms broker with al Qaeda links had tried to purchase 20 SA-7 missiles from Nicaragua’s stockpiles. The US starting pressuring Nicaragua’s President Bolanõs, a neoliberal businessman, to reduce these stockpiles – but hopefully not by selling more of them to the AUC.
In the long run, therefore, by forcing the comparatively-harmless Sandinistas to stockpile all these weapons to defend themselves, and by also arming the right-wing militaries of El Salvador and Guatemala to the teeth, the US had set a trap for itself.
In reality, of course, Nicaragua’s leftists, even if they had been so inclined, were neither necessary nor sufficient to “subvert” their neighbors. Those neighbors with the most serious liberation movements, like El Salvador, Guatemala, and Colombia, had long since done a perfectly good job of subverting themselves. Their rebel movements developed over many decades from within, on the basis of incredibly-unbalanced social structures. For example, El Salvador’s catorce, its top 14 families, controlled 90-95 percent of that country’s land and finance capital, while in Guatemala, just 2 percent of the population controlled more than 70 percent of arable land. These situations were only a slightly more anonymous version of Nicaragua, where the Somoza family alone had laid claim to a quarter of the country’s arable land. And the resulting social conflicts were similar -- in the 1980s, El Salvador’s class war claimed more than 80,000 lives, while Guatemala’s claimed 200,000, with the vast majority due to their own brutal armed forces and paramilitaries.
On the other hand, Costa Rica, Nicaragua’s good neighbor to the south, had long since inoculated itself against revolution by developing an old-fashioned middle-class democracy, with lots of small farms and more teachers than police, having completely abolished its military in 1948.
Furthermore, while the Reagan Administration asserted over and over again in the early 1980s that the Sandinistas had shipped arms to leftist guerillas in El Salvador, two decades later, these allegations have been shown to be as spurious as the MiG purchases. In fact, the Sandinistas’ aid to El Salvador’s rebels, the FLMN, was miniscule, and it was terminated in 1981, as the World Court concluded in 1986. The claim that El Salvador’s FLMN had acquired several hundred tons of weapons from the East Bloc, Arafat and Libya (!), had also been pulled out of thin air. In fact, the rebel armies in El Salvador and Guatemala were poorly armed, except for Galil rifles and rocket launchers they managed to steal or purchase from corrupt army officers. Leading Sandinistas like Tomas Borge also explicitly rejected the notion of “exporting revolution,” except by way of the FSLN’s own example. After all, the FSLN had not needed Soviet or Cuban backing for their own revolution. They also had their hands full rebuilding Nicaragua. The last thing they needed was another war with El Salvador or Guatemala, in addition to the contra war.
Finally, while the Sandinistas were not liberal democrats, and, as noted, committed many political blunders, they were scarcely in a position to run a “dictatorship,” even within Managua’s city limits. To their credit, they had greatly increased the amount of popular involvement in the country’s governance. In November 1984, they held national elections that most international observers, including Latin American scholars and Western European parliaments, agreed were reasonably clean, despite the Reagan Administration’s provision of $17 million to opposition candidates, its systematic efforts to discredit the elections, and the fact that by then Nicaragua was already under steady assault from US-backed contras. Certainly by comparison with the Somozas’ rigged elections, other countries in post-war situations, and El Salvador and Guatemala in particular, Nicaragua’s degree of political freedom was tolerable, if not beyond reproach.
Yet when 75 percent of registered voters turned out for the November 1984 elections, and the FSLN received a commanding 67 percent of the vote, capturing the Presidency and 61 of 96 seats in the new National Assembly, Nicaragua was again accused by the Reaganites of being a “dictatorship.” As former New York Times Editor John Oakes remarked at the time, “The most fraudulent thing about the Nicaraguan election was the part the Reagan Administration played in it.”
The other troubling fact for Reagan’s Nicaraguan policy was that, objectively, the Soviet Union really did not have much interest in acquiring yet another dependent, state-socialist backwater like Vietnam, Afghanistan, or Cuba -- which by the early 1980s was already costing the USSR about $3 billion a year in aid. In hindsight, we now know that, far from being an expansionist Evil Empire, at this point, the USSR was really just hanging on for dear life -- a wounded giant, obsessed with its own serious economic problems, which were even forcing it to import grain from Argentina’s fascist junta! Internationally, it had its hands full just trying to stave off an embarrassing defeat in Afghanistan on its own southern border. It was also pressing existing client states in Eastern Europe and Southeast Asia hard to practice self-reliance.
Finally, in 1980-81, before the US made it absolutely clear that it was seeking “regime change” in Nicaragua, the Sandinistas tried to restore good economic relations, plus access to World Bank and IDB loans. But for the US intervention, this access would have been maintained. And that, in turn, would have significantly reduced Nicaragua’s dependence on East-Bloc aid. After all, as a senior World Bank official noted in 1982, “Project implementation has been extraordinarily successful in Nicaragua, perhaps better than anywhere else in the world.”
About that time, Nicaragua also sought aid from many non-Soviet countries, including Venezuela, Mexico, and France. It was most successful with Mexico, which resisted US pressure and became Nicaragua’s largest aid provider until 1985. Nor did Nicaragua turn immediately to the Soviet Bloc for aid. When it tried to buy $16 million of arms from France in early 1982, however, President Reagan got the French President, Francois Mitterand, to delay the sale “indefinitely.” Only then – under increasing attack from the contras -- did Nicaragua turn to the Soviet Union and Cuba for significant quantities of arms and advisors.
Of course, as noted, many Sandinistas were undoubtedly committed radicals, dedicated to policies like land reform, free health and education, and the seizure of Somocista-owned properties. But these policies were entirely defensible, given Nicaragua’s economic conditions and its need to play catch-up with basic social justice. These are, after all, policies that the US has itself supported, or at least tolerated, in other times and places, when they happened to serve its interests.
The Sandinistas may have been mulish and full of radical bravado, but they were far from anyone’s pawns. These characterizations were 1950-vintage hobgoblins, left over from the days when Ronnie ran the Commies out of the Actors Guild in LA. At best, they reflected a desire to show the Evil Empire who was boss, by making an example of some weak little pinko regime.
On this view, then, in the early 1980s, the USG basically succeeded in pushing tiny Nicaragua into relying heavily on Soviet and Cuban arms and economic aid for its own survival– as, indeed, the USG may have also done with Fidel’s Cuba back in 1959-60. The USG then used that reliance as an excuse to expand its own provocations into a full-scale war that ultimately claimed 30,000 lives. In the historical record books, this is surely one of the clearest examples of state-funded terrorism ever.
SAYING "UNCLE"?
All these inconvenient little details were brushed aside by the Reaganites when they took office in January 1981, raring, in President Reagan’s words, to make the Sandinistas “say uncle.” Say uncle they never did -- in fact, by 1988, they’d “whupped” Olly North’s contras pretty good. But that was not for want of US efforts.
In March 1981, President Reagan signed an Executive Order that mandated the CIA to undertake covert operations in Central America, to interdict arms shipments “by Marxist guerillas.” By November 1981, the US focus had shifted from arms interdiction to regime change. That month, the Administration provided an initial $19 million to mount a pretty transparent “covert” effort to destabilize Nicaragua. The strategy, implemented by the now-famous gang of Presidential pardonees, was the classic scissors tactic that had been employed by the US and its allies in many other 20th century counterrevolutionary interventions, notably Russia (1918), Guatemala (1954), Cuba (1959-60), and Chile(1973).
On the one hand, the USG tried to cut off Nicaragua’s cash flow, reducing access to new loans from the IMF, the World Bank, and the IDB, as well as all EXIM Bank funding and OPIC risk insurance. In September 1983, the US slashed Nicaragua’s sugar quota. In November 1985, it added a total embargo on all trade with the US, Nicaragua’s main trading partner and foreign investor up to then. Given the country’s dire economic straits, this had the practical effect of cutting off all US private investment and bank lending.
At the same time, the Reagan Administration was stubbornly opposing all efforts to embargo trade or investment with respect to South Africa’s racist apartheid regime. In September 1983, for example, the State Department approved a Westinghouse application to bid on a $50 million ten-year contract to maintain and supply South Africa's two nuclear power stations. The US also continued to support World Bank and IDB loans to the right-wing regimes in Guatemala and El Salvador throughout the 1980s.
The other half of the scissors strategy was the USG’s effort to create, finance, arm, and determine strategy and tactics for an 18,000-person contra army, financed with $300 million of taxpayer money, in-kind military assistance, another $100-$200 million raised from private donors like the Sultan of Brunei, and an untold amount of cocaine proceeds. The main faction, the Frente Democrático Nacional (FDN), consisted of 3,000 ex-Somocista National Guard members and another 12-13,000 assorted mercenaries, anti-Castro Cubans, Israeli trainers, Argentine interrogators, and cocaine traffickers of several different nationalities. The Reaganites knew they were not dealing with angels here. As the CIA’s Inspector General later admitted in 1998, the agency made sure to get a statement from the US Department of Justice in 1982, waiving the CIA’s duty to report drug trafficking by any contra contractors.
From 1982 to 1989, this murderous scalawag army stoked a war that ultimately took about 30,000 lives, including those of 3,346 children and more than 250 public school teachers. Another 30,000 people were wounded, and 11,000 were kidnapped, according to the National Commission for the Protection and Promotion of Human Rights. Another half million fled the country to avoid the chaos. With the help of Harvard Law School Professor Abram Chayes, Nicaragua later successfully sued the US for launching these and other terrorist attacks and causing all this damage. In November 1986, the International Court at the Hague found the US liable for several clear violations of international law – notably, for launching an unprovoked war that was not justified by any “right of self defense.” The Court suggested that appropriate damages for the resulting property damage were on the order of $17 billion. But the Reagan Administration declined to appear in court, and refused to recognize the judgment.
THE WORLD'S HEAVIEST DEBT BURDEN
The detailed history of Nicaragua’s contra war has been told elsewhere, at least those parts of it that are not still classified, like much of the record of US knowledge about the contras’ extensive cocaine trafficking activities, and President Reagan’s confidential discussions with his aides, kept off limits for an indefinite period by a Executive Order signed in 2001 by President G.W. Bush.
Our main interest here is in the war’s devastating impact on Nicaragua’s economy and its crushing foreign debt burden. Ultimately, the FSLN soundly defeated the contras with a combination of adroit military tactics – for example, heavily-mined “free-fire” zones along its northern border with Honduras – and a large standing army, raised by draft. To pay for all this, however, the FSLN had to boost military spending, from 5 percent of national income in 1980 to 18 percent in 1988, when the first in a series of armistices was finally signed. By then, more than half of Nicaragua’s government budget was devoted to paying for an army that numbered 119,000 regular soldiers and militia – 7 percent of all Nicaraguans between the ages of 18 and 65.
Early on, the Sandinistas had made a strong commitment to building new health clinics and schools in the county. These social programs, plus land reform, were among their most important accomplishments. Even in the midst of the war, with the help of 2500 Cuban doctors, they managed to increase spending on health and education, open hundreds of new medical clinics, and sharply reduce infant mortality, malnutrition, disease, and illiteracy. They also implemented a land reform that redistributed more than 49 percent of Nicaragua’s arable land to small farmers.
But the war made it very hard to sustain these undeniable social accomplishments Despite the FSLN’s military “victory,” Nicaragua’s regular economy took a direct hit. Trade and investment plummeted, unemployment soared to 25 percent, and inflation reached more than 36,000 percent by 1988-89. From 1980 to 1990, Nicaragua’s average real per capita income fell 35 percent, and the incidence of poverty rose to 44 percent. To deal with shortages in the face of soaring inflation, the FSLN had to implement a rationing system for food and other basic commodities. As the Nixon Administration had done to the Allende regime in Chile a decade earlier, so the Reaganites did to Nicaragua – they made the economy “scream.”
All told, by 1990, Nicaragua had displaced Honduras as the poorest country in Central America. It had also become the world’s most heavily indebted country. To fund the defense budget and their other commitments in the face of declining tax revenues, trade, investment, and multilateral funding, the FSLN partly relied on inflationary finance, by having the Central Bank just print more cordobas. But for vital foreign purchases, including oil and weapons, it required dollar loans from sympathetic countries, mainly the Soviet Union ($3.3 billion), Mexico ($1.1 billion), Costa Rica, Germany, Spain, Venezuela, Brazil, and Guatemala (!), plus more than $500 million from the Central American Bank for Economic Integration, one multilateral institution that the US did not control.
When the newly-elected government of Violeta Barrios de Chamorro took office in April 1990, the debt stood at $10.74 billion – more than 10 times its level in 1980, and nearly 11 times Nicaragua’s national income.
This was by far the highest foreign debt burden in the world, thirty times the average debt-income ratio for all developing countries. And it was not derived from “technical policy errors,” “economic accidents,” or “geographic misfortune. ” Part of it was the $1.5 billion of dirty debt left over from the Somoza years. The rest derived from the ruthless persecution by world’s most powerful country of a tiny, stubborn Central American nation that was determined to finally make its own history.
CONCLUSION - REAGAN'S IMPACT ON NICARAGUA
In the 1980s, against all odds, and woefully ignorant of economics, politics, business, and diplomacy, a handful of rather foolhardy Nicaraguans dared to challenge the Reagan Administration's attempt to prevent them from controlling their own destiny.
They made many mistakes, and they required much on-the-job training. But at least they tried to stand up.
When they did so, they were attacked, and when they defended themselves, they were portrayed as the aggressors. Ultimately they won a victory of sorts, but it left their country a shambles.
Then their successors, worshipers of the latest fashions in neoliberal economic theology, came to power promising reform and freedom, and ended up turning the country into a bantustan.
Perhaps Nicaragua will need another revolution.
(c) James S. Henry, SubmergingMarkets.com(tm) 2004. Not for reproduction or other use without express consent from the author. All rights reserved.
June 17, 2004 at 08:05 PM | Permalink | Comments (2) | TrackBack
Monday, June 14, 2004
The "Reagan Revolution," Part One: Did He Really Win the Cold War?

INTRODUCTION
Former President Reagan’s $10 million taxpayer-funded bicoastal funeral extravaganza is finally over, so we may now be able to regain a little objectivity about the man’s true accomplishments. This really was an extraordinary Hollywood-scale production – one of Reagan’s best performances ever. Apparently the actor/President started planning it himself way back in 1981, shortly after he took office, at the age of 69. Evidently he never expected to live to be 94.
For over a week we have been inundated with neoconservative hagiography from adoring Reagan fans -- one is reminded of Chairman Leonid Brezhnev's funeral in 1982. Even the final event’s non-partisan appeal was slightly undercut by the fact that only die-hard conservatives like President George W. Bush, former President George H. W. Bush, Margaret Thatcher, and Canada’s Brian Mulroney were invited to give eulogies. But at least this saved Democrats the embarrassment of having to say nice things about their fiercest, most popular, and most regressive antagonist of the 20th Century.
Some Bush campaign staff members reportedly recommended shipping the casket home from DC to California by train. Cynics suggested that this was intended to prolong the event and further distract voters from Bush’s serious political difficulties.
Thankfully Nancy Reagan spared us this agonizing spectacle. She probably recognized that it would only invite unfavorable comparisons with FDR, whose family eschewed the state funeral in favor of the more humble train ride. Furthermore, AMTRAK no longer serves most of the towns along the way, due in part to service cutbacks that really got started under President Reagan.
There has already been quite a bit of dissent from the many one-sided tributes to Reagan. Most of it has focused on domestic policy -- especially Reagan's very mixed track record on civil rights and the HIV/AIDs epidemic, his strong anti-union bias, the huge deficits created by his “supply-side” legerdemain, his deep cutbacks in welfare and education spending, and his weak leadership on conservation, the environment, consumer protection, and energy policy. Reagan’s hard-right bias on domestic policy certainly was underscored by the almost complete absence of blacks and other minorities among the ranks of ordinary Americans who lined up to mourn his passing.
With respect to foreign policy, the “Iran-Contra” arms scandal and Reagan’s support for apartheid were recalled by some observers. But most of the attention was directed to Reagan's supposedly uniformly positive contributions to the demise of the Soviet Union and the end of the Cold War.
In this article, the first of two in this series, we'll examine Reagan's foreign policy contributions more closely. The analysis has important implications not only for our assessment of Reagan, but also the White House's current incumbent.
DID RONNIE REALLY “WIN THE COLD WAR?”
We can debate this alleged role endlessly. Of course, like John Kennedy (“Ich bin ein Berliner”), Reagan made one very forceful speech in Berlin (“Tear down this wall, Mr. Gorbachev.”) Especially during his first term, he also supported policies that tried to roll back the Soviet Empire’s frontiers in distant places like Afghanistan, Angola, Nicaragua, and Grenada. He also expanded the US defense budget, accelerated the deployment of theater-nuclear missiles in Europe that had already been started by President Carter, and financed the (largely-nonproductive) first round of the “Star Wars” anti-missile program. All these moves no doubt increased pressure on the Soviets, and probably encouraged them to negotiate and reform.
However, Reagan was hardly responsible for the fact that the “Soviet Empire” had been more or less successfully “contained” almost everywhere except Cuba, Vietnam, and Afghanistan from the 1950s to the 1980s, and that even these client states had become more of a burden to the USSR than a blessing.
Nor was he responsible for the fact that President Carter had initiated anti-Soviet aid to the Poles and the Afghan rebels in the late 1970s; deployed the first long-range cruise and Pershing II missiles in Europe in December 1979, partly in response to the Soviets’ deployment of SS-20 missiles in Eastern Europe; suspended Senate consideration of the SALT II Treaty in January 1980; and issued Presidential Directive 59 in August 1980, adopting a new, much more aggressive “countervailing force” strategy for nuclear war.
Nor did Reagan have much to do with the fact that a whole new generation of Soviet leaders, including Mikhail Gorbachev, took power in 1984-85, or the fact that these new leaders chose the “glasnost/ big bang” route to reform rather than the more gradual and successful one that has kept the Chinese Communist Party in power to this day. This was also a matter largely of the Soviets' own choosing.
Nor was Reagan responsible for the fact that Gorbachev, who actually sought to preserve a stronger, reformed version of the Soviet Union rather than disband it, proved to be much less adept at Russian politics than Boris Yeltsin.
Even if we acknowledge that Reagan’s policies contributed to ending the Cold War, therefore, the historical record is very far from giving him “but for” credit for this happy ending.
In fact, even if "Cold War liberals" like Jimmie Carter and Fritz Mondale had presided over the US throughout the 1980s, the odds are that the very same key systemic and generational factors that helped to produce fundamental change in the Soviet system would have still applied – with very similar outcomes.
WHAT RISKS DID RON RUN?
In the literature on the economics of investment, it is well established that (at least in equilibrium, with competitive markets) there are no increased rewards without increased risk. When it comes to evaluating historical leaders, however, apparently this basic principle is often overlooked.
Reagan’s confrontational approach to the “Evil Empire” clearly was very distinctive. But this was hardly an unmixed blessing. Indeed, we now know that he took incredible risks in the early 1980s, and, as discussed below, that we are all extraordinarily lucky to have survived this period intact.
Furthermore, we are all still living with serious systemic risks that are a direct byproduct of Reagan’s high-risk strategies—even apart from the long-term legacy of his Afghan “freedom fighters” and latter-day terrorists.
For example, only in the mid-1990s, after the USSR’s collapse, did we learn that the Soviet Politburo and top Soviet military planners really had become convinced in the early 1980s that Reagan had adopted a new pro-nuclear war-fighting strategy, changing from “mutually assured destruction” to the pursuit of an all-out victory.
Soviet leaders came to this conclusion partly because of several key developments in military technology and strategy.
- By the early 1980s the US had acquired a growing advantage in submarine-based nuclear weapons (D-5 Trident missiles, with greater accuracy and short flight times) and anti-submarine warfare techniques, as well as space-based communications, surveillance, and hunter-killer satellite capabilities.
- As noted, Carter and Reagan both started to deploy cruise and Pershing II missiles in Europe and on submarines. These were just 4-6 minutes from the Soviets’ command-and-control centers and many of their ICBM silos, which they counted on for up to two-thirds of their deterrent capability.
- In the early 1980s the US also took several steps that were apparently intended to increase its chances of surviving a nuclear war. These not only included “Star Wars,” but also hardened telecommunications, new command-and-control systems and some “civil defense” measures, and revised policies for “continuity in government.”
On top of these structural changes, the Reagan Adminstration’s aggressive rhetoric and behavior also contributed to this new Soviet view of US intentions.
In early 1981, for example, Reagan ordered the military to mount a still-highly-classified series of “psyops” that probed USSR airspace and naval boundaries with US and NATO jet fighters and bombers, submarines, and surface ships. The US and NATO also conducted several large-scale exercises in 1982-84. The US also sharply increased its assistance to “freedom fighters” like the Nicaraguan contras, the Afghan rebels, and Jonas Savimbi’s bloodthirsty South-Africa-assisted renegades in Angola.
As we now know, all this belligerent US activity scared the living daylights out of old-line Soviet leaders like Yury Andropov. It reminded them of Hitler’s sudden blitzkrieg attack on the Soviet Union in 1941, a searing experience for which Stalin had been surprisingly unprepared. They came to believe that the US was actually planning the nuclear equivalent of this blitzkrieg -- a first strike that would decapitate Soviet command-and-control while minimizing the effects of retaliation on the US. Of course Europe would be probably destroyed in such a confrontation. But the Soviets assumed, perhaps correctly, that the US saw “Old Europe” as dispensable.
In response to this perceived US threat, the Soviets did not roll over and play dead. Rather, drawing on their 1941 experience, their first response was to assume the worst and try to prepare for it.
- From May 1981 on, they ordered a worldwide intelligence alert, code-named “RYAN”, aimed at keeping the Politburo informed on a daily basis of US preparations for a first strike.
- The Soviets shifted their nuclear posture decisively to “launch-on-warning.” For the first time they also provided the Politburo with the ability to sidestep the Soviet General Staff and launch all strategic missiles with a central command. To support this shift, they also deployed new ground-based radar and space-based early-warning systems.
- Most striking of all, in the early 1980s the Soviets also implemented a full-scale nuclear “doomsday” system, code-named “Perimeter.” This system, first tested in November 1984, placed the power to unleash a devastating retaliatory strike against the US essentially on autopilot, whenever the system “sensed” that a nuclear strike against Moscow had either occurred, or was about to occur.
Together, all these shifts in Soviet defensive strategy cut the decision time available to their leaders, when deciding how to respond to a perceived US/ NATO attack, to as little as 3-4 minutes.
As Gorbachev later “Never, perhaps, in the postwar decades was the situation in the world as explosive and hence, more difficult and unfavorable, as in the first half of the 1980s.”
THE LEGACY
To our great distress, despite the mutual de-targeting that was announced with so much fanfare by Presidents Clinton and Yeltsin in 1994, both these Cold War “hair trigger” responses to Reagan’s initiatives are still in place today, responsible for controlling at least the 5000+ strategic nuclear warheads that Russia still maintains.
![]() | ![]() Lt. Colonel Petrov had been forced to make a profound decision about world civilization in a matter of minutes, with alarms and red lights going off all around him. ![]() |
These systems have already experienced several close calls. Among the incidents that we know about were those in September 1983, August 1984, and January 1995. In this last incident, President Yeltsin -- who was not always a picture of mental health and stability -- came within minutes of unleashing a full-scale nuclear retaliation in response to a false alarm set off by a Norwegian research missile that was sent aloft to study the Northern Lights. Apparently it bore a striking resemblance to an incoming Trident missile on Soviet radar until it crashed harmlessly in the sea.

The September 1983 incident, at the height of Soviet tensions with the Reagan Administration, and just a few months after the huge anti-Soviet NATO exercise “Able Archer” in Western Europe, was even more scary. In 2000, Lt. Colonel Stanislov Petrov, the duty officer who had been in charge of an early-warning bunker south of Moscow at the time, told Western journalists what happened when the new early-warning computers at his facility suddenly reported a full-scale US attack:
"I felt as if I'd been punched in my nervous system. There was a huge map of the States with a US base lit up, showing that the missiles had been launched. I didn't want to make a mistake…..I made a decision and that was it. In principle, a nuclear war could have broken out. The whole world could have been destroyed. After it was over, I drank half a liter of vodka as if it were only a glass and slept for 28 hours."
Ultimately it turned out that the new Soviet early-warning system had malfunctioned. Lt. Colonel Petrov had been forced to make a profound decision about world civilization in a matter of minutes, with alarms and red lights going off all around him.
Fortunately for all of us, he decided to not to believe his own computers.
Unfortunately for all of us, a modified version of that same hair-trigger early warning system is still in place in both Russia and the US to this day, since neither side has ever reverted to the pre-Reagan “MAD” strategy -- and Lt. Colonel Petrov has long since retired to a humble Moscow flat.
SUMMARY
From this vantage point, President Reagan’s long-term legacy is a little more difficult to evaluate, even with respect to his impact on the Cold War.
- Clearly he had a great deal of help from others, as well as from sheer fortuity.
- We are still living with the heightened risks in the world system that were partly created by the aggressive nuclear strategy adopted by President Reagan, and to some extent by President Carter before him. If Russia’s early warning systems and doomsday systems – both of which are now reportedly starved for maintenance funds -- should ever fail, history may not be so kind to Ronald Reagan, assuming that there is anyone left to write it.
- Much of Reagan’s vaunted “strength” was really based on a blithe combination of sheer ignorance, blind faith, and risk taking. Compared with President Nixon (who, like FDR, also eschewed a state funeral), Reagan knew almost nothing about world affairs other than what he read in The Readers Digest and (perhaps) The National Review.
- On the other hand, compared with the insecure Nixon, who was constantly seeking reassurance from his advisors, Reagan certainly did have much more faith in his own convictions. With respect to the Soviet Union’s nuclear strategy, like a determined child, he may have never fully appreciated the fact that he was playing with…well, er.., much more than dynamite.
After the fact, of course, like any high-stakes gambler who bets it all on “black,” spins the wheel, and wins, Reagan looks like a hero, at least to many Americans.
However, whether or not ordinary citizens of the world should look back on this track record and cheer, much less encourage our present and future leaders to adopt similar blind-faith strategies, is very doubtful.
Indeed, today, most of the rest of the world seems to regard President Reagan -- rather more accurately than many Americans -- as the friendly, fearless, perhaps well-meaning, but really quite reckless “cowboy” that he truly was.
(c) James S. Henry, SubmergingMarkets.com, 2003. Not for reproduction or other use without express consent from the author. All rights reserved.
June 14, 2004 at 03:00 AM | Permalink | Comments (1) | TrackBack
Friday, June 11, 2004
A Truly Great American Died This Week

Like Ellington, Basie, Armstrong, and Miles Davis, Ray's first and last love was his art. From age 3 right up to the end, he continued to perform and tour even when his body, racked by a plethora of ailments, wanted to quit. Without any apparent effort, he was an authentic optimist, an embodiment of hope and courage for all young people who begin life with seemingly insurmountable difficulties. He was not just a hero to Americans, but was revered in many other countries where jazz is popular, from Russia and France to Japan and Brazil.
From one standpoint Ray was “blind.” But he taught us the difference between being able to look and being able to see. Ray was the real thing. On this national day of mourning, we mourn Ray's loss.
(c) James S. Henry, SubmergingMarkets.com, 2003. Not for reproduction or other use without express consent from the author. All rights reserved.
June 11, 2004 at 04:44 PM | Permalink | Comments (0) | TrackBack
Sunday, June 06, 2004
The Forgotten Members of the "Greatest Generation"


This weekend President Bush was in Europe, celebrating the 60th anniversary of D-Day. He was joined by thousands of American, British, Canadian, and French veterans of World War II, members of the so-called “Greatest Generation,” as well as the Queen, the UK’s Tony Blair, France’s Jacque Chirac, Russia’s Vladimir Putin, and Germany’s Gerhard Schroeder, all of whom converged on Normandy for commemoration ceremonies. As Schroeder duly noted, the fact that all the leaders of these former allies and enemies could finally come together to celebrate D-Day for the first time means that “the post-war period is finally over.”
Many other US leaders, from Donald Rumsfeld and Dick Cheney to John McCain and John Kerry, have also recently tried to associate themselves with the valor and sacrifices of American veterans in our increasingly long list of foreign wars. Their tributes have been similar, whether the veterans in question fought in wars that were short or long, one-sided or evenly matched, just or unjust -- and whether or not the politicians in question have ever spent even a single minute on an actual battlefield.
This year, such martial rhetoric is flying thicker than usual because of the coincidence of several events. In late April, the long-awaited $190 million memorial to America’s World War II veterans was finally unveiled in Washington DC. Its architectural reviews have been decidedly mixed, especially by comparison with the beautifully-understated Vietnam War memorial. But this certainly is a long-overdue tribute to the 16.4 million Americans who served in that objectively “good” war and the 405,000 who lost their lives in it.
We are also in the middle of an unusual US Presidential election race, which is proceeding while the country fights two wars at once – the war in Iraq and the less visible, potentially much more dangerous “war on terror.” Both key Presidential candidates are vying hard to be viewed as stalwart defenders of national security and close friends of the veterans community.
Indeed, the whole period from Memorial Day to July 4th has become a high season for veteran commemorations and martial romanticism. Those who happen to part of the majority of Americans who are neither veterans nor members of the “Greatest Generation” may feel a bit uncomfortable – sort of like non-Christians at Christmas.
I don’t happen to share this discomfort. To begin with, my family has done more than its share of fighting for the nation since it arrived in Virginia in the 1620s. We’ve volunteered for almost every single “good” American war in history, from the Revolutionary War and the War of 1812 to the Civil War, World War I, World War II, and the Korea War. As for the “Greatest Generation,” we also supplied several authentic members, including my father, a World War II veteran who served four years with the Navy in the South Pacific, and my uncle, one of General Patton’s tank commanders who helped to liberate the Buchenwald concentration camp.
It is also not the fault of subsequent generations that almost all the wars that the US has chosen to fight since the Korean War in the early 1950s have been one-sided affairs, undertaken against more or less defenseless Third World countries like Vietnam, Cuba, Nicaragua, Granada, Panama, Afghanistan, and Iraq. Except for Afghanistan, where the Taliban allowed al-Qaeda to build training camps, none of these countries ever attacked us or our allies, posed a serious direct threat to our national security, or even had air forces or navies, much less nuclear weapons.
These wars were basically neo-imperialist adventures in gunboat diplomacy. Not surprisingly, most of them proved to be vastly more lethal to the hapless natives than to US troops. For example, while the US lost 58,226 killed and 2300 missing in Vietnam, the Vietnamese lost an estimated 1 million combatants killed, 4 million killed and 200,000 missing in action, most of them to our relentless bombing campaigns.
Furthermore, while those American veterans who have served in genuinely defensive wars certainly deserve to be honored, our political leaders do no service by oversimplifying their contributions. As the history of Germany indicates, excessive militarism and the idealization of martial values like “honor, duty, and blind obedience to one’s superiors” may help to encourage still more aggressive wars, which creates more veterans, which creates more glorification, which encourages still more wars…..
So, in the interests of reversing this venomous cycle, we offer the following critique of “Greatest Generation” mythology -- and also pay homage to other members of the Greatest Generation whose contributions have largely been forgotten.
REALITY CHECKS
The conventional image that most Americans seem to have of the US role in World War II is that we – or, at most, the US and the UK – basically won the war.
In this view, fed by sixty years of Hollywood films, poltical rhetoric, jingoistic reportage in the mass media, and bad history courses, the vast majority of the “Greatest Generation” supposedly volunteered courageously to fight against the Axis Powers. The US military supposedly played a decisive role in defeating not only Japan but also Nazi Germany and Italy, and the Normandy invasion was supposedly critical to the German defeat.
Unfortunately, it was not quite that simple.
>Draftee Predominance. To begin with, of the 16.4 million US veterans who served in World War II, only about a third were volunteers. The rest were drafted. Even in this “best of all possible wars,” therefore, where the lines between good and evil could not possibly have been any clearer, compulsion, not volunteerism, was the main motor force. In fact, volunteerism was even less in evidence during World War II than it was during the Vietnam War. Just a quarter of the 2.6 million Americans who served in the Vietnam War from 1965 to 1973 were draftees – notwithstanding the role that the draft played in stimulating opposition to that war.
>Casualties. On D-Day, June 6, 1944, the US suffered a grand total of 6603 casualties, including about 2000 killed in action or missing. Our other non-Soviet Allies added another 3646 casualties. All told, that first day, there were about 3000 killedamong all the Allies. For the entire Battle of Normandy, the US suffered 126,847 casualties, including about 30,000 killed, and the UK and other non-Soviet allies added 83,045 casualties.
For World War II as a whole, as noted, the US suffered 405,000 American deaths. About 290,000 of these were due to combat, the rest to accidents and disease.
These were impressive losses by comparison with other American wars. Only the Civil War recorded a larger number of total fatalities, but those included a huge number who died from disease. World War II's combat fatalities was the largest for any US war.
Despite these records, the fact is that all these US and non-Soviet Allied casualty statistics pale by comparison with those suffered by our key Ally on the Eastern Front, the Soviet Union.
All told, the USSR lost 8.7 million to 11 million troops killed in combat against Germany, Italy, Rumania, and the other Axis coalition members from 1941 to 1945. This included 500,000 troops killed at the Battle of Stalingrad alone from September 1942 – January 1943. There were also 440,000 Soviet troops killed at the Siege of Leningrad from 1941 to 1944, 250,000 at the decisive Battle of Kursk in June 1943, and 450,000 on the march to Berlin in 1945.
In addition, there were also another 12-18 million civilian casualties in the Soviet Union during World War II, compared with the 60,000 civilian casualties that the British lost to Germany air raids, and negligible US civilian casualties.
>Strategic Role – Germany. Most important, far from playing a decisive role in defeating Hitler, the fact is that the D-Day invasion came so late in the war that even if it had been turned back by Hitler, the chances are that this would have only delayed the Soviet advances into Berlin by six months to a year at most, without fundamentally affecting the outcome of the war.
The Soviets had been lobbying hard for a D-Day invasion from 1942 on, but were resisted by Churchill, in particular, who favored a "southern" strategy through Italy and the Balkans -- and had been a lifelong hardline anti-Communist.
During the 1944 Normandy invasion and the Battle of France, the key battles involved, at most, about 15 Allied and 15 Germany divisions.
On the Eastern Front, by comparison, from 1941-44 more than 400 Germany and Soviet divisions battled each other along a 1000-mile front, and the Soviets succeeded in destroying more than 600 Germany divisions. (Overy, Why the Allies Won, 321).
Even as the Normandy Invasion was proceeding, the much larger Soviet Army was driving towards Berlin, destroying Germany’s main army group and costing the Axis powers nearly 4 million casualties.
Without this Soviet effort on the Eastern Front, therefore, the Normandy invasion could not have succeeded, and Hitler would probably have prevailed. Combined with the successful invasion of Normandy, by the time it came, the main effect was to shorten the war a bit in Europe.
Nor did the “Lend-Lease” aid provided to the Soviet Union by the US and the UK during the war prove decisive. Much more important was the fact that Soviet industry, relocated to the east, was able to out-produce Germany several times over in aircraft, tanks, and artillery pieces throughout the war.
Despite all this, the commemoration speeches given this weekend by Western leaders failed to even mention the Soviet contribution to the war effort. .
>Strategic Performance – Japan. As for the war with Japan, it has long been recognized by military historians that it was distinctly less important than the war with Germany. Without the victory over Germany, the victory over Japan would have been impossible; with it, given Japan’s relative weakness, V-J Day was basically just a matter of time. Consistent with this, the war with Japan only consumed about 15 percent of the total US war effort.
After Germany’s demise in April 1945, the US, with some belated help from the Soviet Union, turned its attention to Japan, and quickly swept it out of China and the Pacific. By August 1945, when the US dropped its two atomic bombs, the Japanese were already on the verge of surrender. As Gar Alperowitz, the leading historian of Truman’s decision to drop the bomb, has argued, that decision was largely undertaken to impress Stalin, not because of military necessity or to save American lives.
IMPLICATIONS
All this is not to say that American World War II veterans do not richly deserve the honors that have been bestowed on them. Millions of them fought valiantly, at D-Day and elsewhere.
However, especially in these times, when the US has given in to the temptation to launch an unprovoked war largely alone on its own, it is important to remember how much assistance the US needed from allies in its most important victory ever -- and how it achieved its best results when it was fighting a clearly justified defensive war.
This viewpoint offers a helpful perspective on several other popular myths about World War II. These include (1) the myth that British intelligence breakthroughs like “Ultra” – a program that broke German encryption codes -- were critical to the war’s outcome; (2) the myth that US economic capacity provided the decisive edge; and (3) the myth that the atomic bomb had to be used to force Japan’s surrender.
This analysis also provides an interesting perspective on the many critics who have deplored the “tragedy” of the Russian Revolution and the brutality of Stalin’s forced industrialization campaign during the 1930s.
The unpleasant reality is that Tsarist Russia had barely held its own against Germany during World War I. It t is very unlikely that a Tsarist regime or even a Kerensky-style liberal capitalist regime could have achieved anything like the rapid industrial development that Stalin accomplished during the 1920s and 1930s. however, in restrospect, there is little question that Stalin's crash industrialization program permitted the Soviet Union to acquire the industrial base that proved to be essential for the defeat of Nazi Germany.
So it is all very well for First World liberal democracies like the US and the UK to look down their long noses at Russia's developmental misfortunes. But perhaps we in the West should at least acknowledge our debt to the Russian Revolution, Stalin’s brutal industrialization program, his millions of victims, and especially the long-suffering Russian people. If they accomplished nothing else, at least they saved our liberal democracies from fascism. This is hardly an apology for Stalin. But if anyone deserves to be called “the Greatest Generation,” it was the generation of Russians that had to face down both Stalin and Hitler during the 1930s and 1940s.
This does not imply that the Normandy invasion was worthless. What it really may have accomplished was not so much the defeat of Hitler, per se, but a more balanced post-war political division of Europe. With the Soviet Army in control of Germany and perhaps even much of France and Italy after World War II, the post-war history of Europe might have been very different. In that sense, the value of D-Day was less a matter of defeating Hitler than of affecting the division of Europe with Stalin.
THE FORGOTTEN GREATEST GENERATION
In retrospect, there is one group of American veterans that unquestionably deserves to be included in the “greatest generation” -- although it is never mentioned in veteran tributes, and none of its members have ever qualified for US veteran benefits.
This is the all-volunteer group of 2800 Americans that journeyed to Spain at their own risk and expense in 1936-39 to serve as members of the Abraham Lincoln Battalion in the Spanish Civil War.
A ragtag army, mainly consisting of leftists, union members, and many American Jews, they joined forces with some 56,000 other international volunteers from more than 50 countries, and fought against overwhelming odds to defend the Spain’s democratically-elected Republic against General Franco’s army, which was openly supported by Hitler’s Germany and Mussolini’s Italy.
Most of these volunteers were amateur fighters, without any military training. The arms embargo that was enforced by the future Allies -- the UK, France, and the US -- against the Spanish Republic – but not Germany or Italy – prevented the Lincolns and their comrades from having adequate arms and munitions. As a result, combat fatalities were very high. More than a third of the Lincolns died in battle, even higher than the 20-30 percent fatality rates recorded by US troops on D-Day.
Meanwhile, the US, the UK, France, and most other European countries (except the USSR), heavily influenced by conservative business elements and the Catholic Church, concocted the “non-intervention pact” that prevented arms and other aid from reaching the Spanish Republic. These Western countries also stood by and watched while Germany, Italy, and their allies aided Franco, seized Ethiopia, butchered China, and occupied Czechoslovakia. Throughout the 1930s, major US firms like GM, Texaco, Exxon, DuPont, Alcoa, and IBM, as well as Wall Street firms like JPMorgan, Brown Brothers Harriman, and Citibank, also continued to trade and invest, not only with Franco, but also with Hitler’s Germany and Mussolini’s Italy.
In the aftermath of the civil war, the Lincoln Battlion members continued to prove their heroism and commitment. After Franco won the civil war in 1939 and the Lincolns returned to the US, they were labeled as “premature anti-fascists” by the US government, prevented from holding government jobs or joining the military. To continue to fight the fascists, they had to enlist in foreign military services. In the 1950s, during the McCarthy era, many were blacklisted and otherwise persecuted. Despite such pressures, they continued to play a leading role in progressive causes throughout the last fifty years, right down to leading recent protests against the invasion of Iraq.
Ultimately, in the late 1970s, Franco’s dictatorship – which was supported by the US Government after World War II for nearly twenty years – gave way to the return of democracy in Spain.
Finally, in 1996, as a tribute to the Lincolns’ sacrifices, the “Second Spanish Republic” celebrated the 60th anniversary of the Spanish Civil War by welcoming those American veterans who had managed to survive to a commemoration ceremony in Madrid -- a modest version of this weekend's ceremonies at Normandy. Spain, at least, recognized that these American veterans had been the forgotten members of the Greatest Generation, whose courageous efforts were never properly honored by their own country.
SUMMARY
American veterans like my father and uncle certainly displayed extraordinary courage, sacrifice, and heroism during World War II. But, as they would have been the first to admit, in some ways they actually had a relatively easy time of it. This was not only because they had a great deal of help from allies like the Russians, the British, and the Chinese (against Japan). It was also because, as noted earlier, their war was perhaps the most clear-cut struggle ever fought between good and evil.
The veterans of all too many other American wars have had to face the fact that their wars were much less virtuous. Bearing that kind of understanding honestly certainly requires a special kind of courage and sacrifice. Perhaps that is why there is so much enthusiasm for the World War II festivities -- perhaps there is a hope that some of that war's glory will rub off on these other efforts. Those of us who have an opportunity to prevent our overly-adventurous leaders from launching such misconceived efforts, or to bring them to a halt as soon as possible, have every obligation to do so.
Finally, as we saw in the case of the Lincoln Battalion, still other American veterans have sometimes had to defy their own country’s policies of the day in order to fight for justice, and then pay a very heavy price for being “prematurely” right.
This year, as we honor those who helped to defeat global fascism, we should also honor those who were among the very first to take up arms against it.
(c) James S. Henry, SubmergingMarkets.com, 2003. Not for reproduction or other use without express consent from the author. All rights reserved.
June 6, 2004 at 09:47 PM | Permalink | Comments (0) | TrackBack
Wednesday, June 02, 2004
"Letters from the New World" (Ukraine): #1.""Schwartzennation" - Microwave Democracy"
"SCHWARTZZENATION" - MICROWAVE DEMOCRACY
From where I sit, here in Kiev, it seems that the United States of America has become a nation of super-people. At the cost of a very few lives it has defeated an army of hundreds of thousands in Iraq, and occupied a country of 25 million. Like the Spanish conquistadors facing the Incas, America appears to be an era ahead of the rest of the world. And just like the Incas facing the conquistadors, the world is ambiguous towards America, fascinated yet fearful, trying democracy and Wrigley's Spearmint Gum for the first time.
If an American soldier dies in Iraq, every inhabitant of our planet learns about his death almost instantly: a giant falls with a thud. When a crowd of Iraqis carried the helmet of a dead American soldier it seemed like it took fifty of them to carry it. Yet like every giant from a fairy tale, the American giant has a vulnerability that may prove its undoing.
Historical eras are often distinguished from one another by technology, both industrial (how things are made) and social (how people interact). A key secret of America's economic and political advantages lies in its use of pioneering social technology, especially the concept of “win/win.”
There are countries where for every ten people who enable there are eight, ten, or twenty of those who destroy or impede. Per capita productivity in Russia is one tenth that of the US. Does this mean that a Russian can’t lift a five-pound sack of potatoes? No, it means that if a Russian wants to open a hot dog stand, a bandit and a tax collector immediately visit him. In America, one of your neighbors works to feed you and another to educate you. In Iraq, one neighbor spies on you and another teaches you hatred instead of arithmetic.
It is “win/win” social cooperation, supported by social values and a legal system that Americans often take for granted that opens the way for the introduction of new technology, not the reverse: industrial technology can be used only if your neighbors realize that your personal success will in turn help them advance their goals. America has long since accepted the basic premises of “win/win,” and this is what helps to make the American soldier grow a hundred feet tall.
But technology is a human attribute, not the essence of what a human being is all about. Technology, both social and scientific, has helped to make America successful, but America is in danger of neglecting human character, proposing solutions that are purely technical, and thus may well be inadequate.
This danger is nothing new. Paganism was a fascination with the technologies of nature: to be strong, people wore wolf's teeth or feather head-dresses. The Industrial Age worshiped the Machine Tool, a new God that produced everything, and people wanted to be like the Machine Tool's products: unanimous, marching in step, and wearing steel helmets. The Information Age proclaims: you are what you appear to be; ultimately it is all “bits and bytes.” If the celluloid Terminator can save the world, it follows that a human Arnold Schwarzenegger can save California. But is it really just a matter of technique and force?
If we compare a McDonald's to a French restaurant, we are likely to conclude that the McDonald's is cheaper, cleaner, faster, and friendlier. It is a triumph of technology, research, and training. The French restaurant has only two things going for it: you will not remember a McDonald's meal for the rest of your life, and you cannot propose at McDonald's. McDonald's stands for a satisfying technologically-assured result, but the French restaurant stands for life, whatever it is. McDonald's has a very useful role to play, but when it proposes itself as a substitute for a sit-down meal, there is a problem.
Too often, America says to the world, "Accept our technology because it is really works." And indeed it does usually “work”, but the world does not want to accept it - it prefers to keep its old ways of life. People want to be, not just to appear. America wants the world to wear a mask of "nice" and “new,” but the world wants to keep its tastes and traditions, its blemishes, its uncertainties, and even its vices. It is not that the world wants to remain "bad": the world simply resists the notion that every problem has a technological solution. The world may not be ready for such “solutions,” or it may believe that there are problems that await spiritual rather than technological solutions. The technocratic side of America seems to be saying, "If your marriage is unhappy it could only mean that your marriage contract was not elaborate enough," but the world sees this as technocratic madness, the worship of a new false pagan god, even in the midst of America’s purported “spiritual revival.”
Of course democracy can be a reasonable goal, be it for Canadians or for Afghans or Iraqis. But when democracy is presented as a ready-made technological solution – three minutes in the microwave, with a pickle and a smile - then people will refuse to swallow this prepackaged sandwich. The world wants to slaughter the lamb, skin it, and eat it with their hands.
The world resists American idea that politics (and art) are no longer about people, but about the application of various technologies – a democratic system of government being one of them. The Terminator saves the world not because he has the largest heart, but because, at the right moment, his guns make the greatest holes. The world sees this exclusion of people, with their hot beating hearts and their imperfect histories, as a serious threat.
India invented quiet contemplation and has congested, noisy streets; Britain invented good manners and reads the stolen letters of royalty; Russia stood for the soul elevated by beautiful literature, and so Russian prostitutes are the best-read in the world. The world abandons its values, and American culture pours in and rules. But the world understands that the American version of good is not good, and the stronger America becomes, the more it tries to impose its will, the more it will be resisted.
America should be extending its "win/win" spirit, which has been so successful at home, to its (belated) efforts to spread democracy abroad. It should not be turning itself into a fearsome giant that pretends that technology has made love, identity, and history obsolete. Just like in every fairy tale, at the end a single human child will defeat it.
June 2, 2004 at 07:30 PM | Permalink | Comments (0) | TrackBack
Monday, May 31, 2004
"Letters from the New World (South Africa)."Denis Beckett #4:""Kill Bill" Comes to Joburg"
KILL BILL COMES TO JOBURG
You’ve seen Kill Bill on the bus shelters – the blonde star in a grand prix tracksuit with a samurai sword. It’s all over the press, in phrases like “TARANTINO’S TRIUMPH -- PAGES 2, 3, 6 – 7, 10 & 12.” The word “brilliant” appears repeatedly, with superlatives, as in absolutely brilliant, amazingly brilliant, astoundingly brilliant.
The impression I got was of a violent movie, brilliantly handled so as to make the violence a light-hearted, merry kind of violence. Additional brilliance apparently lay in the “referencing” by which the film borrows techniques from prior films so filmgoers can have detective fun whispering “Look, Mabel! That’s from Slicing Off Noses, 1989.”
This was intriguing. If we in the print world “borrow”, we are called plagiarists and sent to the back doors of restaurants. The notion of cute violence left me a bit at sea. And I felt a pique on behalf of local films, to whom local ink is the difference between life and bankruptcy and who beg for attention. Was this Kill Bill that much better? I bought a movie ticket.
I lasted half an hour. I might have managed more if they provided sick-bags like on an airplane, but I’m not sure: I also had a pressing urge to shower, with lots of soap.
There were brilliant bits, I suppose. The gentle way a female voice sings the violent opening song is creepily brilliant. And the blood-splattering seemed sort of brilliantly done, the first dozen or so times. It splattered vividly, anyhow.
But I couldn’t see a basic, special brilliance, or what is brilliant in a mother being blood-splatteringly killed in front of her 4-year-old, or another mother being killed (with splatter) on the bed under which her daughter is hiding. Or in the daughter getting a turn to splat after “luckily”, I quote, discovering that her mother’s murderer is a pederast.
One scene imprinted itself: a male nurse hiring out the body of a comatose female patient, Vaseline thrown in leeringly. It’s effective, I grant, wedging like a gallstone in my memory lobe, but “brilliant”? I see sordid. I see sickening. I can’t see brilliant.
The world takes all kinds. Some people need brutality; it speeds their adrenalin. Fine, they’re welcome; rather have it on the screen than in the street (where if they saw a fraction of Kill Bill they’d be in trauma counselling for ten years.)
But these aren’t the only people. I might be a drip, a sissy and chicken, but I’m not unique. If I feel dirty watching this, so do other people, some of whom have been led to see this film as compulsory viewing, advancing mankind’s frontiers. Some are revolted but embarrassed to be revolted, and scared to admit they’re revolted. I think it’s they who laugh at the foulest scenes. Their heads say that what they’re watching is disgusting, but society says that what they’re watching is brilliant. They feel they must be inferior people, gutless or dull. So they don disguises, outdoing each other’s enthusiasm.
Why does society give them so one-way a message? Could some critics be caught in the same syndrome: “if I don’t enthuse over this film I’ll be derided as an inferior person, gutless or dull”? Even the few who denounce, denounce with escape hatches: it’s racist, because the white chick chops non-white heads off; or it’s boring, because its plot plods. Is there a taboo against saying what many viewers surely want to say, that the film is demeaning and psychopathic, and to wish to puke in your shoe is a healthy response?
I think something pathetic is on the go. Once, reviews had to uphold motherhood and apple pie. You couldn’t break out. When breaking out first became permissible, it was called freedom. Now the breaking out has become its own new prison. You don’t dare be seen to not push limits. People will think you’re off the pace.
It’s okay that weird tastes are in the world. There always have been. But that the taste for beauty, for honesty, for integrity, for humanity, for the things we inwardly want… that that taste is now a furtive thing and hidden, is wrong.
May 31, 2004 at 07:46 PM | Permalink | Comments (0) | TrackBack
Tuesday, May 11, 2004
4510. Iraq: From a Strategic Standpoint, We've Already Lost Strategic Anomalies, Denial, Redoubling, and the Case for Withdrawing Now

In the wake of the recent upsurge in popular resistance in Iraq, and the evidence of soaring hostility among ordinary Iraqis and Iraqi elites of all persuasions toward the US-led occupation, a growing number of seasoned US military professionals are concluding that, from a strategic viewpoint, the US is either just now losing the "Iraq Peace," or may have already lost it, and should now focus its attention on accelerating the withdrawal of Coalition Forces.
This viewpoint, combined with harsh criticism of the "Bush/Cheney/Rumsfeld/ Wolfowitz/neocon clique," is becoming more and more widespread among senior Pentagon officers and planners, though most are still reluctant to go on the record.Of course the "usual suspects" on the American Left, like Nader, Chomsky, Jonathan Schell, and Howard Zinn, are ahead of the pack on this issue. But the interest in "withdrawal now" is also gaining ground among some conservative intellectuals, like the New York Times' David Brooks, who argued forcefully in an editorial just this week that the US needs "to lose in order to win" in Iraq. Support for withdrawal is also gaining ground with the American public at large, as noted below. But as in the case of the Vietnam War, the masses are way ahead of their "leaders."
Already, we've heard this revisionist view expressed in public by such Pentagon strategy heavyweights as former Reagan National Security Agency boss General William C. Odom, Army Major General Charles H. Swannack Jr., Commander of the 82nd Airborne Division, and Army Colonel Paul Hughes, who headed strategic planning for the Coalition Authority in Baghdad.
Even before the War, some senior officers, like former Army Chief of Staff Eric Shinseki, had grave concerns about the feasibility of Rumsfeld's war plan. But those had to do mainly with resources and tactics -- whether the plan provided for enough troops and heavy armor, and so forth.
In the last few months, however, the deepening concerns have shifted from tactics to strategy -- as in, do we really have one?
By this we mean:
- Does the US have a clear "definition of victory" and a long-term strategy for accomplishing it? Are these the same goals that it has announced publicly, or are there others? Is it now just floundering reactively from crisis to crisis, wishfully hoping that things will somehow work out, while getting locked in to a vicious cycle of anti-Americanism and violence? Even worse, as in the case of Vietnam, are US leaders just staying the course and sacrificing lives mainly for domestic political reasons, or because the US fears appearing to have been "defeated?"
- Are the initial or revised goals realistic, not only in terms of military might, but also political, economic, diplomatic, and moral capital? Has the US reached the point where -- as in Vietnam in 1967-68 -- these goals of the war are no longer feasible, either because, as in the case of WMDs, they based on misinformation, or, as in the case of "democratization," they may be inconsistent with continued US occupation, or have an unacceptable price tag?
- What are the real long-term costs of the current strategy likely to be, in terms of both "direct" and "opportunity" costs, and costs to credibility, image, and international relationships, as well as human and cash costs? How badly were these costs underestimated by the war's planners? Do we have any reason to believe that cost prediction has improved?
- What impact is this war having on other fronts in the war on terrorism? Has it become a costly distraction? Is it actually helping the terrorist cause, by providing a rallying point, an enticing opportunity to strike at US troops and foster internal divisions in Iraq, and a new source of armaments? If the US withdraws now, would that strengthen or weaken global terrorism? Would it clear the way for other countries or the UN to become more involved?
- Is a continued substantial US military presence in Iraq an obstacle to peace and security, and a source of increased religious and ethnic polarization in Iraq and the Middle East in general?
- Rather than announce increased, prolonged US troop commitments, isn't this the time for the US to announce a specific schedule for a US troop withdrawal, perhaps contingent on a ceasefire.?


As we'll see below, the overall answer is that a fundamental Iraq strategy rethink is overdue. We need to take into account the many "anomalies" that we've discovered in the last year, which have fundamentally undermined the original strategy behind the War.
When policymakers find their pet strategies challenged by such anomalies, however, their first response is usually to dig in.
In the case of the Vietnam War, for example, most top Democrats and many Republican leaders had already agreed by the end of 1968 -- in private, at least -- that no "strategic victory" was feasible at an acceptable price, and that a US withdrawal was indicated.
Shamefully, largely for reasons of cosmetics, the war was continued and even expanded during the next four years. At the time, both President Johnson and President Nixon were terribly concerned about "peace with honor" -- the country's appearing "weak" before the supposedly=unified global Communist menace.
That unity soon proved to be chimerical, with Vietham actually fighting China and Cambodia in the mid-1970s. But it took the US from 1968 until March 1973 to remove its last combat troops. And two more years of fighting were needed before the inevitable reunification of the artificial, largely US-inspired creations, "North" and "South" Vietnam. Today, of course, Vietnam remains united and "Communist," but it is known to us mainly as a worthy supplier of shrimp and coffee, and a hefty World Bank client.
As Senator John Kerry of all people must remember, those four extra years cost an additional 21,000 American lives, plus over 1.5 million extra Vietnamese, Laotian, and Cambodian lives. For what? Indeed, as Henry Kissinger himself admitted in a 2001 interview with documentary filmmaker Stephen Talbot, when asked about what difference it would have made if Vietnam had gone Communist right after World War II,
"Wouldn't have mattered very much. If the Vietnam domino had fallen then, no great loss."
So, according to Kissinger, the architect of that war's misguided strategy for withdrawal (and numerous other policy blunders), 58,000 Americans and 3 million Vietnamese, Cambodians, and Laotians basically died for nothing. This precedent is worth thinking about, as we each decide individually whether to continue to sit and watch while yet another cockamamie "national security" strategy chews up thousands of innocent lives.
DENY AND REDOUBLE
Unfortunately, the first response to strategic anomalies is usually a denial (or reinterpretation) of the new evidence, followed by a redoubling of efforts to make the tired old strategies work.
Where, as here, senior politicians who are also running for office are in charge, these tendencies are reinforced, since they fear being branded by their opponents as "inconsistent." As in Vietnam, the result could easily be over-commitment to a pipedream, ending in an eventual forced withdrawal that is much more costly than it needed to be -- and yet another young generation of Americans that never quite views their country in the same way.
Despite all this, you wouldn't guess from our President or his Democratic challenger that the US is facing any strategic crisis whatsoever in Iraq.
Both major parties' senior politicos and bosses are stuck like deer in the headlights, committed to the same pro-war strategies they all supported last year, as if nothing has been learned since then. Locked in a tight contest, both candidates are running toward the center of the field at full speed, and not paying much attention to the new hard facts on the ground.
Indeed, Senator Kerry, Bush, Senator Clinton, Senator Lieberman, Senator Biden, Senator Lugar, and almost all leading Democrats and Republicans appear to be in violent agreement about the Iraq War, except about picayune tactical details. Consistent with the "denial/redouble" pattern, all of these "leaders" are also now calling for a significant expansion of US troop commitments in Iraq. In Bush's estimates, this would require at least 135,000 to 160,000 US troops in Iraq at least through 2006. But there is no reason to expect that only two more years would be enough, if the current strategy is continued. Indeed, as discussed below, the Pentagon is already proceeding with a little-noticed plan to build 14 permanent US military bases in Iraq.
All these leaders, including Bush, also now give lip service to some kind of increased role for the UN in Iraq, without defining precisely what that would be. They all a bit too glib about how they expect the UN to reenter. Presumably this would be by way of a new UN resolution, but it is not at all clear how much "control" the US is willing to yield to its owl nemeses on the Security Council, France, Germany and Russia. Most important, if there is no fundamental improvement in the security situation, the UN is unlikely to want such a thankless job. But improving the security situation requires, as we'll argue, a new view of where the insecurity is coming from, and a US/ UK withdrawal timetable.
All this implies a substantial increase over the $175 billion that the US has already spent in Iraq -- Bush's latest $25 billion supplemental budget request is consistent with an annual "run rate" of at least $50-$60 billion a year. If recent casualty rates are any indication, this also implies at least another 1500-2000 US war dead through 2006, and probably 5-6 times that many US wounded, not to mention thousands more Iraqi dead and wounded, including many civilians.
In Senator Kerry's case, this lack of leadership is especially disappointing. As he said recently,
"Americans differ about whether and how we should have gone to war, but it would be unthinkable now for us to retreat in disarray and leave behind a society deep in strife and dominated by radicals."
One might have hoped that Senator Kerry would have learned something about the high costs and radicalizing effects of dead-end wars from his years in the Vietnam Veterans Against the War, if not from his four months on a gunboat in Vietnam.
PLUMMETING SUPPORT FOR THE WAR
According to the latest USA Today national poll, taken May 10, 2004, in the wake of the disgusting Abu Ghraib prison abuse scandal, for the first time since the initiation of the war in March 2003, over half of Americans -- 54 percent -- now believe that it was a "mistake" to send US troops to Iraq. This is a dramatic turnaround in less than six months -- and, interestingly, a much faster erosion of support for this war than occurred during the Vietnam period.
Fully 29 percent now believe that all US troops should be withdrawn now, up from just 16 percent in January, and another 18 percent believe that some troops should be withdrawn. At least 75 percent oppose expanding the number of US troops in Iraq.
Despite this, as noted earlier, this is still the official position of both President Bush and John Kerry, and almost all other leading Republicans and Democrats politicos. As in the case of the Vietnam War, the unwashed masses are far ahead of the "leaders."
At the moment, indeed, the only candidate who supports an immediate (six month) withdrawal from Iraq is the stubborn 70-year old Green Party candidate, Ralph Nader. Many Democrats still vehemently (and incorrectly) still believe that he was the "spoiler" for a Democratic victory in the 2000 Presidential election. Whatever we may think about his candidacy, the fact is that at the moment, he is our only Robert Kennedy. To his great credit, Ralph has opposed this "preemptive war on a defenseless country" since the start, and is now advocating a withdrawal of all US troops from Iraq within six months.
As the most recent polls show, Nader's opposition to the Iraq War has given him a burst to 5 percent in the May 10th polls. Perhaps Kerry could take a few lessons from Ralph with respect to his position on the war. For the same poll that showed American support for the war plummeting, and Bush's approval dropping from 52 percent in mid-April and 49 percent in early May to just 46 percent this week, also found Kerry dropping two full points, and losing to President Bush, 47-45, among likely voters. Meanwhile, Ralph gained two points, to 5 percent among likely voters.
Of course this is just one national poll, but if Kerry really wants to win in November, he'd better think about the implications of his me-too position on the war -- the main source of public dissatisfaction with Bush. As one analyst has suggested, he might even consider cutting a deal with Nader, getting Ralph to appoint the same list of Presidential electors as Kerry, in exchange for permitting him to get on the ballot in 50 states. He might also consider adopting Ralph's more popular position on the war, as well that of the National Council of Churches proposal, which has also just recommended withdrawing and turning over control in Iraq to the UN.
Oddly enough, this situation also gives President Bush an incredible opportunity. If he really wanted to insure his victory over Kerry, the smooth move might be for President Bush to reverse course and announce plans for a definite US withdrawal. If the polls are any indication, this "win by losing" approach to fixing Iraq would very popular with the American people -- especially those who are wavering in the center. It would also be popular with many Bush supporters on the right, who fear that aggressive nation-building in Iraq may jeopardize other vital concerns on their agenda -- like gay marriage, abortion rights, more tax cuts, new Supreme Court judges.
However, Bush, like Kerry, also seems determined to dig himself in even deeper on Iraq. This is what he did just this week by unnecessarily backing Secretary Rumsfeld in the Abu Ghraib scandal, despite the many calls from right and left alike for the Secretary's resignation.
STRATEGIC ANOMALIES - "REALITY CHECK, PLEASE!"
Meanwhile, as we've just learned in the towns of Falluja and Najaf, an aggressive US military presence may just lead to increased hostility. This is only one of many "strategy anomalies" that the war's architects -- Democrats, Republicans, and Blair's Laborites in the UK as well -- have encountered. There are many others. Most are already familiar to those who have followed recent events. But it is worth restating them, just to put the case in order.
1. Iraq's WMD Threat. Of course the first basic assumption, declared innumerable times in the fall of 2002 and early 2003 by US and UK officials during the run-up to the war, was that Saddam's Iraq posed a grave threat to the US and its allies. Either it already possessed WMDs and the means to deliver them, or was actively attempting to acquire them.
A key related assumption was that this threat could only be removed by an immediate US invasion, and the complete removal of Saddam from power. The UN weapons inspection program, according to the war's supporters, had been a failure.
In this regard, it is also important to note that the removal of Saddam's regime from power was never a goal of the invasion per se -- apart from the reduction of the WMD threat, reduced terrorism, and democratization. After all, the world is filled with lousy governments, and just replacing Saddam's nasty regime with another nasty regime could never have justified the invasion. So while the supporters of the war have often trumpeted Saddam's removal from power as a sign that we have already triumphed, in fact this depends on whether or not these other goals are achieved. And this is very much in doubt for all of them.
Reality Check, Please: Of course no WMD stockpiles or serious WMD programs have been found, after months of searching by thousands of highly-trained US and UK personnel.
It also now appears that the UN weapons inspection programs was in fact very successful at identifying whatever WMD programs Saddam had, and getting him to curtailing them. For all its imperfections, the UN approach worked.
Indeed, if, as France, Germany, and Russia proposed, weapons inspection had been permitted to continue, the war might have been avoided completely, or, at worst, eventually proceeded with better preparations and much broader multilateral support, as in the 1991 Gulf War.
That, in turn, would have meant less US influence over post-war Iraq (as in military bases and oil). But the costs to the US and Iraq would have been much lower, and the transition to peace and a new representative government much smoother and less violent.
2. Iraq's Role in Supporting Terrorism (Pre-War). The second key strategic premise for the war was that Saddam's Iraq was aiding al-Qaeda and other global "terrorist" groups.
Reality Check, Please: In fact one of the few bona fide pre-war "terrorists" who was living in Saddam's Iraq turned out to be the aging Abu Nidal,, who had been inactive since the mid-1980s. Abu Nidal was reportedly suffering from leukemia, but he died of multiple gunshot wounds in Baghdad in August 2002, long before the invasion -- perhaps the victim of an attempt by Saddam to head it off.
The only other "terrorist group" operating in Iraq before the war was Anwar al-Islam, which was located in northern Iraq in the "no-fly" zone, outside Saddam's control. Its headquarters could easily have been bombed at any time. But the US chose to wait until after the war started, so that it could say that it actually destroyed some terrorists.
Beyond this, no definitive pre-war links between Saddam and al-Qaeda have ever been established. As former Bush Administration counter-terrorism czar Richard Clarke and many other experts have long argued, Saddam and al-Qaeda were, if anything, antagonists, and even if he had had WMDs, Saddam was not about to share control over WMDs with a radical like Bin Laden.
The US also made much of the alleged medical refuge that Saddam allowed a Jordanian sympathizer, Abu Musab al-Zarqawi. The US claimed that Zarqawi had links to al-Qaeda. But in fact it seems that his organization, al-Tawhid, was actually a rival to al-Qaeda before the war, focused on overthrowing Jordan's King Abdullah. Of course Zarqawi's influence in Iraq does appear to have been strengthened by the US invasion.
3. Iraq's Role in Supporting "Terrorism" (Post-War). Whatever the details of Saddam's links to global terrorism were before the War, it was assumed by the war's supporters that the invasion would reduce Iraq's role in global terrorism.
Reality Check, Please: In fact just the opposite has occurred. Since the invasion, Iraq has actually become a terrorist Mecca, with anti-US fighters from all over the Muslim world pouring into the country across its now-wide-open borders, eager to kill Americans. They have no need to bring automatic weapons, grenade launchers, mines, or explosives. Saddam's huge stockpiles of these ordinary weapons have been very poorly secured by the under-manned Coalition Army. And arms have also reportedly been for sale from the new Iraqi police force.
So, as Bin Laden's most recent recorded messages have made clear, far from being a "defeat for terrorism," the Iraq War has actually been something of a boon -- rather like the December 1979 Soviet invasion of Afghanistan ultimately proved to be. The Soviets, we recall, lasted nearly 10 years in Afghanistan, at a cost of more than 15,000 Soviet lives and hundreds of thousands of Afghanis. The US is of course vastly more powerful than the Soviet Union. And, unlike that situation, there is no foreign aid available to the Iraqi resistance. Still, as noted earlier, such aid may not be necessary here. And the US is already well on its way to keeping pace with the Soviet casualty count.
Not only has the Iraq War provided opportunities for young radicals to secure weapons and attack Americans close at hand. It may have also distracted some resources from the hunt for other terrorist organizations. Most important, it has antagonized the whole Muslim world, providing the radical factions a wonderful opportunity to recruit new supporters.
4. Iraq's Warm Welcome for US "Liberators" The US also assumed that we would get a warm reception from Iraqis, as "liberators" of Saddam's Iraq.
The US war planners also assumed that Iraqi nationalism was weak, and that and resistance would disappear after Saddam and his "dead-ender" henchman were gone. They defined "victory" as the removal of Saddam and his Ba'athist regime. They also assumed that the Iraqi Shiite and Sunni communities were fundamentally at odds, and that there would be little opposition to the Coalition Forces outside the so-called "Sunni Triangle."
Reality Check, Please: The US invasion created a wave of genuine nationalism that spans Sunni and Shiite community lines and helped to unite radical factions in each against the "occupiers," as they united in their 1920 revolt against the British. The sharpest fighting in Iraq has taken place in just the last month, long after Saddam and all but ten of the other 55 "most wanted" Ba'athist leaders were killed or captured.
Furthermore, just 17% of all Sunnis live in the so-called "Sunni Triangle" north of Baghdad -- Iraq's Sunni and Shiite communities have historically been much closer than in other Muslim countries. Indeed, one key challenge faced by "foreign fighters' like Zarqawi has been to try and divide them. In the wake of the continued US occupation, the US has experienced growing armed resistance from Shiites and Sunnis alike, as exemplified by the May 10 US strikes against the Shiite leader al-Sadr's headquarters in Baghdad. While the vast majority of Iraqis are still watching the battle from the sidelines, a majority also now supports an end to the US occupation, believe that Coalition Forces have conducted themselves badly, and believe that the Coalition will not withdraw until forced to do so.
In this situation, destroying the Ba'athist Party turns to have been insufficient for a return to peace and security. Nor, given the importance of "ordinary" former Ba'ath Party members in the educational system, the civil service, and the police, was it a necessary condition. It was, in fact, just a dumb move taken by Paul Bremer in the early days of the occupation, under pressure from Chalabi's INC, and recently reversed.
5. "Liberators" Vs. "Occupiers" The pro-war strategists also assumed that thousands of US and UK troops could be counted on to conduct themselves in Iraq as proper "liberators."
Reality Check, Please: The grim reality is that the US and UK forces have hardly distinguished themselves as "liberators." Rather, they were rushed into duty without adequate training or acculturization, with few skills in Arabic. They had also been encouraged from the top of the Bush Administration on down to believe that Iraqis had something to do with 9/11. As a result, many of our troops have behaved very badly toward ordinary Iraqis -- like crude, rude barbarians. As the events at Abu Ghraib prison have dramatized, there have been widespread human rights violations, ethnic slurs, religious slights, and indignities to Iraqi women. The result has been yet another public relations debacle for the Coalition Forces.
6. Iraq's Support for "Acceptable" Democracy. Yet another key strategic assumption was that in a relatively short time, the Coalition and its Iraqi allies would be able to lay the foundations for an "acceptable" democratic system.
In the US vision, this would be one that would be (a) reasonably representative, (b)able to avoid the kind of popular theocracy that has characterized Iran, (c) pro-US and at least neutral towards Israel, (d) able to maintain a unified federal system, including the Kurds, and, of course, (e) be willing to go along with other key 'imperial" requirements, like permission to build 14 long-term US military bases in the country, the use of oil revenue to defray the invasion's costs, and the opening of Iraq's oil resources to foreign investors like ExxonMobil and BP.
Saddam's Removal Alone Not a Victory.In this regard, it is important to note the simply removing Saddam and his associates was never a goal of th invasion for its own sake -- apart from the goal of removing the WMD threat, cutting his alleged support for terrorism, and ultimately installing democracy. In other words, the world is filled with nasty dictators -- there had to be some special reason for singling him out. And no one argued that simply replacing him with yet another nasty dictator would justify the war -- apart from achieving these other goals. So the fact that he and his regime have been removed only "justifies" the war if, in fact, we are able to achieve these other objectives. So far, as we've seen, the WMDs, reduced terrorism, and democratization have all proved elusive.
Reality Check, Please: In fact it has been impossible to square all these various requirements with each other. As of April 2004, after a year of occupation, while 82 percent of Iraqis still support "democracy" in the abstract, , outside Kurdistan, most Sunnis as well as Shiites are also opposed to a rigid separation between church and state. There is, at this late date, still no complete draft constitution that the various key interest groups in the Coalition-appointed Iraqi Governing Council Constitution have been able to agree on.
Ordinary Iraqis, it turns out, are also highly critical of the US, the UK and Israel. They are also highly critical of the Iraqi Governing Council created by the Coalition Forces, which is widely viewed as a puppet government. As the Shiite leader Sistani has said, "We want elections as soon as possible."
Finally, Kurdistan, the one part of the country that is now stable and enjoying economic recovery, also strongly favors complete independence, not federation. Since Kurdistan is one of Iraq's richest provinces -- the original source of much of its oil -- the rest of Iraq is determined to prevent this. As if we needed one, this issue provides another potential source of conflict. The whole country is a bubbling cauldron of such regional, ethnic, tribal, religious, and anti-foreigner feelings, and we have turned up the fire.
7. Reestablishing Security/ Avoiding a Military Draft. Yet another key strategic assumption was that it would be relatively easy to reestablish security with a "politically acceptable" commitment of less than 150,000 Coalition troops. This force level was supposed to diminish over the year, complemented by a large number of private security contractors.
This was supposed to be feasible, despite Bremer's disbanding of the entire pre-existing Ba'athist police and military structure, the fact that Saddam had emptied his prisons of all criminals prior to the invasion, and the fact that Coalition troops had little training in policing, crowd control, or non-lethal weaponry.
One implication was that the US and UK troops expected to rotate home on regular schedules, without undue burdens on their families and morale.
Another was that a new Iraqi national police force would be able to provide an adequate substitute for the Coalition's policing activities.
A third was that military forces from other countries, or the UN, might also become available to back-stop US and UK troop commitments, as the security situation stabilized.
Reality Check, Please: All these assumptions about security have proved false. Almost incredibly, the US military repeated the same exact mistake that was made in Haiti during the 1990s. The complete disbanding of Iraq's military, with no adequate substitute, played a key role in the initial looting that occurred in Baghdad in April, 2003, and the general crime wave and insecurity, especially in Baghdad, that has continued ever since.
It also turned out to be much harder than expected to "train up" an Iraqi police force willing to stand and fight (for what? the unelected IGC?) As the Iraqi resistance became more violent, this problem escalated, to the point where, during the recent turmoil in Falluja, more than 50% of the new police force graduates defected or disappeared into the crowds.
The continuing security problem, in turn, scared many private contractors out of the country, and jeopardized the whole schedule for Iraqi reconstruction, which has basically ground to a halt. The exposed yet another dubious assumption by the war planners -- the decision to rely so heavily on private contractors for security services and reconstruction.
The security crisis has also prolonged service terms for US troops, and led many of them to be given assignments to "policing functions" for which they were never trained. That, in turn, led to even great frustration among US troops -- more than 40,000 of whom are members of the US Army Reserves or National Guard. This encouraged increased hostility among Americans and Iraqis, many of whom are now viewed as "criminals who hate us."
The resulting morale problems have caused US Army Reserve and National Guard enlistment and reenlistment rates, as well as regular military enlistment rates, to plummet to 30 year lows. Another byproduct of the continuing security nightmare is that it has been very difficult to get other countries, or the UN, to maintain, much less expand, their troop commitments.
If Iraq's security situation continues to demand increased Coalition troop commitments, and reenlistment/ enlistment rates don't improve, some observers have even speculated that the US might be forced to reintroduce a military draft. For the moment this appears unlikely, unless some new "front" opens up in Syria, Iran, or North Korea. But those of us with college-age children take no comfort from the fact that several members of Congress have already introduced the necessary legislation.
8. Modest, or At Least "Acceptable," Costs. The last key assumption was that this whole effort could be mounted at a relatively modest, or at least politically-acceptable cost, not only in terms of direct financial costs, but also human lives, and "opportunity costs" as well.
Evidently it was assumed by some planners -- Assistant Defense Secretary Wolfowitz, for example -- -- that Iraq's oil production would resume quickly enough to let it make a substantial contribution to funding the costs of the War. It was also assumed that, as indicated earlier, security would improve rapidly after Saddam's demise, and that the new Iraqi police force would be able to substitute for US troops, at a fraction of their cost. Finally, the super-optimists in the pro-war camp may have even assumed that part of the War's freight would be paid by other UN members, even though the Security Council was never permitted to rule on the final decision to go to war.
Reality Check, Please: The reality is that, just one year in, the Iraq War is a budget-buster, both in terms of cash and lives.
The Pentagon's cost accounting for this effort is inscrutable -- perhaps intentionally so, although it is never easy to know precisely what fraction of, say, a hospital in Frankfurt or "wear and tear" on a particular aircraft is properly assignable to a specific front. However, most estimates put the "sunk cost" to date of the Iraq venture at about $175-$180 billion, including the current interest on this spending, since all of it has to be deficit financed.
Going forward, there is now a continuing "run rate" of about $5 billion per month. In terms of real dollars, this is close to the peak $5.1 billion per month run rate for the Vietnam War. These numbers omit the costs incurred by the UK and other Coalition members, which have supplied about twenty percent of the troops.
Nor is Iraqi oil production anywhere close to covering these financial costs. Indeed, production has still not recovered to its pre-War levels, and the cost of securing Iraqi oil exports against increasing sabotage attempts is eating up almost all the profits.
Coalition casualties have also been much greater than expected. As of May 10, after 13 months of combat, the Coalition has sustained a total of 881 combat fatalities and approximately 4716 wounded, assuming that US "dead to wounded" ratios also apply to non-US Coalition Forces.
While these totals are well below those sustained during the peak years of the Vietnam War -- 1967-69 -- they are far greater than those sustained during the first three years of that War, 1961-64, and comparable to the losses sustained by the US in 1965, the first big year of the Vietnam War, allowing for improved survival rates because of improved body armor and "just-in-time" medicine.
As for the Iraqis, officially, our "new Pentagon" no longer keeps track of "enemy body counts" much less civilians -- one major way in which Vietnam was indeed different, at least for PR purposes.
However, efforts have been made by some observers to keep track of Iraqi civilian fatalities reported in the press. While these statistics are probably an understatement, they indicate at least 9,016 to 10,918 Iraqi civilian deaths through April 24, 2004.
In addition, of course, there have also been at least 4-5 times this number of civilians wounded. In a country with a population of 25 million, this is quite a blow. For the 80 percent that is Arab, and has suffered almost all the casualties, this would be comparable, in US terms, to a loss of 100,000 dead and 600,000 wounded. Assuming that the average Arab family in Iraq has four members, that each family member knows 10 people, and each of these 10 people also knows 10 people, the entire Arab population of Iraq is within "two degrees of separation" of experiencing these losses personally.
Is it any wonder that we have already lost the peace?
LONGER-TERM COSTS
But the real long-term costs of this war are even higher.
Far from striking a decisive blow for "democracy and liberation" in the Middle East, and setting an example for other Arab countries to follow, this war has become a lightning rod for anti-Americanism, and a text-book example of hegemony run amok.
Far from teaching the world to respect and admire America's newfound power, our global reputation has plummeted to an all-time low.
The citizens of other countries that have practiced imperialism against their neighbors or their own peoples know what it feels like to be despised and hated whenever they travel. Americans are not used to this treatment. As a direct result of this war, as well as our other policies in the Middle East and elsewhere, we are going to have to get used to it.
Far from providing the world an inspiring example of our truthfulness and honor, the way in which the case was made for this war, and the way it has been conducted, have severely damaged our nation's credibility.
THE CASE FOR WITHDRAWING NOW
Few Americans doubt that we will someday withdraw all US troops from Iraq, as we did from Vietnam.
Probably most of them are not aware that, unlike in Vietnam, the Pentagon's military engineers are already hard at work designing and constructing 14 enduring" military bases all over the country, in Baghdad, Mosul, Taji, Balad, Kirkuk, and near Nasiriyah, Tikrit, Fallujah, Irbil, and elsewhere.
Apparently the objective is to provide a substitute for our bases in Saudi Arabia, both in terms of oil and military presence. Presumably this is being done in part to keep the Saudis happy, because they are afraid that the presence of US troops on their soil excites domestic resistance. The influential Saudi royal family no doubt prefers to have US troops and Iraqi leaders facing down resistance in Baghdad than in Riyadh.
But of course we are still telling the Iraqis that they will "eventually vote and govern" their own country, and that we have no intention to "occupy" it.
In any case, for those who oppose this war, as well as for the vast majority of Americans who have swallowed the brave new lies that our only long-term interest in Iraq was to "remove Saddam's tyranny," "rebuild Iraq's economy and democracy" and "withdraw," the key issue is -- when should withdrawal start?
In light of the recent crisis, the war's architects have not been able to get on with their agenda quite so easily as they once hoped. Faced with the acute security crisis noted above, they now tell us that if we only increase the number of troops in Iraq for two more years, and provide the extra $120 billion that this will require, we can all return to the original smooth transition plan.
On the other hand, they also claim that, unless the Coalition stays the course, Iraq will disintegrate into civil war, instability, and chaos -- even worse conditions, somehow, than already exist.
To this hokum we say, first, as noted above, the credibility of war's architects is not exactly unsullied.
And now they seem to be proposing yet another episode of "Who Do You Believe -- Me, Or Your Lying Eyes?" As Peter Gutmann once remarked,
We're standing there pounding a dead parrot on the counter, and the management response is to frantically swap in new counters to see if that fixes the problem.
The good news is that the war's architects and pamphleteers are actually very few in number. As the New York Times' Thomas Friedman said in an interview with Ha'aretz, the leading Israeli newspaper, in 2002,
This is a war the neoconservatives wanted.....(and) marketed. Those people had an idea to sell when September 11th came, and they sold it. Oh boy, how they sold it. This is not a war that the masses demanded. This is the war of an elite. I could give you the names of 25 people (all of whom sit within a 5-block radius of my Washington D.C. office, who, if you had exiled them to a desert island a year and a half ago, the Iraq War would not have happened.
So if this tiny band was able to wield so much influence in our erstwhile democracy, and a much larger number now realize that their assumptions were wrong, shouldn't it be possible to reverse course?
Or is it the case that we are now locked into sitting through this whole dreary play?
We say that it is actually the continued occupation that is the greatest threat to stabilization, democratization, the restoration of the Iraqi economy and oil exports, and the preservation of a unified Iraq.
Indeed, Iraqi hostility to the US/UK occupation has now reached the point where securing these goals is much more likely if the Coalition forces withdraw as soon as possible.
This is true for several reasons:
- The announcement of a definite schedule for a US withdrawal will almost instantly cool the resistance, reduce the leverage of the radicals and the "foreign fighters," permit Iraq's beleaguered police force to focus on fighting crime, and allow the reconstruction of the economy to proceed.
- Mainstream Iraqis know full well how to restore order in their own neighborhoods, once the US provocation is gone. Absent the occupation, the vast majority of Iraqis have will have little patience for foreign subversives like Zarqawi, and will throw the bastards out.
Indeed, an astute US withdrawal from Iraq would surely be as sad a day for "terrorists" as the US invasion of Iraq was a happy one.
- A clear withdrawal timetable would make it clear to the Iraqis that the US has no imperial intentions with respect to their oil wealth. It would permit them, indeed, to recover some of the pride that they must have lost, by having to rely on a foreign power to get rid of their dictator -- even if they did not oust Saddam, they could feel, at least they were able to oust the most powerful hegemon on the planet. That along could provide the ideological foundations for a vast rebirth of national pride.
- Such a withdrawal would also permit us to remove the provocative presence of largely-Christian US troops in this overwhelmingly Muslim country. It would also clear the way for UN assistance and multilateral development aid, as well as debt relief, to flow more freely into the country.
- Few Iraqis want a return to dictatorship -- the demand for democracy is overwhelming. It is indeed odd and un-American for the United States of America, in particular, to insist that Iraqi democracy can only be established at the point of a gun, under the guiding hand of a foreigner. I don't recall the Founding Fathers at Constitution Hall in Philadelphia requiring much help from Tony Blair's predecessors or the British monarchy.
- As General Odom has argued, a clear timetable for a US withdrawal would actually help the UN, or perhaps other Muslim states, to send in peace-keeping forces of their own.
- So long as the war in central and southern Iraq continues, the Kurds have every incentive to continue to move toward complete independence. Only the restoration of Iraq's central government, and the prospect of "win win" gains from interregional trade and development, can break down these regional ethnic barriers, and keep Kurdistan within Iraq.
- A US/UK withdrawal, according to a pre-announced timetable that is brisk, yet responsible, and conditioned on the preservation of security, will give Iraqis an incentive to observe and enforce their own general ceasefire, so long as they clearly see progress being made. Unlike the situation in Israel, where the occupiers have been stalling for more than 30 years on "security" grounds, the US has no settler minority that is trying to hold on to Iraqi resources, unless it is Chalabi's band of thieves, thirsting after an oil privatization. But we don't have to be hostage to his demands; if he becomes a problem, we can simply approve Iraq's new extradition treaty with Jordan, cut off his $300,000 per month allowance, and dump him over the border in Jordan, so that he can finally stand trial for bank fraud.
- If it did turn out that this little experiment with a pre-scheduled withdrawal failed, and the Iraqis themselves, perhaps with UN assistance, were unable to develop a peaceful government, there'd be nothing to prevent the UN or even the US from returning with a more rested, better trained peace-keeping force. After all, we're now pretty sure that the Iraqi Army is not about to fight us with WMDs!
- In the event that a civil war erupted in Iraq, and the situation somehow managed to deteriorate from the abysmal state where it is right now -- which 58 percent of Iraqis believe is "the same or worse" than before Saddam's removal -- peacekeepers would probably be welcomed by the majority of Iraqis. This is unlike the current situation, where only a third of Iraqis believe the Coalition forces are doing more good than harm.

All told, the case for a US unilateral withdrawal from Iraq seems very compelling. If the case for it is made to the American people by a leading political figure, it could also be politically very successful. But, other than a marginal, if courageous and thoughtful, candidate like Ralph Nader, is willing to pick up this torch?
Where, indeed, is our Robert Kennedy? Where is the major US political figure who will stand up to this war?
Of course there are some self-styled "conservatives" in the audience -- people who have otherwise somehow found it possible to give a hearty "Sieg Heil" to one of the most radical, un-Constitutional, internationally illegal, risky, costly and irresponsible exertions of military power in US history -- who will no doubt argue, as they did in the case of Vietnam, that this abrupt policy reversal " might be risky."
After all, it might undermine US credibility! It might encourage the world's terrorists! It might make our own allies distrust us! It might jeopardize national security!
These are the same folks whose tidy little war plan has just sullied America's image and credibility almost beyond repair. It has poured hundreds of billions of dollars and thousands of liters of human blood into the sands of the Iraqi desert. It has helped make Iraq more of a sanctuary for the world's worst terrorists than ever before. It has alienated the entire world, and succeeded in making many Iraqis actually long for the old regime.
As Bertrand Russell once remarked, "The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubts."
Or, as Peter Gutmann might have said, the war's supporters are still trying to swap in new counters and pound them with the same old dead parrots.
May 11, 2004 at 05:15 PM | Permalink | Comments (0) | TrackBack
Sunday, May 09, 2004
04509.US Brutilitarianism Comes to Iraq- Part II: The Roots of Brutality


In the midst of all the hoopla and finger-pointing over Secretary Rumsfeld’s apology for the Iraqi prisoner abuses at Abu Ghraib prison, we seem to have avoided getting to the bottom of the fundamental question begged by all those ugly photos: why did it happen?
In other words, how could young American soldiers, raised in a nominally democratic, civilized “Judeo-Christian” society, and members of the world's most advanced military, which has no business being in Iraq if not to “liberate” it from precisely this kind of oppression, come to act in this way?
From this angle, whether or not Rumsfeld or a few military commanders resign is beside the point – a juicy chance for Senator Kerry and his supporters to make political hay, perhaps, but largely irrelevant to our understanding of these disturbing events and the prevention of their recurrence.
This is especially true if, as we will argue here, they may have been part and parcel of the very nature of this ethnically-divisive dirty little urban guerilla war.
ALTERNATIVE EXPLANATIONS
At this point, the official US investigation, as well as press accounts, of the recent abuses at Abu Ghraib prison are incomplete. Already, however, there are several conflicting explanations.
“Exceptional Evil-Doers.” As noted in Part I of this series, the prevailing view of US officials is the “bad apple” theory -- in President Bush's words, "the wrongdoing of a few." This explanation -- which has deep roots in American culture, dating as least as far back as the Salem Witch trials, and is also at the heart of our conventional view of "terrorists" -- attributes the problem to brutal, distinctly “un-American” misbehavior by handful of “bad” people. In this view, this tiny group is clearly distinct from the vast majority of decent, Geneva Convention-abiding US military personnel. This explanation has been adopted by a wide variety of political and military leaders, from President Bush, Secretary Rumsfeld, and General Myers to Senators MeCain, Kerry, and Clinton. It also appears to be the predominant view in the mainstream press, perhaps because it lends itself to the kind of lengthy profiles of soldiers that, for example, the New York Times and the Washington Post have both front-paged several times this week. It is also necessarily more comforting to supporters of the Iraq War -- including all the leaders and newspapers just mentioned -- who view this scandal as an embarrassing, unhelpful distraction from the immediate task at hand, which is to get on with "stabilizing" the security situation in Iraq (e.g., crushing the resistance).
This kind of explanation is a standard one for individual criminal conduct in general. Typically it locates the roots of abusive behavior in the supposed predispositions of particular abusers to commit them. The contributing dispositive factors may vary -- pathological or "authoritarian" personalities, genetic defects, retributions for perceived injustices, inadequate schooling, too much TV, weak role models, or Salem witchery, for all we know. Whatever these underlying, the indicated prescription focuses on identifying and and handling these “bad seeds,” and in this case, any individual commanders who may have also “failed” to supervise them.
(to be continued....)
May 9, 2004 at 02:55 PM | Permalink | Comments (2) | TrackBack
Thursday, May 06, 2004
0505.US Brutilitarianism Comes to Iraq- Part I: Rogue Behavior, Sheer Stupidity, or Something More?
Since all the claims about Saddam’s WMDs and other threats to global security are now in tatters, the goal of replacing his regime with a more humane one is one of the few justifications for the Iraq War that still has any credibility.
It is therefore deeply disturbing to learn that serious human rights violations, including several cases of torture and outright murder, may have been committed against scores of Iraqi prisoners by leading elements of the Coalition Forces, including the US Army Reserve Military Police, US Military Intelligence (INSCOM), the CIA, and private contractors like CACI International that have been providing so-called “intelligence collection” services to the Pentagon. Similar allegations have also surfaced about the British Army, although those charges may have been exaggerated by the Daily Mirror. This is of course in addition to the thousands of civilian deaths that our precision-guided military has produced all over the country with its "fire and forget" tactics.
A preliminary investigation of these charges by US Major General Antonio M. Taguba, disclosed last week by CBS News and the New Yorker, concluded in February that the alleged abuses included:
Breaking chemical lights and pouring the phosphoric liquid on detainees; pouring cold water on naked detainees; beating detainees with a broom handle and a chair; threatening male detainees with rape; allowing a military police guard to stitch the wound of a detainee who was injured after being slammed against the wall in his cell; sodomizing a detainee with a chemical light and perhaps a broom stick, and using military working dogs to frighten and intimidate detainees with threats of attack, and in one instance actually biting a detainee.
While this behavior pales by comparison with the sadism that was routinely practiced by Saddam’s minions on a massive scale, it has no place in a post-Saddam Iraq, and the US and UK certainly have no business encouraging it. Nor, despite the usual defences often heard in right-wing quarters for "selective torture" to deal with terrorism, is there any evidence that the interrogation methods employed here produced anything more than a public relations debacle for the US and its allies.
These charges may be new to American audiences, but this is far from the first time that they have been made. According Iraq’s former Human Rights Minister, Abdel Basset al-Turki, who resigned on May 4th in protest over these allegations, US Pro-Consul Paul Bremer was put on notice about this widespread mistreatment as early as November 2003, but did nothing. The International Red Cross also reports that if has been complaining for months to the Coalition of methods "far worse" than those depicted in the photos released by CBS, but to no avail.
This is consistent with the brush-off that Paul Bremer and Condi Rice reportedly both gave to the concerns that Amnesty International first raised about conditions in Iraqi prisons way back in July 2003.
The Pentagon also now says that it ordered a “high-level review” of the issue last fall, but this must have had little impact on the ground, since the abuses noted above took place in December 2003.
Now that President Bush himself has finally pronounced these abuses “abhorrent” on two Arab television stations, the military may find more time to focus on the results of the "30 investigations” of these and related charges that it claims to have conducted over the past 16 months. In the wake of Donald Rumsfeld's passive "know-nothing" response to the escalating scandal, calls for his resignation are mounting, and the heads of more than just a few mid-tier officers may also have to roll before justice is done here. Certainly Rumsfeld has a lot of questions to answer, including why he said nothing to Congress about the investigation; why so many cases of brutality have been recorded; why nothing was done to change conditions at Abu Ghraib and elsewhere, despite numerous complaints over many months from Amnesty International and the International Red Cross; why private contractors were permitted to play such a large role in military intelligence; and, most important, why the rules of the Geneva Convention are not being enforced by US military and intelligence personnel.
All told, this scandal is shaping up to be a touchstone for the whole Iraq enterprise. It is worth trying to understand how this happened – and why, in particular, American soldiers who were supposedly raised in a democratic, more or less civilized country and trained by our sophisticated, extraordinarily-expensive modern military, should have participated rather gleefully in such bizarre behavior -- and even photographed it, too!
THE RHETORIC - “TORTURE IS ILLEGAL”
To begin with, lest all the civil and military servants in the audience need to be reminded, this kind of behavior, if substantiated, constitutes a clear violation of one of the most fundamental, widely-shared principles of international and US law – the absolute prohibition against torture.
This prohibition, which is as universal as the ones against slavery or piracy, extends to all prisoners of war, civilians, and all other war-time detainees. Indeed, while Iraqi insurgents may not be deemed to be part of the Iraqi armed forces, and therefore are not technically “prisoners of war” for purposes of the “combatant’s privilege – to fire on enemy troops without fear of prosecution - they are still entitled to the same basic rights so far as interrogation is concerned.
It is interesting to see just how many times this prohibition has recently been repeated in international law -- especially since so many countries still routinely engage in the practice.
- The 1948 Universal Declaration of Human Rights, Article 5: “No one shall be subjected to torture or to cruel, inhuman or degrading treatment or punishment.”
- The 1949 Geneva Convention (4th Convention, Article 31): “No physical or moral coercion shall be exercised against protected persons, in particular to obtain information from them or from third parties.”
- The 1966 International Covenant on Civil and Political Rights, Article 7: “No one shall be subjected to torture or to cruel, inhuman or degrading treatment or punishment.”
- The 1975 Declaration on the Protection of all Persons from Being Subjected to Torture and Other Cruel, Inhuman or Degrading Treatment or Punishment, unanimously adopted by the UN General Assembly: “No state may permit or tolerate torture or other cruel, inhuman or degrading treatment or punishment. Exceptional circumstances such as a state of war or a threat of war, internal political instability or any other public emergency may not be invoked as a justification of torture or other cruel, inhuman or degrading treatment or punishment.”
- The 1987 UN Convention Against Torture, which focuses on official conduct, repeats the prohibitions noted above, and also provides that (Article 2): "An order from a superior officer or a public authority may not be invoked as a justification of torture."
All of these conventions have been signed and ratified by both the US and the UK -- although the Bush Administration has actually cooperated with countries like Syria, Libya, and Cuba to oppose efforts to give international inspection “teeth” to these anti-torture conventions.
Of course such prohibitions against torture, cruel and degrading treatment of prisoners prior to conviction, arbitrary detention, “cruel and unusual punishment,” and self-incrimination are also cornerstones of the American and British legal systems.
In the case of the US, they have at least a 200-year history, and are deeply embedded in the US Constitution especially the Fifth, Eighth, Thirteenth, and Fourteenth Amendments. They have also been recognized by numerous state and federal statutes, including the Uniform Code of Military Justice, the War Crimes Act (18 USC 2441), and the 1991 Torture Victims Prevention Act (28 USC 1350 App.)
More recently, the US also played a leading role in prosecuting war crimes at Nuremberg and Tokyo after World War II, and helped finance and organize the prosecution of the war crimes committed in the Balkans and Rwanda in the 1990s. Of course the Bush Administration has also recently taken pride in distinguishing itself from “Axis of Evil” regimes like Saddam’s, North Korea and Iran in this respect.
RELIEF?
So what relief will all this weighty legal doctrine actually provide for the individual Iraqis who were victimized in this case? The short answer is – not much, at least in case of the US.
There appears to be probable cause to prosecute the officers and enlisted personnel involved, as well as any senior officials who knew or should have known about these activities, for war crimes in US or UK military courts, or, in the case of the UK, by the newly-created International Criminal Court (see below.) But while that might satisfy the victims’ needs for retribution, it won’t provide them with any compensation.
The Iraq Special Tribunal that we established in December 2003 to prosecute war crimes and crimes against humanity – mainly those committed by the Ba’athist regime – was carefully limited just to jurisdiction over Iraqi citizens and residents.
In the case of the private US contractors involved in this case, the Iraqi torture victims might at least be able to sue in a US federal district court for the damages inflicted against them under “color of law,” as provided by the Torture Prevention and Alien Tort Claims statutes. But this could be a hard case to prove, especially if the contractors involved were careful to let the military police do all their dirty work.
If the allegations of British misbehavior hold up, the Iraqis might also be able to bring war crimes charges against UK soldiers and their bosses at the new International Criminal Court -- since, unlike the US, Britain ratified the treaty establishing this court last year. This might make for an interesting complement to Saddam’s war crimes tribunal, scheduled to begin later this year. However, even apart from the practical obstacles to such prosecutions, Iraq’s “new government,” hand-picked by the Coalition, would probably readily grant the UK a waiver against such complaints. But British troops might still be subject to the UK's Human Rights Act, which might provide compensation to Iraqi victims for violations of the European Convention on Human Rights. Indeed, 14 Iraqi families have already commenced an action in the UK's High Court for what they consider unlawful killings of their civilian relatives in the last year by British troops.
DAMAGE CONTROL?
When all these revelations first appeared – reportedly after CBS News held the story for two weeks at the request of the US military, which feared the impact on Arab opinion ("duh") -- senior military officers in the US and the UK, as well as political leaders like Prime Minister Blair, President Bush, and even the rather cautious Senator Kerry, were all quick to condemn the obviously indefensible misbehavior. But they were also quick to claim, prior to any investigation, that this behavior must have been exceptional, engaged by at most a handful of “rogue elements” in the military and the CIA, who will now all be sternly dealt with.
However, as Amnesty International has noted, these were not just isolated incidents. Indeed, as we’ll argue below, they appear to be part of a disturbing trend toward the increasing use of “hard-core” interrogation techniques on Arab detainees by the US and its new allies in the “war on terror,” both abroad and at home.
Moreover, the US and UK civilian and military intelligence services and their contractors and surrogates have a long history of intimate involvement with such interrogation methods.
In fact what’s really most unusual about these recent scandals is not the revelation that all these services routinely use such methods, but that in this case they got their hands dirty.
Usually they are smarter than that, outsourcing the "wet work" to Third Worlders in countries with fewer reporters, human rights observers, or young Army reservists.
Ironically, in the case of occupied Iraq, these intelligence services had no other country to "outsource" the work to. With thousands of Iraqi prisoners to deal with, and a growing insurgency, they also had little choice but to rely on at-hand Army Reservists, at least one of whom decided to turn in his comrades. If the military and the intelligence services had not gotten caught, one wonders how many of those "30 investigations" and 25+ deaths in detention we'd have ever heard about.
There is also evidence that, especially in the wake of 9/11, similar tactics may be spreading to domestic law enforcement back in the USA – reflecting a growing militarization of police work. Interestingly, such tough tactics may not actually produce any more solved crimes. But they do provide nice opportunities for frustrated investigators to blow off steam.
Tying all this together, the patterns revealed here really belie the conventional notion that the hard-core interrogation tactics recently seen in Iraq were simply rogue actions by a group of unprincipled individuals.
Nor were they, in Donald Rumsfeld’s words (after four days of silence on the subject), simply “unacceptable and un-American.”
It is more accurate to say that, under the license of our new post-9/11 crypto-culture, many military and civilian intelligence and law enforcement officials apparently feel entitled to violate fundamental civl rights – especially those of Arabs and other suspect minorities – in the interests of pursuing “bad guys.” This is a little taste of what it was like to have been a white cop in Macon, Georgia, or Jackson, Mississippi, in 1956.
So this kind of behavior has become, if anything, all too acceptable, all too American. To make sure that it diminishes, rather than continues to grow, we need to get to the bottom of the institutional failures , not just the individual errors in judgment, that foster it. Just as we (almost) once got to the bottom of the systemic problems in Macon and Jackson.
THE REALITY
The latest disclosures from the Pentagon, when added to other reports in the last two years, add up to some disturbing patterns:
Iraq. As the Pentagon only disclosed on May 4, since December 2002 it has launched investigations of at least 25 suspicious deaths of prisoners in custody in Iraq and Afghanistan, and 10 cases of assaults. In addition, in Iraq, three Army reservists –Army Reserve military police, like those accused in the most notorious incidents -- were discharged in January 2004 for abusing prisoners at Camp Bucca, south of Baghdad near Umm Qasr. In December 2003, two British soldiers were arrested but released after an Iraqi prisoner died in their custody. In November 2003, Major-General Abed Hamed Mowhoush, of Saddam’s Republican Guard, fell ill and died during “an interview with US forces." In August 2003, a US Army lieutenant colonel received a fine, but no court marshall, for firing a shot near a detainee’s head during an interrogation. There have been reports of similar abuses at “Camp Cropper” and “Camp Bucco,” near the Baghad International Airport. Furthermore, the US has been very slow to provide information on the whereabouts and conditions of up to 10,000 civilians who have been detained in Iraq, leaving many family members completely in the dark about them. And, according to Amnesty International, these detainees have been “routinely subjected to cruel, inhuman or degrading treatment during arrest and detention.”
Afghanistan. According to Amnesty International, detainees interrogated by the CIA at Bagram Air Base have allegedly been subjected to "stress and duress" techniques that include ”prolonged standing or kneeling, hooding, blindfolding with spray-painted goggles, being kept in painful or awkward positions, sleep deprivation, and 24-hour lighting.” In December 2002, two Bagram detainees died under suspicious circumstances. A 13-year old Afghan boy who was detained in Bagram for two months described it as
“...a very bad place. Whenever I started to fall asleep, they would kick on my door and yell at me to wake up. When they were trying to get me to confess, they made me stand partway, with my knees bent, for one or two hours. Sometimes I couldn't bear it anymore and I fell down, but they made me stand that way some more."
A December 2002 press report on standard practices at Bagram sounds like it is not all that different from what recently was discovered to be going on at Abu Ghraib prison in Iraq:
Captives are often "softened up" by MPs and U.S. Army Special Forces troops who beat them up and confine them in tiny rooms. The alleged terrorists are commonly blindfolded and thrown into walls, bound in painful positions, subjected to loud noises and deprived of sleep. The tone of intimidation and fear is the beginning, they said, of a process of piercing a prisoner's resistance. The take-down teams often "package" prisoners for transport, fitting them with hoods and gags, and binding them to stretchers with duct tape…..
The US military has also been accused of standing by and watching while its allies in the Northern Alliance slaughtered up to 4000 captured Taliban prisoners of war.
Other Secret Detention Centers. There have also been reports of serious psychological and physical abuse at Guantanamo’s Camp X-Ray and Camp Delta, and other off-limits detention centers, such as Diego Garcia in the Indian Ocean. Up to 3000 detainees are being held in such facilities, but US military officials have refused to disclose their precise names and numbers, and have only allowed intermittent visits from the International Red Cross. In Guantanamo alone, after two years, more than 660 inmates are currently in detention, including some children. All these detainees have been designated as “unlawful combatants” by the US military; we do not know their names or the charges against them, and none of them have received any judicial review, access to lawyers, or even contact with relatives. Indeed, even US citizens are being held in this indefinite “right-less” limbo status, the legality of which is now being challenged in the US Supreme Court. It seems likely that the prisoners’ right-less status has helped to encourage abuses against them.
“Refoulement.” The US has also apparently subjected hundreds of suspects – including, according to the CIA’s George Tenet, at least 70 before September 11th -- to “extraordinary renditions” to countries like Jordan, Egypt, Uzbekistan, Morocco, Saudi Arabia, Israel, and Syria where edgy interrogation methods are routine. According to one report, in the mid-1990s the CIA significantly expanded its efforts to snatch suspected Arab terrorists for purposes of such renditions, from which it shared in any resulting information, by way of a new secret Presidential “finding” that purportedly "authorizes" it -- in violation of all the international treaties noted earlier. According the US State Department’s own country reports, the interrogation methods employed by these US allies include:
- Egypt: Suspension from a ceiling or doorframe; beatings with fists, whips, metal rods, and other objects; administration of electric shocks; being doused with cold water; sexual assault or threat with sexual assault
- Israel: Violent shaking; smelly head-bag; painful positions; "truth serums;" torture of teenagers.
- Jordan: Beatings on the soles of the feet; prolonged suspension in contorted positions; beatings
- Morocco:Severe beatings
- Pakistan: Beatings; burning with cigarettes; sexual assault; administration of electric shocks; being hung upside down; forced spreading of the legs with bar fetters
- Saudi Arabia: Beatings; whippings; suspension from bards by handcuffs; drugging
- Syria: Administration of electric shocks; pulling out fingernails; forcing objects into the rectum; beatings; bending detainees into the frame of a wheel and whipping exposed body parts.
Deeper Roots. While the “war on terrorism” has given hard-core interrogation techniques a new lease on life, in fact the Afghan and Iraqi situations are only the most recent examples of their development by both the US military and the CIA. They sponsored a great deal of primary research on the subject, drafted “how-to” manuals for use in torture/interrogation training, and provided a great deal of instruction and assistance to the global hard-core interrogation industry. Among the many recipients of this development assistance were the Shah’s Iran, Brazil and Uruguay in the 1960s (by way of Dan Mitrione and others), Vietnam (by way of the Phoenix Program’s “Provincial Interrogation Centers), Guatemala and El Salvador in the 1980s, and Honduras’ infamous Battalion 316 in the early 1980s -- where our new US Ambassador to Iraq, John Negroponte, also served in 1981-85.
Dubious Methods Back Home. There is also evidence that the same rough-trade interrogations tactics that US soldiers have recently employed offshore are also showing up more frequently in the US. For example, last December, a Department of Justice investigation disclosed the widespread use of mistreatment and abuse during the interrogation of dozens of Muslim detainees at the Metropolitan District Detention Center in Brooklyn, in the aftermath of September 2001.
More generally, there have also been a growing number of instances of gross brutality inflicted on prisoners in the US' own increasingly over-crowded prison system -- which currently houses more than 2.1 million inmates, the world's largest prison population. Some state systems -- like California's, with more than 160,000 inmates -- are on the verge of breakdown, with crowded conditions, "guard gangs" as well as prisoner gangs, budget shortages, and rampant violations of prisoner rights. Just this year, for example, an inmate on a dialysis machine at California's Corcoran State Prison bled to death while guards ignored his screaming, as they watched the Super Bowl. This is just one of many horror stories that occur on a daily basis in the US prison system. So it is perhaps no accident that at least two of the six US soldiers facing criminal charges in connection with the Abu Ghraib scandal in Iraq were prison guards back home, and one of them was employed at a Pennsylviania prison that is notorious for prisoner abuse.
In an episode that is in some ways even spookier, in February 2004, agents of the US Army’s Intelligence and Security Command (INSCOM) -- which also oversees the same military intelligence units that have been misbehaving in Iraq -- actually showed up undercover and uninvited at a University of Texas Law School Conference in Austin on “The Law of Islam” in civilian garb, and later demanded a list of all the attendees and questioned several students in an aggressive manner. The Army later apologized, and promised to institute new refresher courses on the proper limits of its domestic authority.
Indeed, the US Army might start by reviewing the fundamental federal law that has been on the books since 1879 – the "Posse Comitatus Act" (PCA) (18 USC 1385), which provides for fines and imprisonment for anyone who uses the US military for domestic law enforcement or surveillance, except in times of national emergency or under certain other limited exceptions, none of which applied here.
This growing militarization of US law enforcement -- complete with Predator drones, SWAT teams, US Marines shooting 18-year old goat-herders in the back on the Mexican border, and now the very latest fine refinements on interrogation techniques from Guantanamo and Abu Ghraib -- may be an inevitable byproduct of our brand new global wars on terrorism, drugs, anti-imperialism, and Islamic radicals whose faces we don't happen to like. But those of us who are back home, supposedly the beneficiaries of all this "national security," had better wake up and pay attention to the impact that this emerging "state of siege" mentality is having on our rights -- and those of the Iraqis people that we are supposed to be "liberating."
NEXT: PART II: THE ROOTS OF BRUTALITY.
_________
May 6, 2004 at 01:12 AM | Permalink | Comments (2) | TrackBack
Sunday, April 11, 2004
412.The Coffee Connection: Globalization's Long Reach, From Vietnam To Nicaragua To Starbucks


INTRODUCTION – THE "G" WORD
I remember the first time I heard the “G” word - “globalization.” It was 1985, and I was interviewing a new McKinsey recruit, a former assistant Harvard Business School professor who had decided to exchange the classroom lectern for a larger bank balance. He was about as excited as business intellectuals ever get about the latest HBS paradigm. This was the notion that, in the wake of the 1980s debt crisis, countries would soon be forced to “globalize.” According to him, this meant that they would soon dramatically reduce all barriers to trade, investment, and labor migration, so that, over time, the world would become one great big happy marketplace.
I reacted with the economist’s usual disdain for business school paradigms. While “globalization” might be a new term, surely the basic concept was not new. For example, the world had also experienced a dramatic rise in trade and investment in the late 19th and early 20th centuries. Nor was it nirvana. As I recalled, this earlier period of free trade been marked by numerous speculative bubbles, debt crises, and even some devastating famines in India, China, and Ireland. Then, during the 1930s, many countries had retreated from the winds of global competition behind tariff barriers, import controls, exchange controls, and fixed exchange rates. At the time these “beggar thy neighbor” policies were damaging to the world recovery. But after the world economy was revived by World War II, it did pretty well during the period from 1950 to 1973. Indeed, many economic historians now refer to that distinctly “un-global” period as the 20th century’s “Golden Era,” the only prolonged period in the 20th century when global growth and income equality have both improved dramatically at the same time.
So I could not help but tweak the young professor's nose a bit about the fact that “globalization,” as he called it, was not new, and was evidently neither necessary nor sufficient for strong performance by the world economy.
TWENTY YEARS LATER…
I t is now twenty years later, and many neoliberal pundits are still discussing “globalization” as if it were something strange and new – and as if it did not already have a very long and really quite problematic track record, including its very mixed record since the early 1990s.
What should by now be clear to any careful student of the subject is that in fact there really is no such thing as “globalization” per se. Its effects cannot be assessed or even measured apart from specific historical contexts. In other words, the liberalization of trade and investment is never implemented across all markets or trading partners at once. Its impact depends crucially on the precise sequence of deregulation, initial conditions, and on complex interactions with all the other market and regulatory imperfections that remain after specific barriers have been removed.
EXAMPLE - MEXICO Vs. CHINA
Just to take one specific example – in 1993, Mexico signed the NAFTA, giving its export sector much more access to the US market. However, the gains reaped by Mexican exports have been somewhat disappointing, because it discovered that just as NAFTA was being implemented, China was also dramatically expanding its exports to the US. This was partly just a reflection of China’s lower labor costs. However, for capital intensive sectors, it also reflected China’s artificially lower cost of capital. Unlike Mexico, where the banking sector had been privatized, China’s banking sector remained entirely in state hands, and it provided $billions in subsidized credit to the export companies that Mexico had to compete with. Having liberalized its capital market at the same time that it liberalized trade, Mexico had essentially given up one of its main weapons in its competitive battle with China.
The following case study of the global coffee market provides another example of “globalization’s” complex side effects. In the early 1990s, the World Bank and the IMF, which have been two of the most fanatical sponsors and promoters of “globalization” around the world, decided to encourage the Socialist Republic of Vietnam to boost its exports and growth rate by aggressively entering the world coffee market. Millions of poor coffee farmers around the world are still suffering from the effects of this grand strategy.
THE COFFEE BARRONS
If one is looking for a good example of the unintended impacts of “globalization,” a good place to start is with the world’s second most globalized commodity -- coffee, which is consumed almost everywhere, produced in 70 countries by more than 25 million farmers, and second only to oil as a share of world trade.
Coffee has certainly has had its ups and downs in the health literature, although the latest scientific evidence is apparently that, at least in moderation, it can do some good, at least if one is prone to gallstones, asthma attacks, cirrhosis of the liver, headaches, or heart trouble. But it has been undeniably beneficial to the shareholders of leading First-World companies like Nestle, Kraft, Sara Lee, P&G, and Germany’ Tchibo, the giant conglomerates that dominate the international business of roasting, processing, wholesaling, and at least in Starbucks’ case, retailing gourmet coffee to millions of First World customers.
In the last decade, all these coffee conglomerates have prospered, and Starbucks, in particular, has struck a veritable gold-mine. Founded in 1985, since its 1992 IPO, Starbuck’s share price has risen at an astounding average rate of 28 percent a year, with four 2-1 stock splits along the way. In February 2004 its market value reached a 12-year high of almost $15 billion, and its revenues now exceed $4.4 billion, growing at 32 percent a year. Entering the vaunted ranks of truly global brands like IBM or Coca-Cola, Starbucks now has more than 74,000 “partners” (actually, employees) and more than 6000 stores in over 30 countries, and it expects to add another 1300 stores this year. Indeed, it is even opening stories in markets that one might have thought would be difficult to crack, like Paris, Saudi Arabia, Mexico, the Philippines, Indonesia, and Lima, Peru, where a cup of Starbucks java reportedly two-thirds of the minimum wage.
One might have hoped that the folks at the other end of the pipeline who actually grow all the coffee might have benefited a bit from all this downstream prosperity. But in fact it would actually come as a something of surprise to the world’s 25 million coffee farmers around the globe, most of whom exist at the very bottom of the income distribution in developing countries like Vietnam, Brazil, Nicaragua, Kenya, and Ghana. Even as the processors, roasters, and retailers were cashing in, conditions for these farmers became more fiercely competitive than ever before. Indeed, from 1997 to 2001, composite world coffee prices fell by two-thirds, reaching the lowest levels in 30 years. Since then, average prices have recovered slightly, but they still remain at just half their (real) 1997 levels.
Since the demand for coffee beans is price-inelastic, the result was that coffee bean exports and the incomes of coffee farmers all over the world just collapsed. The result was one of the most socially -devastating commodity market crashes in modern history, with millions of poor coffee growers from Mexico’s Chiapas region, Guatemala, and Nicaragua to Kenya and Ghana to Indonesia and Vietnam all suffering the effects.
"FAIR TRADE" - MORE OR LESS
As one of the younger, less diversified companies in the industry with a retail brand to protect, Starbucks was perhaps more sensitive to the growing contrast between its own prosperity and the farmers’ desperate situation. In the late 1990s it responded with a new emphasis on “corporate responsibility.” This included support for “fair trade-certified” and “organic” farming, the implementation of sourcing guidelines that emphasized “sustainable” farming practices, paying premium “fair trade-like” prices above market averages, providing a certain amount of credit to coffee farmers and financial aid to poor farming communities, and other measures. In 2003, for example, Starbucks’ paid an average of $1.20 per pound for its Arabica beans, at a time when the open market price was less than half that much.
True, this amounted to just 4-5 cents per cup at most for the farmers, compared with a “vente” coffee based drink that might go for $2.50 to $4.50, depending on what’s in it. True, much of the $1.20 per pound did not get through to the farmers, but was digested by middlemen – even in 2003, at least half of Starbucks’ coffee was purchased through brokers and short-term contracts.
True, the 2.1 million pounds of “fair-trade certified” coffee that Starbucks purchased in 2003 amounted to less than 1% of its bean purchases. And true, the social programs and credit that Starbucks distributed to poor coffee farming communities in 2003 amounted to just $1 million and $2.5 million, respectively, scattered across nine countries – less than 1 percent of its operating income that year. But at least Starbucks deserves credit for making an effort, which the other giants in the industry have failed to do.
However, if Starbucks had really wanted to assist poor coffee farmers around the world, it would not have wasted time with all the “fair trade” and “green farming” activity, as valuable as these symbolic gestures might be in the abstract.
As the following tale explains, Starbucks and the fair traders would have had far more social impact if they had simply persuaded the World Bank to keep its mitts off coffee production.
GLOBALIZING COFFEE PRODUCTION
In 2000-2002, an acute coffee market crisis hit poor countries like Nicaragua, Guatemala, and Kenya broadside. The tale of this fiasco is worth telling just because of its dire impact on such countries, which depend on coffee for 25 to 30 percent of their exports. But it is also a striking example of the unintended side-effects of globalization, and of neoliberal development banking at its worst. After all, as noted, coffee is grown by more than 25 million small farmers in more than 50 developing countries, including several of the world’s most heavily-indebted nations. Indeed, it is second only to crude oil as a developing country export. So if you wanted to pick one global commodity market not to screw up, this would be it. But that did not stop the World Bank, the IMF, and the Asian Development Bank from doing so.
As this story also demonstrates, even in the 21st century, countries like Nicaragua not only remain at the mercy of intransigent rightists, corrupt elites, and egomaniacal leftists. They are also at the mercy of massive screw-ups by half-baked neoliberal experiments located half-way around the globe – and by fellow “former socialist countries!”
In 1986, the Socialist Republic of Vietnam’s Communist Party leadership decided to switch from central planning to a liberalization policy called “doi moi” – “change and newness.” This was partly just because, like Cuba, Vietnam could no longer depend on the (crumbling) USSR for huge subsidies. It was also because senior economists at the IMF, UNDP, World Bank, and ADB were preaching the glories of free markets, and holding out the prospect of billions in aid.
The resulting program, designed with extensive assistance from the world’s leading development banks, was a controlled version of a standard orthodox adjustment program. It set out a 10-year plan – oops, “strategy” - for export-led growth, based on opening up Vietnam’s heretofore-closed economy to trade and investment, allowing state-owned banks freedom to lend to individual borrowers, decollectivizing the farm sector, and – in particular -- encouraging small farmers and state-owned companies to develop new cash crops for export.
At the same time, political power was to be kept firmly in the hands of the Communist Party’s Politburo. Despite that slightly-illiberal grace note, from 1993 on, this doi moi economic liberalization package was generously supported with plenty of advice and more than $2 billion a year of foreign loans and grants from the Asian Development Bank, the UNDP, Japan’s JIBC, the France’s Development Fund (AFD), the World Bank, the IMF, and the aid agencies of the US, Sweden, France, and several other Western governments.
Nicaragua’s 44,000 small coffee farmers, the 6 million small farmers in 49 other countries who collectively produced more than 80 percent of the world’s coffee beans, and the more than 100 million people whose jobs and livelihoods depended on coffee beans, had probably never heard of doi moi. But they became one of its first targets. Right from the start, evidently without much thought about collateral damage, Vietnam and its neoliberal wizards decided to embark on a brave new coffee export business.
While coffee had been grown in Vietnam ever since the 1850s, production and exports had been limited. The domestic market was small, and there were few facilities to process the raw beans. As of 1990, green bean exports were a mere 1.2 million 60-kilo bags per year. But Vietnam’s central highlands did have rich hilly, lots of rainfall, and low labor costs, which were ideal conditions for achieving high yields and low prices.
This was especially true for low-grade, easy-to-grow robusta beans. From a consumer’s standpoint, this species was inferior to the Arabica beans grown by Nicaragua and most other Central American producers, as well as big producers like Brazil and Colombia. Arabica had traditionally accounted for more than three-fourths of the world’s coffee production. But robusta had twice the caffeine content of Arabica at half the price, and it could also be used as a cheap filler and blending ingredient.
By the 1990s, bean quality was no longer an absolute barrier to entry in coffee farming. The global market was increasingly dominated by a handful of giant First World coffee processors, roasters, and grinders, including Nestle, Kraft, Sara Lee, P&G, and the German company Tchibo, as well as retail store owners like Starbucks, which generated their own blends. Increasingly, these companies sourced coffee beans from all over the planet, mixing and matching them to produce blends that not only satisfied customer tastes, but also minimized costs. These global buyers had been working overtime on new technologies that took the edge off the cheaper robusta beans and allowed them to be used for extra punch and fill. With the help of commodity exchanges, the giants had also defined standardized forward and futures contracts that allowed them to hedge against price fluctuations – making for a much more “perfect” global coffee market.
From the standpoint of small farmers, most of whom did not have easy access to such hedging devices, “market perfection” was in the eyes of the beholder. The changes introduced by the giant buyers amounted to a radical commoditization of the market that they depended upon for their livelihoods, a sharp increase in direct competition. Accordingly, even as downstream market power became more and more concentrated in the hands of the First World giants, the farmers’ share of value-added plummeted. In 1984, for example, raw coffee beans accounted for more than 64 percent of value-added in the US retail coffee market. By 2000, this share had dropped to 18 percent. From 1990 to 2000, while global retail coffee revenues increased from $30 billion to $60 billion, the revenues earned by bean-growing countries dropped from $10 billion to $6 billion. By then, for every $3.50 café latte sold by Starbucks, the farmers earned just 3.5 cents.
The farmers’ shrinking role was due in part to the basic structure of the global coffee industry. On the supply side, as noted, by the 1990s, raw beans were being exported by more than fifty countries, who were competing head-to-head. But while a few growers like Brazil and Colombia had tried to break into foreign markets with their own processed brands, a handful of global First World buyers still dominated processing and marketing. Indeed, many of the world’s leading exporters of processed coffee, like Germany and Italy, grew no coffee at all.
This long-standing First World control over global coffee processing is partly due to technical factors. There are economies of scale in processing, but not in coffee farming. Unlike petroleum or natural gas, which can be warehoused for free in the ground, coffee beans are costly to store. Unlike wine, aged beans also have no incremental value. Furthermore, most small coffee farmers depend on coffee sales for their current incomes. Global coffee demand is actually not very price-sensitive, and it is only growing at a modest 1 percent per year. All this means that prices tend to fluctuate wildly with current production, so there is an incentive for processors to stay out of farming, shifting market risks to millions of poorly-diversified producers. The fact that coffee beans be stored 1-2 years, while roasted or ground products have a much shorter shelf-life, also favors locating processing facilities close to the final consumer markets. And anyone who has been to France, Italy, or Brazil knows that tastes for particular kinds of coffee vary significantly across countries.
But the coffee industry’s international division of labor is not only based on such technical factors, many of which are actually declining in importance. It is also based on long-standing trading patterns and colonial relations – for example, the 16th century role of the Dutch in smuggling coffee plants out of Yemen to their colony in Java, which fostered Indonesia’s entire coffee industry; the role of French, British, Portuguese, and Japanese trading companies in Africa, Jamaica, Guyana, Brazil, and Asia, and the role of American companies in Colombia, Central America, and Southeast Asia. The First World’s dominance has been reinforced by trade barriers that favor the importation of raw beans over processed coffee.
The net result of all this is as if France, Italy and California were compelled to export all their grapes to Managua, Nairobi, and Jakarta, in order to have them processed into wine.
Along these lines, given the importance of small coffee farmers to debtor countries, and the World Bank’s supposed commitment to “poverty alleviation,” it may seem surprising that the World Bank, the IMF, and other development lenders devoted zero energy in the 1990s to designing a monopsony-breaking strategy for coffee growing countries, to help them break down this division of labor its supporting trade barriers.
Instead, the development bankers did just the opposite, helping Vietnam implement an anti-producer-cartel strategy that ultimately helped to drive the coffee- countries’ association, a rather pale imitation of OPEC, completely out of business in 2001. Could it be that these First World development banks were not influenced by the fact that the world’s leading coffee conglomerates also happen to be based in countries like the US, Japan, France, Switzerland, and Germany, not far from the development banks’ headquarters?
COFFEE CONTRAS
Vietnam’s decision to push coffee bean exports as a cash generator in the 1990s was not just based on rational economics. Like most critical decisions in economic development, it also had a crucial political motive. Vietnam’s best region for growing coffee turns out to be the Central Highlands, along the border with Cambodia and Laos. This region is inhabited by about 4 million people, including 500,000 to 1 million members of non-Buddhist ethnic minorities who are known collectively as the Montagnard/Dega hill tribes. These fiercely independent peoples have battled the Communist Party, and, in fact, most other central authorities, for as long as anyone can remember. In the 1960s, 18,000 of them joined the CIA’s Village Defense Units and fought hard against the NLF. They had many run-ins with South Vietnam’s various dictators. After the war ended in 1975, some Montagnard tribes continued armed resistance at least until the late 1980s.
To shore up control over this volatile region, in the early 1980s Vietnam’s government embarked on its own version of ethnic cleansing – or at least dilution. It actively encouraged millions of ethnic Kinh – Vietnam’s largest ethnic group – plus some other non-Montagnard minorities, to migrate from the more crowded lowlands to the Central Highlands. At first, these migrations were organized directly by the government. But by the 1990s, they were being driven by a combination of market forces and government subsidies. On the one hand, the migrants sought to escape the poverty and resource exhaustion of the lowlands. On the other, they were attracted by the prospect of obtaining cheap land and credit to grow coffee, the exciting new cash crop, which became known to the peasants as “the dollar tree.”
The result was an influx of up to 3 million people to the Central Highlands provinces in less than two decades. In 1990-94 alone, some 300,000 new migrants arrived in the provinces of Dak Lak, Lam Dong, Gia Lai, and Kontum, looking for land. By 2000, these four provinces alone accounted for 85 percent of Vietnam’s coffee production. This reduced the Montagnard tribes to the status of a minority group in their own homelands. They watched in anguish as their ancestral lands were reassigned to outsiders, including state-owned companies, controlled by influential Party members in Hanoi who had close ties with leading Japanese, American, and Singaporean coffee trading companies. Many Montagnards were forced to resettle on smaller plots, without compensation. Over time, as the local economy became more vulnerable to fluctuations in world coffee prices, this contributed to explosive social conflicts.
From the standpoint of Nicaragua’s campesinos, the key impact of all this was on world coffee prices. In Vietnam, the migrants and Montagnards alike turned to coffee for support on increasingly-crowded plots. At the time, in the early 1990s, coffee still offered greater revenue per unit of land, compared with other cash crops like rice or peppers, and it was also being actively promoted as a cash crop by state banks, trading companies, and the government.
It took three to four years for a new coffee bush to mature, so the real surge in exports did not occur until 1996-2000. Then, in just a four-year period, Vietnamese exports flooded the market. From 1990 to 2002, they increased more than ten-fold, from 1.2 million 60-kilo bags to more than 13.5 million bags. By 2000, Vietnam had become the world’s second largest coffee producer, second only to Brazil and ahead of Colombia. In the crucial market segment of cut-rate green robusta beans, the blenders’ choice, Vietnam had become the world leader. While other producers like Brazil also increased their robusta exports during this period, Vietnam alone accounted for more than half of all the increased exports. This helped to boost robusta’s share of all coffee exports to 40 percent.
In pursuing this strategy, Vietnam did not bother to join coffee’s OPEC, the Association of Coffee Producing Counties. Indeed, it acted rather like a scab, providing an incremental 800,000 metric tons of low-priced coffee by 2000, roughly equal to the world market’s overall surplus. The giant coffee buyers were quite happy to buy up all this low-priced coffee and swap it into blended products like “Maxwell House” and “Tasters’ Choice,” using it to discipline other leading supplier-countries. At the same time, foreign debt-ridden countries like Indonesia, Brazil, Uganda, Peru and Guatemala also boosted their coffee sales, in order to generate more exports. In September 2001, partly because of this beggar-thy-neighbor strategy, the ACPC completely collapsed and was disbanded.
The resulting export glut caused world coffee prices to tumble to a 33-year low by 2002. According to the World Bank’s own estimates, this caused the loss of at least 600,000 jobs in Central America alone, and left more than 700,000 people in the region near starvation.
Worldwide, the effects of the coffee glut were even more catastrophic, because the world’s fifty-odd coffee producing countries included many of the world’s poorest, most debt-ridden nations. Ironically, just as they were supporting Vietnam’s rapid expansion into exports like coffee, in 1996 the World Bank and the IMF had launched a new program to provide debt relief to the world’s most “heavily-indebted poor countries” -- the so-called HIPC program. By 2001, indeed, the HIPC program had made some progress in debt reduction, cutting the “present value” of the foreign debts for those countries that completed the program by a median of thirty percent. However, of the 28 heavily-indebted poor countries that had signed up for the World Bank’s HIPC program by 2003, no less than 18 of them were coffee growing countries – including not only Nicaragua, but also desperately poor places like Bolivia, Honduras, Uganda, the Congo, Cameroon, Rwanda, the Ivory Coast, and Tanzania.
Indeed, for the larger coffee exporters in this group, even when they managed to wend their way through HIPC’s complex program and qualify for debt relief, they found that most of its benefits had been offset by the coffee crisis! For example, Uganda, the very first country to qualify for HIPC relief, discovered that by 2001, just one year after qualifying for HIPC, its foreign debt was higher than ever -- mainly because it had to borrow abroad to offset the impact of the coffee crisis on exports!
Furthermore, many other “not-quite-so-heavily indebted” developing countries that produced coffee, like India, Indonesia, Peru, Guatemala, Kenya, Mexico, and El Salvador, were also hurt badly. Overall, if one had set out to create destitution and suffering in as many of the world’s developing countries as possible at one fell swoop, one could hardly have devised a better strategy than to encourage Vietnam to thoughtlessly expand its commodity exports in general, and coffee in particular – free markets be blessed, all other developing countries be damned.
In Nicaragua’s case, the average wholesale price for its Arabica beans fell from $1.44 a pound in 1999 to $.51 cents a pound in 2001 and less than $.40 a year later, compared with typical production costs of $.83 a pound.
Among the hardest hit were Nicaragua’s 44,000 small producers, who accounted for two-thirds of Nicaragua’s production and provided jobs that supported another 400,0000 Nicaraguans, most of them landless campesinos in the rural northwest around Matagalpa, north of Managua. They depended upon Nicaragua’s annual coffee harvests for most of their employment and income. The resulting crisis in the countryside set off a migration to Managua and other cities, with thousands of hungry, landless people crowding into makeshift shacks on the edge of town.
Obviously all these developments begged many questions so far as the role of the World Bank and Vietnams’ other international lenders and advisors was concerned. After all, Vietnam was just a very poor state-socialist country that was undertaking all these free-market reforms for the first time – after fighting and winning a thirty-years war of its own with the US. The World Bank, IMF, and the ADB, on the other hand, were supposed to be the experts – they had implemented such reforms all over the world, backed by billions in loans and boatloads of Ivy-League economists. And Vietnam was intended to be one of their poster stories for de-socialization, and for the claim that growth, free markets, and “poverty alleviation” could go hand-in-hand.
In April 2002, sensitive to NGO charges that the World Bank and the other development lenders might actually bear some responsibility for this fiasco, the World Bank went out of its way to issue a press release denying any responsibility for the crisis whatsoever. Or more precisely, it denied having directly provided any financing to expand coffee production in Vietnam. It also maintained that its $1.1 billion of lending to Vietnam since 1996 had tried – though evidently without much success – to diversify farmers away from cyclical crops like coffee. It also argued that, after all, its lending to Vietnam’s rural sector had only started up after 1996, while coffee production had increased since 1994, and that none of its investments had been “designed to promote coffee production. (emphasis added) ” It did identify two World Bank projects that “could be linked” to coffee production – a 1996 Rural Finance Project that helped Vietnamese banks lend money to farmers, and a Agricultural Diversification Project. But for these projects, the Bank simply observed that it didn’t dictate how Vietnamese banks re-loaned the funds that it had loaned to them.
Overall, then, World Bank basically washed its hands of the coffee crisis -- one of the worst disasters to strike small farmers, their dependents, and debtor countries in modern times. The World Bank did assure the public that it was extremely concerned about the plight of these farmers, and promised to address their woes.
On closer inspection, this defense had more than a few holes. First, whether or not the Bank financed any new coffee farms, clearly the World Bank and its cousins at the IMF, the UNDP, and the ADB were up to their elbows in designing, managing, and financing Vietnam’s economic liberalization program. In the first place, they played a key role in pushing Vietnam to liberalize trade, exchange rates, and banking quickly. To set targets for Vietnam’s macroeconomic plans, they had to have known which export markets the government planned to go after. After all, coffee was not just another export. After the removal of Vietnam’s quotas on coffee and other exports in 1990, partly at the request of IMF, coffee quickly became the country’s number-two export, second only to oil. It continued to be one of the top ten exports even after prices cratered. The ADB and the World Bank also worked closely with Vietnam’s Rural Development Bank, the country’s largest rural lender, to improve management and structure new lending programs. They also advised Vietnam on how to set up a Land Registry, so that rival land claims could be settled and farms – at least the non-Montagnard claimants who found it easier to get titles -- could borrow to finance their new crops more easily.
At the same time, far from encouraging Vietnam to work with other coffee producers to stabilize the market, or design an overall long-term strategy to break up the buy-side power in the market, the development banks bitterly opposed any such interference with “free markets” – no matter how concentrated the buyers were, or how many artificial restrictions had been placed by First World countries on the importation of processed coffee. As one senior World Bank economist remarked in 2001, at the very depths of the coffee glut:
Vietnam has become a successful (coffee) producer. In general, we consider it to be a huge success...It is a continuous process. It occurs in all countries - the more efficient, lower cost producers expand their production, and the higher cost, less efficient producers decide that it is no longer what they want to do.
So, despite its 2002 press release, the World Bank’s true attitude about this whole fiasco appears to have been a combination of “not my problem,” sauve qui peut, and Social Darwinism.
Meanwhile, back in Vietnam, the small farmers in the Central Highlands learned the hard way about the glories of global capitalism – thousands of them had decided that it was “no longer what they wanted to do,” but were finding few easy ways out. After the 1999-2002 plunge in coffee prices, Vietnam’s export earnings from coffee fell by 75 percent from their level in 1998-99, to just $260 million in 2001-02. In 2002-03, they fell another 30 percent. In the Central Highlands, thousands of the small farmers – low-lenders and Montagnards alike -- had gone deeply into debt to finance their growth, and were struggling to feed their families and send their children to school, because market prices now covered just 60 percent of their production costs.
In short, ten thousand miles from Managua, on the opposite side of the globe, these highland farmers were facing the same bitter truths that Nicaraguan campesinos were facing -- that they had more in common with each other than with the stone-hearted elites who governed their respective societies, and designed futures that did not necessarily include them.
In Vietnam, the resulting economic crisis severely aggravated social and political conflicts in the Central Highlands. In February 2001, several thousand Montagnards held mass demonstrations in Dak Lak, demanding the return of their ancestral lands, an end to evictions for indebtedness, a homeland of their own, and religious freedom ( since many Degas are evangelical Christians). Vietnam responded with a harsh crackdown, sending thousands of elite military troops and riot police to break up their protests. They arrested several hundred of them, and then used torture to elicit confessions and statements of remorse. They also destroyed several local churches where the protestors had been meeting. Those protest leaders who did not manage to escape to Cambodia were given prison sentences up to 12 years.
From one angle, this repressive response was the typical handiwork of a Communist dictatorship. From another angle, however, it was just another example of the repressive tactics that neoliberalism required to implement free-market “reforms” by non-Communist regimes, in countries like Venezuela, Ecuador, Bolivia, Egypt, Indonesia, the Philippines, Argentina, and post-FSLN Nicaragua.
In Vietnam’s case, far from helping to solved its political problems in the Central Highlands, the Politburo discovered that their neoliberal reforms had inadvertently helped to revive the Dega separatist movement. Evidently, economic and political liberty did not always go hand in hand.
At least the Politburo and their foreign advisors did have something to show for the coffee strategy, however. In 2000-2002, the profit margins earned by the five giant companies that dominated the global coffee market were higher than ever. Furthermore, cocaine producers in the Andean region no longer had to worry about small farmers substituting coffee for coca. In Colombia’s traditional coffee-growing regions, just the opposite started to happen in the late 1990s, as many farmers converted coffee fields to coca, in the wake of the coffee glut.
Indeed, from 1995 to 2001, coca cultivation more than tripled in Colombia, including a 20 percent increase in 2000-01 alone. This occurred despite hundreds of millions of dollars spent by the USG on coca eradication efforts, the so-called “centerpiece” of its “Plan Colombia.” In 2000-01, coca production started to increase again in Peru, Bolivia, Ecuador and Venezuela. There were also reports that farmers were even turning away from coffee and towards coca in areas that had never before seen coca, like the slopes of Kenya’s Mount Kilimanjaro. Cocaine production from the Andean countries also rose sharply from 1998 to 2002.
After all, unlike coffee, at least coca and cocaine were products for which both the farming and the processing could be done at home.
EXAMPLE - THE IMPACT ON NICARAGUA
Overall, by 2003, Nicaragua’s real per capita income had fallen to $400 (in real $1995), roughly its 1951 level. With population growth averaging 2.4 percent a year in this overwhelmingly Catholic country, the economy would have to grow at 5 percent a year for 30 years just to recover the 1977 per capita income level – compared with the actual average growth rate of 1.3 percent during the 1990s. By now, the country’s entire national income is just $11.2 billion, less than three Starbuck’s annual revenues.
By 2003, underemployment levels exceeded 60-70 percent in many parts of the country, and the overall proportion of people living in poverty was 67 percent, second only to Honduras in Latin America. This means that there were some 1.6 million more Nicaraguans living on the borderline of existence than in 1990, at the end of the contra war.
Earlier, in the 1980s, the Sandinistas had been justifiably proud of their health, education, and literacy programs. Even in the depths of the contra war, rates of infant and maternal mortality, malnutrition, and illiteracy had declined. Infant mortality fell sharply from 120 per 1000 live births in 1979, immunization coverage rose, and the share of the population with access to health care increased from 43 percent to 80 percent.
In the 1990s, however, there were sharp increases in all these maladies, aided a 75 percent cut in public health and education spending by 1994. By 2000, Nicaragua was spending four times as much on debt service as on education, and a third more than on public health. The infant mortality rate was still 37 per 1000, and the under-age-five mortality rate was 45 per thousand, among the highest in Latin America. (Cuba’s equivalent rates, for comparison, were 7 and 9 per thousand.) As of 2000, 12 percent of Nicaraguan children were underweight, and 25 percent were under height. More than 22 percent of children under the age of 9 – 300,000 children -- were malnourished. By 2000, 37 percent of school-age children were not enrolled in classes, and illiteracy, which an intense campaign by the Sandinistas in 1980-81 had reduced to 15 percent, had climbed back up to 34 percent, and was even higher in rural areas. Women’s rights also suffered, as the Church conspired with the new conservative governments to drive abortion underground, even at the cost of higher maternal mortality rates because of botched illegal abortions.
Coincidently, Nicaragua’s $400 per capita income was almost exactly the same as that of the Socialist Republic of Vietnam, its new direct competitor on the other side of the planet. Indeed, in one of history’s many ironies, these two formerly “leftist” countries were now passing each other on the globalization escalator, heading in opposite directions. By 1998, if we believe the statistics published by the UNDP, Vietnam’s poverty rate had dropped to 37 percent, below Nicaragua’s, while adult literacy had reached 94 percent, above Nicaragua’s (declining) rate of 63 percent. Vietnam’s average life expectancy had also matched Nicaragua’s 68.3 years. And far from having a chronic foreign debt crisis, which Nicaragua has had since 1979, Vietnam became one of the development banks’ darlings, as we saw earlier, drawing down $2 billion a year in concessional finance throughout the decade, plus more than $30 billion in foreign investment. Yet Vietnam’s ratio of debt to national income was just 35 percent – not exactly low, but only one-tenth that of Nicaragua’s.
Furthermore, with all the outside help, on top of its entry into the coffee export market, Vietnam’s growth rate averaged more than 9 percent a year in the 1990s, even as Nicaragua’s growth stagnated. In 2001, when Vietnam’s Ninth Communist Party Congress adopted its “Tenth Ten Year Strategy” for the period 2001-10, the World Bank and the IMF were both on hand in Hanoi to celebrate with yet another generous structural adjustment loan program – carefully shielded, of course, from any angry Montagnards who might wish to complain.
All told, by the New Millennium, out of 173 nations ranked by the UNDP according to their “human development” metrics, by the year 2000, Nicaragua had dropped from 68th in 1980 to 118th. It passed Vietnam on the way down, which was in 101st place and rising. The responsibility for Nicaragua’s decline appears to have been almost evenly divided between the contra war of the 1980s and the neoliberal war of the 1990s. Relative to more prosperous (haven) neighbors like Panama and Costa Rica, as well as to the pro-US, military-dominated abattoirs to the north, Guatemala and El Salvador, Nicaragua’s relative decline has been even more striking.
So evidently it wasn’t enough to pull off a revolution and defeat a US-backed puppet army, as both Vietnam and Nicaragua had succeeded in doing. Daniel Ortega and his comrades must have occasionally wondered a little wistfully, “If only we had managed to install a full-fledged, centrally-planned Communist dictatorship, as we were accused of trying to do! Maybe the world would have been as generous to us as it has been to the Socialist Republic of Vietnam!”
April 11, 2004 at 08:46 PM | Permalink | Comments (0) | TrackBack
Tuesday, March 23, 2004
One Year Later A Balance Sheet for the Iraq War


However, this by no means relieves the instigators of this precipitous venture of responsibility for what, in retrospect, appears to have been an illegal, costly, poorly-managed, distracting, divisive, and entirely unnecessary engagement with the wrong enemy at the wrong time.
While it may be decades before the consequences of the Iraq Invasion are completely clear, enough time has already passed for us to begin to take stock.
The following is the first year's balance sheet.
CONSEQUENCES
1. We have now almost satisfied everyone's curiosity about Saddam’s WMDs. They have not existed for quite some time. The US-led coalition’s weapon inspectors have searched high and low. Ellos no encontraron nada. Of course the hawks still assure us that they will turn up eventually, and that we should now search for them in Syria....
2. For purposes of the next time around, what we do now know for sure is that the CIA, MI-6, and the Mossad, as well as many of the hawkish senior Bush and Blair Administration officials and media pundits who "sold" us the war, are unreliable, careless with the truth, and trigger-happy.
On the other hand, the UN weapons inspectors, as well as “dovish” France, Russia, China, and Germany, were basically right all along.
This is an invaluable lesson. We should now proceed to apply it in the upcoming elections in the US and the UK, as Spain has already done.
3. We now know much more about where other countries got their WMDs. In the last year, while the Iraq War was proceeding, we identified the true sources of the last three decade’s WMD proliferations, thanks in part to Libya’s decision to turn in its WMDs.
One key source turns out to have been our ally Pakistan, in the case of North Korea, Iran, and Libya. In addition, as we've known for some time, the other key sources were our ally Israel, plus the US and West Germany, in the case of nuclear weapons for South Africa and India; and the US itself (plus Security Council members France, the UK, Germany, Russia, and China), in the case of Iraq’s chemical and biological weapons capabilities. Iran and Libya have reportedly agreed to cooperate with UN nuclear weapons inspectors. Israel and Pakistan, our close “non-Nato” allies, have refused to do so.
Libya’s concessions probably had little to do with the Iraq invasion per se, so we might well have learned all this without it. But at least we now know for sure that Saddam had nothing to do with WMD proliferation. The real culprit was our own behavior and that of our self-seeking “allies.”
Of course the other great irony here is that while we've been chasing phantom WMDs in Iraq, it appears to be Iran that is, next to Pakistan, the Islamic country with the most advanced nuclear weapons program. It is not obvious to many observers that the Iraq War has strengthened the US hand with respect to Iran -- indeed, with our hands so obviously full, and with Iraq's Shiites in such a strategic position, it may well have reduced our leverage on Iran.
4. We now know all about Saddam’s links with al-Qaeda and 9/11. They were non-existent. A majority of Americans apparently still believe that there were ties between Saddam and Al-Qaeda, and that Saddam was involved in 9/11, perhaps in part because the US President and Vice President persist in encouraging this poppycock. However, the best evidence – including the testimony just this week of President Bush’s own former top counterterrorism expert, Richard Clarke - suggests that these beliefs are completely without foundation. Apparently the warmongers in the Bush Administration decided to punish Iraq, on top of Afghanistan, because, as Secretary Rumsfeld reportedly put it, Iraq "had more targets to bomb."
Of course the warmongers have also argued that al-Qaeda might have links to many other countries, notably Iran, Syria, and Saudi Arabia. As we evaluate these claims, we should remember the expensive lesson that we’ve just learned about al-Qaeda’s purported links to Saddam.
5. Saddam’s ruthless Ba’athists have finally been removed from power.
In this respect, at least, the war helps to make up for the fact that the US played a major role in bringing the Ba’athists to power back in the 1960s, the assistance that the US, Saudi Arabia, and Kuwait (as well as Russia, the UK, Germany, etc.) provided to Saddam throughout the 1980s, and the Allied coalition’s failure to remove him from power in 1991, at a cost of several hundred thousand Shi’ite lives.
In addition to Saddam's removal, some of the war's supporters have concluded that the Iraqi people have already been “liberated.” This is a huge overstatement -- as if, simply by Saddam’s demise, Iraq had already become a peace-loving constitutional democracy with Swiss-like cantons. One can almost smell the Alpine air.
Those Iraqis who have survived the war do now enjoy many new freedoms. Almost as many Iraqis are now employed, with access to electricity, running water, and health care, as before the invasion. They are free to look for work, to start companies, to open bank accounts, to read about their troubles in the media and the mail, and to shop. Except for the fact that more than a quarter of them are still unemployed, are finding it very hard to make ends meet from one day to the next, and fear for their lives because of the security situation, all this is wonderful.
By June 30, they will also presumably be allowed to vote for the “local councils” that the US is setting up all over the country, although these appear to have carefully-vetted candidates. Actual elections for national leaders is a long way off, and the fundamental question of the balance of power among this somewhat artificial “nation’s” Shiite, Sunni, and Kurdish communities remains unresolved.
Nor is it helpful that the US has permitted the return and empowerment of émigré parasites like “Iraqi National Congress” émigré leader Ahmed Chalabi, a well-spoken con artist who was convicted of bank fraud in Jordan in 1992, and sentenced to 22 years at hard labor, is reportedly wanted on similar charges in Lebanon, and whose well-armed minions now hold forth at the Baghdad Hunt Club.
Looking forward to the day when the 100,000-man US-led coalition army leaves, it is also a concern that, just as in Haiti, the US saw fit to completely abolish the Army. Let’s all hope that Iraq’s new US-trained Police Force does a better job defending democracy than the US-trained Police Force did in Haiti. Former Haitian President Aristide, now in Jamaica, may be available for consultations on this point, having just been overthrown by a tiny force of only two hundred armed irregulars while his "Police" took to the hills.
All this only leaves about a dozen or so other brutal regimes in the Middle East left to go. Unfortunately, their ranks include such leading US allies like Saudi Arabia, Egypt, Morocco, Kuwait, Dubai, Djibouti, and Pakistan, as well as Syria. They have all been watching the progress of our experiment with “democracy” closely, and they are reportedly not that impressed.
6. Whether or not Iraq was a haven for terrorists before the war, it has clearly become one now. The destruction of Iraq’s Army, its wide-open new borders, and the opportunity that it presents to hunt US and UK troops has attracted new “terrorists” by the hundreds.
Some argue, painting lipstick on the pig, that this is actually a good thing, since we now have all these enemies in one place. This view is rooted in the assumption that there are only a finite number of terrorists in the world, and that if they were not in Iraq, they would just be operating somewhere else. In fact, it is more likely that many of those who have come to Iraq are new recruits, appalled by the US occupation of an Arab country, and persuaded that we are there to fight Islam and seize Iraqi’s oil.
Of course some of the war’s proponents don’t really care which of these views of the "terrorists" is correct. They were primarily interested in seeing the US sucked in….errr, persuaded…to establish its own long-term military base in the center of the Middle East, take sides in the interminable battle against the evil Muslim/ Arab hoard…..err, “terrorists,” sorry, and contribute most of the fighting, dying, and finance.
7. The Iraq War has not done much to halt terrorism outside Iraq, either. While the US has been fortunate enough to avoid another 9/11, the last year has seen a significant increase in global terrorism elsewhere, with countries like Spain, Turkey, Morocco, Indonesia, France, the UK, Germany, Pakistan, and Russia getting used to a whole new level of permanent insecurity – somewhere between “code amber” and “code red.”
Overall, the Iraq War appears to have indeed distracted attention and resources from the war on terrorism, inflamed world sentiment, and created armies of new recruits for terrorist organizations. This is having serious economic as well as political consequences, with the “terrorism risk premium” built into world stock markets now at its highest point since September 2001.
In short, from the standpoint of fighting terrorism, launching the Iraq War was really a pretty dumb thing to do.
8. Afghanistan and Pakistan are increasingly bogged down with the terrorist struggle, and the Iraq War has certainly not helped. Al-Qaeda, the Taliban, the Karzai Government, and Pakistan’s President Musharraf appear to be locked in a stalemate. Of course, Bin Laden and al-Qaeda’s other top leaders are now “surrounded,” but this just means that they are somewhere within a tiny, narrow 20,000-square-mile region in Northwest Pakistan. Meanwhile, Pakistan’s General Musharraf and Afghanistan’s Hamid Karzai really are confined to about 100 square miles around Islamabad and Kabul, where they have each both narrowly survived two assassination attempts in the last six months.
Just this week, Karzai lost his second Minister of Aviation in a year to assassination, not by “terrorists,” but by a local warlord. His government will try to hold elections this summer, but it is only likely to include about 10-15 percent of the country’s potential registered voters. Elsewhere the warlords are too busy growing opium and producing heroin base to pay much attention to Karzai’s central government, which -- compared with Iraq -- has received relatively limited funding from its foreign allies and the UN. Afghanistan has reverted to its pre-Taliban levels of opium production, accounting for more than 90 percent of the world’s supply, with Karzai warning that the country is becoming a “narco-state.”
Meanwhile, after two years in hiding, the Taliban is also staging a comeback in several parts of the country. This is partly because the Bush Administration decided to use so few US troops in the first attacks on Afghanistan that many of the Taliban and al-Qaeda were able to escape over the border to Pakistan. It is also because the locals are tiring of the warlords.
9. The “Middle East peace process” has, quite literally, run into a “Wall.” The Bush Administration, preoccupied with the War and an election, has paid scant attention to implementing its “road map,” and the Israeli-Palestinian relationship has basically defaulted to a perpetual state of war. Whether or not the World Court declares Sharon’s “Apartheid Wall” illegal, without US involvement, this situation is not likely to improve. Indeed, the tendency will be for the extremists on both sides to escalate the violence– as evidenced by today’s Israeli assassination of Hamas’ top leader. Without the Iraq War to distract US leaders from this root cause of global terrorism, this escalation might have been avoided.
10. Our friends and allies around the world have developed a much more hostile attitude toward the US. This bears a striking resemblance to the world’s attitude toward Israel, South Africa in the 1980s, and the UK and France during the 1956 Suez Crisis. Of course we are used to some disdain from the French. But only just last week, the new Prime Minister of Spain, a close Nato ally, was strikingly critical of the US, as were the South Koreans, another long-time US ally, and the Poles. We may be down to Britain’s Tony Blair, at least until summer’s UK elections. The next time around, in say Iran or Syria, the “coalition of the willing” would probably not have any NATO members in it. Of course we can always count on the Moroccans to lend us the 2000 mine-detecting monkeys that they provided for the Iraq invasion.
11. The Iraq War may at least have enhanced the UN’s credibility. Even though the UN proved powerless to prevent the US from launching this grossly illegal “preventive war,” in hindsight, the Security Council’s methodical approach to the elimination of Iraq’s WMDs has been vindicated. Indeed, the UN is now being begged by the US to come back to Iraq, pick up the pieces, and lend the US the credibility that it needs to get others to share the bill.
Beyond that, people are beginning to suggest that if only the US allies Pakistan and Israel would open their doors to UN weapons inspectors, the whole region might be made WMD-free some day soon.
12. Iraqi oil production has been restored to more than 2 million barrels per day. Of course “it was not about the oil.” But Iraq’s oil exports are now slightly higher than they were before the invasion, with another 1 million bpd not far off. This is important, given all the turbulence in Venezuela and recent terrorist events in Europe. Global oil prices are now much higher than they were before the war, and without Iraq’s additional expected production, they might go much higher.
Whether Saddam’s Iraq might have achieved these export levels without the war is doubtful, given the likelihood that sanctions against him would have remained in place. So at least we have one more definite entry in the war's "plus" column. Of course the war was also very lucrative for US oil service companies like Halliburton. But once again (“repeat after me”): it was not about the oil.
13. The price tag for all this has been high. In human terms, at least 5-6000 Iraqi soldiers and 9-10,000 Iraqi civilians have been killed so far, with hundreds more joining them every month, plus thousands more who have been severely wounded. Many of the country’s antiquities have also been liberated. As of March 20, 2004, there have also been 679 (687 by 3/23) combat fatalities and 3300 (3343 by 3/23) combat wounded for the US-led coalition, plus 7-8000 “non-combat injuries or sickness that required evacuation from Iraq.”Before we are able to withdraw, there will probably also be at least another thousand Allied war dead, 2-3000 more Allied casualties, and many thousands more Iraqi casualties, as “Iraqi-mization” shifts the body count to our local allies. (Of course we “no longer do body counts.”) While this is relatively small, compared with, say, the Vietnam War, it is almost certainly many more than the US military expected.
In financial terms, the Iraq War has cost about $165 billion so far. The expectation is that the price tag will probably be another $40-50 billon per year as 100,000 US troops are required – e.g., at least another 2 years. At this rate, the War’s total cost -- not counting the cost of addressing any terrorism outside Iraq that it might ignite -- will easily be at least ten times the entire First World foreign aid budget for all low-income developing countries. For those who might prefer to spend the money at home, it is also many times the sum that the US Government is now devoting to "homeland security."
Of course these are only rough estimates. By 2007 or so, when the US finally transfers responsibility to the New Iraqi Police Force/ Army, and brings its troops home from Saigon….err, Baghdad, all these costs will be much clearer, especially to the thousands of families all sides that have had to pay the ultimate price for the privilege of being on the front lines of this brave new experiment with preemption.
March 23, 2004 at 01:00 AM | Permalink | Comments (1) | TrackBack
Friday, March 19, 2004
On the Trail of "Oil-Rush Development" in the Gulf of Guinea


This region now has the world’s fastest growth rate for new oil reserves, with $5-$10 billion a year being invested to develop its offshore resources. Oil industry experts speculate that by 2010, the seven top oil-producing “New Gulf States” – Equatorial Guinea (E.G.), Sao Tome & Principe, Gabon, Cameroon, Angola, and Congo, plus landlocked Chad -- may account for at least 10-15 percent of the world’s conventional oil and gas reserves, and an even larger share of US energy imports.
This means that, collectively, these tiny African countries may soon play a much greater role in world energy supply than Nigeria, Mexico, Venezuela, or Iraq.
In addition to raw economics, this shift is also being driven by political factors. The US is eager to free itself from OPEC, especially from dependence on politically-sensitive countries like Venezuela and Saudi Arabia. And alternative sources of supply like the Caspian pipeline, Central Asia producers, and Siberian exports have been slow to materialize. So it is not surprising that the development of Gulf of Guinea energy has recently received high priority, not only in Houston and Dallas, but also in Washington, D.C. As the recent attempted coups against Equatorial Guinea's dictator and Sao Tome's President indicate, they are also receiving increased attention from the world's "oil mafia" and their attendant mercenaries.
In principle, the region's new oil discoveries should also provide a gigantic windfall to the 41 million long-suffering inhabitants of these otherwise-impoverished West African countries. Their life expectancy now averages just 46 years, and more than half of them still survive on less than $1 per day. There is an unique opportunity for the Great Powers that are most active in the region – the US, France, the UK, China, and Spain, as well as multilateral institutions like the World Bank and the IMF -- to learn from the many previous negative experiences with the impact of oil wealth on development, and establish some “rules of the game” to insure that it really goes to support democratic development.
Unfortunately, rather than seize this opportunity, these Great Powers appear to be defaulting to age-old imperial practices. They are permitting their corporations to define investment and development strategy for them. With few exceptions, the result is a “winner-take-all” race for the riches, with the region’s corrupt local dictators and tiny private elites dividing the spoils with transnational corporate allies, private bankers, private armies, and other intermediaries.
There are already many “per-country” accounts of these developments in the Gulf of Guinea. But it is helpful for us to consider them collectively. Among the most important patterns:
All told, these patterns do not bode well for the future of democratic development for the “New Gulf States” --- even as the US invests so heavily to bring representative government and stable government to 25 million Iraqis.
Eighty-five years ago, at the end of World War I, when a similar approach was taken by that period’s Great Powers to the division of oil wealth in the Middle East, they could at least plead that they had little experience with the negative consequences of an elitist, laissez-faire approach to oil-rush based development. Today's Great Powers have no such excuse.
March 19, 2004 at 07:25 PM | Permalink | Comments (3) | TrackBack
Monday, March 01, 2004
Pentagon Strategy Crisis? "New "Secret" (Actually, NOT! Report:" "Global Warming a Greater Threat Than Terrorism!!"
As if the world did not already have enough problems, the last few months have raised the ugly specter of global warming once again, perhaps more forcefully than ever. As we'll see below, there are indeed many recent indications that this problem is -- beg your pardon -- now "heating up." Moreover, one of the more interesting developments comes from the belly of the beast itself, the Pentagon's Office of the Secretary of Defense (OSD), by way of a so-called "secret report" ( according to The Guardian/Observer) that the Pentagon reportedly solicited from two prominent California "futurists" and part-time Hollywood war/disaster-film consultants.
In fact, it turns out that the The Guardian/Observer reporters didn't do their homework. While their February 22 story claimed that this Pentagon report on global warming by California futurists Peter Schwartz and Doug Randall was "secret," Fortune Magazine had obtained and released a copy from the Pentagon on January 26, and SubmergingMarkets has obtained a copy of the so-called "secret" report's Executive Summary, which may be downloaded above or below.
The report, entitled "Imagining the Unthinkable: An Abrupt Change Scenario and Its Implications for US National Security," does make for interesting reading. The authors, private consultants who work for Monitor Group/GBN in California and specialize in "long-run scenario planning," have generated a provocative scenario for the effects of an abrupt, discontinuous change in the world's climate. It involves a hefty diet of chaos, famine, drought, and war, as well as a whole new ice age. The melodrama is perhaps not surprising -- after all, one of the two consultants, Peter Schwartz, has also advised on plot development for films like Minority Report , War Games, and Deep Impact. And, of course, scenario designers, like script writers, don't get paid very well for imagining minor variations on the status quo.
As the Pentagon report itself acknowledges, it is not a "forecast," but a "what if?" exercise in "thinking about the unthinkable,"in the great tradition of Dr. Edward Teller and DOD's "wintry doom" scenarios of the late 1950s and the 1980s. The aim was to construct a "plausible," if not necessarily probable, scenario, in order (in the authors' words) to "dramatize" the possible consequences of "an abrupt slowing" of the ocean's "thermohaline circulation" (TC), the deep ocean currents that have a profound influence on subsidiary ones like the Gulf Stream and the Humboldt Current.
The possible link between global warming and TC is not a new idea. Most of us probably imagine, and certainly hope, that the effects of global warming will be gradual, leaving us -- and our trusted technologists -- plenty of time to react. But in fact there is a growing body of evidence that global climate change can occur quite fast and be very destabilizing. The notion of "abrupt change" has been gaining ground in the world's scientific community since at least the 1980s. And many scientistshave expressed concern about the potential impacts of global warming on TC.
As the authors of the Pentagon report acknowledge, at this point most leading scientists probably believe that the impacts of a TC shift would be "considerably smaller" and more localized than their report assumes. However, what is perhaps most frightening is just how limited our understanding of the potential for "abrupt change" apparently is. Just this month, the US's National Science Foundation and the UK's National Environmental Research Council launched a new four-year project aimed precisely at understanding the TC-global warming relationship.
By dramatizing the importance of this relationship, this "pseudo-secret" report has served a useful purpose. Of course its release is unlikely to please the Bush Adminstration, which has so far adopted a head-in-the-sand attitude toward global warming, including its refusal (together with Russia and Australia)to sign the Kyoto Treaty.
The hapless Guardian/Observer also erred in its claim that the report was "suppressed by US defense chiefs." It also claimed that the report's release "will prove humiliating to the Bush Administration..." So far the report has only clearly proved 'humiliating" to The Guardian/Observer.
However, SubmergingMarket's review of the global warming issue suggests that -- well, my goodness, as Donald Rumsfeld might say, someone in the Bush Administration really should take these matters more seriously! Evidently we have someone in the Pentagon OSD, or at least Monitor/GBN, to thank for underscoring this fundamental point.
We just hope that the task is not just left up to folks like Andrew Marshall, the 82-year old waspish Dr. Strangelove who has been in charge of the Pentagon's "long-range strategic planning" since 1973. He may have contracted for this doomsday report, but he also recently raved about giving our troops "bio-engineering" drugs to make them fight harder. Clearly he has too much time and money on his hands. (See below). Nor should it be left in the hands of his two hip California futurists, neither of whom has any scientific credentials, and one of whom (see below) predicted in 1999 that the US was on the verge of 25 years of uninterrupted economic growth(..just a year before the 2000-2003 global recession)! As the Secretary might say, My golly! Can't we do better than this? Is this why we're spending $401 billion this year alone on non-Iraqi "defense?"
BACKDROP - SINKING ISLANDS, TARDY ICE, MISSING BEARS
Before we turn to the Pentagon report, let's examine the context -- a growing body of evidence that we may indeed have to pay a very high price for our inactions on global warming. Among the recent indicators:
- In December 2003, Russia followed in the footsteps of the US and Australia, and refused to ratify the Kyoto Treaty. This was probably more of a short-run bargaining tactic than a Bush-like idee fixe. Absent US support for the Treaty, Russia's vote is needed to make it an international law, which requires its signature by countries responsible for at least 55 percent of all greenhouse gas emissions. Without US participation, Russia lost a huge market for the "pollution credits" that it hoped to sell to over-polluting American companies. It also has its own oil and gas industry to protect, and is bargaining with the EU for more favorable terms, as it enters the WTO.
- In late December, the prestigious American GeoPhysical Union reported that carbon dioxide emissions are now growing faster than ever, and concluded that "It is virtually certain that increasing atmospheric concentrations of carbon dioxide and other greenhouse gases will cause the global surface climate to become warmer."
- Meanwhile, in December, representatives of Alaska's 155,000 Inuit tribespeople filed a human rights complaint against the Bush Administration with the Inter-American Commission on Human Rights in Washington, D.C., on the grounds that they face virtual extinction because of global warming. According to them, the oceans that surround them are now warmer than ever, the permafrost that supports their homes and roads is melting, the ice arrives later and leaves earlier every year, and polar bears and seals are disappearing.
- In early January 2004, Nature, the influential peer-reviewed science journal, published a study that predicted that by 2050, 15% to 37% -- up to 1 million in all-- of all animal and plant species on the planet may be made extinct by climate changes.
- Also in January, in a vivid demonstration of just how concerned many scientists are about this issue, a conference of leading experts from the UK and the US met at Cambridge University, and considered a variety of rather extreme technical solutions to global warming, including the deployment of "tens of billions of wafer-thin metal plates... into the Earth's low orbit," the growth of huge algae beds in the oceans, and the construction of massive cloud-generating machines that would shield the earth from the sun.
- Just this month, in a warning that captured the attention of everyone who enjoys scuba diving, a study by scientists at Queensland's University concluded that Australia's Great Barrier Reef will completely disappear by year 2050, if ocean temperatures continue to rise at current rates. This is significant because, as noted, Australia, like the US and Russia, had refused to sign the Kyoto Treaty.
- In January, Scottish fishing experts also reported a decline in wild salmon stocks, just as the fishing season was opening. They attributed the decline to global warming.
In any case, further consideration of the issue will now be deferred until another conference in 2004. The EU and Russia had wanted to hold off until after the US elections, but a coalition of 40 small island countries blocked the delay -- several of them, including the Marshall Islands, Kiribati, and Tuvalu in the South Pacific, are already sinking into the sea, literally becoming "submerging markets."
BACKGROUND - THE PENTAGON'S GLOBAL WARMING STUDY
On top of all this, we now have this week's dramatic leak of a new Pentagon analysis of the national security implications of global warming. According to this report, which The Guardian described as "secret," the implications would be nothing short of catastrophic. Indeed, according to The Guardian, "The few experts privy to its contents.....(say) (T)he threat to global stability vastly eclipses that of terrorism.."
Since this report collides head on with the White House's antipathy toward the whole concept of global warming, and also undermines its case for the primacy of fighting terrorism, the Pentagon has a few strategic challenges to sort out. It will be helpful for us to understand the origins of the report.
INVISIBLE MAN
According to The Guardian, the Pentagon global warming study was undertaken at the instance of its 82-year old in-house futurist, Andrew W.Marshall . Marshall is a life-long military strategist, one of the few who worked with such legendary war-hawks as Dr. Edward Teller and Albert J. Wohlstetter. Throughout the 1950s, Marshall worked at The Rand Corporation in Santa Monica as a cold war gamer. In 1969, he succeeded Dr. James Schlesinger as Rand's Director of Strategic Studies, when Schlesinger joined the first Nixon Administration.
During the next four years, Marshall authored what turned out to be one of the seminal works on US-Soviet strategy -- "Long-Term Competition with the Soviets: A Framework for Strategic Analysis," published in 1972. This report basically ported the whole concept of competitive strategy to the world of military planning. At the Pentagon, which had heretofore evaluated programs and budgets in terms of narrow, technical criteria rather than their contributions to strategic value, this approach was considered revolutionary. In May 1973, Schlesinger, who had just become Nixon's Secretary of Defense, appointed Marshall to be the Director of the Office of Net Assessment a new post in the Office of the Secretary of Defense that assumed responsibility for long-run military strategy. Marshall has held this post more or less continuously ever since.
In this capacity, Marshall has reportedly exerted enormous influence, as a kind of eminence gris -- the equivalent of George F. Kennan, the State Department's resident intellectual and policy planner during the 1950s -- only with twice the tenure. Marshall based his longetivity not only on strategic insights, but also on political skills -- he was content to stay in the shadows, bringing others along and helping them to succeed. Over time, he cultivated a loyal group of increasingly influential Pentagon officials, many of whom later converged on the second Bush Adminstration.
This group is often referred to collectively as "neoconservatives." This is really a misnomer -- there is nothing at all about their policies that is "conservative." A more accurate term is "ultra-imperialists," or simply, ultras. Among the best known are Rumsfeld, Paul Wolfowitz, Richard Perle, Eliot Cohen, and James Roche, the Secretary of the Air Force.
Partly through Marshall's influence, this group came to share several strong beliefs about national security.
- They have all regarded 'long-term competitive military strategy" as a serious, high-minded intellectual enterprise -- a rational endeavor that one could count on for useful results.
- They have all generally believed that the demise of the Soviet Union, their old long-term "enemy," owed a great deal precisely to this kind of rational military and economic competitive strategy -- as implemented by Ronald Reagan during the 1980s. It was no longer just a theory; its value had been proven in combat.
- Most of the ultras have also shared Marshall's boundless technological optimism -- his confidence in the capacity of the US economy and technology to provide a continuing, even growing competitive advantage over potential rivals.
- They saw the US as a largely innocent "democracy," with clean hands and high principles. In their view, the US has never had a desire to possess or occupy other countries -- well, not at least since 1946, that is, when the US occupation of the Philippines formally ended. It only wished for other countries to develop "free market" economies, which it saw as a guarantor of peace, development, and prosperous trade for all concerned. As such, they believed that the US had every moral right to leverage its superior powers to its advantage, regardless of what the rest of the world might think. It had, first of all, the absolute right to act in its own (perceived) defensive interests. It had, moreover, the right to act on behalf of other important interests that it might deem necessary, even unilaterally.
As the novelist Graham Greene once said, "No country has had better motives for all the damage that it does."
- The ultras also were traditionally quite proud of the fact that, unlike many of its enemies (especially the Soviets, the Chinese Communists, the Cubans, and so forth), the US has always maintained a relatively open society, with relatively free and open borders, a long history of welcoming immigrants regardless of financial means, or (with notable exceptions) even national and ethnic origins, and relatively modest police controls on ordinary citizens that were in case subject to a very strong bill of rights.
These shared values are important for us to understand, because every single one of them is now being called into question, not through abstract disputations, but by the new harsh realities that the US faces on the ground. This is evident, not only in the Pentagon's recent experiences in Iraq, Afghanistan, and the "global war on terrorism." It is also evident in the recent immigration crisis, occasioned by the growing tide of immigrants, mainly from Mexico and Central America, that has recently crossed our borders. And it is also evident in the challenges noted in the Pentagon's recent global warming study, which has profound implications for all these other problems.
OTHER PROPENSITIES
Along the way, there were also many other Marshall sympathizers whose motives were perhaps a little less high-minded than those who had been his intellectual comrades and proteges. These included many leading US defense contractors and their Congressional allies. Over time, Marshall's ONA developed strong, mutually beneficial ties to such key constituencies, and provided on-the-job training to a steady flow of future top industry executives and Congressional staffers.
In his procurement recommendations, Marshall also tended to err on the side of (a) perceiving huge threats that -- quite coincidentally, of course -- almost always required extremely costly, technology-intensive weapons systems, from anti-ballistic missiles systems, precision-guided missiles, remote sensing, and meteorological manipulation to unmanned combat vehicles, holographic projectors, sea-bed robotics, and particle beam weapons. As the Batman's Joker once said, "Where do they get all those FABULOUS TOYS?"
Of course, most of Marshall's activities took place behind closed doors. (See the recent Submerging Markets white paper on Intelligence Failures).So we only have a few snippets in the public record to help us assess his performance, like his reported exaggeration of the continued Soviet threat in the early 1990s, and his agonizing search for a worthy successor to the Soviet Empire, for which he ranged from China to North Korea, and finally, with the help of fellow ultras like Bernard Lewis and Samuel P. Huntington, ended up with (the somewhat confusing blend of ) the "Islamic fundamentalist horde" and the "Axis of Evil."
But there is at least one good publicly accessible example of Marshall's appetite for expensive, hair-brained technologies -- one of his most recent fetishes, "bio-engineered soldiers." This involves the use of behavior-modifying drugs to achieve specific battlefield conduct. (I am not making this up.) As he observed in a rare public appearance at the University of Kentucky in August 2002,
“The drugs would affect specific receptors and would act just like the internal chemistry (of the brain). We could create fearless soldiers, soldiers that would stay awake longer or be quicker and more alert... These new types of drugs or biochemical agents could create a new model of man."
For Mr. Marshall, apparently "the war on drugs" meant "(DOING) the war on drugs"! One would of course suppose that he must have discussed this loony idea -- which would open the door to all sorts of misbehavior -- with Rumsfeld,,his immediate boss, who, after all, had in the late 1970s served as the CEO of GD Searle, one of the nation's largest drug companies. Evidently the boss did not discourage him from these meditations. The mind boggles at the prospect of thousands of young men and women, no longer consciously serving their country as proud citizens, with honor and dignity, but "doped up," marching fearlessly slavishly into battle, doing whatever they're told......
Of course, if Marshall was willing to ponder this kind of policy in public, just imagine the other flights of fantasy that might be available to those with the security clearances to see them! (Admiral Poindexter, where is thy sting!) Given what we do know, it is not really surprising that Marshall was almost ousted in the late 1990s by President Clinton's Defense Secretary, William Cohen -- a sober Maine Republican. The dismissal was reportedly avoidedat the last minute by way of Marshall's many friends in Congress and the defense industry, plus the neocon press, which portrayed him as lying awake nights, worrying about defending our freedoms, not about how to induce killing sprees by the infantry with pharmaceuticals.
After President George W. Bush took office in 2001, Rumsfeld became Marshall's boss again, and soon placed him in charge of a strategic panel that was one key part of a fundamental rethink that Rumsfeld described, with typical modesty, as a "Revolution in Military Affairs." (RMA)
One might have thought that, even apart from his age and eccentricities, Andrew Marshall might not have been the best choice to lead such a strategy validation, especially after 9/11. After all. he'd spent thirty years designing strategies for a very different kind of adversary. The contrasts between the Soviets and the new global threat environment were many:
- "Competitive strategy" was much easier to define when the conflict was among more or less symmetrical "hegemons" like the US, China, and the Soviet Union. When opponents are not battling for nations, but for the vindication of ideas, movements or deeply-felt antipathies, and are disbursed across the globe rather than concentrated in a few countries, notions like "competition," "wartime," "combatant," "preemption," "deterrence," "victory," and "power" are no longer well-defined.
- Arsenals of conventional "anti-state" weapons, like jets, aircraft carriers, missiles, and tanks, are designed to destroy fixed positions, attack large groups of mobile forces, or wipe out concentrations of troops and seize territory. These may no longer be decisive against the latest post 9/11 generation of adversaries. At the same time, they can easily become resource sinkholes, because of their "semi-custom" production economics and very high maintenance and logistics costs.
- "High technology," can easily become a narcotic, while "low technology" can be surprisingly effective -- partly just because relying on it "enforces" creativity. The limiting case here is of course the box cutter and the hijacked plane. But once an enemy has defined "victory"as simply being able to disrupt civilian society, the list of potential "weapons of massive-enough destruction" becomes endless. Yhe cost of defending against all the endless possibilities also becomes prohibitive, so that even "successful" defense is bittersweet.
- In this context, Marshall's conventional "competitive strategy/scenario planning" apparatus of the Cold War period had became a clear disability, probably as early as the mid-1990s, and certainly by the end of the 1990s. Similarly, "strategic planning" in the private sector also went the way of all flesh in the 1990s, for most large companies. In the private sector, when such practices ceased to be productive, there were at least some natural forces that encouraged them to disappear -- though even there, many companies failed to move quickly enough. (Viz. AT&T, Polaroid, Xerox, etc.) In the context of the massive Pentagon bureaucracy, with its hundreds of thousands of staff, government regulations, security procedures, restrictions on hiring, limited performance bonuses, restrictions on firings and transfers, and endless red tape, casting such entrenched practices aside in favor of greater focus on creativity, rapid adaptation, and innovation is almost impossible.
In effect, these bureaucratic "diseconomies of scale" go a long way toward evening the odds between the "1-bullet guerilla" and the entire US military. One imagines poor Marshall, sitting in his Pentagon bastion, ruing the day that the enemy stopped being the mighty Red Army. He had met the enemy, Pogo, and he recognized the face.
Despite all these disabilities, Rumsfeld decided to rely on Marshall for the strategic panel of his RMA assessment. Marshall, in turn, must have realized that when it came to analyzing non-conventional threats like state-less terrorism or global warming, he needed to pull in some outside resources who were perhaps not so captive of traditional approaches. That set tthe stage for the production of the confrontational global warming analysis that has just now reached the light of day.
BACK TO THE FUTURISTS
To get a handle on such non-traditional issues, Marshall reached out to Peter Schwartz, a well-known "futurist," and the co-founder and Chairman of California's Global Business Network,, now part of Cambridge-based Monitor Group. GBN's other co-founder and fellow futurist, Stewart Brand, was the author of the "Whole Earth Catalogue," and founder of the "Long Now Foundation," an organization devoted to extremely long-term thinking, including the construction of a 10,000 year clock. Schwartz, the elder of the Pentagon report's two authors, is not trained in environmental science, but he does have a B.S. in Aeronautical Engineering from Troy's Rensselaer Polytechnic. He also served as director of the Stanford Research Institute's "Strategic Environment Center," and a "Scenario Planner" for Royal Dutch Shell from 1982 to 1986, during the heyday of corporate planning, before GBN's creation in 1987. In addition to the Pentagon, Schwartz has also consulted to the CIA, Darpa, and many Fortune 500 companies. He's also advised Hollywood film-makers on the plots of several successful war/action films, including Deep Impact, War Games, Sneakers, and Tom Cruise's Minority Report.
Schwarz
Schwartz has also authored several books, including a 1991 best seller on "scenario planning," "The Art of the Long View." In 1999 he published a less fortunate book The Long Boom (1999),co-authored with Peter Leyden and Joel Hyatt, in which they predicted "25 years of uninterrupted economic growth and prosperity." Of course, as we now know, this prediction was undermined by the global recession that started just one year later.
However, this did not deter Schwartz from continuing to pursue long-range planning and analysis. In an interview associated with the publication of his latest book, Inevitable Surprises (June 2003), he still maintains that his "long term boom scenario" will hold up, at least over the next half century. And while there will always be shocks and surprises, he still sees great value in scenario planning -- according to him, "September 11 was the most predicted event in history."
For purposes of the Pentagon report on global warming, Schwartz teamed up with Doug Randall, a Wharton graduate and a GBN "senior practitioner," who also had no environmental science training. This was not their first collaboration. In an April 2003 article in Wired Magazine, they argued that the US Government should undertake a massive 10-year, $100 billion program to develop hydrogen power as a substitute for imported oil.
That timetable is much more aggressive than the "several decades" that many other experts regard as necessary to develop the economical fuel-cell technology and hydrogen distribution systems required for basing mass transportation on hydrogen. But this difference of opinion may really just derive from the fact that, unlike Peter Schwartz, most of the other experts have not invested "in two companies that are now developing hydrogen power." Apparently in this instance, President Bush agrees with Schwartz, because he has also recently advocated the development of fuel-cell-based "Freedom Cars" as an alternative to requiring any better fuel efficiency from car manufacturers now.
DIRE STRAITS
The "secret" Pentagon report produced by the two GBN futurists is nothing if not dramatic. According to them, the world may now be headed for a period of profound, sudden, discontinuous changes in climate, with a possible reversal of the gradual recent trends toward warming, followed by rapid cooling and perhaps even a new ice age in much of the world. Among the many side-effects that all this might have:
- Flooding of the Dutch seacoast and the Hague as early as 2007;
- By 2010, the US experiences a third more days per year with peak temperatures above 90F.
- The imminent prospect of historically low mean temperatures in Western Europe, including "Siberia-like" conditions in the UK by 2020;
- Large-scale famines in southern Africa, India, and China;
- Acute water shortages in the Middle East, the Amazon Basin, and the Nile Delta;
- The likelihood that the US and Europe may become "virtual fortresses," to prevent inundation by millions of destitute immigrants from the increasingly-uninhabitable Third World, where the lives of more than 400 million people become at risk.
- Low-lying countries like Bangladesh become virtually uninhabitable.
- As international tensions over food and water increase, there are much greater incentives for countries like Japan, Germany, and South Korea to acquire nuclear weapons, and to use them.
Not surprisingly, this scenario lines up almost exactly with the pro-hydrogen logic that Schwartz has recently been propounding around the country and in his recent book. But it does appear to be a bit too choppy to reconcile with his other favorite scenario, the vintage 1999/03 "long-growth boom. "
In any case, the disturbing portrait provided by Schwartz and Randall of the possible downsides of global warming is not likely to curry much favor with the Bush White House, or with other persistent critics of global warming theory. After all, the "secret" Pentagon report on global warming has appeared just five months after the Environmental Protection Agency, at the instruction of the White House, deleted the entire chapter on global climate change from its annual report on air pollution, and for the first time in six years made no reference at all to the problem in that report. Perhaps the Adminstration's insouciance explains why the Pentagon report was leaked in the first place -- certainly it would have done little good, locked up forever in some classified vault. The leak probably would also not have harmed the stock prices of certain hydrogen-related investments - assuming there are any.
All told, the report does offer a pretty nightmarish set of scenarios. Less polite commentators might also apply words like "pseudo-scientific." Evidently there's no real effort here to build a complex forecasting model, and no way to the scenarios that were constructed, other than to double-check their internal consistency. Even if there had been an effort to construct a full simultaneous-equation system, our actual knowledge of underlying natural and economic relationships is often so weak that the game is often not worth the candle. One is reminded of the disparity in forecasting performance between the huge, complicated, multi-equation econometric models that try to specify detailed relationships about what is really going on, and simple one-line autoregressive models -- the latter routinely outperform the former. So "theory" is neither necessary nor sufficient for prediction. And the Pentagon report, as Schwartz is wont to say, is happy just to provide "scenarios," not forecasts.
Despite this limitation, a good hard-hitting, logical scenario can be very useful as a way of galvanizing pubic attention. At this point, pending the declassification and release of the full study, it is impossible to judge its real quality. Still, perhaps Andrew Marshall really just wanted enough "meat on the bone" to make his underlings think, call attention to the wide range of potential outcomes, or -- who knows -- perhaps even to toss a bone to the President's opponents, for reasons of their own. I suspect that what the Pentagon planners really got for their money was not much more than a wild-eyed Hollywood script and a few days of media attention for their long-run thinking. Beyond that, they almost certainly did obtain a release from the straightjacket of "competitive strategy" and their really quite restrictive ultra assumptions.
CONCLUSION
So what do we conclude from all this? Stepping back from the Pentagon report's apocalyptics, it does concur, in broad strokes, with the growing sense of urgency among many professional scientists about global warming, and our own sense that the case for taking action is now stronger than ever.
For example, the UK's chief science advisor, Professor Sir David King also stated just last monththat he now sees global warming as a much larger threat than terrorism, and he condemned the Bush Administration for "failing to take up the challenge of global warming."
Whether we really needed the "graphic arts" of Schwartz and Randall's detailed scenarios to drive this home is not clear. The point is that the time for preventive action is here.
Unfortunately, this being a US election year, with many people still preoccupied with jobs, health insurance, Social Security, and the costs of education, let along Iraq and terrorism, we are unlikely to find many politicians who are willing to give this issue top billing. After all, they'd have start with the basic fact that, with just 4 percent of the world's population, the US still generates at least 20-25 percent of the world's greenhouse gas emissions. And then they'd have to move on to discuss the purgative diet of tax increases, emissions controls, other regulations and new investments that would be required to cut this fraction significantly. Having failed to tackle this issue for so long, through Democratic and Republican Administrations alike, by the time we get around to it, the solution will be no doubt very costly. The only consolation is that if there is anything to the Pentagon scenarios, the alternatives could be even worse.
March 1, 2004 at 07:00 AM | Permalink | Comments (2) | TrackBack
Monday, February 09, 2004
Intelligence Failures -- A Proud Tradition?

Similar concerns are also being muttered in France, Germany, Israel, and Russia, whose agencies all reportedly reached similar conclusions about Saddam’s WMD stockpiles.
Before this latest flurry, there was also the flap over NSA/MI6 spying on UN Security Council members, the bogus Niger uranium documents, the failure to track down Bin Laden, and, of course the mother of all intelligence failures, 9/ll.
One might have hoped for slightly more accuracy from all these countries, at least with respect to Iraq. After all, Saddam did not acquire his WMDs from Pakistan. Except for Israel, it was these same countries, plus the US and the UK, that were largely responsible for providing him WMD technology in the first place.
In any case, under acute pressure, President Bush has now courageously decided to appoint yet another Presidential Commission, one of a half dozen that he has created to shuttle fundamental policy issues to one side. And the nine-lived, unabashed George Tenet has even resorted to defending the CIA in public – a daunting task, given his track record. Evidently he must have indispensable knowledge of something, even if it is not WMDs or terrorism.
What can we conclude from this fiasco, other than empty placebos like “try harder,” “get better sources,” or Tenet’s prosaic summary – “we were not completely right, but we were not completely wrong”?
If such errors were randomly distributed, one would expect that these agencies would occasionally drop the ball. But their long-run track record actually reveals that such monumental intelligence failures are nothing new. Indeed, there seems to be a systematic bias toward producing them.
THE DISMAL TRACK RECORD
On matters of signals intelligence, where it just comes down to, say, monitoring international wire transfers or Chinese conversations with Pakistani proliferators, presumably the errors have been less frequent – though even there, a variety of new commucations technologies are making the task much more difficult. And on the operations side, they may be good at the occasional “dirty trick," though there, of course, the track record is also filled with screwups.
But the track record on what we might call “strategic insight” has been downright dreadful. As the following list shows, especially where the signals intelligence is weak and real political or economic insight is called for, our “intelligence” agencies seem to have missed almost every critical strategic turning point in recent history. Like the proverbial wiz kids, these folks are “very smart and (almost) always wrong.”
Given this sorry track record, which the UK’s MI6 appears to have duplicated, we just may be tempted to agree with the UK’s Prime Minister Harold Macmillan, who once snapped, “Why don’t we just exchange secrets every week with (our enemies) , and skip all the fucking guesswork?”
In the case of US agencies, in addition to the recent failures already cited , there was also:
- 1. The 1998 bombings of the Chinese Embassy in Belgrade and the pharmaceutical plant in the Sudan;
- 2. The failure to predict the acquisition of nuclear weapons by India and Pakistan in the 1990s;
- 3. The failure to predict Iraq’s 1990 invasion of Kuwait;
- 4. The failure to anticipate the rapid demise of Portuguese colonialism and South African apartheid, and the Soviet Union in the 1980s;
- 5. The spurious “second missile gap” – the overestimation of Soviet nuclear weapons strength in the early 1980s;
- 6. The famous October 1978 estimate by the Defense Intelligence Agency, three months before the Shah of Iran’s fall, that “the Shah is expected to remain actively in power over the next ten years;“
- 7. Innumerable mispredictions with the prospects for Communist victory in Vietnam;
- 8. The failure to anticipate that the Soviets would deploy nuclear weapons in Cuba in 1962;
- 9. The notorious expectation that the Cuban masses would rise up to support the 1961 Bay of Pigs invasion;
- 10. The spurious “first missile gap” in the late 1950s;
- 11. The original WMD underestimate – the failure to predict the Soviet Union’s acquisition of nuclear weapons in the late 1940s.
- December 1941’s Pearl Harbor doesn’t count, since there were no intelligence agencies around yet – one of the reasons for establishing them was to avoid such blunders.
At the risk of short-circuiting President Bush’s new Commission, let me suggest that much of the systematic bias toward such strategic blunders derives from deep-seated institutional problems. Among the key culprits:
- The pool of talent that is attracted to careers in intelligence is pretty thin to begin with – Exhibt A being Sr. Tenet himself. My hunch is that if this fine fellow were forced to compete in the private sector, he would wind up as the Director of Competitive Strategy for a very small casino in Nevada. Over time, as the agencies have become larger and more bureaucratic, this problem has no doubt increased -- “intelligence analysis” has become reduced to lifeless, formulaic process, and those who rise to the top and survive are likely to be politically-astute bureaucrats, not creative analysts.
- That diminishing talent pool has been scattered across more than a dozen warring bureacracies, including CIA, DIA, NSA, NIMC, and the various service intelligence units. This makes it even more difficult for any individual agency’s boss to stand up and resist political pressures.
- The permeation of the community with “closed source” mythology. "Community" does indeed appear to be the wrong word here -- "den of back-biting snakes," "cat house," or "Dodge City" may be a bit closer to what we're dealing with here. As a representative of the (wonderfully harmonious and cooperative) journalistic (“open source’) approach to understanding the world, I’d love to see a footrace on any given issue between the agencies, with all their secret sources, and a handful of top flight investigative reporters.
Unfortunately, these problems are unlikely to be solved by one or two commissions or Congressional investigations, or in just a few years. Even a very public firing of Tenet, while deeply gratifying to many people, would only be a superficial response. For the foreseeable future, the only real antidote may be to elect a new President with some real "carrying capacity," some real depth to his own understanding of foreign policy, and yet is also astute enough to realize that the country is ill-served by having the intelligence agencies become lap-dogs for his preconceived ideas.
(For a helpful i