Al Qaeda in Iraq: Bush’s Creation by Bill Gallagher

by Bill Gallagher

Global Research, July 7, 2007

President George W. Bush’s political capital is about as low as it can go, with only dead-end Bushists clinging to his failed regime. The erosion of support, however, can actually make the madman even more isolated from reality, arrogant and impetuous.

The final 18 months of his presidency will be an increasingly dangerous time for the world. Bush is wrapping himself in his messianic blanket, still bound to convince the infidels at home and abroad that he is a gifted visionary who can reshape the Middle East.

Vice President Dick Cheney makes Dr. Strangelove seem like Gandhi. Cheney operates above the Congress, the Constitution, the law and human decency — at times, above the presidency. He does as he pleases and is answerable to no one.

Bush is not nearly clever enough to sort through or keep up with Cheney’s Machiavellian machinations. The president is so lazy and incurious, he’s more than willing to let Cheney do his dirty work. Whether it is approving torture, illegal wiretapping, concentration camps and kidnappings, or coddling corporate polluters, Cheney is ready to nod OK.

The poll numbers are encouraging, as Americans see through the lies and conclude — tragically, too late — what a mess we are in. The percentage of Americans who believe the war in Iraq was a mistake is at an all-time high, as is the percentage of those who say continued U.S. military action there is not morally justified.

A CBS/New York Times poll shows 23 percent of Americans believe the war is going well, 76 percent say the war is going “badly.” The critical question politicians carefully watch: “Do you believe the country is heading in the right direction?” The nation is on the wrong track, according to 72 percent of respondents. That’s the highest number since the poll started asking that question in 1983.

The numbers show people are generally gaining a better understanding of the disastrous course our nation is on and the serial failures of the Bush administration. There are exceptions, which seem baffling at first but find explanation in the administration’s talking points — invariably lies and distortions — dutifully echoed in the mainstream media.

A “Newsweek” magazine poll conducted by Princeton Survey Research Associates International posed the question, “Do you think Saddam Hussein’s regime in Iraq was directly involved in planning, financing, or carrying out the terrorist attacks of September 11, 2001?”

A staggering 41 percent answered yes. That’s actually a 5 percentage point increase over the same question asked in 2004. It’s hard to image that 4 out of 10 Americans could be that uninformed or flat-out stupid. Another poll shows 3 out of 10 Americans still approve of Bush’s job performance and his handling of the war in Iraq.

The growth in the number of sorry souls buying the Saddam-Sept. 11 lie may be the result of the word games the White House and Pentagon use to sell the failed surge and the futile occupation of Iraq.

When people hear “al-Qaeda,” it’s natural that they think of Osama bin Laden and the Sept. 11 attacks. The insurgency, sectarian violence and opposition to the U.S. occupation in Iraq are not about fighting al-Qaeda, but that’s how Bush’s fiasco there is being branded.

McClatchy Newspapers’ Baghdad correspondent Mike Drummond exposed the sinister rhetorical shift, noting in a recent report, “U.S. forces continue to battle Shiite militia in the south, as well as Shiite militia and Sunni insurgents in Baghdad. Yet America’s most wanted enemy at the moment is Sunni al-Qaeda in Iraq. The Bush administration’s recent shift toward calling the enemy in Iraq ‘al-Qaeda’ rather than an insurgency may reflect the difficulty in maintaining support for the war at home more than it does the nature of the enemy in Iraq.”

In a major speech at the National War College last week, Bush mentioned al-Qaeda 27 times. McClatchy’s Jonathan Landley reports, “Bush called al-Qaeda in Iraq the perpetrator of the worst violence racking that country and said it was the same group that carried out the 9/11 attacks.”

Pure crap. Al-Qaeda has become a generic name, in many cases a self-proclaimed label for foes of the U.S. invasion. Since some of the Iraqi Sunnis started calling themselves “al-Qaeda in Iraq” or “al-Qaeda in Mesopotamia,” the Bush propaganda ministry saw an opportunity to conflate the invasion of Iraq with the Sept. 11 attacks.

Bush went on to claim, “Al-Qaeda is the main enemy for Shia, Sunni and Kurds alike. Al-Qaeda is responsible for the most sensational killings in Iraq. They’re responsible for the sensational killings on U.S. soil.” Those are demonstrable lies. Bush hardly mentioned sectarian violence. U.S. military intelligence even disputes Bush’s wild claims. All of this is intentional and meant to deceive and distort.

The Iraq Study Group and intelligence estimates placed the number of Iraqi insurgent forces at approximately 20,000 combatants. At most, 5 percent of that number are foreigners fighting in Iraq, nearly all going there in response to the U.S. invasion. Some are Iranians supporting Shiites; others Sunnis from Syria and Saudi Arabia supporting Iraqi Sunnis. Nearly all come from nations with strategic interests in the region and want to help their ethnic or religious comrades.

The lightning rod for these expeditionary forces is the presence of U.S. forces in the heart of Islam. Bin Laden, hiding in the mountains of northern Pakistan, just sits back and relaxes, enjoying the bloody spectacle and the gift to radical Islam Bush brought him.

The true ties involving Iraq, deadly chemical weapons and the United States rarely get a mention. Saddam’s allies in killings tens of thousands of Kurds included Ronald Reagan, George H.W. Bush, Donald Rumsfeld, Robert Gates and others. Reagan as president, Bush as vice president, Rumsfeld as special envoy to the Middle East and Gates as a senior CIA officer all provided help and support for Saddam’s murderous assaults on his own people.

The Iraqi Special Tribunal has sentenced Saddam’s cousin Ali Hassan al-Majid to death. Known as “Chemical Ali,” he was the point-man in using chemical weapons first on Iranian troops during the Iraq-Iran war, and then later on Iraqi Kurds supporting the Iranians.

On the Smirking Chimp Web site, Barry Lando described how, when Ali’s sentence was read on June 24, “all the key players in the media were there to capture the dramatic courtroom scene. What none of the reporters mentioned, however, was that when Saddam and Chemical Ali and the rest of the Saddam killers were doing their worst, the U.S. governments of Ronald Reagan and later George Bush Senior were their de facto allies, providing them with vital satellite intelligence, weapons and financing, while shielding them from U.N. investigations or efforts by the U.S. Congress to impose trade sanctions for their depredations.”

Robert Parry, who broke many of the Iran-Contra stories for AP, wrote, “Hussein’s silence was golden for the international arms dealers who supplied his regime and the foreign officials who facilitated the shipments.” So when the “brutal dictator” — as George W. liked to call him — was sent to the gallows, Bush the Elder, Rumsfeld and Gates “were among those who could breathe a little easier after the hangman’s noose had choked the life out of Hussein,” Parry concluded.

Bin Laden never had any kind of relationship with Saddam, but many intimates of our president did. So far, they have been able to choke the life out of that truth.

Bill Gallagher, a Peabody Award winner, is a former Niagara Falls city councilman who now covers Detroit for Fox2 News. His e-mail address is . Niagara Falls Reporter

Global Research Articles by Bill Gallagher

To become a Member of Global Research

The CRG grants permission to cross-post original Global Research articles on community internet sites as long as the text & title are not modified. The source and the author’s copyright must be displayed. For publication of Global Research articles in print or other forms including commercial internet sites, contact: contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to our readers under the provisions of “fair use” in an effort to advance a better understanding of political, economic and social issues. The material on this site is distributed without profit to those who have expressed a prior interest in receiving it for research and educational purposes. If you wish to use copyrighted material for purposes other than “fair use” you must request permission from the copyright owner.

For media inquiries:

© Copyright Bill Gallagher, , 2007

The url address of this article is:

The Crashing U.S. Economy Held Hostage by Richard C. Cook

Our Economy is on an Artificial Life-support System

by Richard C. Cook
Global Research, July 7, 2007

Remember when the U.S. was the world’s greatest industrial democracy? Barely thirty years ago the output of our producing economy and the skills of our workforce led the world.

What happened? It’s hard to believe that in the space of a generation our character and capabilities just collapsed as, for example, did our steel and automobile industries and our family farming. What then are the causes of the decline?

Here’s how I would put it today: our economy is on an artificial life-support system, a barely-breathing hostage in a lunatic asylum. That asylum is the U.S. and world financial systems which are on the verge of collapse.

The inmates are the world’s central bankers, along with most of the financial magnates big and small. The fact is that the economy of much of the world is in a decisive downward slide which the financiers cannot stop because the systems they operate are the primary cause. As often happens, the inmates rule the asylum.

The problems aren’t confined to the U.S. Unemployment worldwide is increasing, debt is rampant, infrastructures are crumbling, and commodity prices are rising.

In such an environment, crime, warfare, terrorism, and other forms of violence are endemic. Only the most naïve, self-centered, and deluded jingoist could describe such a scenario in terms of the freedom-loving Western democracies being besieged by the “bad guys.”

Rather what is happening highlights the growing failures of Western globalist finance whose impact on political stability has been so corrosive. As many responsible commentators are warning, we are likely to see major financial shocks within the next few months. The warnings are even coming from high-flying institutional players like the Bank of International Settlements and the International Monetary Fund.

We may even be seeing the end of an era when the financiers ruled the world. At a certain point, governments or their military and bureaucratic establishments are likely to stop being passive spectators to the onrushing disorder. It is already happening in Russia and elsewhere.

The countries that will be least able to master their own destiny are those like the U.S. where governments have been most passive to economic decomposition from actions of their financial sectors. The financiers are the ones who for the last generation have benefited most from economies marked by privatization, deregulation, and speculation, but that may be about to change. Whether the change will be constructive or catastrophic is yet to be seen.


Within the U.S., foreign investors, above all Communist China, have been propping up our massive trade and fiscal deficits with their capital. To keep them happy, interest rates—after six years of “cheap credit”—must now be kept relatively high. Otherwise the Chinese,, might bail-out, leaving us to fend for ourselves with our hollowed-out shell of an economy.

Even so, these investors are increasingly uneasy with their dollar holdings and are bailing out anyway. Foreign purchase of U.S. securities has plummeted. And our debt-laden economy, where our manufacturing base has been largely outsourced, is no longer capable of providing our own population with a living by utilizing our own productive resources.

For a while we were floating on the housing bubble, but those days are now history when, according to a Merrill-Lynch study, the artificially pumped-up housing industry, as late as 2005, accounted for fifty percent of U.S. economic growth.

As everyone knows, the Federal Reserve under Chairman Alan Greenspan used the housing bubble, like a steroid drug, to pump liquidity into the economy. This worked, at least for a while, because consumers could borrow huge amounts of money at relatively low interest rates for the purchase of homes or for taking out home equity loans to pay off their credit cards, finance college education for their children, buy new cars, etc.

When the final history of the housing bubble is written, its beginnings will be dated as early as 1989-90, when credit restrictions on the purchase of real estate first began to be eased. According to mortgage industry insiders interviewed for this article, they began to be taught the methods for getting around consumers’ weak credit reports and selling them homes anyway in the mid to late 1990s.

The Fed started inflating the housing bubble in earnest around 2001, after the collapse of the bubble, which failed with the stock market decline of 2000-2002. Then, over a trillion dollars of wealth, including working peoples’ retirement savings, simply vanished.

Also according to mortgage specialists, it was in March 2001, two months after George W. Bush became president, that a “wave of intoxicated fraud” started. Mortgage companies began to be instructed, by the creditors/lenders, on how to package loan applications as “master strokes of forgery,” so that completely unqualified buyers could purchase homes.

There could not have been a sudden onset of industry-wide illegal activity without direction from higher-up in the money chain. It could not have continued without reports being filed by whistleblowers with regulatory agencies. Today the government is prosecuting mortgage fraud, but they certainly had to know about it while it was actually going on.

The bubble was coordinated from Wall Street, where brokerages “bundled” the “creatively-financed” mortgages and sold them as bonds to retirement and mutual funds and to overseas investors. Portfolio managers were directed to buy subprime bonds as other bonds matured. It’s the subprime segment of the industry that has now collapsed, triggering, for instance, the recent highly-publicized demise of two Bear Stearns hedge funds.

And it’s not just lower-income home purchasers who are affected. The Washington Post has reported that for the first time in living memory foreclosures are happening in Washington’s affluent suburban neighborhoods in places like Fairfax, Loudon, and Montgomery Counties.

The subprime bonds were known to be suspect. One reason was that they were based on adjustable rate mortgages that were actually time bombs, scheduled to detonate a couple of years later with monthly payments hundreds of dollars a month higher than when they were written. Many of these mortgages will reset to higher payments this October.

Purchasers were lied to when they were told they could re-sell their homes in time to escape the payment hikes. Now the collapse of the market has made further resale at prices high enough to escape without losses impossible.

One way the system worked was for mortgage lenders to maximize the “points” buyers were required to finance, making the mortgages more attractive to Wall Street. Of course bundling and selling the mortgages relieved the banks which originated the loans from exposure, pushing a considerable amount of the risk onto millions of small investors. This was in addition to the normal sale of mortgages to quasi-public agencies like Freddie Mac and Fannie Mae.

Was it a scam? Of course. Did the Federal Reserve know about it? They had to. Did Congress exercise any oversight? No.

What did the White House know?

Amy Gluckman, an editor of Dollars and Sense, reported in the November/December 2006 issue: “During the Clinton administration, Greenspan was relatively ‘unembedded’—averaging only one meeting per month at the White House….

“But when George W. Bush moved into 1600 Pennsylvania Ave., Greenspan’s behavior changed. During 2001, he averaged 3.3 White House visits a month, more than triple his rate under Clinton and much more often with high-level officials like Vice President Cheney. His visits rose to 4.6 a month in 2002 and 5.7 in 2003.

“Whatever White House officials were whispering in Greenspan’s ear, it worked: Greenspan abruptly changed his tune on tax cuts, lending critical support to Bush’s massive 2001 and 2003 tax giveaways, and he loosened the reins by cutting Fed-controlled interest rates repeatedly beginning in January 2001, a gift to the Republicans in power.”

Along the way, the bubble caused housing prices to inflate drastically, which officialdom touted as economic “growth.” Even today, periodicals like Barron’s naively boast that this inflation boosted American’s “wealth.”

But this source of liquidity for everyday people has been maxed out, like our credit cards, and there is nothing to replace it. There is no cash cushion anymore, because years ago people stopped earning enough money for personal or household savings.

As purchasers lose their homes to foreclosure, the real estate is being grabbed at bankruptcy prices by the banks and by any other investors with ready money. Whole neighborhoods of cities like Cleveland or Atlanta are turning into boarded-up ghost towns.

What we are seeing are the results of an economic crime on a fantastic scale that implicates the highest levels of our financial and governmental establishments. It spanned three presidential administrations—Bush I, Clinton, and Bush II—though the worst of it came with the surge of outright lending fraud after 2001.

As usual when hypocrisy is rampant only the small fry are being called to account. Commentators, including a sleepwalking Congress, have self-righteously railed at consumers who got in over their heads. The Mortgage Bankers Association is even lobbying Congress to allocate $7 million more to the FBI to go after the supposedly rogue brokers within their own industry who are being scapegoated.


But there’s much more to it than that. These bubbles are symptoms. They are created because our wage and salary earners lack purchasing power due to stagnant incomes and various structural causes. These causes include the outsourcing of our manufacturing industries to China and other cheap labor markets and the super-efficiency of the remaining U.S. industry which is able to manufacture products with ever-fewer workers.

Also, our farming, mining, and other resource-based industries are in a long-term slide. This and the decline of hard manufacturing have been going on since our oil production peaked in the 1970s, followed by the Federal Reserve-induced recession of 1979-83. Next came the deregulation of the financial industry. It was all part of the economic disintegration that led to today’s “service economy.”

Now, for the first time in modern U.S. history, there are no new economic engines at all. The last real engine was the internet which has now reached maturity with marginal players being weeded out.

Our biggest sources of new private-sector jobs today are food service, processing of financial paperwork, health care for the growing numbers of retirees, and menial low-paying jobs, like landscaping and building maintenance. These are increasingly being performed by immigrants who are also underpricing U.S. citizens in many service jobs like childcare and auto repair.

Today the rank-and-file of our population must increasingly turn to borrowing in order to survive. Only the banks and the credit card companies are the beneficiaries. The total societal debt for individuals, businesses, and government is over $45 trillion and climbing. This is happening even while the real value of wages and salaries is decreasing.

What I have just been saying is bad enough, but here’s where the real lunacy enters in.

A major factor connected to the decline in the value of employee earnings is dollar devaluation in the overarching financial economy due to the proliferation of huge quantities of bank credit being used to keep the stock market afloat and to fuel the speculative games of equity, hedge, and derivative funds.

In other words, while our factories continue to shut down, the Wall Street gambling casino—like its Las Vegas counterpart—is running full-bore, 24/7. This, along with financing of the massive federal deficit, is what critics are talking about when they speak of the Federal Reserve “printing money.”

The main growth factors for federal spending are Middle East war expenditures and interest on the national debt. But within the private sector it’s leveraged loans to businesses which The Economist recently said “mirror….interest-only and negative-amortization mortgages” in the subprime market. But here’s the big difference: in the leveraged business economy, the amount of assets at stake are even greater than with the housing bubble.

The financial world, which Dr. Michael Hudson calls the FIRE economy—Finance, Insurance, and Real Estate—has been producing millionaires and billionaires among those who know how to play the game.

The Wall Street hedge funds stand out as the most irresponsible financial scams in history. Unregulated and secretive, they account for a third of all stock trades, own $2 trillion in assets, and pay their individual managers over $1 billion a year. Think about this the next time someone you know has their job outsourced to China or when his adjustable rate mortgage resets and drives up his monthly house payment past the level of affordability.

The hedge funds borrow huge sums from the banks which generate loans under their Federal Reserve-sanctioned fractional reserve privileges. Often this money is used by the hedge funds to “short the market,” thereby earning profits when stock prices decline.

In other words, the hedge funds and their banking enablers use banking leverage to bet against the producing economy. In doing so, they may actually drive stock prices down, causing ordinary investors to lose a portion of their own wealth. Can this be called anything other than a crime?

The livelihood of much of the U.S. workforce and perhaps half of the rest of the world’s population—maybe three billion people—is being threatened by such financial lawlessness. The justification that was first used for financial deregulation and tax cuts for the rich was that the trickle-down effect of wealthy peoples’ earnings would spill over to the rank-and-file.

The Reagan administration ushered in these policies in the 1980s under the heading of “supply-side economics.” But the opposite has happened. The system has institutionalized an increasingly stratified worldwide culture of haves and have-nots.


How did today’s looming tragedy come to pass?

Looking for causes is like peeling an onion. What we are really seeing are the terminal throes of a failed financial system almost a century old. It’s happening because, since the creation of the Federal Reserve System in 1913—even during the period of the New Deal with its Keynesian economics aimed at full employment—our economy has been based almost entirely on fractional reserve banking.

This means that under the regime of the world’s all-powerful central banking systems, money is brought into existence only as debt-bearing loans. Interest on this lending tends to grow exponentially unless overtaken by real economic growth.

Remember that every instance of bank lending, from purchase of Treasury Bonds, to credit cards, to home mortgages, to billion-dollar loans to hedge funds for leveraged buyouts or sheer speculation, must eventually be paid back somewhere, somehow, sometime, by somebody, with interest. In the end, it all comes back to people who work for a living, whether in the U.S. or elsewhere, because that is the only way the world community ever creates real wealth.

In an anemic economy like that of the U.S., growth cannot catch up with interest in a deregulated financial marketplace where interest rates are high. Rates may not seem high compared with, say, the twenty percent-plus rates of the early 1980s, but they are high in an economy with, at best, a two percent GDP growth rate.

And they have been high on average since the 1960s, as the banking industry became increasingly deregulated. Interestingly, since 1965, the U.S. dollar has lost eighty percent of its value, which tends to validate the contention by some observers that higher interest rates not only do not reduce inflation, as the Federal Reserve contends, but actually cause it.

The situation today is worse in many respects than 1929, because the debt “overhang” vs. real economic value is much higher now than it was then. The U.S. economy was in far better shape in the 1920s, because so much of our population was gainfully employed in factories or on farms.

The question is not when will the system start to come down, because this has already begun. It’s shown most clearly by the fact that according to Federal Reserve data, M1, the part of the money supply most readily available for consumer purchases, is not only lagging behind inflation but has actually decreased in eleven of the last twelve months. This means that the producing economy is already in a recession.

The federal government is trying to figure out what to do. Their biggest concern is that foreign investors have started to pull out of dollar-denominated markets.

The government’s “plunge protection team”—known officially as the President’s Working Group on Financial Markets—is trying to engineer what they call a “soft landing.” It’s been likened to the process by which you cook a frog in a pot where you raise the temperature one degree a day. The frog doesn’t hop out because the heat goes up gradually, but before long it’s too late. The frog has been cooked.

Even if the plunge protection team succeeds, and the frog cooks slowly, there will be a massive de facto default on dollar-denominated debt and a long-term degradation of the U.S. standard of living. The inside word is that we are likely to see major monetary shocks and a possible stock market crash as early as December 2007.

The worst off will be people locked into retirement funds which have a heavy load of mortgage-related securities. Entire investment portfolios are likely to disappear overnight.

The banks, along with the bank-leveraged equity and hedge funds, are preparing for the biggest fire sale in at least a generation. Insiders are going liquid to get ready. If you think Enron was “the bomb,” you won’t want to miss this one.


There are so many flaws in the system that it’s time for real change.

As I have been pointing out in articles over the last several months, the key to a rational solution would be immediate monetary reform leading to a fundamental shift in how the world conducts its financial business. It would mean taking control of the world’s economy out of the hands of the private bankers and giving it back to democratically elected governments.

I spent twenty-one years working for the U.S. Treasury Department and studying U.S. monetary history. For much of our history we were a laboratory for diverse monetary systems.

During and after the Civil War (1861-5) we had five different sources of money that fueled our economy. One was the Greenbacks, an extremely successful currency which the government spent directly into circulation. Contrary to financiers’ propaganda, the Greenbacks were not inflationary.

Another was gold and silver coinage and specie-backed Treasury paper currency. The third was notes lent into circulation by the national banks. The fourth was retained earnings—individual savings and business reinvestment of profits—which was the primary source of capital for industry. The fifth was the stock and bond markets.

After the Federal Reserve Act was passed by Congress in 1913, the banks and the government inflated the currency through war debt and destroyed most of the value of the Greenbacks and coinage. The banks never entirely displaced the capital markets but eventually took them over during the present-day era of leveraged mergers, acquisitions, and buyouts, while the Federal Reserve created and deflated asset bubbles.

The banking system which rules the economy through the Federal Reserve System has produced the crushing debt pyramid of today. The system is a travesty. Banks, which can be useful in facilitating commerce, should never have this much power. Many intelligent people have called for the Federal Reserve to be abolished, including former chairmen of the House banking committee Wright Patman and Henry Gonzales and current Republican presidential candidate Ron Paul.

Some might call such a program a revolution. I prefer to call it a restoration—of national sovereignty. Central to the program would be the elimination of the Federal Reserve as a bank of issue and restoration of money-creation to the people’s representatives in Congress. This is what our Constitution says too. It’s the system we had before 1913.


The fundamental objectives of monetary policy should be to secure a healthy producing economy and provide for sufficient individual income. The objectives should not be to produce massive profits for the banks, fodder for Wall Street swindles, and a blank check for out-of-control government expenditures.

Note I referred to income. I did not say “create jobs.” That is the Keynesian answer, because Keynes was a collectivist, and the main thing collectivists like to come up with is to give everyone more work to do, even if it’s just grabbing a shovel and digging ditches like they did with the WPA during the Depression.

It’s what President Clinton did with his welfare-to-work program that threw hundreds of thousands of mothers off the welfare rolls and into a job market where sufficient work at a living wage did not exist. It’s another reason the government is constantly borrowing more money to fuel the military-industrial complex by creating more military, bureaucratic, and contractor jobs.

Back to income. The idea of “income,” as opposed to “jobs,” is a civilized and humane idea. When are we going to realize that everyone doesn’t need a paying job in order for an industrial economy to provide all with a decent living? When are we going to realize that the productivity of the modern economy is part of the heritage of all of us, part of the social commons?

Why can’t mothers have the choice of staying home with the kids like they could a generation ago? Why can’t some people choose to do eldercare? Why can’t others comfortably go into lower-paying occupations like teaching or the arts? Why can’t some just opt to study or travel for a while or learn new skills or start a business without facing financial ruin as they often must today? Why can’t retirees enjoy their retirement instead of having to stay in the job market or worrying about Social Security going broke?

The U.S. and world economies are on the brink of collapse due to the lunacy of the financial system, not because we can’t produce enough.

Contrary to so many doomsayers, the mature world economy is capable of providing a decent living for everyone on the planet. It cannot because the monetary equivalent of its bounty is skimmed by interest-bearing debt.

These are things that monetary reformers have known about for decades. The first steps within the U.S. would be 1) a large-scale cancellation of debt; 2) a guaranteed income for all at about $10,000 a year, not connected to whether a person has a job; 3) an additional National Dividend, fluctuating with national productivity, that would provide every citizen with their rightful share in the benefits of our incredible producing economy; 4) direct spending of money by the government for infrastructure and other necessary costs without resort to taxation or borrowing; 5) creation of a new system of private lending to businesses and consumers at non-usurious rates of interest; 6) re-regulation of the financial industry, including the banning of bank-created credit for speculation, such as purchase of securities on margin and for leveraging buyouts, acquisitions, mergers, hedge funds, and derivatives; and 7) abolishment of the Federal Reserve as a bank of issue with retention of its functions as a national financial transaction clearinghouse.

While these proposals are basically simple, the overall program is so different from what we have today with our financier-controlled system that it takes careful reading and a great deal of thought to understand exactly how it would work. One way to approach it is to look at the likely effects.

These measures would immediately shift the basis of our economy from borrowing from the banks to a mixed system that would include the direct creation of credit at the public and grassroots level. The size of government would shrink, our producing economy would be reborn, debt would come down, economic democracy would become a reality, and the financial industry could be right-sized. Finally, the international situation could be stabilized because we would no longer be driven to a constant state of warfare to seize other nations’ resources as with Iraq and to prop up the dollar as a reserve currency abroad.

Such a system would work by creating indigenous sources of credit needed to mobilize the natural wealth and productivity of the nation. There are people who could implement this program. Systems to do so could be installed within the U.S. Treasury and the Federal Reserve within a matter of months.

Fundamental monetary reform implemented to restore economic democracy is what America’s real task should be for the twenty-first century. One thing is for certain. The out-of-control financial system that has wrecked the U.S. and world economies over the last generation cannot be allowed to continue.

How the outcome will play out may well depend on whether there is a Jefferson, Lincoln, or Roosevelt waiting in the wings. The success of each of these great leaders was due to one critical factor: their ability to implement monetary reform at a time of national emergency.

Richard C. Cook is the author of “We Hold These Truths: The Hope of Monetary Reform,” scheduled to appear by September 1, 2007. A retired federal analyst, his career included service with the U.S. Civil Service Commission, the Food and Drug Administration, the Carter White House, and NASA, followed by twenty-one years with the U.S. Treasury Department. His articles on monetary reform, economics, and space policy have appeared on Global Research, Economy in Crisis, Dissident Voice, Arizona Free Press, Atlantic Free Press, and elsewhere. He is also author of “Challenger Revealed: An Insider’s Account of How the Reagan Administration Caused the Greatest Tragedy of the Space Age.” His website is at . He appears frequently on internet radio at on Saturday mornings at 11 a.m. Eastern.

Richard C. Cook is a frequent contributor to Global Research. Global Research Articles by Richard C. Cook


To become a Member of Global Research

The CRG grants permission to cross-post original Global Research articles on community internet sites as long as the text & title are not modified. The source and the author’s copyright must be displayed. For publication of Global Research articles in print or other forms including commercial internet sites, contact: contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to our readers under the provisions of “fair use” in an effort to advance a better understanding of political, economic and social issues. The material on this site is distributed without profit to those who have expressed a prior interest in receiving it for research and educational purposes. If you wish to use copyrighted material for purposes other than “fair use” you must request permission from the copyright owner.

For media inquiries:

© Copyright Richard C. Cook, Global Research, 2007

The url address of this article is:

George Bush was behind September 11 attacks, says French politican

From correspondents in Paris

July 07, 2007 09:43pm
Article from:

A SENIOR French politician, now a minister in President Nicolas Sarkozy’s government, suggested last year that US President George W. Bush might have been behind the September 11, 2001 attacks, according to a website.
The website, which promotes September 11 conspiracy theories, has posted a video clip of French Housing Minister Christine Boutin appearing to question that Osama bin Laden’s al-Qaeda group orchestrated the attacks.

Ms Boutin’s office sought to play down the remarks.

Asked in an interview last November, before she became minister, whether she thought Mr Bush might be behind the attacks, Ms Boutin says: “I think it is possible. I think it is possible.”

Ms Boutin backs her assertion by pointing to the large number of people who visit websites that challenge the official line over the September 11 strikes against US cities.

“I know that the websites that speak of this problem are websites that have the highest number of visits … And I tell myself that this expression of the masses and of the people cannot be without any truth.”

Ms Boutin’s office sought to play down the remarks, saying that later in the same interview she says: “I’m not telling you that I adhere to that position.”

This comment does not appear on the video clip on ReOpen911.

Numerous other websites have also posted the clip in recent days and the story has started to seep into the mainstream media.

“Christine Boutin snared by her controversial suggestions about September 11,” Le Monde newspaper said in a headline.

Liberation newspaper quoted Ms Boutin’s spokesman Christian Dupont as saying that she had not wanted to appear pro or anti-Bush at a time when Mr Sarkozy was being branded a “US poodle” after meeting the President in Washington.

“And then she is not the foreign minister,” Mr Dupont added.

France appears to be particularly fertile ground for conspiracy theories.

In 2002, a book that claimed that no airliner hit the US Pentagon in the September 11 attacks topped the French bestseller lists.

However, the French are not alone in their scepticism.

According to a Scripps Howard/Ohio University poll carried out last July, more than one-third of Americans suspect US officials helped in the September 11 attacks or took no action to stop them so the US could later go to war.

The US State Department has rejected these accusations.

Almost 3000 people died when hijackers crashed planes into New York, Washington and Pennsylvania.

h/t: ICH

FAIR USE NOTICE: This blog may contain copyrighted material. Such material is made available for educational purposes, to advance understanding of human rights, democracy, scientific, moral, ethical, and social justice issues, etc. This constitutes a ‘fair use’ of any such copyrighted material as provided for in Title 17 U.S.C. section 107 of the US Copyright Law. This material is distributed without profit.

Uprooted – A documentary film by Donia Mili

Dandelion Salad

replaced video Nov. 30, 2012


“I should of, I could of” made a shorter film, but I was angry, very angry. I felt that we are so bombarded by this propaganda about terrorists and what terrorism is, that I wanted to take as much space as I could about RESISTANCE, and the right of the Palestinians to RESIST” Donia Mili, Director and Producer of Uprooted.

May 11, 2012 by Continue reading

Dr. Dahlia Wasfi – Life in Iraq Under U.S. Occupation (video) + A Short History of the Republic of Iraq By Dr. Sadiq H. Wasfi and Dahlia Wasfi

shortages; lack of electricity; potable water; tanks rolling through the streets night and day; gunfire and explosions. Iraqi health care in shambles. 200 bodies turn up daily in the Baghdad morgue. For Iraqis, it’s 9/11 every day.

h/t: ICH

A Short History of the Republic of Iraq

Originally posted  2/22/07 at Information Clearing House Blog

By Dr. Sadiq H. Wasfi and Dahlia Wasfi

“For the poor throughout history who have suffered violence, death, hunger, sickness, and indignity at the hands of powerful oppressors who would not respect their humanity, and especially for the Iraqi, Arab, and other victims of the fire this time—with a call for action to end the scourge of war, economic exploitation, and poverty.”
—Ramsey Clark

Dedication for The Fire This Time: U.S. War Crimes in the Gulf. Thunder’s Mouth Press, New York 1992.

The region of Mesopotamia—modern-day Iraq —has been a magnet for greedy conquerors for thousands of years. Time and again, thieving invaders coveting her rich natural resources and advanced society have pillaged “the land between two rivers.” Time and again, the invaders were defeated and expelled. The Americans led by “leaders” comprised of rich oil barons and neo-conservative Zionists are the latest in a long line of imperialists. If they had only read history, however, they could have predicted that they, too, would suffer great losses from an unwavering resistance. They could have predicted that they, too, will have to leave. In the words of Yogi Berra, they “made the wrong mistake.” But while imperialism in Western Asia is nothing new, our weaponry and war crimes may put Americans in the history books as Iraq ’s most barbaric invaders ever.

Throughout the last hundred years, Iraqis have experienced much social, political, and economic turmoil. The British occupied the region in the early 20th century, and installed a puppet monarchy to serve the interests of the Empire. King Faisal, with pro-British agents Nuri Al-Said and Saleh Jabr, traded Iraq ’s wealth and their own dignity for power and greed.[i] Many Iraqis identified them as traitors, as evidenced by demonstration slogans: “Nuri Al-Said, Al-Qundara; Saleh Jabr, Al-Qeetanheh” (“Al-Said is the shoe, Jabr is its shoelace.”) On July 14th, 1958 (marked as Iraq ’s first Independence Day), a coup lead by General Abdul Karim Qasim— Iraq ’s self-proclaimed “only leader”—resulted in the dissolution of the “Kingdom” of Iraq . The royal family was assassinated, and the “ Republic of Iraq ” was born—a development that took Western powers by surprise.

Iraqis had suffered greatly under the exploitation of the British and their Indian and Nepalese “Gurkha” militias. With the end of the Kingdom, they hoped conditions would improve, and for a brief time, they did. General Qasim, influenced by his allies in the Iraqi Communist Party, introduced sweeping socio-economic reforms to distribute the nation’s wealth more equitably. Also, in 1959, he withdrew Iraq from the U.S.-orchestrated Baghdad Pact, established to confront the Soviet Union . However, pan-Arab nationalists (including members of the Ba’ath Socialist Party) led by Qasim’s second-in-command, Abdul Salam Arif, wanted Iraq to move in a different direction. At the time, they sought to join Iraq with Egypt and Syria in the United Arab Republic (UAR). In October 1959 during a failed coup, there was an attempt on Qasim’s life by a group of militant Ba’athists, including one young Iraqi from the town of Tikrit named Saddam Hussein.[ii]

Qasim’s rule lasted only until 1963, when he was assassinated by his former deputy
Abdul Salam Arif. According to the late King Hussein of Jordan , the American CIA backed this coup and fed the rebels information on leftist intellectuals and communists, who were summarily executed by the rising Ba’athists.[iii] Arif’s rule, however, also was short-lived; he died in a plane crash in 1966 while visiting Basra . His brother, Abdul Rahman Arif, succeeded his rule, but yet another coup was simmering.

In 1968, former General Ahmed Hassan al-Bakr and Saddam Hussein—leaders of Iraq ’s Ba’ath Party—toppled Arif to become President and Vice-President, respectively. The Ba’athists had gained popularity among Iraqis in the 1950’s for their strong opposition to British imperialists and their agents. In the early hours of July 17—Iraq’s second Independence Day—General Al-Bakr drove up to the barracks of the Republican Guard outside Abdul Rahman Arif’s palace. He was followed by a convoy of armed Ba’athists, including Saddam Hussein and his half-brother Barzan. Promised his life, Arif quickly surrendered, and the “White” Revolution was complete. Bloodshed would soon follow, however, as Al-Bakr and Hussein consolidated their power over the Republic. Any threat to their rule—from collaborators with imperial Europe to members of opposing political parties to challenges within their own party—was quickly eliminated.

While al-Bakr was the nominal leader, Hussein ran the Party, the National Guard, and the state’s security apparatus. He played major roles in establishing treaties with Kurdish parties in northern Iraq and nationalizing Iraq ’s oil (1972-1975),[iv] to profit Iraq instead of foreign companies. Though the regime remained politically oppressive until its fall in 2003, Iraqi society and the economy flourished once autonomy over oil sales was reclaimed from foreign hands. By the late 1970’s, the value of the Iraqi dinar was equivalent to over three American dollars. Education and literacy rates were on the rise, and the healthcare system was considered the “jewel of the Arab World.” (Thirty years later, however, gas-guzzling Americans and British shocked and awed their way back into the driver’s seat, resulting in the destruction of Iraq ’s infrastructure and civil society.)

Saddam Hussein remained the behind-the-scenes de facto ruler until July 16, 1979, when al-Bakr retired and gave the presidential seat to his second-in-command. Soon thereafter, the lives of Iraqis would return to suffering. In 1980, tensions were rising between Iraq and Iran . Although the 1975 Algiers Accord between the two nations established a diplomatic resolution to their long-standing territorial disputes, neither country abided by its terms. In addition, Iran ’s new leader, Ayatollah Ruholla Khomeini, sought to extend theocratic rule to secular Iraq and the Gulf monarchies. As tensions rose, each side accused the other of encouraging internal uprisings. In September 1980, with the tacit approval of the United States , Saudi Arabia , and the rest of the Gulf States whose leaders feared Khomeini’s aspirations of regional hegemony, Hussein launched a massive military assault. The Iran-Iraq War officially began.

The United States appeared to side with Iraq by supplying the Iraqi army with conventional, biological and chemical weapons as well as intelligence. In secrecy, however, the Reagan Administration also sold arms to Iran without Congressional approval, and used the funds to support the U.S.-backed Contra insurgency against the Sandinista Government in Nicaragua . This criminal activity is remembered as the Iran-Contra scandal, for which no one has been brought to justice. In fact, many of its culprits remain top U.S. government officials, including members of the recent Iraq Study Group, such as the new Secretary of Defense Robert Gates. Before a cease-fire was declared between Iraq and Iran on April 20, 1988, over a million lives were lost.

In 1988, Saddam Hussein’s regime was developing a $40 billion plan for post-war reconstruction. Kuwait , however, was selling oil in quantities exceeding those mandated by OPEC (Organization of Petroleum Exporting Countries) agreements. This inexplicable violation of OPEC-established limits dropped the price of oil per barrel, which drastically reduced Iraq ’s export earnings. Kuwait ’s Al-Sabah monarchy was also demanding repayment of the 10 billion dollar financial support given to Iraq to defend against the Ayatollah’s regime that would depose them. Furthermore, the Kuwaitis were stealing oil from the Rumaila field in southern Iraq by “slant-drilling” under the border. As diplomatic efforts to resolve these and the long-standing border disputes between the two countries were dismissed by the Al-Sabah sheikhs, Saddam Hussein threatened military action to end Kuwait ’s intransigence. On August 2, 1990—once again with tacit approval from the United States —Iraqi troops crossed the border of one of its neighbors.

Since then, punishment for the Iraqi government’s actions has been carried out against the country’s civilian population, in violation of the basic standards of international law. U.N. sanctions imposed on Iraq ’s economy on August 6, 1990, lasted until the fall of Saddam Hussein’s regime after the American invasion of 2003. It is estimated that between 1.2 and 1.8 million Iraqis died during those years, due to starvation, destruction of electrical and sewage treatment plants during the Gulf War, daily bombing by US and British warplanes, and denial of basic medical supplies and equipment. But today, Iraqis see those as their “better days”—a testimonial to the grave human suffering under the brutal, illegal American-British occupation.

Every Iraqi knows of Al-Hajaj bin Yusef Al-Thaqafi, a ruler from a thousand years ago reputed to have led the most brutal and repressive regime in Iraq’s long history. The name “Al-Hajaj” is invoked in casual conversation to describe times when conditions have hit rock bottom. Ironically, Saddam Hussein admired the rule of that brutal leader. Some Iraqis thought the now assassinated president would replace Al-Hajaj in the colloquial lexicon—that is, until the 2003 arrival of “Al-Dijaj” (in Arabic, “the chicken”): George W. Bush and his chicken hawk administration. Those rich politicians in Washington , who send poor Americans to kill and be killed in the name of stealing control of Iraq ’s oil, evaded their own opportunities for military service. And just as the British used King Faisal as a puppet ruler in the 1920’s, the Americans have their own stooges today, including Ahmed Chalabi, Iyad Allawi, Ibrahim Jafaari, and Nuri Al-Maliki. Al-Dijaj and the rest of the chicken coop know nothing of combat, poverty, or human suffering either at home or abroad.

Since 1958, Iraqis have hoped that each regime change might bring a better life. Have they hit rock bottom now?


[i] Ali, Tariq. Bush in Babylon : The Recolonisation of Iraq . Chapter 3. Verso, London . 2003.
[ii] Ali, Tariq. Bush in Babylon : The Recolonisation of Iraq . Chapter 4. Verso, London . 2003.
[iii] Ibid.

Please visit

FAIR USE NOTICE: This blog may contain copyrighted material. Such material is made available for educational purposes, to advance understanding of human rights, democracy, scientific, moral, ethical, and social justice issues, etc. This constitutes a ‘fair use’ of any such copyrighted material as provided for in Title 17 U.S.C. section 107 of the US Copyright Law. This material is distributed without profit.


Parasitic Imperialism By Ismael Hossein-zadeh

How recent U.S. wars of choice, driven largely by war profiteering, are plundering not only defenseless peoples and their resources abroad, but also the overwhelming majority of U.S. citizens and their resources at home.

By Ismael Hossein-zadeh

07/07/07 “ICH”

Although immoral, external military operations of past empires often proved profitable, and therefore justifiable on economic grounds. Military actions abroad usually brought economic benefits not only to the imperial ruling classes, but also (through “trickle-down” effects) to their citizens. Thus, for example, imperialism paid significant dividends to Britain, France, the Dutch, and other European powers of the seventeenth, eighteenth, nineteenth, and early twentieth centuries. As the imperial economic gains helped develop their economies, they also helped improve the living conditions of their working people and elevate the standards of living of their citizens.

This pattern of economic gains flowing from imperial military operations, however, seems to have somewhat changed in the context of the recent U.S. imperial wars of choice, especially in the post-Cold War period. Moralities aside, U.S. military expeditions and operations of late are not justifiable even on economic grounds. Indeed, escalating U.S. military expansions and aggressions have become ever more wasteful, cost-inefficient, and burdensome to the overwhelming majority of its citizens.

Therefore, recent imperial policies of the United States can be called parasitic imperialism because such policies of aggression are often prompted not so much by a desire to expand the empire’s wealth beyond the existing levels, as did the imperial powers of the past, but by a desire to appropriate the lion’s share of the existing wealth and treasure for the military establishment, especially for the war-profiteering Pentagon contractors. It can also be called dual imperialism because not only does it exploit the conquered and the occupied abroad but also the overwhelming majority of U.S. citizens and their resources at home.

Since imperial policies abroad are widely discussed by others, I will focus here on parasitic military imperialism at home, that is, on what might be called domestic or internal imperialism. Specifically, I will argue that parasitic imperialism (1) redistributes national income or resources in favor of the wealthy; (2) undermines the formation of public capital (both physical and human); (3) weakens national defenses against natural disasters; (4) accumulates national debt and threatens economic/financial stability; (5) spoils external or foreign markets for non-military U.S. transnational capital; (6) undermines civil liberties and democratic values; and (7) fosters a dependence on or addiction to military spending and, therefore, leads to an spiraling vicious circle of war and militarism. (The terms domestic imperialism, internal imperialism, parasitic imperialism, and military imperialism are used synonymously or interchangeably in this article.)

1. Parasitic Imperialism Redistributes National Income from the Bottom to the Top

Even without the costs of the wars in Iraq and Afghanistan, which are fast surpassing half a trillion dollars, U.S. military spending is now the largest item in the Federal budget. President Bush’s proposed increase of 10% for next year will raise the Pentagon budget to over half a trillion dollars for fiscal year 2008. A proposed supplemental appropriation to pay for the wars in Afghanistan and Iraq “brings proposed military spending for FY 2008 to $647.2 billion, the highest level of military spending since the end of World War II—higher than Vietnam, higher than Korea, higher than the peak of the Reagan buildup.”[1]

The skyrocketing Pentagon budget has been a boon for its contractors. This is clearly reflected in the continuing rise of the value of the contractors’ shares in the stock market: “Shares of U.S. defense companies, which have nearly trebled since the beginning of the occupation of Iraq, show no signs of slowing down. . . . The feeling that makers of ships, planes and weapons are just getting into their stride has driven shares of leading Pentagon contractors Lockheed Martin Corp., Northrop Grumman Corp., and General Dynamics Corp. to all-time highs.”[2]

But while the Pentagon contractors and other beneficiaries of war dividends are showered with public money, low- and middle-income Americans are squeezed out of economic or subsistence resources in order to make up for the resulting budgetary shortfalls. For example, as the official Pentagon budget for 2008 fiscal year is projected to rise by more than 10 percent, or nearly $50 billion, “a total of 141 government programs will be eliminated or sharply reduced” to pay for the increase. These would include cuts in housing assistance for low-income seniors by 25 percent, home heating/energy assistance to low-income people by 18 percent, funding for community development grants by 12.7 percent, and grants for education and employment training by 8 percent.[3]

Combined with redistributive militarism and generous tax cuts for the wealthy, these cuts have further exacerbated the ominously growing income inequality that started under President Reagan. Ever since Reagan arrived in the White House in 1980, opponents of non-military public spending have been using an insidious strategy to cut social spending, to reverse the New Deal and other social safety net programs, and to redistribute national/public resources in favor of the wealthy. That cynical strategy consists of a combination of drastic increases in military spending coupled with equally drastic tax cuts for the wealthy. As this combination creates large budget deficits, it then forces cuts in non-military public spending (along with borrowing) to fill the gaps thus created.

For example, at the same time that President Bush is planning to raise military spending by $50 billion for the next fiscal year, he is also proposing to make his affluent-targeted tax cuts permanent at a cost of $1.6 trillion over 10 years, or an average yearly cut of $160 billion. Simultaneously, “funding for domestic discretionary programs would be cut a total of $114 billion” in order to pay for these handouts to the rich. The projected cuts include over 140 programs that provide support for the basic needs of low- and middle-income families such as elementary and secondary education, job training, environmental protection, veterans’ health care, medical research, Meals on Wheels, child care and HeadStart, low-income home energy assistance, and many more.[4]

According to the Urban Institute–Brookings Institution Tax Policy Center, “if the President’s tax cuts are made permanent, households in the top 1 percent of the population (currently those with incomes over $400,000) will receive tax cuts averaging $67,000 a year by 2012. . . . The tax cuts for those with incomes of over $1 million a year would average $162,000 a year by 2012.”[5]

Official macroeconomic figures show that, over the past five decades or so, government spending (at the federal, state and local levels) as a percentage of gross national product (GNP) has remained fairly steady—at about 20 percent. Given this nearly constant share of the public sector of national output/income, it is not surprising that increases in military spending have almost always been accompanied or followed by compensating decreases in non-military public spending, and vice versa.

For example, when by virtue of FDR’s New Deal reforms and LBJ’s metaphorical War on Poverty, the share of non-military government spending rose significantly the share of military spending declined accordingly. From the mid 1950s to the mid 1970s, the share of non-military government spending of GNP rose from 9.2 to 14.3 percent, an increase of 5.1 percent. During that time period, the share of military spending of GNP declined from 10.1 to 5.8 percent, a decline of 4.3 percent.[6]

That trend was reversed when President Reagan took office in 1980. In the early 1980s, as President Reagan drastically increased military spending, he also just as drastically lowered tax rates on higher incomes. The resulting large budget deficits were then paid for by more than a decade of steady cuts on non-military spending.

Likewise, the administration of President George W. Bush has been pursuing a similarly sinister fiscal policy of cutting non-military public spending in order to pay for the skyrocketing military spending and the generous tax cuts for the affluent.

Interestingly (though not surprisingly), changes in income inequality have mirrored changes in government spending priorities, as reflected in the fiscal policies of different administrations. Thus, for example, when from the mid 1950 to the mid 1970s the share of non-military public spending rose relative to that of military spending, income inequality declined accordingly.

But as President Reagan reversed that fiscal policy by raising the share of military spending relative to non-military public spending and cutting taxes for the wealthy, income inequality also rose considerably. As Reagan’s twin policies of drastic increases in military spending and equally sweeping tax cuts for the rich were somewhat tempered in the 1990s, growth in income inequality slowed down accordingly. In the 2000s, however, the ominous trends that were left off by President Reagan have been picked up by President George W. Bush: increasing military spending, decreasing taxes for the rich, and (thereby) exacerbating income inequality.

The following are some specific statistics of how redistributive militarism and supply-side fiscal policies have exacerbated income inequality since the late 1970s and early 1980s—making after-tax income gaps wider than pre-tax ones. According to recently released data by the Congressional Budget Office (CBO), since 1979 income gains among high-income households have dwarfed those of middle- and low-income households. Specifically:

  • The average after-tax income of the top one percent of the population nearly tripled, rising from $314,000 to nearly $868,000—for a total increase of $554,000, or 176 percent. (Figures are adjusted by CBO for inflation.)

  • By contrast, the average after-tax income of the middle fifth of the population rose a relatively modest 21 percent, or $8,500, reaching $48,400 in 2004.

  • The average after-tax income of the poorest fifth of the population rose just 6 percent, or $800, during this period, reaching $14,700 in 2004.[7]


Legislation enacted since 2001 has provided taxpayers with about $1 trillion in tax cuts over the past six years. These large tax reductions have made the distribution of after-tax income more unequal by further concentrating income at the top of the income range. According to the Urban Institute–Brookings Institution Tax Policy Center, as a result of the tax cuts enacted since 2001, in 2006 households in the bottom fifth of the income spectrum received tax cuts averaging only $20; households in the middle fifth of the income range received tax cuts averaging $740; households in the top one percent received tax cuts averaging $44,200; and households with incomes exceeding $1 million received an average tax cut of $118,000.[8]

2. Parasitic Imperialism Undermines Public Capital—both Physical and Human

Beyond the issue of class and inequality, allocation of a disproportionately large share of public resources to the beneficiaries of war and militarism is also steadily undermining the critical national objective of building and/or maintaining public capital. This includes both physical capital or infrastructure (such as roads, bridges, mass transit, dams, levees, and the like) and human capital such as health, education, nutrition, and so on. If not reversed or rectified, this ominous trend is bound to stint long term productivity growth and socio-economic development. A top heavy military establishment will be unviable in the long run as it tends to undermine the economic base it is supposed to nurture.

In March 2001, the American Society of Civil Engineers (ASCE) issued a “Report Card for America’s Infrastructure,” grading 12 infrastructure categories at a disappointing D+ overall, and estimating the need for a $1.3 trillion investment to bring conditions to acceptable levels. In September 2003, ASCE released a Progress Report that examined trends and assessed the progress and decline of the nation’s infrastructure. The Progress Report, prepared by a panel of 20 eminent civil engineers with expertise in a range of practice specialties, examined 12 major categories of infrastructure. The report concluded: “The condition of our nation’s roads, bridges, drinking water systems and other public works have shown little improvement since they were graded an overall D+ in 2001, with some areas sliding toward failing grade.”[9]

Neoliberal proponents of laissez faire economics tend to view government spending on public capital as a burden on the economy. Instead of viewing public-sector spending on infrastructure as a long-term investment that will help sustain and promote economic vitality, they view it as an overhead. By focusing on the short-term balance sheets, they seem to lose sight of the indirect, long-term returns to the tax dollars invested in the public capital stock. Yet, evidence shows that neglect of public capital formation can undermine long-term health of an economy in terms of productivity enhancement and sustained growth.

Continued increase in military spending at the expense of non-military public spending has undermined more than physical infrastructure. Perhaps more importantly, it has also undercut public investment in human capital or social infrastructure such as health care, education, nutrition, housing, and the like—investment that would help improve quality of life, human creativity and labor productivity, thereby also helping to bring about long-term socioeconomic vitality. Investment in human capital—anything that improves human capacity and/or labor productivity—is a major source of social health and economic vitality over time.

Sadly, however, public investment in such vitally important areas has been gradually curtailed ever since the arrival of Ronald Reagan in the White House in 1980 in favor of steadily rising military spending. Evidence of this regrettable trend is overwhelming. To cite merely a few examples: “The war priorities have depleted medical and education staffs. . . . Shortages of housing have caused a swelling of the homeless population in every major city. State and city governments across the country have become trained to bend to the needs of the military—giving automatic approvals to its spending without limit. The same officials cannot find money for affordable housing.”[10]

The New York Times columnist Bob Herbert recently reported that some 5.5 million young Americans, age 16 to 24, were undereducated, disconnected from society’s mainstream, jobless, restless, unhappy, frustrated, angry and sad. Commenting on this report, Professor Seymour Melman of Columbia University wrote: “This population, 5.5 million and growing, is the product of America’s national politics that has stripped away as too costly the very things that might rescue this abandoned generation and train it for productive work. But that sort of thing is now treated as too costly. So this abandoned generation is now left to perform as fodder for well-budgeted police SWAT teams.”[11]

3. Parasitic Imperialism Undermines National Defense Capabilities against Natural Disasters—the Case of Hurricane Katrina

Neglect of public physical capital, or infrastructure, can prove very costly in terms of vulnerability in the face of natural disasters. This was tragically demonstrated, among many other instances, by the destruction wrought by Hurricane Katrina. In light of the steady cuts in the infrastructural funding for the city of New Orleans, catastrophic consequences of a hurricane of the magnitude of Katrina were both predictable and, indeed, predicted.

Engineering and meteorological experts had frequently warned of impending disasters such as Katrina. Government policy makers in charge of maintaining public infrastructure, however, remained indifferent to those warnings. They seem to have had other priorities and responsibilities: cutting funds from public works projects and social spending and giving them away to the wealthy supporters who had paid for their elections. It is not surprising, then, that many observers and experts have argued that Katrina was as much a policy disaster as it was a natural disaster.

The New Orleans project manager for the Army Corps of Engineers, Alfred Naomi, had warned for years of the need to shore up the levees, but corporate representatives in the White House and the Congress kept cutting back on the funding. Naomi wasn’t the only one who had warned of the impending disaster.

In 2001, the Federal Emergency Management Agency (FEMA) “ranked the potential damage to New Orleans as among the three likeliest, most catastrophic disasters facing the country,” wrote Eric Berger in a prescient article in the Houston Chronicle of December 1, 2001. In that piece, Berger warned: “The city’s less-than-adequate evacuation routes would strand 250,000 people or more, and probably kill one of ten left behind as the city drowned under twenty feet of water. Thousands of refugees could land in Houston.”[12]

In June 2003, Civil Engineering Magazine ran a long story by Greg Brouwer entitled “The Creeping Storm.” It noted that the levees “were designed to withstand only forces associated with a fast-moving” Category 3 hurricane. “If a lingering Category 3 storm—or a stronger storm, say, Category 4 or 5—were to hit the city, much of New Orleans could find itself under more than twenty feet of water.”[13]

On October 11, 2004, The Philadelphia Inquirer ran a story by Paul Nussbaum, entitled “Direct Hurricane Hit Could Drown City of New Orleans, Experts Say.” It warned that “more than 25,000 people could die, emergency officials predict. That would make it the deadliest disaster in U.S. history.” The story quoted Terry C. Tuller, city director of emergency preparedness: “It’s only a matter of time. The thing that keeps me awake at night is the 100,000 people who couldn’t leave.”

But government representatives of big business in the White House and the Congress were not moved by these alarm bells; the warnings did not deter them from further cutting non-military public spending in order to pay for the escalating military spending and the generous tax cuts for the wealthy.

Some disasters cannot be prevented from occurring. But, with proper defenses, they can be contained and their disastrous consequences minimized. Katrina was not; it was not “because of a laissez-faire government that failed to bother to take warnings seriously,” and because of a skewed government fiscal policy “that is stingy when it comes to spending on public goods but lavish on armaments and war.”[14]

4. Parasitic Militarism Costs External Markets to Non-military Transnational Capital

U.S. military buildup and its unilateral transgressions abroad have increasingly become economic burdens not only because they devour a disproportionately large share of national resources, but also because such adventurous operations tend to create instability in international markets, subvert long-term global investment, and increase energy or fuel costs. Furthermore, the resentment and hostilities that unprovoked aggressions generate in foreign lands are bound to create backlash at the consumer level.

For example, A Business Week report pointed out in the immediate aftermath of the U.S. invasion of Iraq that in the Muslim world, Europe, and elsewhere “there have been calls for boycotts of American brands as well as demonstrations at symbols of U.S. business, such as McDonald’s corporation” (Business Week, 14 April 2003, p. 32).

A leading Middle East business journal, AME Info, reported in its April 8, 2004 issue that “In 2002, a cluster of Arab organizations asked Muslims to shun goods from America, seen as an enemy of Islam and a supporter of Israel. In Bahrain, the Al-Montazah supermarket chain, for example, boosted sales by pulling about 1,000 US products off its shelves, and other grocers followed suit.” The report further pointed out that “Coca-Cola and Pepsi, sometimes considered unflattering shorthand for the United States, took the brunt of the blow. Coca-Cola admitted that the boycott trimmed some $40 million off profits in the [Persian] Gulf in 2002.”[15]

The report also indicated that in recent years a number of “Muslim colas” have appeared in the Middle Eastern/Muslim markets. “Don’t Drink Stupid, Drink Committed, read the labels of Mecca Cola, from France. . . . Iran’s Zam Zam Cola, originally concocted for Arab markets, has spread to countries including France and the United States.” In addition, the report noted that “US exports to the Middle East dropped $31 billion from 1998-2002. Branded, value-added goods—all the stuff easily recognized as American—were hit the hardest.” Quoting Grant Smith, director of IRmep, a leading Washington-based think tank on Middle Eastern affairs, the report concluded: “Our piece of the pie is shrinking, and it’s because of our degraded image.”[16]

Evidence shows that foreign policy-induced losses of the U.S. market share in global markets goes beyond the Middle East and/or the Muslim world. According to a December 2004 survey of 8,000 international consumers carried out by Global Market Insite (GMI) Inc., one-third of all consumers in Canada, China, France, Germany, Japan, Russia, and the United Kingdom “said that U.S. foreign policy, particularly the ‘war on terror’ and the occupation of Iraq, constituted their strongest impression of the United States. Brands closely identified with the U.S., such as Marlboro cigarettes, America Online (AOL), McDonald’s, American Airlines, and Exxon-Mobil, are particularly at risk.” Twenty percent of respondents in Europe and Canada “said they consciously avoided buying U.S. products as a protest against those policies.” Commenting on the results of the survey, Dr. Mitchell Eggers, GMI’s chief operating officer and chief pollster, pointed out, “Unfortunately, current American foreign policy is viewed by international consumers as a significant negative, when it used to be a positive.”[17]

Kevin Roberts, chief executive of advertising giant Saatchi & Saatchi, likewise expressed concern about global consumer backlash against militaristic U.S. foreign policy when he told the Financial Times that he believed consumers in Europe and Asia are becoming increasingly resistant to having “brand America rammed down their throats.” Similarly, Simon Anholt, author of Brand America, told the British trade magazine Marketing Week that “four more years of Bush’s foreign policy could have grave consequences for U.S. companies’ international market share.”[18]

Writing in the October 27, 2003 issue of the Star Tribune, Ron Bosrock of the Global Institute of St. John’s University likewise expressed anxiety over negative economic consequences that might follow from the Bush administration’s policies of unilateral military operations and economic sanctions.

Concerns of this nature have prompted a broad spectrum of non-military business interests to form coalitions of trade associations that are designed to lobby foreign policy makers against unilateral U.S. military aggressions abroad. One such anti-militarist alliance of American businesses is USA*ENGAGE. It is a coalition of nearly 700 small and large businesses, agriculture groups and trade associations working to seek alternatives to the proliferation of unilateral U.S. foreign policy actions and to promote the benefits of U.S. engagement abroad. The coalition’s statement of principles points out, “American values are best advanced by engagement of American business and agriculture in the world, not by ceding markets to foreign competition” through unilateral foreign policies and military aggressions ( ).

Non-military business interests’ anxiety over the Bush administration’s unilateral foreign policy measures is, of course, rooted in their negatively-affected financial balance sheets by those actions: “Hundreds of companies blame the Iraq war for poor financial results in 2003, many warning that continued U.S. military involvement there could harm this year’s performance,” pointed out James Cox of USA Today.

In a relatively comprehensive survey of the economic impact of the war, published in the July 14, 2004 issue of the paper, Cox further wrote: “In recent regulatory filings at the Securities and Exchange Commission, airlines, home builders, broadcasters, mortgage providers, mutual funds and others say the war was directly to blame for lower revenue and profits last year.” Many businesses blamed the war and international political turbulence as a ‘risk factor’ that threatened their sales: “The war led to sharp decreases in business and leisure travel, say air carriers, travel services, casino operators, restaurant chains and hotel owners.” The survey covered a number of airlines including Delta Airlines, JetBlue, Northwest Airlines and Alaska Airlines, all of which blamed the war for a drop in air travel. Related industries such as travel agencies, hotels, restaurants, and resort and casino operations all suffered losses accordingly.[19]

Even technology giants such as Cisco, PeopleSoft and Hewlett-Packard that tend to benefit from military spending expressed concerns that “hostilities in Iraq hurt results or could harm performance.” For example, managers at Hewlett-Packard complained that “potential for future attacks, the national and international responses to attacks or perceived threats to national security, and other actual or potential conflicts or wars, including the ongoing military operations in Iraq, have created many economic and political uncertainties that could adversely affect our business, results of operations and stock price in ways that we cannot presently predict.” Other companies that were specifically mentioned in the survey as having complained about the “whiplash from the Iraq conflict” included home builders Hovnanian and Cavalier homes, casino company Mandalay Resort Group, retailer Restoration Hardware, cosmetics giant Estée Lauder, eyewear retailer Cole, Longs Drug Stores, golf club maker Callaway, and H&Q Life Sciences Investors.[20]

5. Parasitic Imperialism Accumulates National Debt, Weakens National Currency, and Undermines Long-Term National Financial/Economic Health

A major source of the financing of the out-of-control military spending has been borrowing—the other source has been cutting non-military public spending. This represents a cynically clever strategy on the part of the powerful interests that benefit from war and militarism: instead of financing their wars of choice by paying taxes proportionate to their income, they give themselves tax cuts, finance their wars through borrowing, and then turn around and lend money (unpaid taxes) to the government and earn interest.

Viewed in this light, the staggering national debt of nearly $9 trillion, which is more than two thirds of gross nation product (GNP), represents a subtle redistribution of national resources from the bottom to the top: it represents unpaid taxes by the wealthy, which has to be financed by cutting non-military public spending—both now and in the future. This means that the wealthy has successfully converted their tax obligations to credit claims, that is, lending instead of paying taxes—which is in essence a disguised form of theft or robbery.

This cynical policy of increasing military spending, cutting taxes for the wealthy and, thereby, accumulating national debt cannot continue for ever, as it might eventually lead to national or Federal insolvency, collapse of the dollar, and paralysis of financial markets—not only in the United States but perhaps also in broader global markets.

Prospects of such developments has led a number of observers to argue that the profit-driven military expansion might prove to be the nemesis of U.S. imperialism: the escalating and out-of-control militarization tends to gradually drive the once-prosperous U.S. superpower in the direction of a mismanaged and destructive military imperial force whose capricious and often purely existential military adventures will eventually become costly both politically and economically. While the top-heavy imperial military colossus tends to undermine its economic base, it is also bound to create many enemies abroad and a lot of discontentment and hostility to the established order at home. Unchecked, a combination of these adverse developments, especially a drained economy and an empty or bankrupt treasury, might eventually lead to the demise of the empire, just as happened to the post-Rubicon, Old Roman Empire.[21]

6. Parasitic Imperialism Undermines Democratic Control and Corrupts the System of Checks and Balances

As noted earlier, powerful beneficiaries of war dividends (the military-industrial complex and affiliated businesses of war) have successfully used war and military spending as a roundabout way to reallocate national resources in their own favor. Appropriation of public finance by these war profiteers has reached a point where more than half of the discretionary Federal budget, or more than one-third of the entire Federal budget, is now earmarked for “national security.”

This perverse allocation of national resources in the name of national security has meant that while the increasing escalation of war and militarism have hollowed out national treasury (and brought unnecessary death, destruction, and disaster to millions), it has also brought tremendous riches and resources to war profiteers. Concealment of this subtle robbery of national treasury from the American people requires restriction of information, obstruction of transparency, and obfuscation or misrepresentation of national priorities—that is, curtailment of democracy.

Curtailment of democracy, however, is best achieved under conditions of war, which in turn, requires invention of enemies or manufacturing of threats to national security. Therefore, it is not fortuitous that, in the post-Cold War world, U.S. architects of wars of choice have become very resourceful in invoking all kinds of bogeymen (rogue states, global terrorism, axis of evil, radical Islam, and more) that are allegedly threatening “our national interests” in order to justify their plans of increased militarization of U.S. foreign policy. (Under the bipolar world of the Cold War era, “threat of communism” served the purpose of continued increases of the Pentagon budget.)

This means that U.S. wars of choice abroad are prompted largely by metaphorical domestic wars over allocation of public resources, or tax dollars. From the standpoint of war profiteers, instigation or engineering of capricious wars for profits help achieve two closely-linked purposes: on the one hand, they will help justify escalation of military spending, which means escalation of their share of U.S treasury, on the other, they will help camouflage such a cynical robbery of public money by restricting information under the cover of war-time circumstances.

For example, only under conditions of war the Bush the administration could display an attitude of cavalier contempt for lawful norms, undermine constitutional balances, corrupt national institutions with nefarious special interests, smear dissent as unpatriotic, suspend traditional legal rights for certain citizens, obstruct the free flow of information, sanction domestic spying without legal warrant, institute military tribunals, and promote torture in defiance of American and international law.

Likewise, only under conditions of war (and the self-fulfilling threats of imminent “terrorist attacks” on the U.S.) could the administration establish and manage a prison system outside the rule of law where torture can be used. With this system of prison camps in Afghanistan, Iraq, Cuba (Guantánamo), and a number of other undisclosed overseas places, where detainees are abused and kept indefinitely without trial and without access to the due process of the law, the United States now has its own gulags. President Bush and his allies in Congress recently announced they would issue no information about the secret CIA “black site” prisons throughout the world, which are used to incarcerate people who have often been seized off the street.[22]

From the vantage point of war profiteering militarists, such prison camps are an essential ingredient for the justification of war: they are portrayed as evidence of the existence of terrorists, of the “enemies of the people,” or of “enemy combatant” without, at the same time, having to show what the alleged evidence really is, or who the alleged “enemy combatants” really are—as would be required in an open court of law. Combined with warrantless wiretapping, electronic surveillance, and various types of illegal searches, this prison system serves yet another objective of the beneficiaries of war dividends: inspiration of fear and cultivation of silence and obedience among citizens, which means subversion of democracy and promotion of authoritarianism.

James Madison warned against such an ominous symbiosis of war and authoritarianism long time ago: “Of all the enemies of public liberty, war is perhaps the most to be dreaded, because it comprises and develops the germ of every other.” The Congress of the United States of America had earlier (1784) issued a similar warning against authoritarian consequences of maintaining a large military establishment during times of peace: “standing armies in time of peace are inconsistent with the principles of republican governments, dangerous to the liberties of a free people, and generally converted into destructive engines for establishing despotism.”[23]

But perhaps the strongest and most well-known warning against the baleful consequences of a large peace-time military establishment came from President Dwight Eisenhower: “The conjunction of an immense military establishment and a huge arms industry is new in the American experience. The total influence—economic, political, and even spiritual—is felt in every city, every state house, and every office of the federal government. . . . In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex” (Farewell Address, January 17, 1961).

Eisenhower’s warning that “we must guard against the acquisition of unwarranted influence” of the military-industrial complex is more relevant today than when it was issued nearly half a century ago. The steadily rising—and now perhaps monopolizing and overwhelming—power and influence of the Complex over both domestic and foreign policies of the United States is testament to the unfortunate realization of Eisenhower’s nightmare. As Howard Swint, Democratic candidate for Congress in West Virginia, put it: “The seat of power for formulating foreign policy and defense strategy is not in the White House but rather in the Pentagon. While a civilian Commander-in-Chief may tweak policy in four-year increments, it’s obvious that military careerists together with major defense contractors effectively control the Congressional budget process and drive defense appropriations.”[24]

7. Parasitic Imperialism Leads to Dependence on, or Addiction to, War and Militarism

The fact that the Pentagon appropriates and controls more than one-third of the entire Federal budget has allowed it to forge the largest constituency and/or dependents nationwide. Tens of thousands of businesses, millions of jobs, and thousands of cities and communities have become dependent on military spending. While a handful of major contractors take the lion’s share of military spending, millions more have become dependent on it as the source of their livelihood.

It is not surprising then that not many people are willing to oppose the continuing rise in the Pentagon budget—even if they might philosophically be opposed to militarism and large military spending. Because of the widespread presence of military installations and production sites nationwide, few politicians can afford not to support a continued rise in military spending lest that should hurt their communities or constituencies economically.

This helps explain the vicious and spiraling circle of war, international political convulsions, and military spending: Major Pentagon contractors and other powerful beneficiaries of war dividends are dependent on continued war and militarism in order to maintain and expand hefty profits. This dependence has, in turn, created a secondary (or derived) dependence; it is the dependence of millions of Americans on military spending as the source of their livelihood, which then plays into the hands of war profiteers in their perennial quest for ever newer enemies, newer wars, and bigger appropriations for the Pentagon—hence the addiction to and the vicious circle of war profiteering, international political tension, war, and military spending.

Concluding Remarks—Parasitic Imperialism: A Most Dangerous Type of Imperialism

Dependence on, or addiction to, war and militarism for profitability makes U.S military imperialism (that is, imperialism driven by military capital, or arms conglomerates, vis-à-vis non-military transnational capital) a most dangerous kind of imperialism. Under the rule of the past imperial powers, the conquered and subjugated peoples or nations could live in peace—imposed peace, to be sure—if they respected the interests and the needs of those imperial powers and simply resigned to their political and economic ambitions.

Not so in the case of the U.S. military-industrial empire: the interests of this empire are nurtured through “war dividends.” Peace, imposed or otherwise, is viewed by the beneficiaries of war dividends inimical to their interests as it would make justification of continued increases of their share of national resources (in the form of Pentagon appropriations) difficult.

Of course, tendencies to build bureaucratic empires have always existed in the ranks of military hierarchies. By itself, this is not what makes the U.S. military-industrial complex more dangerous than the military powers of the past. What makes it more dangerous is the “industrial,” or business, part of the Complex. In contrast to the United States’ military or war industries, arms industries of past empires were not subject to capitalist market imperatives. Furthermore, those industries were often owned and operated by imperial governments, not by market-driven giant corporations. Consequently, as a rule, arms production was dictated by war requirements, not by market or profit imperatives, which is the case with today’s U.S. armaments industry.

Ismael Hossein-zadeh is an economics professor at Drake University, Des Moines, Iowa. This article draws upon his recently published book, The Political Economy of U.S. Militarism (Palgrave-Macmillan Publishers)


[1] William D. Hartung, “Bush Military Budget Highest Since WW II,” Common Dreams (10 February 2007),

[2] Bill Rigby, “Defense stocks may jump higher with big profits,” Reuter (12 April 2006),

[3] Shakir F. et al., Center for American Progress Action Fund, “The Progress Report” (6 February 2007),

[4] Robert Greenstein, “Despite the Rhetoric, Budget Would Make Nation’s Fiscal Problems Worse and Further Widen Inequality,” Center for Budget and Policy Priorities (6 February 2007),

[5] Ibid.

[6] Richard Du Boff, “What Military Spending Really Costs,” Challenge 32 (September/October 1989), pp. 4–10.

[7] Congressional Budget Office, Historical Effective Federal Tax Rates: 1979 to 2004, as reported by Center on Budget and Policy Priorities,

[8] Tax Policy Center, Table T06-0279, online: ; and Table T06-0273, online:

[9] American Society of Civil Engineers, “What can happen if America fails to invest in its infrastructure? Anything,” news release (4 September 2003),

[10] Seymour Melman, “They Are All Implicated: In the Grip of Permanent War Economy,” (15 March2003),

[11] Ibid.

[12] M. Rothschild, “Katrina Compounded,” The Progressive (1 September 2005),

[13] Ibid.

[14] Ibid.

[15] AME Info, “Coke and Pepsi battle it out,” (8 April 2004),

[16] Ibid.

[17] Jim Lobe, “Poll: War Bad for Business,” (30 December 2004),

[18] Ibid.

[19] James Cox, “Financially ailing companies point to Iraq war,” USA Today (14 July 2004):

[20] Ibid.

[21] Paul Kennedy, The Rise and Fall of the Great Powers (New York, NY: Vintage Books 1989); Chalmers Johnson, The Sorrows of Empire (New York, NY: Metropolitan Books 2004); Ismael Hossein-zadeh, The Political Economy of U.S. Militarism (Palgrave-Macmillan2006).

[22] Naomi Wolf, “Fascist America, in 10 Easy Steps,” (28 April 2007),

[23] Sidney Lens, The Military-Industrial Complex (Kansas City, Missouri: Pilgrim Press & the National Catholic Reporter 1979).

[24] Swint, Howard, “The Pentagon Ruled by Special Interests,”

FAIR USE NOTICE: This blog may contain copyrighted material. Such material is made available for educational purposes, to advance understanding of human rights, democracy, scientific, moral, ethical, and social justice issues, etc. This constitutes a ‘fair use’ of any such copyrighted material as provided for in Title 17 U.S.C. section 107 of the US Copyright Law. This material is distributed without profit.

The United States of Israel by Margaret Kimberley

by Margaret Kimberley
July 6th, 2007

Americans celebrate their nation’s independence on the Fourth of July. On that day in 1776 a group of propertied, nearly all slave holding, white men declared that Britain’s American colonies no longer existed as such. America was an independent nation and would fight to retain that status. How ironic that in July 2007, America is anything but independent from foreign influence.

Continue reading

The President Fails American History, Again-Bush, the Revolution and the Iraq War By BART GRUZALSKI

Weekend Edition
July 7 / 8, 2007

On July 4th, 2007, President Bush compared the U.S. struggle in Iraq with the American Revolution. He reminded his audience of West Virginia Air National Guard personnel and their families that the first July 4th celebration in 1777 took place in the midst of “a bloody and difficult struggle that would not end for six more years before America finally secured her freedom.” Although “it is hard [today] to imagine the Revolutionary War coming out any other way,” he said, “at the time, America’s victory was far from certain.” He then turned to the war in Iraq. “You’re the successors of those brave men,” he told the crowd. “Like those early patriots, you’re fighting a new and unprecedented war.” He commended them for “showing that the courage which won our independence more than two centuries ago is alive and well here in West Virginia.”

What makes the American Revolution an inspirational example to people everywhere is that the first Americans were, like the Iraqis today, trying to end an occupation. Bush never got it, even though he pointed out that in 1777 “we were a small band of freedom-loving patriots taking on the most powerful empire in the world.” Today it is the Iraqis who are “taking on the most powerful empire in the world” and our own military that constitutes the occupation. This is the stark and obvious analogy between the American Revolution and the Iraq occupation and it does not bode well for U.S. military success.

The American Revolution jettisoned foreign rule and subjection to the English throne. It should not be surprising to Americans that other peoples want to free themselves from occupation and foreign rule. Being a colony and suffering foreign troops on our soil infuriated our ancestors and would infuriate most of us today. The U.S. currently has over 160 thousand military personnel occupying Iraq, yet Iraq is a country with less than 10% of the population of the U.S. If a proportionally equal number of foreign troops were occupying our nation, more than 1,600,000 foreign troops would be in our cities, on our streets, and in our neighborhoods. There is no question about what would happen-we would create an insurgency of unprecedented ferocity that would drive the foreigners out. This is precisely what the Iraqi insurgents are trying to do today.

It is difficult to believe that some of the strategists in Washington D.C. are not fully aware that the occupation itself creates the Iraq insurgency. Even President Bush in his April 2004 press conference acknowledged that the Iraqis are Anot happy they’re occupied” and added that AI wouldn’t be happy if I were occupied either.What is shameless is that Bush used the example of our own American Revolution against a foreign occupation to pump up an audience of National Guard and their families as he told them to be ready for even “more sacrifice” (the euphemism for casualties) in our war of occupation.

Bart Gruzalski is Professor Emeritus at Northeastern University. His most recent book is “On Gandhi.”

FAIR USE NOTICE: This blog may contain copyrighted material. Such material is made available for educational purposes, to advance understanding of human rights, democracy, scientific, moral, ethical, and social justice issues, etc. This constitutes a ‘fair use’ of any such copyrighted material as provided for in Title 17 U.S.C. section 107 of the US Copyright Law. This material is distributed without profit.

US ran nuclear weapons exercises the week before Bush-Putin summit by Michael Roston

US ran nuclear weapons exercises the week before Bush-Putin summit

by Michael Roston
Published: Friday July 6, 2007

Shortly before the so-called ‘Lobster Summit’ between President George W. Bush and his Russian counterpart Vladimir Putin in Kennebunkport, Maine, the United States appears to have carried out a significant nuclear weapons exercise, according to a report in Friday’s Washington Times.

“International radio operators picked up large numbers of coded Air Force communications being sent around the world on June 26 that indicated some type of military activity was about to take place,” writes Bill Gertz in his weekly “Inside The Ring” column.

Gertz suggests that the transmissions, which he called ‘extraordinary,’ were related to US nuclear forces.

“A U.S. military official said the radio traffic was monitored from the Air Force Global High Frequency System (GHFS) that some observers regarded as ‘extraordinary’ because of the unprecedented length of messages,” he writes. “The messages appeared to be emergency action messages, coded communications sent by the Joint Chiefs of Staff to U.S. Air Force strategic nuclear forces.”

An Air Force spokesman said the operation was ‘routine.’

The Bush-Putin summit in Maine started a week later, on Monday, July 2. US-Russian strategic nuclear relations were a subject of discussion, particularly the development of a replacement for verification rules established by the Strategic Arms Reduction Treaty (START), which is set to lapse in 2009.

Stephen Hadley, Bush’s National Security Adviser, said on July 2 that the White House was expecting the START issue to be handled by Secretary of State Condoleezza Rice and Russian Foreign Minister Sergei Lavrov in a Tuesday, July 3 meeting.

“There will be a document that will come out tomorrow, and the easiest thing to do is let that document come out,” he said in a Monday press briefing. “They’ve got a document that the two — the Foreign Minister and the Secretary of State will sign.”

But there appeared to be little result from the Rice-Lavrov discussions in the area of strategic nuclear relations, with decisions deferred to the future.

“[The] Ministers discussed development of a post-START arrangement to provide continuity and predictability regarding strategic offensive forces. Upon instructions of the Presidents, the sides will continue these discussions with a view toward early results,” both sides said in a carefully worded joint statement released on July 3.

Robert Joseph, the US Special Envoy on Nuclear Nonproliferation, acknowledged that agreement had been elusive on the aftermath of the START treaty.

“Now, as we haven’t come to agreement on what will replace START, but we are in the process of talking about that,” he said in a July 3 briefing. “We both want transparency. We both want confidence-building measures. We have talked about measures that would involve data exchanges and site visits. We have, I think, you know, a way to go in terms of our discussion, but we are actively working now that.”

Russia’s Deputy Foreign Minister, Sergei Kislyak, made clear in the same briefing that Russia is eager to make progress on the question of verifying the reduction of strategic nuclear forces on both sides.

“We need to think what will come up after the treaty does expire, because we do not want this very important process to get lost or just to be discontinued,” he said on Tuesday. “So the idea is that we will look into the treaty and the other ideas that are being fermented around the treaty and try to sort out what positive elements of this treaty should continue after it expires, and then we will build on that. That is the Russian concept.”

US-Russian nuclear tensions have increased in recent months. Putin has made vocal statements about the ability of Russian nuclear forces to overcome any missile defense system devised by the United States. Russia has been particularly upset with the potential siting of interceptor missiles in Eastern European countries such as the Czech Republic or Poland.

h/t: ICH

FAIR USE NOTICE: This blog may contain copyrighted material. Such material is made available for educational purposes, to advance understanding of human rights, democracy, scientific, moral, ethical, and social justice issues, etc. This constitutes a ‘fair use’ of any such copyrighted material as provided for in Title 17 U.S.C. section 107 of the US Copyright Law. This material is distributed without profit.

America The Beautiful’s Germ Warfare Rash By Sherwood Ross

Dandelion Salad

By Sherwood Ross
July 06, 2007

Speaking Truth to Power

Reprinted from THE HUMANIST

In his bellicose Cincinnati, Ohio, speech of October 7, 2002, President George W. Bush warned that Saddam Hussein must not be allowed to threaten America with “horrible poisons and diseases and gases and atomic weapons.” While Iraq’s possession of these weapons later proved to be unfounded, the president’s charges did point to a certain germ of truth: they neatly described his own operations.

Since the September 11, 2001, terrorist attacks, the Bush administration has spent at least $44 billion on biological “defense” without ever making made a true needs assessment. In the early 1990s the Kremlin shut down their huge, Soviet-era germ warfare operation and, while Israel, Iran, and North Korea are known to have biological weapons research facilities and India, China, and Cuba are said to be building high-security labs to study lethal bacteria and viruses, these initial or potential programs are disproportionately behind the massive efforts underway in the United States. In the words of Edward Hammond, director of the Sunshine Project, an Austin, Texas-based group that tracks research involving biological agents: “Our biowarfare research is defending ourselves from ourselves. It’s a dog chasing its tail.”

Milton Leitenberg is an arms control authority and a member of the University of Maryland’s School of Public Policy and UM’s Center for International and Security Studies. In his 2005 book, Assessing the Biological Weapons and Bioterrorism Threat, Leitenberg writes that the risk of terrorists and nonstate actors using biological agents “has been systematically and deliberately exaggerated,” particularly after the 2001 anthrax attacks on Congress and media outlets. He contends that U.S. officials undertook a concerted effort to promote their view on the international stage and that an “edifice of institutes, programs, conferences, and publicists” continues to spread what he calls exaggeration and scare-mongering.

What’s more, while floating extravagant tales of terrorists planning to launch deadly germ attacks on the United States, the Bush administration has been diverting dollars from urgent medical research against real threats, such as avian influenza, to the creation of new strains of extinct killer diseases like Spanish flu. Upon his retirement in December 2004, Department of Health and Human Services Secretary Tommy Thompson cited pandemic flu as the greatest threat to the nation. Yet according to Leitenberg, Washington policymakers instead have focused on bioterrorism and biodefense.

Leitenberg posits it might take just such a pandemic to demonstrate to the public that Washington “has been using the overwhelming proportion of its relevant resources to prepare for the wrong contingency.” From 1977 to 1999, he notes, flu killed 788,000 people in the United States, about 36,000 a year. Even if there is no outbreak of pandemic flu, one could project 360,000 American deaths from flu over the next decade. When these figures are contrasted with the five deaths from the 2001 anthrax attacks, it is little short of amazing that in fiscal year 2006 the National Institutes of Health (NIH) received $1.76 billion for biodefense but only $120 million to fight influenza.

If taxpayers are slow to recognize that billions of their tax dollars are being poured into hundreds of biological cesspools, some scientific bodies are not. According to the nonprofit Center For Arms Control and Non-Proliferation (CAC) in Washington, DC, in 2001 the U.S. government spent $1.6 billion to address the threat of biological weapons. By 2006 total spending had reached $36 billion, with a record $8 billion more earmarked for FY 2007. As noted, one of the leading agencies allocating such funds is the NIH, billed as “the steward of medical and behavioral research for the Nation.” Two years ago, the growing slice of the NIH budget being shifted to biodefense research–money that has traditionally gone to fighting diseases such as cancer–prompted 750 of the 1,143 NIH-funded scientists studying bacterial diseases to write an open letter to NIH Director Elias Zerhouni charging that the research center’s emphasis on biodefense had diminished their efforts to achieve basic research breakthroughs.

The public has good reason for concern. In the introduction to Francis Boyle’s 2005 book, Biowarfare and Terrorism, MIT molecular biology professor Jonathan King writes: “the Bush administration launched a major program which threatens to put the health of our people at far greater risk than the hazard to which they claimed to have been responding.” Bush’s policies, he continues, “do not increase the security of the American people” but “bring new risk to our population of the most appalling kind.”

From Washington State to Florida and from Massachusetts to California, the United States has broken out in a rash of federally funded biological warfare operations with as many as four hundred labs involved in research related to pathogens that could be used as bioweapons agents. According to the Sunshine Project, most states have a facility of some sort, ranging from an open-air testing location to aerosol test chambers to Biosafety Level 3 or Level 4 operations, the latter being one in which the pathogens being tampered with are deadly, easily transmissible, and have no known cure. Additionally, there are laboratories in a number of states whose activity is classified as secret. Beyond the NIH, the Department of Homeland Security (DHS) and the military are heavily engaged in such work, some of it conducted abroad, such as the Navy’s labs in Egypt, Peru, Indonesia, and Germany.

Biological warfare involves the use of living organisms for military purposes. Such weapons can be viral, bacterial, and fungal, among other forms, and can be spread over a large geographic terrain by wind, water, insect, animal, or human transmission. Among the most dangerous pathogens under study are anthrax, tularemia, plague, and ebola virus, as well as toxins (living organisms such as fungi). Using genetic engineering, U.S. government scientists are purportedly concocting new strains of lethal microbes for which there are no cures. Bacteria, for example, can be made resistant to vaccines. Indeed, they can be made more virulent, easier to disseminate, and harder to eradicate. Some pathogens are even being injected with genes to make them resistant to antibiotic drugs. Words fail to describe this “achievement,” coming from the same country that gave the world the Jonas Salk and Albert Sabin polio vaccines.

As part of its buildup, in January 2005 the Army authorized construction of a new facility at the already sprawling U.S. Army Medical Research Institute of Infectious Diseases at Fort Detrick, Maryland. According to a July 31, 2006, report in London’s Guardian, Fort Detrick’s National Biodefense Analysis and Countermeasures Center (NBACC), due to be completed in 2008, “will house heavily guarded and hermetically sealed chambers in which scientists simulate potential terrorist attacks.” The scientists will dress in full-body spacesuits and use aerosol-test chambers to expose animals to deadly pathogens. To do so, the Guardian reported, the world’s most lethal bacteria and viruses would have to be produced and stockpiled. Questions of international law violations and the hastening of a biological arms race persist.

In December 2006 Battelle National Biodefense Institute hooked the $250-million, five-year DHS contract to run the NBACC. According to the Washington Post, much of what transpires at that center may never be known as the government intends to operate the facility largely in secret. In its July 30, 2006, article, the Post reported:

    The heart of the lab is a cluster of sealed chambers built to contain the world’s deadliest bacteria and viruses. There, scientists will spend their days simulating the unthinkable: bioterrorism attacks in the form of lethal anthrax spores rendered as wispy powders that can drift for miles on a summer breeze, or common viruses turned into deadly superbugs that ordinary drugs and vaccines cannot stop.

University of Illinois law professor Francis Boyle charges that the Bush administration is spending more money in inflation-adjusted dollars to develop illegal, offensive germ warfare than the $2 billion the United States spent on the Manhattan Project to make the atomic bomb. That weapon’s development was, at least, driven by the realistic fears that Nazi Germany might develop it first. Today, no comparable enemy exists.

Peculiarly, the only significant deadly germ warfare attack on the United States appeared to have come from the government’s own Fort Detrick site. A month after 9/11, the mysterious anthrax attacks killed five, sickened seventeen, and alarmed the nation. The perpetrator was never found (a poor showing for a country that spends $40 billion a year on intelligence), but the anthrax-laced letters to Democratic Senate Majority Leader Tom Daschle (D-SD) and Senator Patrick Leahy (D-VT) prodded Congress to rubberstamp an expansion of spending for biological defense through the Patriot and Project BioShield acts.

Project BioShield is a $5.6-billion plan under which the Department of Homeland Security is stockpiling vaccines and drugs to fight anthrax, smallpox, and other germ warfare agents. There is considerable dispute as to whether the plan’s activities open the door to aggressive use of such agents. According to Boyle, pursuant to two national strategy directives adopted by Bush in 2002, the Pentagon “is now gearing up to fight and ‘win’ biological warfare without prior public knowledge and review.” The Pentagon’s Chemical and Biological Defense Program was revised in 2003 to implement those directives, endorsing “first-use” strike of chemical and biological weapons in war. Boyle calls the directives the proverbial smoking gun and further points to President Bush’s Homeland Security Presidential Directive, HSPD-10, of April 28, 2004, which states:

    We are continuing to develop more forward-looking analyses, to include Red Teaming efforts, to understand new scientific trends that may be exploited by our adversaries to develop biological weapons and to help position intelligence collectors ahead of the problem.

“‘Red Teaming’ means that we actually have people out there on a Red Team plotting, planning, and scheming how to use biowarfare,” says Boyle. The Army has stated its work is, and will continue to be, solely defensive in nature. But when it comes to biowarfare and the agents involved, how do you prepare to defend against such threats without developing them?

According to Jeremy Rifkin, author of The Biotech Century, “it is widely acknowledged that it is virtually impossible to distinguish between defensive and offensive research in the field.” Still, as a signatory to the 1972 Biological Weapons Convention (entered into force in 1975), the United States is officially bound to the treaty, the scope of which is defined in Article 1:

    Each State Party to this Convention undertakes never in any circumstances to develop, produce, stockpile or otherwise acquire or retain: (1) Microbial or other biological agents, or toxins whatever their origin or method of production, of types and in quantities that have no justification for prophylactic, protective or other peaceful purposes; (2) Weapons, equipment or means of delivery designed to use such agents or toxins for hostile purposes or in armed conflict.

University of Maryland’s Leitenberg, a long-time authority in the arms control field, contends the government is not developing germ warfare weapons. However, in a monograph published by his organization, he does question whether ongoing research crosses the line:

    There is no such thing as ‘defensive’ biological weapons. Whatever military doctrine may say regarding distinctions between offensive and defensive conventional weapons, this does not apply to biological weapons. Article 1.1 of the BWC allows the growth of laboratory quantities of pathogens (agents) for defensive purposes, that is, in order to develop vaccines and pharmaceuticals, test rapid detection systems, masks, decontamination systems and so on. However, even the ‘development’ of the pathogen is explicitly forbidden–“never in any circumstances”–as is production and stockpiling.

Boyle, who drafted the 1989 federal law enacted by Congress that criminalized BWC violations, sees it more definitively; he contends the government is creating a killing machine. Fort Detrick’s activities betray aggressive intent, he says, and should be shut down and those responsible jailed. Others also apparently believe the line has been crossed. Commenting on Fort Detrick, Mark Wheelis, a microbiology professor at the University of California at Davis, told the Global Security Newswire in June 2004 there was no question that the activities underway there mirrored how offensive biological weapons capability would be developed. “We’re going to develop new pathogens for various purposes. We’re going to develop new ways of packaging them, new ways of disseminating them,” Wheelis outlined. “We’re going to harden them to environmental degradation. We’ll be prepared to go offensive at the drop of a hat if we so desire.” And on July 30, 2006, Leitenberg told the Washington Post, “If we saw others doing this kind of research, we would view it as an infringement of the bioweapons treaty. You can’t go around the world yelling about Iranian and Korean programs, about which we know very little, when we’ve got all this going on.”

As for whether there truly is an aggressive intent behind the government’s biological warfare research, one clue is that in February 2003 the United States granted itself a patent on an illegal, long-range bioweapons grenade in clear violation of the BWC mandate that prohibits such delivery devices. U.S. officials equivocated they never intended to use the biogrenade as described in their patent. Ironically, as Hammond pointed out in a news statement of November 30, 2004, “The United States invaded Iraq in pursuit of phantom bioweapons yet, here at home, it brazenly develops them.”

The Council for Responsible Genetics (CRG) has warned that the United States has progressively undermined international efforts to abolish biological weapons, noting that at the November 2001 Fifth Review Conference in Geneva, the United States rejected a “verification protocol for legally binding international investigations and inspections of all parties.” CRG pointed out that to create vaccines or antiviral agents against many of the most dangerous pathogens and toxins, researchers must first produce such agents in sizable quantities, and that, in the name of vaccine development, as many as twenty laboratories in the United States handle, manipulate, and in some cases weaponize, one of the most lethal strains of anthrax. Prominent among these facilities, CRG identified Dugway Proving Ground in Salt Lake City, Utah; the U.S. Army Medical Research Unit in Fort Detrick, Maryland; the Armed Forces Institute of Pathology in Washington, DC; the Battelle Memorial Institute in Columbus, Ohio; the University of New Mexico Health Sciences Center in Albuquerque, New Mexico; the Center for Disease Control in Atlanta, Georgia; and the Aberdeen Proving Ground in Aberdeen, Maryland.

Critics of the biodefense building boom contend that increasing the number of high-containment labs around the country that will handle potential germ warfare agents only increases the likelihood of accidental (or intentional releases) that ultimately could threaten public safety. Richard H. Ebright, a Rutgers University chemist who tracks arms control issues, told the Baltimore Sun that the government’s tenfold expansion of Biosafety Level-4 laboratories raises the risk of spreading dangerous organisms. “If a worker in one of these facilities removes a single viral particle or a single cell, which cannot be detected or prevented,” he cautioned, “that single particle or cell can form the basis of an outbreak.”

Just recently, the Sunshine Project learned that for fourteen months, Texas A&M University concealed an incident in which a student researcher fell seriously ill from undulant fever. In March 2007 hundreds of people were evacuated from Boston University’s ten-story biomedical research building after white smoke wafted through a laboratory containing tularemia bacteria. Exposure to the bacteria can produce sudden chills, fever, pneumonia, and can even prove to be fatal. Meanwhile, the metal frame of another large BU biolab is rising nearby at a cost of $178 million where, the Boston Globe reported in March, “researchers would work with the world’s deadliest germs, including Ebola, plague, and anthrax.” Anthrax, the nation learned in 2001, subjects its victims to breathing difficulties and wracking coughs as it starves the body of oxygen, often leading to death.

Despite the hues and cries of individuals interviewed or otherwise quoted here, the biowarfare buildup is getting an enthusiastic response from academia, which sees new funds flowing from Washington’s horn of plenty. “American universities have a long history of willingly permitting their research agenda, researchers, institutes, and laboratories to be co-opted, corrupted, and perverted by the Pentagon and the CIA,” Boyle writes.

More than a dozen universities and private consortia are currently vying to win the DHS contract for its own new biodefense research center tagged at roughly $450 million. The proposed 520,000 square-foot main building of the National Bio and Agro-Defense Facility (NBAF) will be the nucleus of a complex liable to exceed one hundred acres. Although advertised to replace an aging facility at Plum Island, New York, the DHS recently announced plans to spend $30 million to expand that lab. NBAF bidders include state universities of Alabama, California, Florida, Georgia, Iowa, Kansas, Kentucky, Maryland, Mississippi, Missouri, North Carolina, Oklahoma, Tennessee, Texas, and Wisconsin.

According to DHS literature, the NBAF complex will fill a critical void in responding to “high consequence biological threats involving human, zoonotic and foreign animal diseases.” While DHS advertises it as a response to “threats,” the Council for Responsible Genetics notes that because efforts to diagnose and treat exposure to biological weapons necessarily involve their production and dispersal, transparency measures must be enforced to verify the defensive intent of such efforts. CRG laments that the U.S. rejection of the BWC inspection and verification protocol undermines that obligation.

On the contrary, many university NBAF project bidders have a history of operating in secret, and this has in no way barred them from applying for new operations. Institutional Biosafety Committee (IBC) disclosure is vital, Sunshine’s Hammond says, for protecting against the human health and environmental risks of biotechnology research. Instead of making its IBC records public as required by NIH guidelines, the University of Maryland, for example, has refused to provide any significant information to the Sunshine Project. “It has lost requests for records, refused them, delayed its response, and when it has replied, provided useless paperwork from which it has redacted all meaningful information,” Hammond contends. Scores of other universities are no more forthcoming.

Many big pharmaceutical houses and biotech firms that have received NIH dollars also conceal their operations from the public. Among those, Sunshine identified: Abbott Laboratories, BASF Plant Science, Bristol-Myers Squibb, DuPont Central Research and Development, Eli Lilly Corp., Embrex, GlaxoSmithKline, Hoffman-La Roche, Merck & Co., Monsanto, Pfizer Inc., Schering-Plough Research Institute, and Syngenta Corp. of Switzerland. Of the top twenty biotech firms, only Genzyme and Millennium Pharmaceuticals, both headquartered in Cambridge, Massachusetts, complied with NIH guidelines–likely because reporting is mandated by local law. This illustrates the massive failure of voluntary compliance. Only 8,500, or 16 percent, of the 52,000 workers employed at the top twenty U.S. biotech firms work at an NIH guidelines-compliant company, Sunshine estimated.

Here and there, concerned citizens are speaking out. Private and government groups around the nation are protesting the bid for the National Bio and Agro-Defense Facility: In Dunn, Wisconsin, the Dane County Board is opposing the University of Wisconsin’s idea of building the NBAF complex on their turf; in Tracy, California, the city council voted against allowing an NBAF facility at Lawrence Livermore, while more than 3,000 people paid to wire DHS Secretary Michael Chertoff opposing the bid and 2,000 more sent e-mail messages in opposition; in Seattle the city council forced the University of Washington to withdraw its bid; at the proposed NBAF site in Leavenworth, Kansas, residents voiced concern over lab safety, the impact on property values, and the potential to make the area a terrorism risk; in Mississippi, opponents posted “No Bio-Lab” signs; and Kentucky residents greeted federal officials making a visit to a proposed site with posters reading “Hal! No! We won’t go!” in a reference to Representative Hal Rogers (R-KY).

In Boston, neighbors of the new BU lab under construction have convinced the Massachusetts Supreme Court to hear their objections. In Maryland, area residents are objecting to the enlargement of Fort Detrick. And when the Army announced plans this past March to reopen the Baker Laboratory on its Dugway Proving Grounds (eighty miles southwest of Salt Lake City) for the purpose of testing anthrax, the Salt Lake Tribune recalled years ago when 6,000 sheep grazing near Dugway were killed, likely by nerve gas. “The Army is working on the deadly pathogens for classified defense purposes. That’s scary,” the Tribune editorialized. “It’s no wonder we’re concerned.”

An editorial of this sort is a rarity. Apart from coverage in the Washington Post and New York Times, major media has done little investigation into the underlying reasons for Bush’s biodefense buildup. For that matter, Leitenberg says, “not a single member of the House or Senate has questioned that expenditure or called for its reduction or basic redirection.”

More questions must be asked. In its news release of August 16, 2001, the Department of Health and Human Services–laying out its plan to combat a possible bioterrorism threat–said it was increasing its support for research related to “likely” agents. If so, what is the point of regenerating an extinct 1918 killer flu virus? The same release warned that “large numbers of people might be directly exposed to an agent released in a dense urban environment.” If so, why entertain bids for new facilities in such areas? Deadly pathogens are the last thing the world needs. And yet what we have taking shape in the United States today is the costliest, most grandiose germ warfare research program ever attempted. It involves developmental work with the deadliest and most loathsome pathogens capable of triggering plagues and epidemics. It is being conducted in good part in secret without adequate oversight and in violation of the NIH’s own rules and treaty requirements for transparency. It is being lavishly funded while urgent biological research to combat imminent health threats is delayed or denied. It’s not only a staggering waste of taxpayer treasure and a perversion of scientific ingenuity but it needlessly puts Americans, and all humanity, at risk.

Sherwood Ross has worked in an executive capacity in the civil rights movement, as a host of a Washington talk radio show, and as a reporter for major dailies and wire services. He can be reached at

FAIR USE NOTICE: This blog may contain copyrighted material. Such material is made available for educational purposes, to advance understanding of human rights, democracy, scientific, moral, ethical, and social justice issues, etc. This constitutes a ‘fair use’ of any such copyrighted material as provided for in Title 17 U.S.C. section 107 of the US Copyright Law. This material is distributed without profit.


Plague of bioweapons accidents afflicts the US by Debora MacKenzie

Troubled Soldier Gets Demoted, Not Treated By Aaron Glantz

Dandelion Salad

By Aaron Glantz

Cody Miranda joined the U.S. Marine Corps when he was 17 years old. He loved the military and hoped to spend his entire career in the service.

Miranda has served more than 16 years in the Marine Corps. Over the years, he’s been deployed to the Middle East six times, including stints in the 1991 Persian Gulf War and the 2003 invasion of Iraq.

But when he returned from a tour in Iraq in 2003, his stepmother Jodie Stewart says, he was a changed man.

“He always used to be over focused on time as the military trains you to be,” she said as an example. “He’s never on time for anything anymore. I don’t know how to explain it to you. How do you explain it when a man who used to behave one way has gone abstractly and profoundly different?”

After returning from Iraq, Cody Miranda divorced his wife and pulled away from his son. He started drinking too much and was found in possession of cocaine.

“He never received any of the post-deployment questionnaires that now are mandatory for all troops,” said Amanda Newman, a licensed family therapist who’s been seeing Miranda on a pro-bono basis for the past few weeks. “He couldn’t understand why all of a sudden his life was falling apart.”

In 2005, Miranda went Absent Without Leave from Camp Pendleton in California for nearly a year and lived homeless on the street.

When he returned to the Marine Corps, military doctors diagnosed him with severe post-traumatic stress disorder; an anxiety illness that can develop after exposure to a terrifying event or ordeal in which grave physical harm occurred or was threatened, according to the National Institute of Mental Health. A person having a flashback may lose touch with reality and believe that the traumatic incident is happening all over again.

Military doctors also diagnosed Miranda with bipolar disorder, insomnia and sleep apnea.

But rather than give him treatment for his illness, the Marine Corps lowered his rank to private from staff sergeant, threw him in the brig multiple times (most recently for being five minutes late for a hearing), and began court martial proceedings that can lead to a dishonourable discharge — which would have denied the medical benefits Miranda needs to get his life right again.

Newman said Miranda needs inpatient psychiatric care, which he is not receiving, and complained that her attempts to see him while in the brig were delayed as a result of military orders.

“I asked immediately to see him in the brig and was told that it was not possible,” Newman wrote to Miranda’s military lawyer on Jun.29. “This is absolutely unacceptable: if a Marine was experiencing a medical emergency and had cut an artery and was bleeding profusely, he surely would not be denied treatment simply because he was in the brig.”

“In fact I would assume and hope that he would be transferred to the hospital for appropriate treatment. There is no difference regarding the severity and crisis nature of Pvt Miranda’s psychiatric condition and that of a medical condition: both are life threatening,” she wrote.

Officials at Camp Pendleton did not respond to multiple telephone and e-mail inquiries by deadline. Thirty-six hours after receiving a written request for information, a public affairs representative of the base told IPS: “I still don’t have anything for you.”

But public attention did appear to have an effect, however.

On Tuesday, after veterans’ groups helped Miranda file formal complaints with California Congressman Ken Calvert and Senator Barbara Boxer, Camp Pendleton’s commander, Col. James B. Seaton, abandoned plans for a court martial.

According to military defence lawyer Captain Bart Slabbekorn, Miranda was brought before the base commander Jul. 3 and given “non-judicial punishment.”

“As a result of today’s proceedings, Pvt Miranda may be retained in the Marine Corps or he may ultimately leave active duty,” Slabbekorn wrote in a letter to supporters. “Either way, at this point, he will be looking at a discharge making him eligible for VA (Veterans Affairs) treatment down the road.”

If Miranda does remain in the military, it’s likely he will be assigned to the Wounded Warrior Battalion, where he would work with other soldiers facing similar issues.

“The future is up to Miranda,” Slabbekorn said.

But Cody Miranda is not alone.

The Department of Defence’s most recent mental health survey found about 20 percent of soldiers met screening criteria for a mental health problem and that there was a “linear relationship” between combat exposure and subsequent mental health problems. Nearly one-third of troops who had seen “high combat” met criteria for a mental health problem.

Slabbekorn told San Diego’s KSUI television between 10 to 20 percent of soldiers imprisoned in Camp Pendleton’s brig suffer from some kind of combat-related mental illness.

In the first four years of the Iraq war, 1,019 Marines were dismissed with less-than-honourable discharges for misconduct committed after overseas deployments. Navy Capt. William Nash, who coordinates the Marines’ combat stress programme, told USA Today this week that at least 326 of the discharged Marines showed evidence of mental health problems, possibly from combat stress, according to the Marines story.

Nash told the paper he hoped that “any Marine or sailor who commits particularly uncharacteristic misconduct following deployment…be aggressively screened for stress disorders and treated.”

“If a Marine who was previously a good, solid Marine — never got in trouble — commits misconduct after deployment and turns out they have PTSD, and because of justice they lose their benefits, that may not be justice,” Nash said.

The Marine Corps has yet to follow up on Nash’s recommendations.


Before You Enlist

Prisoner 345: What happened to Al Jazeera’s Sami al-Haj By Rachel Morris

Prisoner 345:

What happened to Al Jazeera’s Sami al-Haj

By Rachel Morris

On December 15, 2001, early in the morning on the last day of Ramadan, a reporter and a cameraman from Al Jazeera arrived at the Pakistani town of Charman on the Afghanistan border, on their way to cover the American military operation. The reporter, Abdelhaq Sadah, was replacing a colleague, but the cameraman, a Sudanese national named Sami al-Haj, had been on such an assignment before, and had crossed the border without incident. This time, however, an immigration official stopped him. He seemed angry. The official told Sadah that he could go, but “your friend is a wanted man and will stay here.”

In Sadah’s recollection, the official produced a letter from Pakistani intelligence—written, curiously, in English. It said that al-Haj had Al Qaeda ties and should be apprehended. Al-Haj noticed that the passport number in the letter didn’t correspond to the one in his current passport, but instead to an old passport he had lost several years ago in Sudan and had reported missing. Despite his protests, the official insisted on detaining him overnight. The next morning, Sadah returned to the border post just in time to see a Pakistani military officer lead al-Haj to a car and drive him away.

Al-Haj is a tall, slender man whose round face and glasses give him a boyish demeanor. In photographs, he looks much younger than his thirty-eight years. People who have met him invariably describe him as polite; in conversation he is said to smile almost constantly.

After Sadah informed Al Jazeera management what had happened, the network made contact with the Pakistani authorities and was told that al-Haj’s background was being investigated. On January 4, al-Haj called his wife, Asma, who was then living in Azerbaijan. He sounded confident, almost cheerful, saying that he expected to be back in Doha, Qatar, Al Jazeera’s headquarters, in two or three days.

Instead, al-Haj was taken to an underground prison in Kabul. There, he was transferred to American custody. On January 7, he was brought by helicopter to Bagram Air Base. Al-Haj later described his disorienting arrival to his lawyer. After a fifteen-minute flight, he said, he was pitched from the helicopter into the icy night, hitting the tarmac so hard that he briefly lost consciousness. He claimed that he was then kicked and beaten by military police, who removed the black bag from his head and cut off his clothes. After performing what al-Haj called an “intimate body search,” they dressed him in a blue uniform, and said, “You record videos of Osama bin Laden for Al Jazeera.”

For the next six months, al-Haj was held in Afghanistan. In early June 2002, he was put on a military plane. In another letter to his lawyer, he explains that his hands were gloved and cuffed and linked to his leg shackles; his mouth was gagged. Every so often, American soldiers removed the gag to feed him peanut butter crackers. The plane landed many hours later. On June 14, al-Haj was given an orange jumpsuit and the ID number 345. He was in Cuba. For the past five years, al-Haj has been the only journalist known to be held in Guantánamo Bay.

Many questions surround Sami al-Haj. After talking with his colleagues, friends, family members, and lawyer, I could piece together only a partial picture of his life. He grew up in the Sudan, where an uncle, who was better off than al-Haj’s family, helped him attend college in India, where he studied computers and English. In the late 1990s, he took a job as an administrative assistant for a company called Union Beverages, and later worked in a similar role for an import-export company in the United Arab Emirates. In 1997, a former university classmate introduced him to Asma, and they married the following year. Asma told me that her husband was “a very kind-hearted person, [but] we didn’t have deep conversations about our future or experiences.” She added that he liked to sleep a lot, to watch television (usually Al Jazeera and Egyptian movies), and to read “every newspaper he could find.” In 2000, the couple had a son, Mohammed. Soon after, al-Haj answered a newspaper advertisement for a trainee position at Al Jazeera, and the family moved to Qatar. He started work on a trial basis in April 2000.



A Farewell to Arms Control By Scott Ritter

Dandelion Salad

By Scott Ritter
Jul 5, 2007

The organization that was at the center of the maelstrom of the Iraqi weapons-of-mass-destruction fiasco, responsible for bringing the world to the brink of war on no fewer than a half-dozen occasions during the 1990s, and then unable to prevent a war in March 2003, has departed the global scene. It left not with a dramatic flair befitting its former status, but rather with barely a whimper, reduced to nothing more than a historical footnote in the grand tragedy that has become Iraq. Continue reading

Antiwar Radio: Scott Horton Interviews Chris Floyd (audio)

Antiwar Radio: Scott Horton Interviews Chris Floyd

Chris Floyd (who’s Website is apparently being hacked), author, columnist, blogger and activist, discusses the phony war on terrorism, the politics of fear, racism and imperialism, America’s murderous regime change in Somalia and belligerent attitude toward Russia.

MP3 here.

Chris Floyd is an award-winning American journalist, and author of the book, Empire Burlesque: High Crimes and Low Comedy in the Bush Regime. For more than 11 years he wrote the featured political column, Global Eye, for The Moscow Times and the St. Petersburg Times in Russia. He also served as UK correspondent for, and was an editorial writer for three years for The Bergen Record. His work appears regularly CounterPunch, The Baltimore Chronicle and in translation in the Italian paper, Il Manifesto, and has also been published in such venues as The Nation, the Christian Science Monitor, Columbia Journalism Review, The Ecologist and many others. His articles are also featured regularly on such websites as Information Clearing House, Buzzflash, Bushwatch,,, and many others. His work has been cited in The New York Times, USA Today, the Guardian, the Independent and other major newspapers.Floyd co-founded the blog Empire Burlesque with webmaster Richard Kastelein, who created the site using open-source software. Floyd is also chief editor of Atlantic Free Press, which was founded and designed by Kastelein. Floyd has been a writer and editor for more than 25 years, working in the United States, Great Britain and Russia for various newspapers, magazines, the U.S. government and Oxford University.

FAIR USE NOTICE: This blog may contain copyrighted material. Such material is made available for educational purposes, to advance understanding of human rights, democracy, scientific, moral, ethical, and social justice issues, etc. This constitutes a ‘fair use’ of any such copyrighted material as provided for in Title 17 U.S.C. section 107 of the US Copyright Law. This material is distributed without profit.

Finding Ian Shook Me to the Core (a bent memoir) by Mescalito

On Myspace:


Finding Ian Shook Me to the Core (a bent memoir)

Current mood: Between a crock and a lard place

by Mescalito
Jully 5, 2007

I was sure Ian had killed himself somehow or had had his personality destroyed beyond all serviceable humanity by his illness and that legendary acid overdose. A morning spent taking chemical courage nurtured the notion to drive to his parents’ house and my evil amygdala boosted my fear that he’d died into total certainty that he had. The house was quiet. That couldn’t be a good sign. It was the 4th of July. But I rang the doorbell rubbing my scalp and prepping my nerves for a terrible blow when I saw his big, football shaped head in the window. He opened the door and stood there – huge, egg shaped, looking healthier than I’ve ever been and I felt as strange and gamey as I always had been back in high school when he and I were inseparable. He invited me in without saying much of anything. After twelve years apart we still knew each other so well that our mutual shock could go unspoken. We sat down on the couch. He put on the Sopranos – something cool and seedy for two cool and seedy guys, if only in our hearts. After smoking in strangely comfortable silence he said; “What are you on?” He could tell! What nerve! “I didn’t want to hide it from you,” I said. “I’m not like that. I just, well with everything you’ve gone through, I didn’t want to get you into anything that might, ah-“

“Send me into psychotic shock.”

“Yeah.” I laid down a few lines. He served me soda and we played pool in the garage. We talked about the days of our teenage wildness. His recall was twice as good as mine, and I’d never even smoked grass until the age of 21. I felt like a silly kid, totally exposed, but safe. The story of his acid overdose was blown completely out of proportion by a bunch of cracked-out waiters who we’d both lost touch with. But he had gone schizophrenic. It runs in his family. And it made his twenties a hell. But he stood before me now, with terrible lucidity, telling me things about myself I’d forgotten and giving me advice about drugs that I wouldn’t have expected from anyone less than the late W.S. Burroughs.

“Honestly man,” I said. “I had given up on you. I was sure you were dead or had lost your humanity. Now I find you here, like this, well and calm as a monk. It’s like meeting a bear in the forest who it turns out can talk and what’s more knows all about you. You’ll have to forgive me if I seem nervous. Plus the coke is coming down. I may get the Fear. But I can see it coming. I’ll take off before it takes hold.”

He laughed at me. “You need something else to bring you down,” he said. I went white and had to steady myself on the table. He was right. I’d had whiskey on hand almost every time, but not in the last few weeks. I thought it was just a side effect of regular cocaine use.

((This is a true story. And it happened only just yesterday. At the moment I find myself unable to accurately recreate Ian’s impressive presence and the contrast of the crazed and incoherent person I remembered him as. So I’m not going to finish it here as I am still processing the memory. I went home, still coming down getting nervous, depressed and unsure of myself while using all my hidden discipline to be mindful of the fact that this sudden crushing sadness and self doubt was only chemical. I went to bed scared as a chicken in a thunderstorm, but not of danger. It was a fear of life, that which must be done, and the future that this down-phase, for which I was not prepared, put into me. It took my whole will not to call my EX and beg for a cuddle. For the first time in years I felt embarrassed, because Ian had just seen me being very foolish and not showing the usual self-awareness of which I usually boast. He became like a kind older brother while I turned into a head-shy dog. I intend to visit him again tomorrow before I leave town, give him a copy of my novel and try to show him some sense in a relatively sober state.

I will feel compelled to paint a clearer picture of Ian in the future as I put my memories of yesterday’s encounter in order. Until then, a toast to old friends!))