The Official Blog Of Edward Cline

Month: June 2006

Billions for boondoggles, but not a cent for reason

“Don’t bother to examine a folly — ask only what it accomplishes.”

So said Ellsworth Toohey to Peter Keating near the end of Ayn Rand’s The Fountainhead in a speech in which he explains the method of his plan to rule the world.

I have commented in the past on the folly of Bill Gates spending his billions on fighting various “ills” without bothering to examine their causes. Now Warren Buffett will donate $37 billion to the Bill and Melinda Gates Foundation, or about 85 percent of his estimated worth of $44 billion. Bill Gates himself is worth an estimated $50 billion.

“Now worth $30 billion, the Gates foundation is one of the world’s richest philanthropic organizations,” says a Reuters article. “It has committed millions to fighting diseases such as malaria and tuberculosis in developing countries, and to education and library technology in the United States.”

It is not just the wasted billions that causes the mind to reel. It is also, among other things, the utter futility of the gestures. “Developing countries”? Read “Third World” backwaters that will never develop beyond what they are now: incubators of poverty, starvation, disease, death and tyranny. Their various inhabitants and rulers are as clueless about the political and economic causes of malaria, tuberculosis and poverty as are apparently Gates and Buffett, who have far less excuse for their ignorance.

What is astonishing is that neither Gates nor Buffett grasps the nature of the origin of his wealth, nor apparently has ever bothered to ask himself why there is a difference between the American standard of living and its wealth and the standards of living and the degrees of destitution that are responsible for the diseases they wish to combat and cure. Or, if they have sensed or identified the difference (and given their public statements, there is no evidence they place any importance on the difference), their altruistic premises trump any distinction.

The folly must be examined in order to understand what will and will not be accomplished by pouring billions of dollars into the bottomless pits of the needy around the globe, and by perpetuating the ever-deepening sinkholes of American public education.

Let us first note that charities produce nothing. They are eminently non-productive. The United States is rich because it is productive, because so much created wealth was invested in other productive enterprises, and only a fraction of the produced wealth ever donated to charities. (And we will leave aside for the moment the incalculable wealth confiscated by the U.S., state and other governments, also non-productive entities, dedicated to such boondoggles as Social Security and rebuilding New Orleans.) Perhaps the fortunes Gates will invest in research to cure malaria (God forbid he advocate the application of DDT) and tuberculosis will actually produce the hoped for cures. Fine. That will leave the cured to endure poverty, starvation and other diseases, not to mention the turmoil and anarchy and tyrannical brutality of the countries in which malaria and tuberculosis might be checked.

Let us also note that, in regards to the wealth Gates will donate to the public education system, the students who will be the immediate or direct beneficiaries of that money, for as long as they are hostages of that system, will not emerge brighter students or super achievers. By all the direct evidence of plummeting test scores and the inability of increasing numbers of students and young adults to think, read, do simple math, and write, learning how to use technology or some souped-up library or data system will not turn them into independent individuals capable of emulating Gates’s business success.

What Gates overlooks or is oblivious to is that the education system is committed to turning young people into selfless individuals who defer to arbitrary authority and regard themselves as mere cogs in society, some more adept or skillful than others, tolerated as long as they remain obedient ciphers.

And for as long as Gates and Buffett are lauded as role models of “responsible citizens” and exemplars of sacrifice and “giving back,” any given student will be discouraged from developing a personal, selfish ambition, and never encouraged to ask the question: Give back what, and to whom? This is a more potent consequence of their actions, more potent than any amount of money they may donate. Perhaps the most perilous thing Gates’s folly will accomplish is the further “legitimization” of selflessness as a “noble” virtue.

These students will end up as shortsighted or blind as Gates and Buffett must be in any realm beyond their businesses, and the realm in which they are most blind is the moral foundation of capitalism and freedom. By announcing their intention to squander their wealth in a prolonged orgy of altruism, they betray the very vestiges of the morality that allowed them to succeed in their businesses. Obviously, throughout their entire careers, Gates and Buffett accepted the idea that greed and personal ambition were either evil or irrelevant. It was “practical” to make a profit, but not moral. Ideas? Principles? Free minds? Championing capitalism? No. Apparently, bridge games are the limit of their intellectual efforts.

Their altruist campaign to “do good” by “giving back” is evidence of what could be called moral autism. Webster’s New Collegiate Dictionary defines autism as an “absorption in phantasy to the exclusion of reality.” The Oxford Concise Dictionary adds a term that is directly related to altruism in its definition of autism: “Morbid absorption in fantasy,” the term morbid medically indicative of an unhealthy disease combined, in Gates’s case, with an obsession to cure the ills of the world. And altruism is the progenitor of myriad fantasies. It requires leaving reason behind and focusing on ridding the world of an “ill” with no reference to reality.

I am not certain that experts have determined whether clinically defined autism is a consequence of physiological disorders or psychological ones, or a combination of them. But I am certain that moral autism is a consequence of a profound philosophical disorder: the automatic suspension of reason where moral values are in question and a departure from reality, a condition required by altruism.

A medically certified autistic person may not have any control over his condition, but the moral autism of Bill Gates, Warren Buffett, and countless individuals, is a matter of choice. If the mark of autism is a “disconnect” from reality in favor of a fantasy in which reason is neither applicable nor welcome, then Gates and Buffett are morally autistic. For them, there is no rational causal relationship between reality and morality. They can be brilliant in business, but become congenital idiots in the realm of morality.

Another observation is that all those who are praising Gates and Buffett are gloating in self-righteous vindication of the altruism they have been promoting all their lives and careers, happy that such enormous wealth will be consumed in altruist programs. What is obscene about this event is the glib sanctimony of Gates and Buffett, and the smug sanctimony of those who approve of the give-away.

It would be interesting to hear the reception these same altruists — in the news media, in universities, by politicians, in churches — would give Gates and Buffett if these men announced instead that they planned to devote their billions to educating Americans on the values of reason, capitalism, and liberty, and to rediscovering the America that the Founders intended this country to be — a land of the free, not a home of the selfless.

Our nation was young when Charles Pinckney, an American minister to Republican France, replied to an official French request for graft, “Millions for defense, but not a cent for tribute.” Altruism has so warped the character of our nation that now the reply is: “Billions for boondoggles, but not a cent for reason.”

Moral autism is a disease that can only be combated and cured by advocating reason and capitalism. A nation that treats it as a superlative normal condition will not know why it is perishing.

Enshrining sacrifice: the American Film Institute’s ‘inspiring’ film list

It was inevitable, almost predestined, that Frank Capra’s cinematic paean to selflessness and self-sacrifice, “It’s a Wonderful Life” (1946), would be voted the most inspiring American film out of one hundred candidates by the American Film Institute. In a culture that values altruism as a primary, uncontroversial, not-to-be-questioned virtue, it is almost an instance of determinism.

On its official website, the AFI’s director and CEO, Jean Picker Firstenberg, explained the purpose of the program that aired the choices on national television on June 14:

“The past few years have not been easy in America — from September 11th to the devastation of hurricanes Katrina, Rita and Wilma. AFI’s 100 years….100 Cheers will celebrate the films that inspire us, encourage us to make a difference and send us from the theatre with a greater sense of possibility and hope for the future.”

The website notes: “AFI distributed a ballot in November 2005 with 300 nominated inspiring movies to a jury of over 1,500 leaders from the creative community, including film artists (directors, screenwriters, actors, editors, cinematographers), critics and historians.”

“To make a difference,” in the context of the Capra film, is a euphemism for selfless efforts on behalf of others, for “giving back” to society, to the “community,” to the world.

The AFI program, broadcast under the title “Cheers,” elaborates on its moral criteria of the “most inspiring:

“Movies that inspire with characters of vision and conviction who face adversity and often make a personal sacrifice for the greater good. Whether these movies end happily or not, they are ultimately triumphant — both filling audiences with hope and empowering them with the spirit of human potential.”

And therein is the clincher: “sacrifice for the greater good.”

In previous commentary, I cited Bill Gates’s decision to “give back” his billions as an auspicious instance of craven selflessness in a commitment to “make a difference for the greater good.” It is his money, and he has a right to dispose of it as he wishes. One can think of a number of “worthier” things he could spend the money on than on the insatiable demands of the needy, such as the endowment of a university fully staffed by advocates of reason and freedom.

However, one would like to ask him: “On the premise that you are giving back to society what you took from it, what exactly is it that you took? Ideas for software? Programming innovations? If you concede that you originated those things, and not society, why are you branding yourself as a thief or a repentant debtor? If you concede that you took your customers’ money in trade, why do you believe that you don’t deserve every penny of it? Haven’t your products revolutionized men’s lives and made an incalculable difference? If you concede that you gave the public a priceless value, why are you willing to believe that it was immoral, immaterial, or irrelevant, and that you must make amends?”

But it is nearly futile to argue with a convert to altruism. One’s only weapon is reason. Altruism is reason-proof. It derogates the self and selfishness. It is a corrosive that eats away at a mind and renders it progressively impervious to rational persuasion. It is why I rarely attempt to persuade an otherwise rational person of the folly and impracticality of his altruist beliefs. To make the transition from an altruist morality to one of rational selfishness requires too great a mental task for a person who at least senses the rightness of a refutation of altruism; he would see that he would need to repudiate nearly everything on which he has based his life. It is too frightening or traumatic a prospect, and the person will choose instead to “blank out” without pursuing the subject privately or in conversation.

This is not so much a digression as it is an elucidation. To the AFI, the term “inspiration,” in a literary or artistic context, refers almost exclusively to the motivation to practice altruism and self-sacrifice. It has nothing to do with what Ayn Rand called “spiritual fuel” to pursue or fight for one’s values. In her essay, “What is Romanticism?” in The Romantic Manifesto, she writes:

“The archenemy and destroyer of Romanticism was the altruist morality. Since Romanticism’s essential characteristic is the projection of values, particularly moral values, altruism introduced an insolvable conflict into Romantic literature from the start. The altruist morality cannot be practiced (except in the form of self-destruction) and, therefore, cannot be projected or dramatized convincingly in terms of man’s life on earth….”

In that same essay, she notes:

“Romanticism is a category of art based on the recognition of the principle that man possesses the faculty of volition…..If man possesses volition, then the crucial aspect of his life is his choice of values — if he chooses values, then he must act to gain and/or keep them — if so, then he must set his goals and engage in purposeful action to achieve them.”

Some of the films that made the top 100 list are “inspiring” for the right reasons, that is, they do not inspire one to devote one’s life to others’ needs or to sacrifice anything, but dramatize the pursuit of personal values. The values they dramatize the pursuit of are as varied as the subjects and themes of the films. And some of them dramatize apparent sacrifices which are actually actions taken at risk to preserve values.

To cite an example from the AFI list, “Gunga Din” is about a water-carrier for the British army in India. He wants to be a regular soldier in that army, but is scoffed for his ambition. He risks his life to warn the army of a trap, and is killed. This is not so much a “sacrifice” as his achieving his goal of being a soldier (and his knowing the risks of being one). The same could be said about “Glory,” in which the principal characters die as soldiers risking their lives to fight for their values. About these and a few other films that feature the risks of warfare, the last thing one would want to hear is President Bush pontificating on the virtue of sacrifice in relation to collectivist or altruist goals. Bush and Hollywood, ostensibly enemies, have more in common than either would be willing to acknowledge.

I personally find these inspiring stories. On the other hand, as a teenager I found the deterministic, Shakespearian “Lawrence of Arabia” inspiring not only for its numerous production values (such as direction, cinematographer, casting, and dialogue), but chiefly because it suggested what is possible if those same production values were applied to Romantic stories.

The majority of the films on the AFI list, however, fall somewhere in between value-pursuit and value-sacrifice, or have little or nothing to do with either end, such as “2001: A Space Odyssey.” The list is as mixed as an altruist’s premises. One revolts against the presence of some films on the same list as others. “Shane” and “High Noon” should not be in the same company with “Harold and Maude” and “Dances with Wolves.” It is also worth noting that “The Fountainhead” did not make it to the list.

There is no room here to discuss all one hundred films on the AFI list of the “most inspiring.” That would require a book. But an Associated Press article on the AFI list is instructive about the moral esteem in which “It’s a Wonderful Life” is held in modern culture. It is the story of George Bailey, who surrenders his personal ambition to the needs of his “community,” is about to commit suicide, when, as the A.P. article describes it, he “got a chance to see how ugly the world would be without him” had he not been born, that is, conned into relinquishing that ambition. At movie’s end, George’s brother, referring to all the people in Bedford Falls George has “helped,” proclaims him the richest man in the town.

“We all connect to that story,” said Bob Gazzale, producer of the AFI TV special. “We may not all connect to the story of a fighter from Philadelphia or a singing family in the Austrian Alps. But there’s no way to get away from the inspiring story of George Bailey. It relates to us all.”

No, it does not, if by “relate” he means that we all have the potential for selflessness or self-sacrifice, or the capacity to tolerate it for the sake of others’ needs, as George Bailey chose to tolerate it. The first time I saw the Frank Capra film as a child, I was repelled by it, and for a long time was intrigued about why it was so revered. As a novelist, I have always wanted to rewrite that story. But Ayn Rand beat me to it in Atlas Shrugged, the story about heroes who refuse to be George Baileys.

It would be interesting to speculate on whether or not Bill Gates, now the richest man in the world, found “It’s a Wonderful Life” the most inspiring movie he ever saw, and whether or not he ever privately wondered, at the peak of his career, when he was being sued by rivals and hounded by the U.S. government and the European Union, what the world would be like had he not pursued his own selfish ambition to create Microsoft, or if he now withdrew the products of his mind.

That, however, would necessitate the self-esteem of a man proud of his achievements, together with a knowledge of the injustices perpetrated against him. Bill Gates lacks both that self-esteem and a sense of justice; he is motivated by humility and mercy, the twin enemies of justice. He meets the criteria of a sacrificer for the “greater good.”

Bill Gates–alias George Bailey, alias Mitchell Layton

“Mitchell Layton had inherited a quarter of a billion dollars and had spent the thirty-three years of his life trying to make amends for it.” (The Fountainhead, p. 579, Centennial edition)

The estimated personal worth of Bill Gates, age 49, chairman of Microsoft, the 4th largest company in the world, is $50 billion, all of it earned, not inherited, and he has proposed devoting the balance of his life making amends for it. Bill Gates announced yesterday that he will be spending less time running Microsoft and more time to his charity work, “giving back” to the world.

Let us say that Gates somehow manages to conquer malaria without the benefit of using DDT. Fifty million children are saved. Then what? What are they going to do with their lives and health on a continent plagued by dictatorships, poverty, and corruption?

Gates believes in “giving back” his billions. His Bill and Melinda Gates Foundation is endowed with $30 billion. In another quarter, President Bush is motivated by the same altruist morality, that Americans should be willing to help Iraq achieve “democracy” (fallaciously equated with freedom) and sacrifice their lives in the effort. His program is tentatively projected to cost $500 billion, and rising. Making Iraq “safe for democracy” will somehow ensure America’s security.

Pouring wealth down the bottomless pit of Africa or Iraq will not accomplish “good.” Doing “good” without the least thought of in what context and circumstance the “good” might actually have some tangible, beneficial results will have no results, or results that are inimical for all concerned parties. (The youth of Saudi Arabia, for example, benefit physically from billions in oil revenues; having no purpose in life, most turn into murdering jihadists.) Apparently Gates has devoted little or no thought to the necessary conditions that would ensure that children did not starve, contract AIDS, or succumb to malaria, just as Bush has devoted little or no thought to the conditions necessary to ensure any country’s freedom, prosperity, and well-being.

One would think that such an elemental fact would occur to someone as bright as Bill Gates. But, one of the pernicious effects of an altruist morality in an otherwise rational and productive mind is that it necessarily, fundamentally, and incrementally dissolves the causal connections that lead to rational conclusions. Gates would not pour a fortune into the development of a software program that not only would not work, but also be proven to damage or destroy an operating system or computer hardware. But he will spend his fortune to “do good” without the least consideration of what causes the “bad.”

Altruism divorces the real world from the moral world, which is believed to be on a “higher plane” but somehow can influence the real world. To Gates, there is no connection between freedom, property rights, and the sanctity of the individual and the prosperity and well-being these things can make possible. He sees misery, starvation, and disease in Africa and other “undeveloped” regions of the world, and believes that money, not freedom, will eradicate them.

Gates’s father, William Gates, recently appeared in the news to argue strenuously against the temporary repeal of the blatantly confiscatory estate tax by Congress, righteously claiming that wealth is a “privilege,” that the wealthy actually have little right to arrange for the disposal of their fortunes on their decease, and that the government had a responsibility to tax it away for society’s sake as a form of “giving back.” One can imagine that Bill, his son, has been moral putty in the father’s hands. One is tempted to cast him as an Ellsworth Toohey, and almost tempted to pity Bill Gates.

Asked in an interview which he would like to be remembered for, Microsoft or his charity work, Bill Gates replied that he didn’t think it was important what he was remembered for, just as long as “good” was done. He answered the question, which startled him, almost immediately, with no evidence of offense or pride in his manner.

The news media stressed that Gates wishes to give his money away to programs that produce “results.” The media also lauded Gates as a marvelous example to American children and young adults who are attracted to “volunteerism.” I do not know the details of his purpose in pouring money into our ravenously wasteful and destructive education system (other than to introduce children to technology), but if that program is motivated by the same altruist spirit, the only lasting result will be the inculcation of more “selfless” ciphers, the Brown Shirts and self-sacrificers (and sacrificers of others) of tomorrow.

It is no cultural coincidence that the American Film Institute recently voted, out of one hundred candidates, that the most inspiring is “It’s a Wonderful Life,” the Frank Capra “classic” about a man, George Bailey, who surrenders his ambition to the needs of his “community.” Bill Gates is another George Bailey. Reality emulates art again.

It is the daunting task of reason to destroy once and for all the myth that the selfless man is an exemplar of morality, beginning with Robin Hood and ending to date with Bill Gates. Ayn Rand was so right that Immanuel Kant and his numerous yea-sayers over the centuries are man’s most evil nemeses.

Saudi suicides and a deadly double standard

All right. Two Saudis and one Yemeni committed suicide at the Guantanamo Bay prison. And? As one correspondent of mine remarked: “Since Muslims are committing suicide on a daily basis all over the world — and killing as many others as is possible with themselves — what is so hard to believe about three suicides in a jail?” Remember that every one of the 460 detainees at Gitmo was either taken in combat against U.S. forces in Afghanistan or Iraq or elsewhere, or taken as a suspect with terrorist or Taliban connections, and scheduled to be tried by a tribunal.

It is hard to believe if reality does not conform to one’s wishes.

Remember that these are not “rockin'” fans of the Dixie Chicks or gentle Bono groupies or twittering sycophants of Muslim-patronizing Prince Charles of Britain, spirited away from Pennsylvania Avenue or the Strand and unlawfully incarcerated without charge. These are men who would just as soon as cut the throats of American civilians with box-cutters, hijack another planeload of them and smash it into the U.S. Capitol in an act of suicidal jihad. Or at least stockpile bags of ammonium nitrate fertilizer to grow more piles of Western bodies and rubble.

No, if you listen to the news media, you are not to remember that. You are to buy the story that these three succumbed to the “stench of despair,” as Mark Denbeaux, a law professor at Seton Hall University described the “plight” of prisoners at Gitmo. Denbeaux and his son represent two Tunisian prisoners there. It did not occur to the writer of the Associated Press report that quoted Denbeaux to wonder: Who is paying Denbeaux’s retainer? CAIR? Or some other Islamic front organization funded by Saudi Arabia? Attorneys cost money. So does the judicial system, even for pro bono lawyers. But don’t expect any investigative Pulitzer Prize-winning stories to result from that tidbit.

You are not to remember either the extent to which the American military has gone to accommodate Islamic customs at Gitmo in terms of prayer times and prayer rugs, food, free copies of the Koran, not to mention all the medical services, cleaner clothing than most of them ever wore, and other perks that no American prisoners of war ever enjoyed in any war of the 20th century. The U.S. has gone more than the whole nine yards to fend off accusations by “human rights” organizations that it is mistreating prisoners, even to the extent of calling them “detainees” and not “prisoners of war.”

No, you are to empathize with their suffering, not your own or that of Americans whom these “detainees” have killed or would have killed if not captured. You are to forget that every one of them acted in the name of a totalitarian ideology that regards due process, individual rights, and freedom as the corrupt practices of men to be either killed or enslaved.

A measure of the news media’s virulent hatred of the U.S. is how quickly and eagerly it will jump on any rumor of American misbehavior. Its malevolent glee at a chance to knock the Marines — the proudest and least politically correct American military service — over Haditha must sit in the craw of anyone who has ever been in combat against Islamic “insurgents” or lost a friend or relative to these “freedom fighters.” You want to put your fist through the TV screen and wipe the sanctimony from the faces of Charles Gibson, Matt Lauer, Diane Sawyer and their patronizingly skeptical brothers and sisters elsewhere in the media.

The crime this time is that especially the American news media is willing to grant credence to our enemies first — an enemy that knows how to work the West’s multicultural and relativist premises to his full advantage — before examining facts or even recognizing that there are such things as facts. Observe, for example, how the media dwells on the Israeli mortar shell dropped on a Gaza beach, killing some “innocent” Palestinian civilians. You watch the footage and if you have half a brain, you must ask yourself: Why does this look so staged? What was a cameraman doing there with a camcorder and audio? Why does the little girl behave like she is following directions?

You can almost hear the Hamas’s verbal cues. “Now, run along the beach looking for your father. Don’t look at the camera! Okay. Now you see him. I’ll put the camera on him, and then you see him and flop into the sand and roll back and forth hysterically, screaming anguish and bloody murder. Try not to look up at the camera, or it’ll look phony. Hey, great work, little one! Now we have an excuse to fire more rockets into Israel. To hell with their apologies. We want to kill Jews. Say, little one, how would you like to wear a pretty new vest?”

But, back to the Gitmo suicides. The Associated Press reports General John Craddock, commander of the U.S. Southern Command, saying that the “suicides were part of Islamic militants’ holy war against the United States and its allies.” “They’re determined, intelligent, committed elements,” said Craddock, “and they continue to do everything they can…to become martyrs in the jihad.” “Militants”? Not prisoners of war?

Fine. Let more of them commit suicide and martyr themselves. Give them the bed sheets and maybe some nylon rope. It will mean fewer hostile mouths for U.S. taxpayers to feed. It’s a thought, but all 130 Saudis at Gitmo could be freed by herding them onto Air Force transports for “release” over Riyadh, together with about a thousand 500-pound bombs targeted on various palaces of the sheiks, the mourning tents, and mosques.

The Associated Press reports of June 12th on the suicides read like an Islamic agony column. (A correspondent of mine queried whether or not the Associated Press and Reuters might be sub-cells of Al-Quada, which is not so wild a hypothesis, since rich Saudis are stealthily buying interests in Western news organizations.) Ample space was given to the likes of Denbeaux and his ilk in the European Union and Saudi Arabia, all of whom commiserate over the “detainees” and who call for the closing of Gitmo and release of the “detainees.” Very little space was devoted to the American position. Most Saudis don’t believe the deaths were the result of suicides, or if they believe they were suicides, they were brought on by “torture.”

“A crime was committed here,” said Kateb al Shimri,” to the Associated Press, “and the U.S. authorities are responsible.” The Associated Press went on to say that Shimri echoed “the general sentiment heard in the Saudi capital.” Shimri is a Saudi lawyer representing relatives of Saudis held at Gitmo. He plans to sue the U.S. government for compensation on behalf of the relatives of the suicides. “Many Saudis denounced the suicide claims as a fabrication, and some accused the U.S. authorities of complicity in the inmates’ deaths.”

“They were killed; they were murdered,” one mother of a Saudi prisoner of war wailed. “This was no suicide.” This from a Muslim woman who would have celebrated her son’s death had he wandered into an Israeli pizza parlor and blew himself and twenty people up. That action, presumably, would not qualify as killing and murder. Does anyone out there see the deadly double standard that is eroding the separation of Western and Muslim cultures, the moral chasm that divides the life-giving values of the West and the death-worshipping cult of the East?

Finally, I quote another Saudi whose veracity is demonstrably impeachable, Mufleh al-Qahtani, deputy director of the Saudi kingdom’s Saudi Human Rights Group. “There are no independent monitors at the detention camp,” he said to the Associated Press, “so it is easy to pin the crime on the prisoners, given that it’s possible they were tortured.” Also, the A.P. article reported that “The kingdom’s semiofficial human rights organization called for an independent investigation into the deaths of the two Saudis.”

I submit that a Saudi “human rights” organization is as much an oxymoron as a Mafia-run squad that offers crime victims trauma and bereavement counseling. It would be the stuff of satire were it not actually happening.

I submit also that it is beyond bizarre. Doubtless Shimri and al-Qahtani speak with the approval of the Saudi government. Saudi Arabia last week complained that the State Department included it in a list of twelve countries that deal in “human trafficking” (also known as slavery), and that this was “unfair” in lieu of American guilt. Prince Turki al-Faisal, Saudi ambassador to the U.S., speaking to a group of Nashville businessmen, said that “We read in American media and the press about the mistreatment of illegals who come to the U.S. seeking work and end up in brothels and gangs and unacceptable servitude, whether in factories or at farms, and yet that is not mentioned in the State Department report.”

More sanctimony that invites a punch in the face. It cannot even be called “hypocrisy.” You see how slyly and effectively the double standard of fatal altruist/ pragmatist/ multicultural Western premises can be used against us. You can see it; President Bush and Condeleezza Rice cannot. Or will not. Al-Faisal and his ilk know how to work a crowd of dupes and apologists and exploit our own double standard of good and bad premises. “You are altruists, but not perfect. Do not presume to throw stones at us. What you call crimes and abuses, you are guilty of committing.” And the dupes and apologists and aging hippies in three-piece suits nod in sad concession.

Saudi Arabia is a medieval dinosaur that also respects honor killings, castrations of boys, the subjugation of women, tribal vendettas, supports kindergartens for killers called “madrasas,” funds the jihad against the West through various “charities,” foundations, and oil revenues, and regularly practices extortion against especially the U.S. Saudi and other Islamic mouthpieces in the U.S. call for Sharia law to replace the Constitution. (Since that is an assertion of a “religious belief,” it cannot be defined as advocating treason, even though Islam makes no distinction between “church” and state.)

It would be an interesting to listen to a debate on the subject of which Muslim country is our deadlier enemy: Saudi Arabia or Iran. If I were a judge of such a debate, I would be obliged to call a tie and give both sides an equal number of marks.

Sunday Cafe: Why the Music Died*

Today, one often hears the question asked — sometimes despairingly, sometimes jeeringly — that if classical music is so wonderful, uplifting, and timeless, why is it no longer being composed? The stock answers are numerous, but unconvincing.

One is that classical music is peculiar to a period of European history dating approximately from the Renaissance through the nineteenth century, and thus is not the “voice” of our age. But that classical music remains valued by so many people in this age belies this assertion.

Another argument claims that classical composition has “evolved” beyond harmony, tonality, and melody to a “new plateau” of atonality. A variant of this argument charges that the public “ear,” so habituated to the traditional forms of musicality, suffers from a sort of evolutional, tonal lag because it has not kept pace with the ever-evolving musical avant-gard, purportedly representative of an advanced species of humanity. Thus, the ear must be trained or “conditioned” to plumb the reputed depths of jumbles of random sounds, or, in some cases, no sounds at all.

This is the complaint of the modern artist who sneers that the public cannot appreciate his abstract rendering of, say, Perseus and Andromeda, as a canvas of blots, drippings, and sprinkled-on metal shavings. The public, with the notable exception of an aesthetically superior minority, is philistine, perhaps even artistically “reactionary”; it is confined to a reificatory, bourgeois aesthetic prison, and insists that art be — Gads! Can you credit it? — intelligible and that music be compatible with its inchoate psychology.

Modern “formal” music, like modern art, is devoted to addressing a “higher” consciousness, using a “logic” that transcends syllogisms, proportion, time, space dimension, sense perception, and other Euro- and/or logo-centric “constructs.” In short, reality. It requires that listeners revise their expectations, discard the “prejudice” of the various centrisms, and passively receive logically ineffable droplets of pure essence, or pure being — or deliberately unintegrated sense data.

Among the many demerits of the politically correct Webster’s II New Riverside University Dictionary (1994), is its definition of music: “The art of arranging tones in an orderly sequence so as to produce a unified and continuous composition.” This definition is a step backward from “The science or art of incorporating intelligible combinations of tones into a composition having structure and continuity,” which is the definition found in Webster’s Seventh New Collegiate Dictionary (1969). The Riverside definition replaces the key term intelligible with orderly, which can mean virtually anything, and the term structure with unified, which can also mean virtually anything. One can imagine that the next edition of the Riverside will shed the self-conscious air of its ambiguous qualifiers and offer an au courant, fashionably “deconstructed” definition: “The art of arranging tones in a sequence to produce a composition” — which, of course, could be applied equally to Beethoven’s “Symphony No. 5” or to the gruntings and squeals of a pig sty.

A musical composition is an identifiable sum of its parts. A composition that has no structure, that seems to fly apart, or worse, seems to be notes and rhythms randomly flung into the air to fall where they may on a blank music sheet, has no sum, no identity, and no theme but chaos and madness. A composition of jumbled sounds “represents” merely the modernist fixation with pseudo-aesthetics and artistic fraud.

In her explanation of the purpose and demands of music, novelist-philosopher Ayn Rand wrote:

“It is in terms of his fundamental emotions — i.e., the emotions produced by his own metaphysical value judgments — that man responds to music….The theme of a composition entitled ‘Spring Song’ is not spring, but the emotions which spring evoked in the composer….Liszt’s ‘St. Francis Walking on the Water’ was inspired by a specific legend but what it conveys is a passionately dedicated struggle and triumph — by whom and in the name of what, is for each individual to supply.” 1

It was fashionable among early twentieth century composers to write melodic music punctuated by stretches of dissonance. Ralph Vaughan Williams, Aaron Copeland, Charles Ives, and Virgil Thompson all interspersed orchestrated “folk” melodies with dissonance. Even Edward Elgar, in his later work, resorted to the practice. They all helped to make madness and the irrational respectable. Copeland’s “Symphony No. 3,” for example, uses his well-known “Fanfare for the Common Man” as a melody around which he weaves screeches, drum rolls that herald nothing, and other chaotic noise. And none but the musicians who must play it can remember the full score of Samuel Barber’s “Adagio.”

“Don’t set out to raze all shrines — you’ll frighten men,” says Ellsworth Toohey, the critic and arch-villain in Rand’s novel, The Fountainhead. “Enshrine mediocrity — and the shrines are razed.”2 Toohey offers that advice in the course of explicating, for one of his willingly duped victims, his method of inculcating and promulgating collectivism in men’s souls. He could have added: Elevate incompetence, and competence is irrelevant; sanctify the irrational, and the rational is emasculated; praise noise, and music is silenced. The principle behind Thomas Gresham’s law, that bad money will drive out the good, is equally applicable to art and music, especially in a culture that is in a state of philosophical disintegration, and in which the destroyers are blithely sustained by the destroyed. Indeed, the idea that our culture, in its present state of anarchy, could generate classical music, seems almost oxymoronic.

“Doctors have this theory that if you play classical music for infants, they’ll understand complex relationships, like math. They don’t know what effect rock-and-roll would have. Well, we figure the world could do with one fewer accountant.”

This message was spoken by a post-adolescent male voice in a smarmy drawl in an ad for a popular radio station, accompanied by a series of jerky, time-lapse close-ups of a smiling infant rolling its head back and forth on a pillow in seeming enjoyment of the dissonant “rock” being played in the background. The commercial’s message is clear: It is not necessary for anyone to understand “complex relationships like math,” or to develop much skill in any field of mental labor. It is okay to raise a child to be a cognitive troglodyte, unable to raise his consciousness beyond the immediately perceptible, impatient with music that demands conceptual integration or that addresses a soul he may never recognize he possesses, or could have possessed, indifferent or hostile to anything that “makes sense.”

Whether or not there is any scientific truth to the theory that a particular genre of music can aid in (or arrest) a child’s mental faculties, the ad implicitly endorses the stunting of children’s minds. Accountant doubtless is used as a generic pejorative for all professionals who deal in facts, which includes the universe of Western science and technology that allows the intellectually slothful to exist in relative opulence and without having to exert much mental effort. The ad is distinctly anti-mind.

Anyone who regularly attends classical music concerts must be familiar with the practice of conductors or music directors of inserting “new” (or even old) atonal compositions between “traditional” ones in a program. An orchestra might begin with, say, Mozart’s “Impresario Overture,” end with Prokofiev’s “Classical Symphony,” and sandwich in between them something like Peter Warlock’s “Capriole Suite.” The practice ensures that concertgoers hear something of the “new plateau” genre whether they want to or not. And they will hear it, chiefly because most concertgoers believe it would be rude to rise en masse, leave the hall, and return when the noise has subsided. Modern “formal” music is played to audiences held hostage by their own civility.

If an orchestra were to advertise an all-Warlock, or an all-John Cage, or an all-Schoenberg concert, attendance would be embarrassingly thin. Why conductors or music directors continue the practice of subjecting their audiences to aural torture is a matter of conjecture. Perhaps they feel duty-bound to be “fair” to the newer composers; perhaps they feel obligated to play the compositions of government- or foundation-subsidized artists.

The last possibility has some interesting implications. How many orchestras remain wholly supported by private donations and receipts, free of the pressures exerted in by the byzantine mazes of public arts funding bureaucracies? Very few. That they must resort to this brand of extortion underscores the bankruptcy of what they foist upon their audiences.

Surely conductors know the difference between Camille Saint-Saëns’ “Phaeton” and Fritz Kreisler’s “String Quartet.” They must suspect that people attend live performances for many reasons, but that voluntary submission to what amounts to an enervating, auditory Rorschach test is not one of them. Whatever rationalizations have been offered by defenders of the practice, it is as purposeful as art galleries exhibiting kitsch or non-art together with genuine art. The unstated purpose of these exercises is to “enshrine mediocrity,” to subvert and destroy values, to undercut man’s capacity to formulate or sustain values, and to introduce doubt in their minds about the values they do hold.

One regularly exposed to this practice, if he does not maintain the conviction that what is being committed is a fraud, will begin to think: “Perhaps there is something here, something important about these lead pipes welded together to make a stick man. It’s right there next to Canova’s “Cupid and Psyche.” Perhaps I’ve missed the boat, and shouldn’t be so smug (or certain) about these things.”

This individual will not stop seeing the stick man as a bunch of pipes welded together, nor will he begin doubting the artistic value of the Canova, but he may begin to doubt the evidence of his senses, the certainty of his mind. Some part of his implicit certitude concerning right and wrong, good and bad, beautiful and ugly, reality and fantasy, will turn to mush, the certitude progressively softened by the miasma of a subjectivist, value-negating artistic nihilism.

This is an instance of retrogression, of the flaunting of primitivism as merely a “cultural difference.” Among this country’s black youth the results of this value negation have been especially sad. The enormity of the evil perpetrated on them by their parents and teachers defies description. “Cultural separatism” shares the same corrupting end as atonal “formal” composition: to be both A and non-A; that is, to live in a country whose high standard of living is made possible by Western values, but to hold conscious values that are hostile to or inimical to the West and civilized living.

Walter Grimes, reporting on a highly publicized debate between August Wilson, the Pulitzer-winning black playwright and Robert Brustein, drama critic for The New Republic, wrote: “Mr. Wilson tried to explain that his insistence on a black theater was not limiting.”3

“Why is white experience assumed to be universal, he asked, and black experience somehow particular? Why are black artists expected to become universal by transcending race and moving beyond black themes?”4

Grimes added:

“Black Americans, Mr. Wilson said, want to enter the American mainstream, but not at the price of shedding their African identity. Black artists have a duty to preserve and promote the thoughts and values of their ancestors, including their African ancestors. ‘If we choose not to assimilate…this does not mean we oppose the values of the dominant culture, but rather we wish to champion our own causes, our own celebrations, our own values.'”5

Mr. Grimes did not broach such questions as: What is a “black theme”? What is it that Mr. Wilson wishes to perpetuate? Is it only black “angst”? It is merely “white” experiences that the playwright wants segregated from the mainstream, or is it Western values in general? Are the concepts of individual rights and independent minds too universal or too peculiarly “white” to apply to blacks? How can one support individual freedoms, yet uphold a tribal (i.e., collectivist) consciousness at the same time?

“Separatism” may be achieved, but an “ethno-culture,” burdened with such phenomena as “Ebonics” in language, will not send probes to Mars, invent open-heart surgery, or grow corn. The great black musicians who contributed to American culture, e.g., Scott Joplin, Duke Ellington, Lionel Hampton, and Louis Armstrong, have apparently been disowned in favor of the malevolent “dissing” and droning of “rap.” Armstrong and company are now no more revered among Afro-centrists than are Thomas Sowell, J.C. Watts, Walter Williams, or Ward Connerly among thinkers, economists or educators, black or white.

Composers of film scores inherited the mantle of classical music composers. There is little distinction between what moved the latter and what can inspire the best creators of film scores: a story, a legend, an image, a tableau, a play, a need to express some inner conviction or truth. Once, much film music approached the symphonic or classical level. Many scores by composers such as William Walton, Arthur Bliss, John Barry, and Miklos Rozsa are as evocative and memorable as any opus from the nineteenth century, and can stand alone apart from their original inspiration. Walton’s score for Henry V, Maurice Jarre’s for Lawrence of Arabia, and James Horner’s for Glory come to mind as instances of what is possible.

The best film scores were those written for grand-scale, larger-than-life epics. But such epics are no longer being produced. Great music cannot be written to dramatize triteness, or about psychotics, functional illiterates, criminals, perverts, predatory aliens, whales or dinosaurs. And great music cannot be indefinitely appropriated to accompany and elevate the depiction of the superficial, the witless, the stupid, or the banal, such as in Woody Allen’s Manhattan.

The preferred and broadening cesspool of subject matter of most filmmakers today cannot serve as the genesis of magnificent, or event pleasant music. Popular films have become little more than vehicles for “special effects”; their stories are superfluous appendages, flimsy excuses to exhibit the technological repertoire of their computer graphics artists and incendiary experts. “Serious” films today, such as Love! Valour! Compassion! and Female Perversions (dealing, respectively, with homosexual relationships and feminist existentialism), are not rich material for great music, either. Film scores are written now to be heard and promptly forgotten.

A word about bass in contemporary popular music. Were this a separate article, its title could well be “Technology in the Hands of Barbarians.” The stress on “mega” bass (of 120 decibels or more, crowding the 180 decibel range of a NASA rocket launch) is especially revealing, for it confesses an attempt to compensate for vapidity of content in what passes for contemporary popular music. Bass, once considered a single musical element, has come to dominate “pop” music because this type of music requires the least amount of thought or imagination by either its composers or listeners. Its continual “thumping” — in popular music and even in television commercials — is used to arrest one’s attention, deaden thought, and metaphorically beat listeners to a stupefied pulp. On dance floors and in bars, it imposes a nihilistic gestalt on everyone and everything it touches. It is not joy or happiness or even sorrow that this kind of bass seeks to evoke, but a temporary state of annihilation.

Bass is also employed now as a weapon against civilized existence by those who install expensive “mega bass” amplifiers, “woofers,” and speakers in their vehicles. It is easy to name the motive of the owners of these throbbing machines: pure, unadulterated malice. The blasts that emanate from these vehicles are distracting not merely because of their volume; their peculiar, offensive, intrusive nature penetrates one’s consciousness as a disruptive, often painful force. It is not joy that the perpetrators of the “mega bass” phenomenon wish to share with random passersby or residents, but hatred and the chance to torture without physically touching anyone. What such creatures are saying is: We’re a revolting nuisance, but we’re here, we’re pumping up the volume, and there’s nothing you can do about it.

“Rap,” of course, cannot even be considered as music. Taking together its belligerent tone, its monotonous, metronomic beat, obscene and homicidal “lyrics,” and confrontational delivery, it is simply a species of malevolence.

Students attending the best music schools are no longer taught how to compose “classical” music. These schools, such as the Peabody in Baltimore, the Curtis in Philadelphia, and the Julliard in New York, are turning out talented soloist musicians, but their philosophy of composition is governed — if modern “formal” music is any kind of gauge — by the likes of Arnold Schoenberg, or worse. Consider the spirit of the nineteenth century, and one will understand the reasons why so much great music was written in that era. Consider the spirit of our time, and one will grasp the significance of music as a litmus test of general cultural well-being or decay.

A culture takes its cues from the top — from the universities, from the intelligentsia, from the trendsetters of ideas. And if the message from the top is that anything goes, then all that is good will go. The rubbish, bile, and nihilism that pass for music today cannot be legislated out of existence. Conservatives such was William Bennett, the former Secretary of Education, have proposed silencing the barbarians and frauds and nuisances, but even if they could be repressed or muffled, the appearance of a new Verdi, Brahms or Chopin will not be the consequence.

What is true of politics is true of aesthetics. Just as a free nation will collapse into statism when the most rational elements of the political philosophy on which it was founded and sustained are subverted or negated by elements of their antipodes, the best in aesthetics will vanish when the irrational, the atonal, and the unintelligible are given equal time and equal approbation.

The sad truth is that we should not expect greatness in music to emerge from a decaying, rudderless culture.

* Revised. Originally published in The Social Critic, Summer 1997

1 Ayn Rand, “Art and Cognition,” in The Romantic Manifesto (New York: New American Library).
2 Ayn Rand, The Fountainhead (Indianapolis, IN: Bobbs-Merrill Co., 1943), p. 691.
3 William Grimes, “Face-to-Face Encounter on Race in the Theater,” New York Times, January 29, 1997, Sec. C, p. 9.
4 Ibid.
5 Ibid.

Neville Chamberlain Redux

“The world must be made safe for democracy,” said President Woodrow Wilson to Congress on April 2nd, 1917, some months after he proposed “peace without victory.” Four days later Congress approved a declaration of war against Germany. Wilson could have asked for a declaration much earlier. German submarines were sinking neutral American shipping in a policy of unrestricted submarine warfare, five merchant ships being sunk in February and March that year alone.

Wilson had been waiting for a more “overt” act of belligerence against the U.S other than the loss of American lives at sea at German hands. But the most recent sinkings, together with the Zimmermann note to the German minister in Mexico, forced him to face reality. The Zimmermann note pledged Germany to support Mexico in an invasion of the U.S. southwest to deter certain American entry into the European conflict until after Germany had beaten Britain and France to exhaustion. If the U.S. declared war, German foreign minister Alfred Zimmermann instructed his minister in Mexico to assure Mexico that “we shall make war together and together make peace. We shall give generous financial support and it is understood that Mexico is to reconquer the lost territory in New Mexico, Texas and Arizona.”

A declaration of war was not what Wilson had in mind as an altruist “tonic of a moral adventure,” as editor and fellow Progressive Herbert Croly had prescribed for America years before. Rather, it was the role of mediator and “peace maker” in the conflicts and international disputes of the early 20th century.

Shuttle ahead ninety years to Georgetown University, where British Prime Minister Tony Blair, in remarks about the “new” global politics, proclaimed, “Idealism becomes the realpolitik.” An essential part of that “idealism” is the introduction of “democracy” in regions of the world that have seen no legitimate governments in over a century, chiefly because their inhabitants did not know what to do with democracy, except to vote themselves new tyrants or tolerate old ones. Democracy, however, means mob rule, no matter how legitimate it sounds. It recognizes no individual rights that a majority cannot abridge or abrogate.

Even Wilson’s contemporary, Vladimir Lenin, understood that. “Democracy is not identical with majority rule.” Off by one adverb in that statement, he elucidates the point in contradiction of himself. “Democracy is a State which recognizes the subjection of the minority to the majority, that is, an organization for the systematic use of force by one class against the other, by one part of the population against another.” (Chapter 4, State and Revolution, 1919) Which is why democracy was as much his enemy as “capitalistic” republicanism, to be ruthlessly crushed. After all, in terms of a nation’s population, a totalitarian party’s members are always in the minority.

The point here is that President Bush’s and Mr. Blair’s “idealism” does not fundamentally differ from Wilson’s. Its moral core consists of blind duty and the sacrifice of wealth and of lives to accomplish the spread of democracy. Integral to the concept is that the U.S. should eschew its selfish isolationism and adopt a proactive, Kantian “moral” role to correct wrongs wherever it might see them. Our political leaders are ruled by the little Prussian’s categorical imperative to “do the right thing” regardless of cost, self-interest, or even of consequence.

“Democracy,” rather than being an object of populist appeal or simply because it is easier for politicians to pronounce than “constitutional republic” (which is what the U.S. is becoming less and less), thus complements such “idealist realpolitik.” That is the true character of Mr. Blair’s “realpolitik.” It is the “idealism” of humility, retreat, and ultimate self-destruction.

In the conflict with Iran and its neo-Hitlerian President Mahmoud Ahmadinejad, Bush contends that the issue of Iran’s nuclear weapons development can be resolved with “robust diplomacy.” That was Wilson’s premise behind his proposal for an international peace conference to end the fighting between the European powers, and the basis of Neville Chamberlain’s negotiations with Nazi Germany.

Wilson also said, in April 1915, that “No nation is fit to sit in judgment upon any other nation.” Both Bush and Blair have refined that idea, alleging that no religion is fit to sit in judgment of any other creed. Their altruist, Christian premises forbid them to condemn Islam, and allow them to claim that Islam is not the motivating force behind terrorism. It has been “hijacked,” or “perverted.”

Anyone who has read the Koran knows this is an absurd notion, as absurd a notion that Hitler “hijacked” Nazism or that Stalin “perverted” communism. But, then, Bush and Blair believe in democracy, as well.

Some commentators may suspect that the May 31st news that the U.S. is willing to negotiate directly with Iran is a ruse to assure world opinion that it is not trying to bully Iran into giving up its nuclear enrichment program, and that it does not intend to employ force against Iran.

Given recent developments, we can believe that it is not a ruse. President Bush and Secretary of State Condoleezza Rice are willing to take both of Ahmadinejad’s hands and personally lead him to the higher plateau of international amity, global peace, and pure “democracy,” with Prime Minister Blair, Europe, Russia, China and others flinging confetti and flowers at them. Ahmadinejad can snarl and missile-rattle all he wishes; Bush and Rice are willing to forget dignity and take the abuse in the name of a higher cause.

Ahmadinejad is a beast, they agree. But he is there, a metaphysical given, and must be dealt with without igniting more conflict or exacerbating existing animosity. Ma Rice acknowledges that Iran is a supporter of terrorism “in Lebanon and Palestinian territories,” she remarked at a news conference, according to an Associated Press report on May 31st. But, “Iran can and should be a responsible state.” No mention by her or Bush of its support of terrorism in Iraq, where Iran’s “insurgent” proxies and planners are picking off Americans and Iraqis by the busload. Apparently, that is not “overt’ enough an act of war.

Nazi Germany and Imperial Japan were also metaphysical givens. Our policy more than half a century ago was to erase those givens, and that was the end of that. There were no negotiating “tables” to lure dictators to in the name of peace, just the burnt out shell of the Reichstag and the wind-blown ashes of Hiroshima.

As John Lewis remarked in correspondence elsewhere in response to the AP report, “Note that Rice’s admission that Iran has a right to nuclear energy is the same error the British made prior to WW II, when they accepted that Germany had the same ‘right to self-determination’ as other nations.”

Iran’s “self-determination,” in light of its record and especially in view of Ahmadinejad’s bellicose rantings, includes the “destiny” of ruling the Mideast by force or subversion, the annihilation of Israel, and setting the terms of peace with the rest of the world in a quest for a Pax Persia via nuclear payload.

In the staring contest between Ahmadinejad, Bush and Rice, the pragmatists blinked. So they must always blink when facing bellicosity. Their concept of ensuring national security is to offer the aggressor bribes, such as the U.S and Britain did in Vienna on June 1st, and to rule out military force.

The “realpolitik” of U.S. policy to date has been one of uncompromising pragmatism. Pragmatism as an “ideal” and as a policy must by its nature sacrifice the good to evil; otherwise it would not be pragmatism. Evil derives its strength from compromised and ultimately vanquished principles. Pragmatism discounts principles as a guide to moral conduct; they are forgotten in a rush to keep a nemesis at bay.

The principle left behind here is the right of the U.S. to its self-defense against a threatening rogue state. Reason and reality have no role in a policy of pragmatism. Yet, despite pragmatism’s sorry and costly role in history, especially in the 20th century, current leaders are convinced that pragmatism is the only “moral” path to follow. They are determined to make it “work.” But it works only to the benefit of the enemies of civilization.

The New York Times, under the chortling headline on June 1st, “Bush’s Realization on Iran: No Good Choice Left Except Talk,” reported that the president asked Rice “several months ago that he needed ‘a third option,’ a way to get beyond either a nuclear Iran or an American military action.” The term “beyond” is eloquently appropriate; it suggests an excursion into fantasyland in search of a Star Trekian “Prime Directive.” Bush has explicitly rejected an “either/or” in favor of an evasive, non-confrontational middle course.

One must wonder about the psychology of men who are so afraid of absolutes that they are willing to acknowledge a threat but never the rational course of action to take to remove one. According to the AP report, when Rice was asked about the possibility of the U.S. reestablishing diplomatic relations with Iran, Rice “ruled out a ‘grand bargain.’ However, she said a negotiated solution to the nuclear dispute could ‘begin to change the relationship.'”

“Nobody is confused about the nature of this regime,” said Rice at a news conference held to announce the alleged shift in policy. “We are not negotiating the terms of terrorism.”

Were she and Bush genuinely confused about the nature of Iran’s regime, it might be forgivable. But she names what she and Bush both know, and that makes the action an unforgivable betrayal. In effect, their willingness to “come to the table” to talk is, in effect, a willingness to negotiate the terms of terrorism.

Is it any wonder that Ahmadinejad is so contemptuously confident that Islam will triumph? Even psychopaths like him can sense cowardice and smell blood. Ahmadinejad has mastered Hitler’s playbook of the 1930’s.

The overture to the U.S.’s creeping, inevitable capitulation on Iran was reported in the Los Angeles Times of May 26th under the appropriate headline, “The Tyranny Doctrine.”

“Last week, Secretary of State…Rice announced resumption of full U.S. diplomatic relations with Libya, citing Tripoli’s renunciation of terrorism and intelligence cooperation.” The article asserts that this move “marks an effective end to the Bush doctrine.”

Rather, it highlights a continuation of the Bush doctrine of non-judgmental pragmatism, which has been to take the path of least resistance and greatest expediency, to avoid confronting major threats and to expend lives and treasure on incidentals, such as Iraq and Afghanistan. Not to mention, in this instance, a forgiving of Libyan dictator Qadhaffi for the murder of hundreds of Westerners by his own army of jihadists. There is “realpolitik” for you.

The Los Angeles Times article goes on to list Bush’s record of non-achievements in his pursuit of global “democracy”:

“The Bush administration has watched Egypt abrogate elections, ignored the collapse of the so-called Cedar Revolution in Lebanon and abandoned Chinese dissidents; now Washington is mulling a peace treaty with Stalinist North Korea.”

The mare’s nest of pragmatism and its consequences grows nastier, thicker and more perilous. When will Bush have his own “reality check” and grasp the true nature of our enemies? When we experience another September 11th?

Bush, at his second inauguration, stated: “The survival of liberty in our land increasingly depends on the success of liberty in other lands. The best hope for peace in our world is the expansion of freedom in all the world.” The first half of this statement is not strictly true; liberty in America can succeed without it succeeding elsewhere in the world. But what if the rest of the world rejects the peace that freedom can bring, and chooses the “peace” of submission, tyranny or conquest?

The world, indeed, is being made safe, but not for freedom.

Powered by WordPress & Theme by Anders Norén