The Sounds of Silencing
PEGGY NOONAN The Sounds of Silencing
Why do Americans on the left think only they have the right to dissent?
Friday, October 13, 2006 12:01 a.m.
Four moments in the recent annals of free speech in America. Actually annals is too fancy a word. This all happened in the past 10 days:
At Columbia University, members of the Minutemen, the group that patrols the U.S. border with Mexico and reports illegal crossings, were asked to address a forum on immigration policy. As Jim Gilchrist, the founder, spoke, angry students stormed the stage, shouting and knocking over chairs and tables. "Having wreaked havoc," said the New York Sun, they unfurled a banner in Arabic and English that said, "No one is ever illegal." The auditorium was cleared, the Minutemen silenced. Afterward a student protester told the Columbia Spectator, "I don't feel we need to apologize or anything. It was fundamentally a part of free speech. . . . The Minutemen are not a legitimate part of the debate on immigration."
On Oct. 2, on Katie Couric's "CBS Evening News," in the segment called "Free Speech," the father of a boy killed at Columbine shared his views on the deeper causes of the recent shootings in Amish country. Brian Rohrbough said violence entered our schools when we threw God out of them. "This country is in a moral freefall. For over two generations the public school system has taught in a moral vacuum. . . . We teach there are no moral absolutes, no right or wrong, and I assure you the murder of innocent children is always wrong, including abortion. Abortion has diminished the value of children." This was not exactly the usual mush.
Mr. Rohrbough was quickly informed he was not part of the legitimate debate, either. Howard Kurtz in the Washington Post: "The decision . . . to air his views prompted a storm of criticism, some of it within the ranks of CBS News." A blog critic: Grief makes people say "stupid" things, but "what made them put this man on television?" Good question. How did they neglect to silence him?
Soon after, at Madison Square Garden, Barbra Streisand, began her latest farewell tour with what friends who were there tell me was a moving, beautiful concert. She was in great form and brought the audience together in appreciation of her great ballads, which are part of the aural tapestry of our lives. And then . . . the moment. Suddenly she decided to bang away on politics. Fine, she's a Democrat, Bush is bad. But midway through the bangaway a man in the audience called out. Most could not hear him, but everyone seems to agree he at least said, "What is this, a fund-raiser?"
At this, Ms. Streisand became enraged, stormed the stage and pummeled herself. Wait, that was Columbia. Actually she became enraged and cursed the man. A friend who was there, a liberal Democrat, said what was most interesting was Ms. Streisand made a physical movement with her arms and hands--"those talon hands"--as if to say, See what I have to put up with when I attempt to educate the masses? She soon apologized, to her credit. Though apparently in the manner of a teacher who'd just kind of lost it with an unruly and ignorant student.
On "The View" a few days earlier it was Rosie O'Donnell. She was banging away on gun control. Guns are bad and should be banned. Elizabeth Hasselbeck, who plays the role of the young, attractive mom, tentatively responded. "I want to be fair," she said. Obviously there should be "restrictions," but women have a right to defend themselves, and there's "the right to bear arms" in the Constitution. Rosie accused Elizabeth of yelling. The panel, surprised, agreed that Elizabeth was not yelling. Rosie then went blank-faced with what someone must have told her along the way is legitimately felt rage. Elizabeth was not bowing to Rosie's views. Elizabeth needed to be educated. The education commenced, Rosie gesturing broadly and Elizabeth constricting herself as if she knew physical assault were a possibility. When Rosie gets going on the Second Amendment I always think, Oh I hope she's not armed! Actually I wonder what Freud would have made of an enraged woman obsessed with gun control. Ach, classic projection. Eef she had a gun she would kill. Therefore no one must haf guns.
There's a pattern here, isn't there?
It is not only about rage and resentment, and how some have come to see them as virtues, as an emblem of rightness. I feel so much, therefore my views are correct and must prevail. It is about something so obvious it is almost embarrassing to state. Free speech means hearing things you like and agree with, and it means allowing others to speak whose views you do not like or agree with. This--listening to the other person with respect and forbearance, and with an acceptance of human diversity--is the price we pay for living in a great democracy. And it is a really low price for such a great thing.
We all know this, at least in the abstract. Why are so many forgetting it in the particular?
Let us be more pointed. Students, stars, media movers, academics: They are always saying they want debate, but they don't. They want their vision imposed. They want to win. And if the win doesn't come quickly, they'll rush the stage, curse you out, attempt to intimidate.
And they don't always recognize themselves to be bullying. So full of their righteousness are they that they have lost the ability to judge themselves and their manner.
And all this continues to come more from the left than the right in America.
Which is, at least in terms of timing, strange. The left in America--Democrats, liberals, Bush haters, skeptics of many sorts--seems to be poised for a significant electoral victory. Do they understand that if it comes it will be not because of Columbia, Streisand, O'Donnell, et al., but in spite of them?
What is most missing from the left in America is an element of grace--of civic grace, democratic grace, the kind that assumes disagreements are part of the fabric, but we can make the fabric hold together. The Democratic Party hasn't had enough of this kind of thing since Bobby Kennedy died. What also seems missing is the courage to ask a question. Conservatives these days are asking themselves very many questions, but I wonder if the left could tolerate asking itself even a few. Such as: Why are we producing so many adherents who defy the old liberal virtues of free and open inquiry, free and open speech? Why are we producing so many bullies? And dim dullard ones, at that. Ms. Noonan is a contributing editor of The Wall Street Journal and author of "John Paul the Great: Remembering a Spiritual Father" (Penguin, 2005), which you can order from the OpinionJournal bookstore. Her column appears Fridays on OpinionJournal.com.
“Freedom and Justice in Islam”
I found this to be an excellent piece on the evolution of contemporary radical Islam from the September 2006 Imprimus,...
Cleveland E. Dodge Professor Emeritus of Near Eastern Studies, Princeton University
Bernard Lewis, born and raised in London, studied at the University of London's School of Oriental and African Studies, where he earned a Ph.D. in the history of Islam. After military and other war service in World War II, he taught at the University of London until 1974 and at Princeton University until 1986. He is currently Princeton's Cleveland E. Dodge Professor Emeritus of Near Eastern Studies. For many years he was one of the very few European scholars permitted access to the archives of the Ottoman Empire in Istanbul. In addition to his historical studies, he has published translations of classical Arabic, Turkish, Persian and Hebrew poetry. Professor Lewis has drawn on primary sources to produce more than two dozen books, including The Arabs in History, What Went Wrong? and The Crisis of Islam: Holy War and Unholy Terror.
The following is adapted from a lecture delivered on July 16, 2006, on board the Crystal Serenity, during a Hillsdale College cruise in the British Isles.
By common consent among historians, the modern history of the Middle East begins in the year 1798, when the French Revolution arrived in Egypt in the form of a small expeditionary force led by a young general called Napoleon Bonaparte—who conquered and then ruled it for a while with appalling ease. General Bonaparte—he wasn't yet Emperor—proclaimed to the Egyptians that he had come to them on behalf of a French Republic built on the principles of liberty and equality. We know something about the reactions to this proclamation from the extensive literature of the Middle Eastern Arab world. The idea of equality posed no great problem. Equality is very basic in Islamic belief: All true believers are equal. Of course, that still leaves three “inferior” categories of people—slaves, unbelievers and women. But in general, the concept of equality was understood. Islam never developed anything like the caste system of India to the east or the privileged aristocracies of Christian Europe to the west. Equality was something they knew, respected, and in large measure practiced. But liberty was something else.
As used in Arabic at that time, liberty was not a political but a legal term: You were free if you were not a slave. The word liberty was not used as we use it in the Western world, as a metaphor for good government. So the idea of a republic founded on principles of freedom caused some puzzlement. Some years later an Egyptian sheikh—Sheikh Rifa'a Rafi' al-Tahtawi, who went to Paris as chaplain to the first group of Egyptian students sent to Europe—wrote a book about his adventures and explained his discovery of the meaning of freedom. He wrote that when the French talk about freedom they mean what Muslims mean when they talk about justice. By equating freedom with justice, he opened a whole new phase in the political and public discourse of the Arab world, and then, more broadly, the Islamic world.
Is Western-Style Freedom Transferable?
What is the possibility of freedom in the Islamic world, in the Western sense of the word? If you look at the current literature, you will find two views common in the United States and Europe. One of them holds that Islamic peoples are incapable of decent, civilized government. Whatever the West does, Muslims will be ruled by corrupt tyrants. Therefore the aim of our foreign policy should be to insure that they are our tyrants rather than someone else's—friendly rather than hostile tyrants. This point of view is very much favored in departments of state and foreign offices and is generally known, rather surprisingly, as the “pro-Arab” view. It is, of course, in no sense pro-Arab. It shows ignorance of the Arab past, contempt for the Arab present, and unconcern for the Arab future. The second common view is that Arab ways are different from our ways. They must be allowed to develop in accordance with their cultural principles, but it is possible for them—as for anyone else, anywhere in the world, with discreet help from outside and most specifically from the United States—to develop democratic institutions of a kind. This view is known as the “imperialist” view and has been vigorously denounced and condemned as such.
In thinking about these two views, it is helpful to step back and consider what Arab and Islamic society was like once and how it has been transformed in the modern age. The idea that how that society is now is how it has always been is totally false. The dictatorship of Saddam Hussein in Iraq or the Assad family in Syria or the more friendly dictatorship of Mubarak in Egypt—all of these have no roots whatsoever in the Arab or in the Islamic past. Let me quote to you from a letter written in 1786—three years before the French Revolution—by Mssr. Count de Choiseul-Gouffier, the French ambassador in Istanbul, in which he is trying to explain why he is making rather slow progress with the tasks entrusted to him by his government in dealing with the Ottoman government. “Here,” he says, “things are not as in France where the king is sole master and does as he pleases.” “Here,” he says, “the sultan has to consult.” He has to consult with the former holders of high offices, with the leaders of various groups and so on. And this is a slow process. This scenario is something radically different than the common image of Middle Eastern government today. And it is a description that ceased to be true because of a number of changes that occurred.
Modernization and Nazi and Soviet Influence
The first of these changes is what one might call modernization. This was undertaken not by imperialists, for the most part, but by Middle Eastern rulers who had become painfully aware that their societies were undeveloped compared with the advanced Western world. These rulers decided that what they had to do was to modernize or Westernize. Their intentions were good, but the consequences were often disastrous. What they did was to increase the power of the state and the ruler enormously by placing at his disposal the whole modern apparatus of control, repression and indoctrination. At the same time, which was even worse, they limited or destroyed those forces in the traditional society that had previously limited the autocracy of the ruler. In the traditional society there were established orders-the bazaar merchants, the scribes, the guilds, the country gentry, the military establishment, the religious establishment, and so on. These were powerful groups in society, whose heads were not appointed by the ruler but arose from within the groups. And no sultan, however powerful, could do much without maintaining some relationship with these different orders in society. This is not democracy as we currently use that word, but it is certainly limited, responsible government. And the system worked. Modernization ended that. A new ruling class emerged, ruling from the center and using the apparatus of the state for its purposes.
That was the first stage in the destruction of the old order. The second stage we can date with precision. In the year 1940, the government of France surrendered to the Axis and formed a collaborationist government in a place called Vichy. The French colonial empire was, for the most part, beyond the reach of the Nazis, which meant that the governors of the French colonies had a free choice: To stay with Vichy or to join Charles de Gaulle, who had set up a Free French Committee in London. The overwhelming majority chose Vichy, which meant that Syria-Lebanon—a French-mandated territory in the heart of the Arab East—was now wide open to the Nazis. The governor and his high officials in the administration in Syria-Lebanon took their orders from Vichy, which in turn took orders from Berlin. The Nazis moved in, made a tremendous propaganda effort, and were even able to move from Syria eastwards into Iraq and for a while set up a pro-Nazi, fascist regime. It was in this period that political parties were formed that were the nucleus of what later became the Baath Party. The Western Allies eventually drove the Nazis out of the Middle East and suppressed these organizations. But the war ended in 1945, and the Allies left. A few years later the Soviets moved in, established an immensely powerful presence in Egypt, Syria, Iraq and various other countries, and introduced Soviet-style political practice. The adaptation from the Nazi model to the communist model was very simple and easy, requiring only a few minor adjustments, and it proceeded pretty well. That is the origin of the Baath Party and of the kind of governments that we have been confronting in the Middle East in recent years. That, as I would again repeat and emphasize, has nothing whatever to do with the traditional Arab or Islamic past.
Wahhabism and Oil
That there has been a break with the past is a fact of which Arabs and Muslims themselves are keenly and painfully aware, and they have tried to do something about it. It is in this context that we observe a series of movements that could be described as an Islamic revival or reawakening. The first of these—founded by a theologian called Ibn Abd al-Wahhab, who lived in a remote area of Najd in desert Arabia—is known as Wahhabi. Its argument is that the root of Arab-Islamic troubles lies in following the ways of the infidel. The Islamic world, it holds, has abandoned the true faith that God gave it through His prophet and His holy book, and the remedy is a return to pure, original Islam. This pure, original Islam is, of course—as is usual in such situations—a new invention with little connection to Islam as it existed in its earlier stages.
Wahhabism was dealt with fairly easily in its early years, but it acquired a new importance in the mid-1920s when two things happened: The local tribal chiefs of the House of Saud—who had been converted since the 18th century to the Wahhabi version of Islam—conquered the holy cities of Mecca and Medina. This was of immense importance, giving them huge prestige and influence in the whole Islamic world. It also gave them control of the pilgrimage, which brings millions of Muslims from the Islamic world together to the same place at the same time every year.
The other important thing that happened—also in the mid-20s—was the discovery of oil. With that, this extremist sect found itself not only in possession of Mecca and Medina, but also of wealth beyond the dreams of avarice. As a result, what would otherwise have been a lunatic fringe in a marginal country became a major force in the world of Islam. And it has continued as a major force to the present day, operating through the Saudi government and through a whole series of non-governmental organizations. What is worse, its influence spreads far beyond the region. When Muslims living in Chicago or Los Angeles or Birmingham or Hamburg want to give their children some grounding in their faith and culture—a very natural, very normal thing—they turn to the traditional resources for such purposes: evening classes, weekend schools, holiday camps and the like. The problem is that these are now overwhelmingly funded and therefore controlled by the Wahhabis, and the version of Islam that they teach is the Wahhabi version, which has thus become a major force in Muslim immigrant communities.
Let me illustrate the significance of this with one example: Germany has constitutional separation of church and state, but in the German school system they provide time for religious instruction. The state, however, does not provide teachers or textbooks. They allow time in the school curriculum for the various churches and other religious communities—if they wish—to provide religious instruction to their children, which is entirely optional. The Muslims in Germany are mostly Turks. When they reached sufficient numbers, they applied to the German government for permission to teach Islam in German schools. The German authorities agreed, but said they—the Muslims—had to provide the teachers and the textbooks. The Turks said that they had excellent textbooks, which are used in Turkey and Turkish schools, but the German authorities said no, those are government-produced textbooks; under the principle of separation of church and state, these Muslims had to produce their own. As a result, whereas in Turkish schools in Turkey, students get a modern, moderate version of Islam, in German schools, in general, they get the full Wahhabi blast. The last time I looked, twelve Turks have been arrested as members of Al-Qaeda—all twelve of them born and educated in Germany.
The Iranian Revolution and Al-Qaeda
In addition to the rising spread of Wahhabism, I would draw your attention to the Iranian Revolution of 1979. The word “revolution” is much misused in the Middle East; it is used for virtually every change of government. But the Iranian Revolution was a real revolution, in the sense that the French and Russian revolutions were real revolutions. It was a massive change in the country, a massive shift of power—socially, economically, and ideologically. And like the French and Russian revolutions in their prime, it also had a tremendous impact in the world with which the Iranians shared a common universe of discourse—the world of Islam. I remember not long after the Iranian Revolution I was visiting Indonesia and for some mysterious reason I had been invited to lecture in religious universities. I noticed in the student dorms they had pictures of Khomeini all over the place, although Khomeini—like the Iranians in general—is a Shiite, and the Indonesians are Sunnis. Indonesians generally showed little interest in what was happening in the Middle East. But this was something important. And the Iranian Revolution has gone through various familiar phases—familiar from the French and Russian revolutions—such as the conflicts between the moderates and the extremists. I would say that the Iranian Revolution is now entering the Stalinist phase, and its impact all over the Islamic world has been enormous.
The third and most recent phase of the Islamic revival is that associated with the name Al-Qaeda—the organization headed by Osama bin Laden. Here I would remind you of the events toward the end of the 20th century: the defeat of the Russians in Afghanistan, the withdrawal of the defeated armies into Russia, the collapse and breakdown of the Soviet Union. We are accustomed to regard that as a Western, or more specifically, an American, victory in the Cold War. In the Islamic world, it was nothing of the kind. It was Muslim victory in a Jihad. And, if we are fair about it, we must admit that this interpretation of what happened does not lack plausibility. In the mountains of Afghanistan, which the Soviets had conquered and had been trying to rule, the Taliban were able to inflict one defeat after another on the Soviet forces, eventually driving the Red Army out of the country to defeat and collapse.
Thanks to modern communications and the modern media, we are quite well informed about how Al-Qaeda perceives things. Osama bin Laden is very articulate, very lucid, and I think on the whole very honest in the way he explains things. As he sees it, and as his followers see it, there has been an ongoing struggle between the two world religions—Christianity and Islam—which began with the advent of Islam in the 7th century and has been going on ever since. The Crusades were one aspect, but there were many others. It is an ongoing struggle of attack and counter-attack, conquest and reconquest, Jihad and Crusade, ending so it seems in a final victory of the West with the defeat of the Ottoman Empire—the last of the great Muslim states—and the partition of most of the Muslim world between the Western powers. As Osama bin Laden puts it: “In this final phase of the ongoing struggle, the world of the infidels was divided between two superpowers—the United States and the Soviet Union. Now we have defeated and destroyed the more difficult and the more dangerous of the two. Dealing with the pampered and effeminate Americans will be easy.” And then followed what has become the familiar description of the Americans and the usual litany and recitation of American defeats and retreats: Vietnam, Beirut, Somalia, one after another. The general theme was: They can't take it. Hit them and they'll run. All you have to do is hit harder. This seemed to receive final confirmation during the 1990s when one attack after another on embassies, warships, and barracks brought no response beyond angry words and expensive missiles misdirected to remote and uninhabited places, and in some places—as in Beirut and Somalia—prompt retreats.
What happened on 9/11 was seen by its perpetrators and sponsors as the culmination of the previous phase and the inauguration of the next phase—taking the war into the enemy camp to achieve final victory. The response to 9/11 came as a nasty surprise. They were expecting more of the same—bleating and apologies—instead of which they got a vigorous reaction, first in Afghanistan and then in Iraq. And as they used to say in Moscow: It is no accident, comrades, that there has been no successful attack in the United States since then. But if one follows the discourse, one can see that the debate in this country since then has caused many of the perpetrators and sponsors to return to their previous diagnosis. Because remember, they have no experience, and therefore no understanding, of the free debate of an open society. What we see as free debate, they see as weakness, fear and division. Thus they prepare for the final victory, the final triumph and the final Jihad.
Let's spend a moment or two defining what we mean by freedom and democracy. There is a view sometimes expressed that “democracy” means the system of government evolved by the English-speaking peoples. Any departure from that is either a crime to be punished or a disease to be cured. I beg to differ from that point of view. Different societies develop different ways of conducting their affairs, and they do not need to resemble ours. And let us remember, after all, that American democracy after the War of Independence was compatible with slavery for three-quarters of a century and with the disenfranchisement of women for longer than that. Democracy is not born like the Phoenix. It comes in stages, and the stages and processes of development will differ from country to country, from society to society. The French cherish the curious illusion that they invented democracy, but since the great revolution of 1789, they have had two monarchies, two empires, two dictatorships, and at the last count, five republics. And I'm not sure that they've got it right yet.
There are, as I've tried to point out, elements in Islamic society which could well be conducive to democracy. And there are encouraging signs at the present moment—what happened in Iraq, for example, with millions of Iraqis willing to stand in line to vote, knowing that they were risking their lives, is a quite extraordinary achievement. It shows great courage, great resolution. Don't be misled by what you read in the media about Iraq. The situation is certainly not good, but there are redeeming features in it. The battle isn't over. It's still very difficult. There are still many major problems to overcome. There is a bitter anti-Western feeling which derives partly and increasingly from our support for what they see as tyrannies ruling over them. It's interesting that pro-American feeling is strongest in countries with anti-American governments. I've been told repeatedly by Iranians that there is no country in the world where pro-American feeling is stronger, deeper and more widespread than Iran. I've heard this from so many different Iranians—including some still living in Iran—that I believe it. When the American planes were flying over Afghanistan, the story was that many Iranians put signs on their roofs in English reading, “This way, please.”
So there is a good deal of pro-Western and even specifically pro-American feeling. But the anti-American feeling is strongest in those countries that are ruled by what we are pleased to call “friendly governments.” And it is those, of course, that are the most tyrannical and the most resented by their own people. The outlook at the moment is, I would say, very mixed. I think that the cause of developing free institutions—along their lines, not ours—is possible. One can see signs of its beginning in some countries. At the same time, the forces working against it are very powerful and well entrenched. And one of the greatest dangers is that on their side, they are firm and convinced and resolute. Whereas on our side, we are weak and undecided and irresolute. And in such a combat, it is not difficult to see which side will prevail.
I think that the effort is difficult and the outcome uncertain, but I think the effort must be made. Either we bring them freedom, or they destroy us.
Stephen Laffey tries to knock off the most liberal Republican senator
JOHN J. MILLER
Halfway up the steep incline of Parker Street, Stephen Laffey jokes, “We should have started at the top and worked our way down.” It’s a hot July morning here in Lincoln. A hundred feet ahead, the driver of a silver pickup truck announces through a bullhorn that Laffey is “the only U.S. Senate candidate who’s campaigning on Parker Street.” Several other volunteers follow on foot, ringing doorbells and searching for homeowners who want to meet Laffey, a Republican who is challenging Sen. Lincoln Chafee in the state’s GOP primary. Quite a few of them come out to shake hands. “Someone should have checked a contour map,” says Laffey with a smile.
Then again, Laffey didn’t bother to check a contour map when he decided to take on a sitting senator in a primary — an uphill battle if there ever was one. Perhaps he didn’t need to. Recent polls suggest that Laffey, who is the mayor of Cranston, just might win the Republican nomination on September 12. In May, a survey of likely GOP primary voters found him to have a slight lead over Chafee, 46 to 44 percent. “We’re going to crush him,” boasts Laffey. But whatever the margin, a Laffey victory would surely be the year’s biggest political surprise for Republicans.
For conservatives, the surprise wouldn’t be merely pleasant, but positively exhilarating — almost the equivalent of what they would have felt two years ago if former congressman Pat Toomey had defeated Sen. Arlen Specter in Pennsylvania’s GOP primary. That’s because Chafee might be the most irritating Republican in the Senate. The problem isn’t simply that he opposes tax cuts, supports partial-birth abortion, and believes that enemy combatants should enjoy habeas corpus rights. After all, somebody has to be the most liberal Republican senator, and chances are it will be a person who hails from a true-blue state like Rhode Island.
What makes Chafee stand out even among figures such as senators Susan Collins and Olympia Snowe — liberal Republicans from Maine — is his sheer flamboyance. In 2004, he announced that he wasn’t voting for President Bush’s reelection. In what he called a “symbolic protest,” he wrote in the name of Bush’s father on his ballot. He also threatened to switch parties, something he may very well do if Democrats can welcome him into a Senate majority. Last January, Chafee was the only Republican to oppose the confirmation of Supreme Court justice Samuel Alito. And in March, he described Sen. Russ Feingold’s censure resolution to condemn Bush as “positive,” because it helped put the issue of wiretapping “into the public awareness.” (When these comments achieved their own public awareness — or, more accurately, their own notoriety — he added that he opposed censure.)
A close look at Chafee’s congressional record suggests that the senator would fit comfortably within the Democratic fold: The American Conservative Union gives Chafee a lifetime rating of 37 out of a possible 100. This is not only the worst performance in the GOP, but it actually places Chafee to the left of Democratic senator Ben Nelson of Nebraska. Chafee’s rating for 2005 is a dismal 12, which is precisely the score of Sen. Hillary Clinton. Only 24 senators received a lower score. Twenty-three of them are liberal Democrats, and the other is Jim Jeffords, the “independent” who caucuses with the Democrats. Even Feingold, who is actively courting his party’s left wing in anticipation of a presidential run, was rated a point better than Chafee. So it’s no wonder that a lot of rank-and-file Republicans have run out of patience with Rhode Island’s junior senator.
A DOG NAMED MILTON
In the 44-year-old Laffey, the GOP has a credible alternative to Chafee. He’s the twice-elected mayor of Cranston, a small city thick with Reagan Democrats. Laffey grew up in this working-class town and learned about Milton Friedman from a high-school teacher. (Today, that teacher is a campaign volunteer and Laffey owns a dog named Milton.) Laffey became the first member of his family to go to college, at Bowdoin and then Harvard Business School. He spent time on Wall Street before taking a job with Morgan Keegan, an investment bank in Memphis, where he became president at the age of 38. When the company was sold in 2001, he decided to move back to his hometown and open his own firm.
With his family in Tennessee, Laffey went to Rhode Island to look for a house. Before he found one, he read a newspaper story about Cranston’s financial mess. So he went over to City Hall and asked to see the audits. “They said they didn’t have any,” says Laffey. “I called my wife in Memphis and told her I was running for mayor.” He hadn’t run for anything since his student-council days in high school and college. A year later, after spending $270,000 of his own money — a lot of cash for a Cranston mayoral race — he was elected.
When Laffey took the oath of office, Cranston was on the brink of bankruptcy. The new mayor cut fat from the budget and tangled with the local unions. Priorities were so skewed that Cranston’s school-crossing guards — a unionized bunch — were paid the equivalent of $129 per hour. The ensuing fight made news throughout the state. “Ronald Reagan had air-traffic controllers,” says Laffey. “I had crossing guards.” Today, Cranston’s crossing guards are still unionized, but they make only $16 per hour. This well-publicized triumph, which according to Laffey saves Cranston about half a million dollars annually, has served as one of the mayor’s calling cards as he campaigns around Rhode Island.
But Laffey’s best medicine for Cranston could also be bitter: He hiked property taxes three times. “I had no choice,” he says. “I inherited a faulty budget full of unfunded pensions, and if I hadn’t raised taxes the city would have fallen into receivership and a judge or the state would have done it.” (Cranston’s current budget, which took effect in June, actually includes a small tax cut.)
Even though Chafee himself is no tax-cutter — he has voted against extending the life of federal tax reductions, which is effectively a tax increase — he has run campaign ads that try to paint Laffey as a tax-happy pol. Yet Laffey has still managed to win a thumbs-up from anti-tax groups. “Normally, we won’t support anybody who has a history of raising taxes,” says Toomey, who now leads the Club for Growth. “But we studied the circumstances in Cranston, and Laffey’s hands really were tied.” Arguably more important than what Laffey has done in the past is the commitment he has made for the future: He has signed the Americans for Tax Reform pledge, promising not to support any tax increases in the Senate. Chafee hasn’t followed this lead. Moreover, Steve Forbes — no friend of higher taxes — will host a fundraiser for Laffey on July 24 in Manhattan.
On other issues, Laffey is a mainstream conservative: He rails against runaway federal spending, backs the president on Iraq, supports free trade, believes Congress should pass an enforcement-only immigration bill, and thinks the flag deserves the protection of a constitutional amendment. He’s also pro-life, which is hardly a kiss of death in Rhode Island: Gov. Donald Carcieri, a Republican, and Rep. James Langevin, a Democrat, are both anti-abortion.
On energy policy, Laffey departs from conservative orthodoxy. His first campaign commercial, which aired last fall, was a piece of populism filmed at a gas station: The mayor slammed “big oil companies” and “special interests” for high prices. He wants to increase federal fuel-efficiency standards for cars, and use the tax code to promote alternative sources of energy. “We should have solar panels on every roof,” he says. He’s against drilling in the Arctic and favors offshore wind farms.
The National Republican Senatorial Committee, under the direction of Sen. Elizabeth Dole, has criticized Laffey for this minor apostasy — as if it somehow excuses Chafee’s more complete version. Last fall, as Laffey was going on the air with his first TV commercials, the NRSC struck with at least $150,000 in ads against Laffey. Only in Montana, where Republican senator Conrad Burns faces a difficult reelection, has the NRSC spent more — and there, at least, it was attacking a Democrat. “Rather than propping up Chafee against a credible conservative, wouldn’t this anti-Laffey cash have been better spent on helping other vulnerable Republican senators, such as Jim Talent of Missouri?” asks Jon Lerner, a consultant with Laffey’s campaign. On July 10, the NRSC filed a complaint against Laffey with the Federal Election Commission, alleging that he had used Cranston city resources to promote his U.S. Senate candidacy. No one can think of a precedent for a party committee using the FEC to harass one of its own.
The Republican establishment’s hostility to Laffey stands in stark contrast to the behavior of Democrats in Connecticut’s primary, where Sen. Joe Lieberman faces an unexpectedly strong challenge from antiwar candidate Ned Lamont. Party heavyweights such as Hillary Clinton and John Kerry have offered tepid support for Lieberman, indicating that they’ll switch to Lamont if he wins the nomination and Lieberman runs in the general as an independent. In Rhode Island, Laffey has promised to support Chafee if the senator gets the Republican nod, but Chafee has refused to say he’ll support Laffey.
COFFEE SHOPS AND WIENER JOINTS
Chafee is in fact counting on the support of Democrats to help him win the GOP nomination. This spring, his campaign mounted an effort to convince them to disaffiliate from their party, at least temporarily, in order to become eligible to vote in the Republican primary. More than 14,000 did so — a huge number in Rhode Island, where turnout probably won’t top 50,000 voters. If Chafee prevails on September 12, his victory over a conservative will have been underwritten by national Republican dollars and made possible by local Democratic ballots.
And if Laffey somehow manages to pull off an upset, it will come from having targeted Republican voters for an entire year and engaging in the retail politics of a tiny state whose total population is smaller than that of Dallas or San Diego. Last year, when I first interviewed Laffey, he said that he planned to visit “every coffee shop and wiener joint in Rhode Island.” By all appearances, he has done his best to keep pace. “Nobody will outwork me,” he said to a gathering of Republicans in Burrillville on July 7. The next day, as he walked around Lincoln in his trademark yellow shirt, Laffey bumped into an old schoolmate who had moved out of Cranston. “Only in Rhode Island,” he said, a few minutes later.
With Providence as its single media market, Rhode Island is amenable to a Laffey-style insurgency the way larger states are not. But are its voters simply too liberal for a conservative who wants to be a senator? Although Republicans can win statewide office — there hasn’t been a Democratic governor in more than a decade — Laffey faces long odds. Even if he accomplishes the difficult task of unseating an incumbent in the primary, he must go on to win a general election in a year that doesn’t favor Republicans. Indeed, a Brown University poll in June showed Sheldon Whitehouse, a former attorney general and the presumptive Democratic nominee, running far in front of Laffey, 55 to 25 percent. Against Chafee, Whitehouse led by just a point, 38 to 37 percent. And that’s what the GOP case for Chafee essentially boils down to: the belief that a spectacularly lousy Republican who enjoys a reasonable chance of winning is preferable to a promising young conservative who would have to pull off a political miracle.
But is that the right belief to hold? A Chafee victory would send a powerful message to other liberal Republicans: The party will be there for you when you really need it, even if you’re not there when it needs you. Laffey may face an uphill battle over the next few months, but one thing’s for sure about the GOP’s future with Chafee: It’s all downhill from here.
Evolution and Me - George Gilder - NR July 17, 2006
‘The Darwinian theory has become an all-purpose obstacle to thought rather than an enabler of scientific advance’
I first became conscious that something was awry in Darwinian science some 40 years ago as I was writing my early critique of sexual liberation, Sexual Suicide (revised and republished as Men and Marriage). At the time, the publishing world was awash with such titles as Desmond Morris’s The Naked Ape and The Human Zoo and Robert Ardrey’s African Genesis, which touted or pruriently probed the animality of human beings. Particularly impressive to me was The Imperial Animal, a Darwinian scholarly work by two anthropologists aptly named Lionel Tiger and Robin Fox that gave my theory of sex roles a panoply of primatological support, largely based on the behavior of patriarchal hamadryas baboons.
Darwinism seemed to offer me and its other male devotees a long-sought tool — resembling the x-ray glasses lamentably found elsewhere only in cartoons — for stripping away the distracting décor of clothing and the political underwear of ideology worn by feminists and other young women of the day. Using this swashbuckling scheme of fitness and survival, nature “red in tooth and claw,” we could reveal our ideological nemeses as naked mammals on the savannah to be ruled and protected by hunting parties of macho males, rather like us.
In actually writing and researching Sexual Suicide, however, I was alarmed to discover that both sides could play the game of telling just-so stories. In The Descent of Woman, Elaine Morgan showed humans undulating from the tides as amphibious apes mostly led by females. Jane Goodall croodled about the friendliness of “our closest relatives,” the chimpanzees, and movement feminists flogged research citing the bonobo and other apes as chiefly matriarchal and frequently homosexual.
These evolutionary sex wars were mostly unresolvable because, at its root, Darwinian theory is tautological. What survives is fit; what is fit survives. While such tautologies ensure the consistency of any arguments based on them, they could contribute little to an analysis of what patterns of behavior and what ideals and aspirations were conducive to a good and productive society. Almost by definition, Darwinism is a materialist theory that banishes aspirations and ideals from the picture. As an all-purpose tool of reductionism that said that whatever survives is, in some way, normative, Darwinism could inspire almost any modern movement, from the eugenic furies of Nazism to the feminist crusades of Margaret Sanger and Planned Parenthood.
So in the end, for better or for worse, my book dealt chiefly with sociological and anthropological arguments and left out Darwin.
Turning to economics in researching my 1981 book Wealth & Poverty, I incurred new disappointments in Darwin and materialism. Forget God — economic science largely denies intelligent design or creation even by human beings. Depicting the entrepreneur as a mere opportunity scout, arbitrageur, or assembler of available chemical elements, economic theory left no room for the invention of radically new goods and services, and little room for economic expansion except by material “capital accumulation” or population growth. Accepted widely were Darwinian visions of capitalism as a dog-eat-dog zero-sum struggle impelled by greed, where the winners consume the losers and the best that can be expected for the poor is some trickle down of crumbs from the jaws (or tax tables) of the rich.
In my view, the zero-sum caricature applied much more accurately to socialism, which stifles the creation of new wealth and thus fosters a dog-eat-dog struggle over existing material resources. (For examples, look anywhere in the socialist Third World.) I preferred Michael Novak’s vision of capitalism as the “mind-centered system,” with the word itself derived from the Latin caput, meaning head. Expressing the infinite realm of ideas and information, it is a domain of abundance rather than of scarcity. Flouting zero-sum ideas, supply-side economics sprang from this insight. By tapping the abundance of human creativity, lower tax rates can yield more revenues than higher rates do and low-tax countries can raise their government spending faster than the high-tax countries do. Thus free nations can afford to win wars without first seizing resources from others. Ultimately capitalism can transcend war by creating rather than capturing wealth — a concept entirely alien to the Darwinian model.
After Wealth & Poverty, my work focused on the subject of human creativity as epitomized by science and technology and embodied in computers and communications. At the forefront of this field is a discipline called information theory. Largely invented in 1948 by Claude Shannon of MIT, it rigorously explained digital computation and transmission by zero-one, or off-on, codes called “bits.” Shannon defined information as unexpected bits, or “news,” and calculated its passage over a “channel” by elaborate logarithmic rules. That channel could be a wire or another other path across a distance of space, or it could be a transfer of information across a span of time, as in evolution.
Crucial in information theory was the separation of content from conduit — information from the vehicle that transports it. It takes a low-entropy (predictable) carrier to bear high-entropy (unpredictable) messages. A blank sheet of paper is a better vessel for a new message than one already covered with writing. In my book Telecosm (2000), I showed that the most predictable available information carriers were the regular waves of the electromagnetic spectrum and prophesied that all digital information would ultimately flow over it in some way. Whether across time (evolution) or across space (communication), information could not be borne by chemical processes alone, because these processes merged or blended the medium and the message, leaving the data illegible at the other end.
While studying computer science, I learned of the concept of a universal computing machine, an idealized computer envisioned by the tormented genius Alan Turing. (After contributing significantly to the Enigma project for decrypting German communications during World War II, Turing committed suicide following shock therapy — “treatment” for his homosexuality.) A so-called “Turing machine” is an idealized computer that can be created using any available material, from beach sand to Buckyballs, from microchips to matchsticks. Turing made clear that the essence of a computer is not its material substance but its architecture of ideas.
Based as it is on ideas, a computer is intrinsically an object of intelligent design. Every silicon chip holds as many as 700 layers of implanted chemicals in patterns defined with nanometer precision and then is integrated with scores of other chips by an elaborately patterned architecture of wires and switches all governed by layers of software programming written by human beings. Equally planned and programmed are all the computers running the models of evolution and “artificial life” that are central to neo-Darwinian research. Everywhere on the apparatus and in the “genetic algorithms” appear the scientist’s fingerprints: the “fitness functions” and “target sequences.” These algorithms prove what they aim to refute: the need for intelligence and teleology (targets) in any creative process.
I came to see that the computer offers an insuperable obstacle to Darwinian materialism. In a computer, as information theory shows, the content is manifestly independent of its material substrate. No possible knowledge of the computer’s materials can yield any information whatsoever about the actual content of its computations. In the usual hierarchy of causation, they reflect the software or “source code” used to program the device; and, like the design of the computer itself, the software is contrived by human intelligence.
The failure of purely physical theories to describe or explain information reflects Shannon’s concept of entropy and his measure of “news.” Information is defined by its independence from physical determination: If it is determined, it is predictable and thus by definition not information. Yet Darwinian science seemed to be reducing all nature to material causes.
As I pondered this materialist superstition, it became increasingly clear to me that in all the sciences I studied, information comes first, and regulates the flesh and the world, not the other way around. The pattern seemed to echo some familiar wisdom. Could it be, I asked myself one day in astonishment, that the opening of St. John’s Gospel, In the beginning was the Word, is a central dogma of modern science?
In raising this question I was not affirming a religious stance. At the time it first occurred to me, I was still a mostly secular intellectual. But after some 35 years of writing and study in science and technology, I can now affirm the principle empirically. Salient in virtually every technical field — from quantum theory and molecular biology to computer science and economics — is an increasing concern with the word. It passes by many names: logos, logic, bits, bytes, mathematics, software, knowledge, syntax, semantics, code, plan, program, design, algorithm, as well as the ubiquitous “information.” In every case, the information is independent of its physical embodiment or carrier.
Biologists commonly blur the information into the slippery synecdoche of DNA, a material molecule, and imply that life is biochemistry rather than information processing. But even here, the deoxyribonucleic acid that bears the word is not itself the word. Like a sheet of paper or a computer memory chip, DNA bears messages but its chemistry is irrelevant to its content. The alphabet’s nucleotide “bases” form “words” without help from their bonds with the helical sugar-phosphate backbone that frames them. The genetic words are no more dictated by the chemistry of their frame than the words in Scrabble are determined by the chemistry of their wooden racks or by the force of gravity that holds them.
This reality expresses a key insight of Francis Crick, the Nobel laureate co-author of the discovery of the double-helix structure of DNA. Crick expounded and enshrined what he called the “Central Dogma” of molecular biology. The Central Dogma shows that influence can flow from the arrangement of the nucleotides on the DNA molecule to the arrangement of amino acids in proteins, but not from proteins to DNA. Like a sheet of paper or a series of magnetic points on a computer’s hard disk or the electrical domains in a random-access memory — or indeed all the undulations of the electromagnetic spectrum that bear information through air or wires in telecommunications — DNA is a neutral carrier of information, independent of its chemistry and physics. By asserting that the DNA message precedes and regulates the form of the proteins, and that proteins cannot specify a DNA program, Crick’s Central Dogma unintentionally recapitulates St. John’s assertion of the primacy of the word over the flesh.
By assuming that inheritance is a chemical process, Darwin ran afoul of the Central Dogma. He believed that the process of inheritance “blended” together the chemical inputs of the parents. Seven years after Darwin published The Origin of Species, though, Gregor Mendel showed that genes do not blend together like chemicals mixing. As the Central Dogma ordains and information theory dictates, the DNA program is discrete and digital, and its information is transferred through chemical carriers — but it is not specified by chemical forces. Each unit of biological information is passed on according to a digital program — a biological code — that is transcribed and translated into amino acids.
THE MEDIUM NOT THE MESSAGE
Throughout the 20th century and on into the 21st, many scientists and politicians have followed Darwin in missing the significance of the “Central Dogma.” They have assumed that life is dominated by local chemistry rather than by abstract informative codes. Upholding the inheritability of acquired characteristics, Jean-Baptiste Lamarck, Trofim Lysenko, Aleksandr Oparin, Friedrich Engels, and Josef Stalin all espoused the primacy of proteins and thus of the environment over the genetic endowment. By controlling the existing material of human beings through their environment, the Lamarckians believed that Communism could blend and breed a new Soviet man through chemistry. Dissenters were murdered or exiled. (The grim story is vividly told in Hubert Yockey’s definitive 2005 book, Information Theory, Evolution, and the Origin of Life.)
For some 45 years, Barry Commoner, the American Marxist biologist, refused to relinquish the Soviet mistake. He repeated it in an article in Harper’s in 2002, declaring that proteins must have come first because DNA cannot be created without protein-based enzymes. In fact, protein-based enzymes cannot be created without a DNA (or RNA) program; proteins have no structure without the information that defines them. As Yockey explains, “It is mathematically impossible, not just unlikely, for information to be transferred from the protein alphabet to the [DNA] alphabet. That is because no codes exist to transfer information from the 20-letter protein alphabet to the 64-letter [codon] alphabet of [DNA].” Twenty letters simply cannot directly specify the content of patterns of 64 codons.
But the beat goes on. By defrocking Lawrence Summers for implying the possible primacy of the genetic word over environmental conditions in the emergence of scientific aptitudes, the esteemed professoriat at Harvard expressed its continued faith in Lamarckian and Marxian biology.
Over at NASA, U.S. government scientists make an analogous mistake in constantly searching for traces of protein as evidence of life on distant planets. Without a hierarchy of informative programming, proteins are mere matter, impotent to produce life. The Central Dogma dooms the NASA pursuit of proteins on the planets to be what we might call a “wild goo chase.” As St. John implies, life is defined by the presence and precedence of the word: informative codes.
I began my 1989 book on microchips, Microcosm: The Quantum Era in Economics and Technology, by quoting physicist Max Planck, the discoverer of the quantum, on the resistance to his theory among the scientific establishment — the public scientists of any period whom I have dubbed the Panel of Peers. By any name they define the “consensus” of respectable science. At the beginning of the 20th century, said Planck, they balked at taking the “enormous step from the visible and directly controllable to the invisible sphere, from the macrocosm to the microcosm.”
But by entrance into the “microcosm” of the once-invisible world of atoms, all physical science was transformed. When it turned out early in the 20th century that the atom was not a “massy unbreakable particle,” as Isaac Newton had imagined, but a complex arena of quantum information, the classical physics of Newton began inexorably to break down. We are now at a similar point in the history of the sciences of life. The counterpoint to the atom in physics is the cell in biology. At the beginning of the 21st century it turns out that the biological cell is not a “simple lump of protoplasm” as long believed but a microcosmic processor of information and synthesizer of proteins at supercomputer speeds. As a result, breaking down as well is the established biology of Darwinian materialism.
No evolutionary theory can succeed without confronting the cell and the word. In each of the some 300 trillion cells in every human body, the words of life churn almost flawlessly through our flesh and nervous system at a speed that utterly dwarfs the data rates of all the world’s supercomputers. For example, just to assemble some 500 amino-acid units into each of the trillions of complex hemoglobin molecules that transfer oxygen from the lungs to bodily tissues takes a total of some 250 peta operations per second. (The word “peta” refers to the number ten to the 15th power — so this tiny process requires 250x1015 operations.)
Interpreting a DNA program and translating it through a code into a physical molecule, the cells collectively function at almost a thousand times the processing speed of IBM’s new Blue Gene/L state-of-the-art supercomputer. This information processing in one human body for just one function exceeds by some 25 percent the total computing power of all the world’s 200 million personal computers produced every year.
Yet, confined as they are to informational functions, computer models stop after performing the initial steps of decoding the DNA and doing a digital-to-analog conversion of the information. The models do not begin to accomplish the other feats of the cell, beginning with the synthesis of protein molecules from a code, and then the exquisitely accurate folding of the proteins into the precise shape needed to fit them together in functional systems. This process of protein synthesis and “plectics” cannot even in principle be modeled on a computer. Yet it is essential to the translation of information into life.
WORRYING THE WORD
Within the Panel of Peers, the emergence of the cell as supercomputer precipitated a mostly unreported wave of consternation. Crick himself ultimately arrived at the theory of “panspermia” — in which he speculated that life was delivered to the earth from other galaxies, thus relegating the problems of creation to a realm beyond our reach. Sensing a crisis in his then exclusively materialist philosophy, neo-Darwinian Richard Dawkins of Oxford coined the word “meme” to incorporate information in biology, describing ideas as undergoing a Darwinian process of survival of the fittest. But in the end Dawkins’s memes are mere froth on the surface of a purely chemical tempest, fictive reflections of material reality rather than a governing level of information. The tongue still wags the mind.
These stratagems can be summed up as an effort to subdue the word by shrinking it into a physical function, whimsically reducing it to a contortion of the pharynx reflecting a firing of synapses following a mimetic emanation of matter from a random flux of quanta shaking physical atoms. Like the whirling tigers of the children’s fable, the recursive loops of names for the word chase their tails around the tree of life, until there is left at the bottom only a muddled pool of what C. S. Lewis called “nothing buttery.”
“Nothing buttery” was Lewis’s way of summing up the stance of public scientists who declared that “life” or the brain or the universe is “nothing but” matter in motion. As MIT’s Marvin Minsky famously asserted, “The brain is nothing but a ‘meat machine.’” In DNA (2003), Crick’s collaborator James Watson doggedly insisted that the discovery of DNA “proved” that life is nothing but or “merely chemistry and physics.” It is a flat-universe epistemology, restricted to what technologists call the “physical layer,” which is the lowest of seven layers of abstraction in information technology between silicon chips and silica fiber on the bottom and the programs and content at the top.
After 100 years or so of attempted philosophical leveling, however, it turns out that the universe is stubbornly hierarchical. It is a top-down “nested hierarchy,” in which the higher levels command more degrees of freedom than the levels below them, which they use and constrain. Thus, the higher levels can neither eclipse the lower levels nor be reduced to them. Resisted at every step across the range of reductive sciences, this realization is now inexorable. We know now that no accumulation of knowledge about chemistry and physics will yield the slightest insight into the origins of life or the processes of computation or the sources of consciousness or the nature of intelligence or the causes of economic growth. As the famed chemist Michael Polanyi pointed out in 1961, all these fields depend on chemical and physical processes, but are not defined by them. Operating farther up the hierarchy, biological macro-systems such as brains, minds, human beings, businesses, societies, and economies consist of intelligent agents that harness chemical and physical laws to higher purposes but are not reducible to lower entities or explicable by them.
Materialism generally and Darwinian reductionism, specifically, comprise thoughts that deny thought, and contradict themselves. As British biologist J. B. S. Haldane wrote in 1927, “If my mental processes are determined wholly by the motions of atoms in my brain, I have no reason to suppose my beliefs are true . . . and hence I have no reason for supposing my brain to be composed of atoms.” Nobel-laureate biologist Max Delbrück (who was trained as a physicist) described the contradiction in an amusing epigram when he said that the neuroscientist’s effort to explain the brain as mere meat or matter “reminds me of nothing so much as Baron Munchausen’s attempt to extract himself from a swamp by pulling on his own hair.”
Analogous to such canonical self-denying sayings as The Cretan says all Cretans are liars, the paradox of the self-denying mind tends to stultify every field of knowledge and art that it touches and threatens to diminish this golden age of technology into a dark age of scientistic reductionism and, following in its trail, artistic and philosophical nihilism.
All right, have a tantrum. Hurl the magazine aside. Say that I am some insidious charlatan of “creation-lite,” or, God forfend, “intelligent design.” “In the beginning was the Word” is from a mystical passage in a verboten book, the Bible, which is not a scientific text. On your side in rebuffing such arguments is John E. Jones III of central Pennsylvania, the gullible federal judge who earlier this year made an obsequious play to the Panel of Peers with an attempted refutation of what has been termed “intelligent design.”
But intelligent design is merely a way of asserting a hierarchical cosmos. The writings of the leading exponents of the concept, such as the formidably learned Stephen Meyer and William Dembski (both of the Discovery Institute), steer clear of any assumption that the intelligence manifestly present in the universe is necessarily supernatural. The intelligence of human beings offers an “existence proof” of the possibility of intelligence and creativity fully within nature. The idea that there is no other intelligence in the universe in any other form is certainly less plausible than the idea that intelligence is part of the natural world and arises in many different ways. MIT physicist and quantum-computing pioneer Seth Lloyd has just published a scintillating book called Programming the Universe that sees intelligence everywhere emerging from quantum processes themselves — the universe as a quantum computer. Lloyd would vehemently shun any notion of intelligent design, but he posits the universe as pullulating with computed functions. It is not unfair to describe this ubiquitous intelligence as something of a Godlike force pervading the cosmos. God becomes psi, the “quantum wave function” of the universe.
All explorers on the frontiers of nature ultimately must confront the futility of banishing faith from science. From physics and neural science to psychology and sociology, from mathematics to economics, every scientific belief combines faith and facts in an inextricable weave. Climbing the epistemic hierarchy, all pursuers of truth necessarily reach a point where they cannot prove their most crucial assumptions.
The hierarchical hypothesis itself, however, can be proven. Kurt Gödel, perhaps the preeminent mathematician of the 20th century and Einstein’s close colleague, accomplished the proof in 1931. He demonstrated in essence that every logical system, including mathematics, is dependent on premises that it cannot prove and that cannot be demonstrated within the system itself, or be reduced to it. Refuting the confident claims of Bertrand Russell, Alfred North Whitehead, and David Hilbert that it would be possible to subdue all mathematics to a mechanical unfolding of the rules of symbolic logic, Gödel’s proof was a climactic moment in modern thought.
This saga of mathematical discovery has been beautifully expounded in a series of magisterial books and articles by David Berlinski, notably his intellectual autobiography Black Mischief (1986), The Advent of the Algorithm (2000), and Infinite Ascent: A Short History of Mathematics (2005). After contemplating the aporias of number theory in Black Mischief, he concluded, “It is the noble assumption of our own scientific culture that sooner or later everything might be explained: AIDS and the problems of astrophysics, the life cycle of the snail and the origins of the universe, the coming to be and the passing away. . . . Yet it is possible, too, that vast sections of our experience might be so very rich in information that they stay forever outside the scope of theory and remain simply what they are: unique, ineffable, insubsumable, irreducible.” And the irreducibility of mathematical axioms translates directly into a similar irreducibility of physics. As Caltech physicist and engineer Carver Mead, a guiding force in three generations of Silicon Valley technology, put it: “The simplest model of the galaxy is the galaxy.”
The irreducibility takes many forms and generates much confusion. Michael Behe, author of the classic Darwin’s Black Box (1996), shows that myriad phenomena in biology, such as the bacterial flagellum and the blood-clotting cascade, are “irreducibly complex” in the sense that they do not function unless all their components are present. It’s an all-or-nothing system incompatible with an evolutionary theory of slow, step-by-step incremental change. Behe’s claim of “irreducible complexity” is manifestly true, but it thrusts the debate into a morass of empirical biology, searching for transitional forms in the same way that paleontologists search for transitional fossils. Nothing definitive is found, but there are always enough molecules of smoke, or intriguing lumps of petrified stool or suggestive shards of bones or capsules of interesting gas, to persuade the gullible judge or professor that somewhere there was a flock of flying dragons or a whirling cellular rotaxane that fit the bill.
Mathematician Gregory Chaitin, however, has shown that biology is irreducibly complex in a more fundamental way: Physical and chemical laws contain hugely less information than biological phenomena. Chaitin’s algorithmic information theory demonstrates not that particular biological devices are irreducibly complex but that all biology as a field is irreducibly complex. It is above physics and chemistry on the epistemological ladder and cannot be subsumed under chemical and physical rules. It harnesses chemistry and physics to its own purposes. As chemist Arthur Robinson, for 15 years a Linus Pauling collaborator, puts it: “Using physics and chemistry to model biology is like using lego blocks to model the World Trade Center.” The instrument is simply too crude.
Science gained its authority from the successes of technology. When Daniel Dennett of Tufts wants to offer unanswerable proof of the supremacy of science, he writes, “I have yet to meet a postmodern science critic who is afraid to fly in an airplane because he doesn’t trust the calculations of the thousands of aeronautical engineers and physicists that have demonstrated and exploited the principles of flight.” Dennett is right: Real science is practical and demonstrable, following the inspiration of Michael Faraday, Heinrich Hertz, Thomas Edison, William Shockley, Robert Noyce, Charles Townes, and Charles Kao — the people who built the machines of the modern age. If you can build something, you can understand it.
The Panel of Peers, however, is drifting away from these technological foundations, where you have to demonstrate what you invent — and now seeks to usurp the role of philosophers and theologians. When Oxford physicist David Deutsch, or Scientific American in a cover story, asserts the reality of infinite multiple parallel universes, it is a trespass far beyond the bounds of science into the realm of wildly speculative philosophy. The effort to explain the miracles of our incumbent universe by postulating an infinite array of other universes is perhaps the silliest stratagem in the history of science.
Darwin’s critics are sometimes accused of confusing methodological materialism with philosophical materialism, but this is in fact a characteristic error of Darwin’s advocates. Multiverse theory itself is based on a methodological device invented by Richard Feynman, one that “reifies” math and sees it as a physical reality. (It’s an instance of what Whitehead called “the fallacy of misplaced concreteness.”) Feynman proposed the mapping of electron paths by assuming the electron took all possible routes, and then calculating the interference patterns that result among their wave functions. This method was a great success. But despite some dabbling as a youth in many-worlds theory, Feynman in his prime was too shrewd to suggest that the electron actually took all the possible paths, let alone to accept the theory that these paths compounded into entire separate universes.
Under the pressure of nothing buttery, though, scientists attempt to explain the exquisite hierarchies of life and knowledge through the flat workings of physics and chemistry alone. Information theory says this isn’t possible if there’s just one universe, and an earth that existed for only 400 million years before the emergence of cells. But if there are infinite numbers of universes all randomly tossing the dice, absolutely anything is possible. The Peers perform a prestidigitory shuffle of the cosmoses and place themselves, by the “anthropic principle,” in a privileged universe where life prevails on Darwinian terms. The Peers save the random mutations of nothing buttery by rendering all science arbitrary and stochastic.
Science still falls far short of developing satisfactory explanations of many crucial phenomena, such as human consciousness, the Big Bang, the superluminal quantum entanglement of photons across huge distances, even the bioenergetics of the brain of a fly in eluding the swatter. The more we learn about the universe the more wide-open the horizons of mystery. The pretense that Darwinian evolution is a complete theory of life is a huge distraction from the limits and language, the rigor and grandeur, of real scientific discovery. Observes Nobel-laureate physicist Robert Laughlin of Stanford: “The Darwinian theory has become an all-purpose obstacle to thought rather than an enabler of scientific advance.”
In the 21st century, the word — by any name — is primary. Just as in Crick’s Central Dogma ordaining the precedence of DNA over proteins, however, the word itself is not the summit of the hierarchy. Everywhere we encounter information, it does not bubble up from a random flux or prebiotic soup. It comes from mind. Taking the hierarchy beyond the word, the central dogma of intelligent design ordains that word is subordinate to mind. Mind can generate and lend meaning to words, but words in themselves cannot generate mind or intelligence.
Retorts the molecular biologist: Surely the information in DNA generates mind all the time, when it gives the instructions to map the amino acids into the cells of the brain? Here, however, intercedes the central dogma of the theory of intelligent design, which bars all “magical” proteins that morph into data, all “uppity” atoms transfigured as bits, all “miracles” of upstream influence. DNA can inform the creation of a brain, but a brain as an aggregation of proteins cannot generate the information in DNA. Wherever there is information, there is a preceding intelligence.
At the dawn of information theory in 1948, MIT cybernetician and Shannon rival Norbert Weiner defined the new crisis of materialism: “The mechanical brain does not secrete thought ‘as the liver does bile,’ as the earlier materialists claimed, nor does it put it out in the form of energy as the muscle puts out its activity. Information is information, not matter or energy. No materialism that does not admit this can survive at the present day.”
This constraint on the Munchausen men of the materialist superstition is a hard truth, but it is a truth nonetheless. The hierarchies of life do not stop at the word, or at the brain. The universe of knowledge does not close down to a molecular point. It opens up infinitely in all directions. Superior even to the word are the mind and the meaning, the will and the way. Intelligent people bow their heads before this higher power, which still remains inexorably beyond the reach of science.
Throughout the history of human thought, it has been convenient and inspirational to designate the summit of the hierarchy as God. While it is not necessary for science to use this term, it is important for scientists to grasp the hierarchical reality it signifies. Transcending its materialist trap, science must look up from the ever dimmer reaches of its Darwinian pit and cast its imagination toward the word and its sources: idea and meaning, mind and mystery, the will and the way. It must eschew reductionism — except as a methodological tool — and adopt an aspirational imagination. Though this new aim may seem blinding at first, it is ultimately redemptive because it is the only way that science can ever hope to solve the grand challenge problems before it, such as gravity, entanglement, quantum computing, time, space, mass, and mind. Accepting hierarchy, the explorer embarks on an adventure that leads to an ever deeper understanding of life and consciousness, cosmos and creation.
Mr. Gilder is editor-in-chief of Gilder Technology Report and co-founder of the Discovery Institute. His most recent book, The Silicon Eye, was a finalist for the Royal Society’s Aventis Prize for science.
Elway Poll Results from Sound Politics
The Rossi campaign has just released the results of an Elway poll which shows that the state's voters believe:
1) If Rossi prevails in the manual recount, he should be accepted as the legitimate Governor.
2) If Gregoire increases her margin in King County as a result of changing the rules in the middle of the game, that her "win" will not be accepted as legitimate and that a run-off election should be held.
This in spite of the fact that the survey sample included significantly more Democrats than Republicans and an equal number who said they voted for each candidate.
The complete survey report is in the extended entry.
This summary presents response frequency distributions for the survey of Washington state registered voters on behalf of the Republican Governor’s Association.
Telephone interviews were completed with 405 registered voters on Dec 16-18, 2004. The overall margin of sampling error is ±5. That means, in theory, there is a 95 probability that the results of this survey are within ±5 of the results that would have been obtained by interviewing all registered voters in the state.
The raw data have been statistically weighted to reflect the proportion of vote in the General Election.
The data are presented here in a replica of the questionnaire used in the interviews.
The figures in bold type are percentages of respondents who gave each answer.
Percentages may not add to 100 due to rounding.
FROM SAMPLE: VOTE HISTORY
SEX: MALE...50 FEMALE…50
**As you know, the recent election here in Washington state resulted in the closest election in history for Governor. In that election did you vote for…
Republican Dino Rossi…42
Democrat Christine Gregoire…42
Or Did You Not Vote in That Race…2
**As you know, because of the closeness of the election, the state conducted a re-count of all 2-point-8 million ballots. That recount resulted in Mr. Rossi being declared the winner by 42 votes.
The state Democratic Party requested a third count of the ballots –to be done by hand, instead of by machine. This recount is almost completed.
If Rossi gets the most votes this time, he will have won all three counts of the vote and will be declared the winner. In your opinion, will Rossi be the legitimate winner of the election? Or are there too many unresolved issues to say?
TOO MANY UNRESOLVED ISSUES…28
**It is possible that the results of this second recount will also be in dispute. If there are unresolveable disputes resulting from this second recount, how do you think the issue should be resolved:
By the Courts…24
By the Legislature…11
By the Voters in a Special Run-Off Election Between Rossi and Gregoire…50
**State law says that the result of the hand-count is final – whoever wins this count will be declared the Governor. Therefore, if Gregoire ends up with the most votes in this third count, she will be declared the winner. If King County were to change some of their counting rules part way through this third count and Gregoire ends up winning, will Gregoire be the legitimate winner of the election, in your opinion? Or would Rossi then be justified in asking for a new election to vote again?
ROSSI JUSTIFIED IN ASKING FOR A NEW ELECTION…51
**If King County were to change some of their counting rules part way through this third count and the final tally from this third count results in a margin of victory for Gregoire that is more than 100 votes, will Gregoire be the legitimate winner of the election, in your opinion? Or would Rossi then be justified in asking for a new election to vote again?
ROSSI JUSTIFIED IN ASKING FOR A NEW ELECTION…52
**If there is a continuing dispute, it will be over the legitimacy of the ballots and the counting process used in each county. There has never been a contested state-wide election in Washington. In the past, the courts have resolved contested local elections. However, since this would be the first time, the legislature could be called upon to resolve this election.
If it goes to the Legislature, it could work much like an impeachment process, where Senators and Representatives hear testimony from both sides – then vote to determine the winner.
Given this, how do you think the new governor should be determined if there are still unresolved disputes after this second recount:
By the Courts…26
By the Legislature…12
By the Voters in a Special Run-Off Election Between Rossi and Gregoire…52
**What is your overall opinion of the two candidates today?
Would you say your opinion of [READ & ROTATE] is Very Favorable, Somewhat Favorable, Somewhat Unfavorable, or Very Unfavorable?
[REPEAT QUESTION WITH SECOND NAME]
VF FAV UNFAV VERY DK
1. Christine Gregoire 22 29 23 20 5
2. Dino Rossi 31 33 17 8 12
**I have just a few last questions for our statistical analysis.
First, how old are you?
**What is the last year of schooling you completed?
**Which of these the following best describes you at this time? Are you. . .
employed in the public sector, like a governmental agency or educational institution...15
employed in private business...36
Not Working Right Now...9
**If you had to register by party in order to vote, would you register as a Democrat, a Republican, or Independent?
Thank you very much. You have been very helpful.