This page is sponsored by Google Ads. ARN does not necessarily select or endorse the organizations or products advertised above.
The German biochemist Bruno Müller-Hill tells a memorable story to illustrate his thesis that "self-deception plays an astonishing role in science in spite of all the scientists' worship of truth:"
When I was a student in a German gymnasium and thirteen years old, I learned a lesson that I have not forgotten.... One early morning our physics teacher placed a telescope in the school yard to show us a certain planet and its moons. So we stood in a long line, about forty of us. I was standing at the end of the line, since I was one of the smallest students. The teacher asked the first student whether he could see the planet. No, he had difficulties, because he was nearsighted. The teacher showed him how to adjust the focus, and that student could finally see the planet, and the moons. Others had no difficulty; they saw them right away. The students saw, after a while, what they were supposed to see. Then the student standing just before me -- his name was Harter -- announced that he could not see anything. "You idiot," shouted the teacher, "you have to adjust the lenses." The student did that and said after a while "I do not see anything, it is all black." The teacher then looked through the telescope himself. After some seconds he looked up with a strange expression on his face. And then my comrades and I also saw that the telescope was nonfunctioning; it was closed by a cover over the lens. Indeed, no one could see anything through it. [From "Science, Truth, and Other Values," by Bruno Müller-Hill, Quarterly Review of Biology, Sept 1993 (Vol. 68, No. 3), pp. 399-407.]
Müller-Hill reports that one of the docile students became a professor of philosophy and director of a German TV station. "This might be expected," he wickedly comments. But another became a professor of physics, and a third professor of botany. The honest Harter had to leave school and go to work in a factory. If in later life he was ever tempted to question any of the pronouncements of his more illustrious classmates, I am sure he was firmly told not to meddle in matters beyond his understanding.
One might derive from this story a satirical "Harter's Precept," to put alongside Parkinson's Law (bureaucracy expands to the limit of the available resources) and the Peter Principle (everyone rises in a hierarchy up to his level of incompetence). Harter's Precept says that the way to advance in academic life is to learn to see what you are supposed to see, whether it is there or not. As Sam Rayburn used to explain to new members of Congress, you've got to go along to get along.
Richard Hamilton's The Social Misconstruction of Reality indicates that many social scientists seem to have guided their careers by the light of Harter's Precept. Hamilton gives three major examples of erroneous theses that gained the status of fact in social science despite the absence of evidentiary support: (1) Max Weber's thesis that the Protestant Ethic spurred the advance of capitalism; (2) the widely accepted thesis that Hitler's main electoral support came from the lower middle classes (the despised petit bourgeoisie of Marxism); and (3) Michel Foucault's thesis that the modern prison evolved not as a more humane alternative to the cruel physical punishments of earlier centuries, but as part of a wide-ranging scheme by sinister forces to enforce a pervasive social conformity.
Persons who want to follow the fact-heavy specifics of the three cases should read Hamilton. Very briefly, Weber relied on slender evidence and based a major point on a typographical error. He thus offered an elaborate explanation for a phenomenon -- the supposed domination of the business economy by Protestants -- which the data did not support. The claim that Hitler obtained his electoral support disproportionately from the lower middle class arose from ideology rather than evidence, and became generally accepted because influential scholars uncritically repeated it. Actual analysis of voting records published in the 1980s showed that the lower middle class did not provide exceptional support for Hitler, his support being greatest among the more affluent groups.
The easiest of Hamilton's three case studies to summarize, and the one with the greatest current impact, is that of Michel Foucault. Foucault's book Discipline and Punish argued that the modern prison replaced the barbaric punishments of earlier ages because a malevolent "they" decided that brutality was less effective than a more superficially humane punishment "that acts in depth on the heart, the thoughts, the will, the inclinations." To accomplish this pervasive control, "they" adopted Jeremy Bentham's "panopticon" design for the prison, which permits guards to watch every solitary prisoner in every cell at all times. Such controls were then extended "throughout the social body," to create a "disciplinary society" of domination.
The factual basis of Foucault's thesis consisted mainly of offhand assertions and grotesque errors, supported with references to obscure sources that didn't really support the text. For example, Bentham's panopticon was an eccentricity that was rarely copied and never successful; most modern prisons are built on the entirely different "Auburn" plan. Nineteenth-century governments preferred to transport felons to distant penal colonies rather than to keep them constantly under surveillance. Foucault's method was a freewheeling interpretation of selected texts, and he made little effort to situate the texts in context or to distinguish eccentric proposals and exceptional events from regular practice.
These methodological blunders escaped the notice of the eminent reviewers who praised Foucault's history and helped make him fashionable in the academy. The reviewers also failed to notice the transparently paranoid nature of Foucault's attribution of imaginary evils to unspecified malevolent forces. Just as Weber's thesis was revered by sociologists but ignored by historians who knew the facts, Foucault is a cult hero in the humanities but not taken seriously by penologists.
The cases of Weber and Foucault are otherwise very different, however. Muller-Hill warns us that even scientists who worship truth are prone to deceive themselves, but at least they are embarrassed when their errors come to light -- as Weber would have been. Disciples of Foucault, on the other hand, are typically indifferent to criticisms of their master's accuracy. This is not surprising, because Foucault himself was a nihilist obsessed with power, who believed that "truth" was something imposed by powerholders, not something found by impartial investigation.
Foucault was also personally a sociopath, who sought the limits of experience in drug use and in the sado-masochistic depths of San Francisco's gay bathhouses. (He died of AIDS in 1984.) His use of deliberate or reckless errors in his scholarship may have been another attempt to "test the limits," just to see what he could get away with. Foucault's writing is not about objective reality, which to him was a form of tyranny, but about his own obsessions and his determination to defy every restraint. That such an enemy of reason became one of the most influential and honored of late twentieth century philosophers dramatically refutes his own thesis that we live in a "disciplinary society." (Why did "they" tolerate Foucault?) It tends rather to suggest that parts of our universities are going through what Plato described as the last stages of democracy, when all restraints break down and nihilism is both preached and practiced.
Hamilton concludes with an analysis of how major scholarly errors get made and perpetuated: theorists fall in love with their theories, the appearance of scholarship can often pass for the reality, and clever charlatans like Foucault can evade the scrutiny of specialists and appeal directly to those who are all too willing to deceive themselves. Above all, there is the wisdom of Harter's Precept. Revealing that the lens cap is still on the telescope isn't necessarily good for one's career.
Copyright © 1997 Phillip E. Johnson. All
rights reserved. International copyright secured.
File Date: 5.15.97
This data file may be reproduced in its entirety
for non-commercial use.
A return link to the Access Research Network web site would be appreciated.
Documents on this site which have been reproduced from a previous publication are copyrighted through the individual publication. See the body of the above document for specific copyright information.