pig-headed-man
The stream of articles that chip away at the classical view of reason continues. Cordelia Fine, writing in today’s NYTimes Sunday Review suggests that scientific “truth” is established by being pigheaded. In the article, [Biased but Brilliant](http://www.nytimes.com/2011/07/31/opinion/sunday/biased-but-brilliant-science-embraces-pigheadedness.html?_r=1&ref=opinion),” she writes
> HOW’S this for a cynical view of science? “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”
>
> Scientific truth, according to this view, is established less by the noble use of reason than by the stubborn exertion of will. One hopes that the Nobel Prize-winning physicist Max Planck, the author of the quotation above, was writing in an unusually dark moment.
The metaphor about dying that Planck used could be interpreted as the ultimate surrender of the opposition and their withdrawal from the battle. Kuhn’s theory about scientific revolution is similar. New paradigms capable of explaining currently imponderable observations emerge as the established views and those that hold them slowly come to dominate the scientific debate. The explanatory power of the new theories is central. Pigheadedness may be an appropriate descriptor at the early stages of a scientific argument, but the resistance is probably due to the fact that the new paradigms float in space for a while, detached from those beliefs and norms that have become institutionalized.
Our beliefs and the norms they spawn are created through experience. We don’t arbitrarily move from one set to another unless the present structure no longer works, that is, when actions based on these beliefs fail to produce the results we seek in the form of satisfactorily explaining an observation , or producing actions that meet fulfill our intentions. Science is related to the first of these, it is a way of finding explanations of observations that are robust and accurate. All the pigheadedness in the world will not change the underlying phenomena on which the observations are based. The Earth is not flat, no matter how many would deny it. If the observations are flawed through the inadequacy of the technology being used to produce them, then it would be possible to pigheadedly argue for some “truth” that will be refuted at a later time. After the first space flights, flat Earth proponents would have found few adherents.
More is at play than pigheadedness. We tend, for sure, to hold on to our beliefs even as we encounter “facts” that begin to erode them. Fine says that this tendency is a “worry.”
> Doesn’t the ideal of scientific reasoning call for pure, dispassionate curiosity? Doesn’t it positively shun the ego-driven desire to prevail over our critics and the prejudicial urge to support our social values (like opposition to the death penalty)? . . . Perhaps not. Some academics have recently suggested that a scientist’s pigheadedness and social prejudices can peacefully coexist with — and may even facilitate — the pursuit of scientific knowledge.
Here, I believe, she makes an error found in many similar critiques of scientific “reasoning.” “Scientific” research is an activity designed to produce the truth about a set of observed phenomena. Truth is these cases is a story, in language, that explains the findings in such a way that others can reproduce them. Scientific investigations largely seek to confirm previous theories while pushing their regions of validity ever more widely. Only rarely do scientific studies produce wholly new truths, that is, stories that explain previously unknown or mysterious observations. The sociological processes are different in these two cases. Peer review is used in the first to assure that the work meets established criteria. Biases and prejudices are always in play and errors are made in both accepting and rejecting the new findings. But the process, if well designed and executed, will generally converge on the “truth.” The biases and prejudices ultimately are of little consequence.
The situation is completely different when scientific findings and theories are used to establish some derivative position. Fine says, “Clearly, social values should never count as evidence for or against a particular hypothesis — abhorrence of the death penalty does not count as data against its crime-deterrent effects.” She uses a couple of examples based on determining the risk of accepting findings as true and acting on the consequent conclusions when there is some possibility that the criteria used to establish the “safe” threshold may be incorrect. The decision to use the results or not to depends on the implicit or explicit risk aversion profile of the decision maker; no science is involved at all. Fine and the author, Heather Douglas, she cites confuse scientific methodology with carefully structured reasoning.
Scientific data are just one piece in complicated reasoning chains invoked to argue for and against action, particularly in public debates such as those connected with climate change. These arguments are not science, merely because scientific claims are being employed. The confusion arises from two sources, at least. The first is that we erroneously and misleadingly name arguments that include scientific claims as part of the reasoning chains as “scientific arguments.”
> Science often makes important contributions to debates that involve clashes of social values, like the protection of public health versus the protection of private industry from overregulation. Yet Ms. Douglas suggests that, with social values denied any legitimate role in scientific reasoning, “debates often dance around these issues, attempting to hide them behind debates about the interpretation of data.” Professors A and B are left with no other option but to conclude that the other is a stubborn, pigheaded excuse for a scientist.
The second is that we confuse the roles of “scientists” in these debates. We tend to fuzz the boundaries, not surprisingly, between the statements of scientists that arise from their scientific knowledge and arguments they make that lie outside of this domain. The authority of scientists, the persons, rests on their reputations in the first domain, that is scientific knowledge, but they may, intentionally or not, fail to distinguish between these two classes in their statements. Almost all serious public debates involve a mix of scientific “facts” which can, in theory, be grounded through a methodological consensus, and social values which, by their fundamental nature, cannot.
I really don’t see where pigheadedness can be beneficial in science, per se. Disbelief and counterargument for sure, but good scientists are generally committed to the finding of truths. The institutions in which they work have this as a goal. The denial of these truths by disputants outside of the scientific field is, on the other hand, pigheadedness when consensus reigns among scientists. The data may be wrong; science is all about setting up hypotheses and seeking to disprove them, thereby moving truth along. It is reasonable, in the sense of trying to win an argument, to claim that the particular scientific facts should not be used because the consequences of being wrong (always a possibility) would be so great that the preserving the status quo is a better decision. But this is all about regret, not about truth. Unfortunately the sloppiness of public debate on almost all issues today has made these distinctions disappear. We spend too much time focused on the structure of the arguments instead of the substance.

Leave a Reply

Your email address will not be published. Required fields are marked *