Kathryn Schulz talks about being open to the possibility of being wrong:
Schulz has some really good insights here, I think, particularly about a couple of things. First, our fear of being wrong is closely tied to our fear that something is intrinsically wrong with us — that is, we identify so closely with certain cherished beliefs that having those beliefs shaken amounts to a crisis of identity and self-worth. And second, an unshakeable faith in our rightness causes us to assume “ignorance, idiocy, and evil” in those who disagree with us. But in this particular talk she doesn’t answer an important question: While it’s important to acknowledge our own human fallibility, how do we know when we are, in fact, right? She does a better job addressing that issue here (at 14:38):
“If we can’t look inside to figure out whether or not we might be wrong, then we need to look outside. We need to look to each other.” In other words we can’t simply trust the feeling of rightness that we get from closely-held beliefs and personal perspectives; we need to look outwards, to evidence and to criticism.
This is why subjective experience cannot tell us anything about God. Knowing what kind of god someone believes in tells us a great deal about that person — but nothing whatsoever about the truth or otherwise of the existence of any god at all.
The subjective feeling of rightness, in other words, is not to be trusted when we’re trying to get a handle on reality. We have to look to the facts.
Of course scientists, being human, aren’t immune from mistakes either. Kenan Malik brings up the fascinating case of Stephen Jay Gould, whose classic The Mismeasure of Man tore apart racial scientist Samuel Morton’s argument for empirically measurable racial differences. But it turns out, according to the authors of a new scientific paper, that Gould himself was apparently influenced by anti-racist ideology to skew the existing data in favor of his argument. The point isn’t that Morton was right — his ideas about race are still unfounded, outdated, and discredited — but that Gould, in attacking Morton’s data as skewed by ideological bias, was prevented by his own bias from assessing the data in an impartial light. The problem, Malik suggests, isn’t with science itself but with the marriage of science and ideology:
We need more, therefore, than simply an affirmation of faith in the scientific method. We need also constant policing of those areas in which science meets ideology. We need, too, a commitment to skepticism and a willingness constantly to question, particularly in those areas in which science seems unblinkingly to back the predominant social or cultural views.
Paradoxically, Malik argues, that commitment to skepticism and questioning arises out of the very “social embeddedness of science” that led Morton and Gould to make their culturally biased mistakes to begin with. Because science is a social activity, scientists can be influenced by the prevailing social and cultural attitudes of the day. But because science is a collective pursuit, any scientist’s claims can also be criticized and corrected by others in their turn: Morton by Gould, and now Gould by his new questioners. Science may never be completely objective, but the pursuit of objectivity is what makes the scientific endeavor worthwhile:
Scientists live in particular societies, and are shaped by particular cultures. They questions they ask about the world and the interpretations they place on their data are inevitably formed by cultural attitudes, needs and possibilities. Because scientific practice is socially bound, it is open to ideological corruption. But it is also the social embeddedness of science that provides the means to combat such corruption. The weapons we need to defend scientific objectivity are themselves social practices: an open society, the encouragement of free debate, a skepticism of accepting truth on authority, a willingness to question received wisdom, an acknowledgement of the political independence of scientific research. Ironically, it is precisely because science is a social endeavour that it is able to ‘escape the bounds and blinders of cultural contexts’.
Science, in other words, seems to deal with the inevitability of being wrong in exactly the way that Kathryn Schulz recommends: by looking outwards, to evidence and critics and adversaries, and by inviting people in to figure out exactly where we have things wrong — and where we have things right.
This is something that I think she unfortunately glosses over in the second video above (at around 17:00). It isn’t merely the case, as Schulz seems to suggest, that scientific paradigms are overturned time and again (and therefore we can expect that our current ideas will someday be considered wrong too). Rather, science is in the business of being increasingly less wrong as time goes by. (She does hint at this with Richard Rorty’s quote about “the permanent possibility of someone having a better idea” — though I think she could have expanded on Rorty’s notion that ideas not only are possibly wrong but actually get better, or in other words more right.) As Isaac Asimov explains in his classic essay “The Relativity of Wrong”: “Theories are not so much wrong as incomplete.”
(This applies not only to science but to cultural paradigms at large. We don’t arbitrarily consider something “right” one day, and something completely different “right” the next. Our morality improves with time — and so while our notions of human dignity and human rights are still imperfect and incompletely realized, it doesn’t mean that we’ll ever turn around and go back to thinking, for instance, that it’s okay to treat women and blacks as property.)
Schulz’s point is well-taken: We should never smugly assume that we’re right merely on the basis of instinct, desire, belief, tradition, or personal feeling; and we should always be open to evidence and the perspectives of others. But if we are to avoid being stuck in paralyzing self-doubt, we should also have the wisdom to determine when we are right; the confidence to decide when certain questions are settled and certain perspectives no longer have credibility; and the courage — after carefully considering the evidence and arguments at hand — to speak and act on our convictions.