Home » » Why do smart people disagree about facts?

Why do smart people disagree about facts?

Unknown | 10:52:00 PM | 0 comments
Because climate change is a concept developed by climate scientists, there is a wide spread belief that once the facts are known, there should be no reason to stop a proper course of action form being implemented. The problem is that sometimes the facts are not as clear as they seem to be, or least this is what some people claim. So do we get a pseudo controversy where there is no reason to disagree? Are the media presenting a false symmetry of positions where one side has no standing (see Seumas Milne's comment today)?


Two eminent social scientists deny this line of reasoning, David Victor (San Diego) 


and Dan Kahan (Yale). 

'Why do smart people disagree about facts? Some Perspectives on Climate Denialism' -- this is the title of a talk given by David Victor at the end of January at the Scripps Institution of Oceanography. Andy Revkin has covered the talk and put up a link to the paper here. Dan Kahan spoke at the STS event last week at my university where I had the pleasure to talk to him before and after his lecture. The title of his talk was 'Culture, rationality, and the tragedy of the science communications commons.'

Victor examines the role of climate consensus and the role of contrarians. With regard to the latter, he distinguishes three categories of 'denialists': shills, skeptics, and hobbyists. The shills are the professional policy delayers, skeptics are people like Freeman Dyson, and hobbyists populate the blogosphere. Victor thinks that the their influence is vastly over estimated:
[T]he whole climate science and policy community is spending too much time thinking about the denialists and imagining that if we could just muzzle or convince these outliers that policy would be different. That’s not right—in part because the denialists aren’t such a hearty band and in part because the real barriers to policy are cost and strategy.

His conclusion is reprinted below, but read the whole paper, it is illuminating.

First, we in the scientific community need to acknowledge that the science is softer than we like to portray. The science is not “in” on climate change because we are dealing with a complex system whose full properties are, with current methods, unknowable. The science is “in” on the first steps in the analysis—historical emissions, concentrations, and brute force radiative balance—but not for the steps that actually matter for policy. Those include impacts, ease of adaptation, mitigation of emissions and such—are surrounded by error and uncertainty. I can understand why a politician says the science is settled—as Barack Obama did…in the State of the Union Address, where he said the “debate is over”—because if your mission is to create a political momentum then it helps to brand the other side as a “Flat Earth Society” (as he did last June). But in the scientific community we can’t pretend that things are more certain than they are.
Second, under pressure from denialists we in the scientific community have spent too much time talking about consensus. That approach leads us down a path that, at the end, is fundamentally unscientific and might even make us more vulnerable to attack, including attack from our own. The most interesting advances in climate science concern areas where there is no consensus but the consequences for humanity are grave, such as the possibility of extreme catastrophic impacts. We should talk less about consensus and more about the consequences of being wrong—about the lower probability (or low consensus) but high consequence outcomes. Across a large number of climate impacts the tails on the distributions seem to be getting longer, and for policy makers that should be a call for more action, not less. But people don’t really understand that, and we in the scientific community haven’t helped much because we are focused on the consensus-prone medians rather than the tails.  

In a similar vein, Dan Kahan argues that people's tendency to use motivated reasoning while trying to maintain social bonds is the root cause for a highly polarized science communication with regard to climate change (the slides and notes from his talk are on his blog; see also his Nature paper Why we are poles apart on climate change).

During his talk at Nottingham, Dan presented research which examined two approaches from psychology, one called the 'public irrationality thesis' (PIT), the other the 'cultural cognition thesis' (CCT). According to PIT, public controversy over climate change and other societal risks can be attributed to the public’s excessive reliance on unconscious, affect-driven heuristics (“system 1” in Kahneman's terminology) and its inability to engage in the conscious, effortful, analytic analysis (“system 2”) form that characterizes expert risk analysis. But this is not borne out by the facts, as Kahan explains: 'Those members of the public who display the greatest degree of “system 2” reasoning ability—are no more likely to hold views consistent with scientific consensus. Indeed, they are even more likely to be culturally and ideologically polarized than members of the public who are most disposed to use “system 1” heuristic forms of reasoning.'

CCT, on the other hand, posits that the social bonds individuals have interact with their risk perception. Studies show that individuals are much more ready to perceive 'scientists to be “experts” worthy of deference on disputed societal risks when those scientists support the position that is predominant in individuals’ cultural group.'

This selectivity can be expected to generate diverging perceptions of what expert consensus is on disputed risks.  And, indeed, empirical evidence confirms this prediction.  No cultural group believes that the position that is dominant in its group is contrary to scientific consensus—and across the run of disputed societal risks, all of the groups can be shown to be poorly informed on the state of expert opinion.

The source of the science communication problem is not too little rationality on the part of the public but rather too much.  The behavior of an ordinary individual as a consumer, a voter, or an advocate, etc., can have no material impact on the level of risk that person or anyone else faces from climate change. But if he or she forms a position on that issue that is out of keeping with the one that predominates in that person's group, he or she faces a considerable risk of estrangement from communities vital to his or her psychic and material well-being.  Under these conditions, a rational actor can be expected to attend to information in a manner that is geared more reliably to forming group-congruent than science-congruent risk perceptions.  And those who are highest in critical reasoning dispositions will do an even better job than those whose “bounded rationality” leave them unable to recognize the evidence that supports their groups’ position or to resist the evidence that  undermines it.
Both Victor and Kahan point to an important issue. The issue is the role of scientific expertise in public affairs, and the social dynamics which ensue when risk issues are debated among scientists and the public at large is invited to comment (if only through opinion polls). The knee jerk assumptions of non-specialists in the field are not borne out by the facts, which means that progress on climate policy is not stalled primarily by contrarians, and more science education or information will do nothing to convince the public. Victor, the political scientist, shows how different forms of climate denial are over estimated while Kahan, the psychologist, shows that people actively seek information which fits the cultural group they belong to. No amount of 'neutral' information will change their views and campaigns of educating the public (through science or alarm) are futile. It looks as if those of us who want to see progress in climate policy need to focus their energy on different issues.

Share this article :

0 comments:

Post a Comment