Tuesday, February 9, 2010

"Believe what you want to believe. Truth schmuth."

"People endorse whichever position
reinforces their connection to others with
whom they share important commitments."

Is it true that people are generally unwilling to change their minds, even in light of weighty evidence against a stance they've taken? Do peoples' beliefs really color their perceptions, shattering any semblance of a claim for objectivity (http://en.wikipedia.org/wiki/Confirmation_bias)? Do normal folks systematically reject evidence that disagrees with their opinions? Umm- yes.

As Dan Hakan points out in (NATURE|Vol 463|21 January 2010 "Fixing the Communications Failure"), "public debate about science is strikingly polarized. The same groups who disagree on
‘cultural issues’ — abortion, same-sex marriage and school prayer — also disagree on whether
climate change is real and on whether underground disposal of nuclear waste is safe." This seems to demonstrate an attitude of, "if a study's conclusions, data, or scientific consensus varies with my opinion, simply discredit the offending evidence." This strategy is effective at maintaining an opinion; it is not effective at forming a correct one.

Why do people who subscribe to different moral views about environmental risk, public health, and crime control react differently to scientific data? Probably because it's offensive to accept some of the implications of scientific evidence. Say hypothetically that climate change research suggests that carbon emissions should be reduced (hang with me, I know this is a fantastic scenario- just use your imagination). Well, imagine further that some primary emitters of the pollution in question are commerce/industry corporations. If you believe that business activity is a good thing and regulation of commerce is a bad thing, then you have an incentive to discredit the evidence. (*see excerpt below) Also, "Cultural cognition also causes people to interpret new evidence in a biased way that reinforces their predispositions. As a result, groups with opposing values often become more polarized, not less, when exposed to scientifically sound information."

Case example. The Center for Disease Control four years ago recommended vaccinating schoolgirls against the human-papillomavirus. Arguments for and against the mandatory vaccination were matched to fictional experts who appearances were intentionally varied (e.g. denim shirt + beard or suit + grey hair). When the individualistic-looking experts advocated opposition to the mandatory vaccinations, individualistic people became even more opposed. When the communitarian/egalitarian looking fellow touted the safeness of the vaccine, egalitarian folks became even more supportive. Then the roles were switched (individualistic-looking expert supported the vaccines and the communitarian-looking expert opposed). Rather than maintaining their opinions, people shifted their positions to match the position of the expert they identified with. "The experts whom laypersons see as credible, we have found, are ones whom they perceive to share their values." It seems "people deal with evidence selectively to promote their emotional interest in their group."

This selective perception of evidence is troubling because it leads groups of people (say, e.g., those who buy into the conspiracy theory about climate change vs. those who buy into the scientific consensus on the matter) who have the same desired outcomes of health, safety, and economic well-being to advocate contrasting remedies, not all of which are equally likely to bring about the desired outcomes. Lex non novit patrem, nec matrem; solam veritatem "the law does not know neither father nor mother, only the truth."

Conclusion? Science needs better marketing, the object of which should be to create an environment for the public's open-minded consideration of the best science, rather than certain stances. Also, folks would more likely achieve a correct position on issues by consciously acting against the tendency to "resist scientific evidence that could lead to restrictions on activities valued by their group" by remaining open-minded.  Topple that tower of preconceived notions, leaving in its wake an open mind- sublato fundamento, cadit opus!  ("the foundation being removed, the structure falls")

*excerpt below:
"People with individualistic values, who
prize personal initiative, and those with hierarchical
values, who respect authority, tend to dismiss
evidence of environmental risks, because
the widespread acceptance of such evidence
would lead to restrictions on commerce and
industry, activities they admire. By contrast,
people who subscribe to more egalitarian and
communitarian values are suspicious of commerce
and industry, which they see as sources
of unjust disparity. They are thus more inclined
to believe that such activities pose unacceptable
risks and should be restricted. Such differences,
we have found, explain disagreements in environmental-
risk perceptions more completely
than differences in gender, race, income, education
level, political ideology, personality type
or any other individual characteristic" (Kahan, D. M., Braman, D., Gastil, J., Slovic, P. & Mertz, C. K. J. Empir. Legal Stud. 4, 465–505 (2007).

Also relevant:


Some research indicating “Humans are not only prone to make biased predictions,” he wrote, “we’re also damnably overconfident about our predictions and slow to change them in the face of new evidence.”

No comments:

Post a Comment

Search This Blog