The text below was modified slightly from a comment I left over at Crooked Timber. After writing it, I thought it held up ok as a separate piece of writing, divorced from the comment thread, so I’m just posting it here with minimal alterations:
Science is hard. It’s just really difficult to even achieve a small amount of mastery in an area of your own alleged expertise. There’s just so much of it, and so much more appearing every day. There are varying responses to this problem. One response is to just write off any results that disagree with conclusions that one has already reached by other means. Another is to set up institutions in which legitimate queries after truth can actually be carried out and debated. That’s a great meta-solution, in my view, but unsurprisingly it comes with its own meta-problems. Now you’ve got this whole other layer of professional scientists that, to the untutored observer, appear interposed priestlike between you and the truth. As with any sufficiently complex (i.e. involving more than 5 people) institutions, mystification sets in. If you’re already predetermined to disregard what the scientists are saying in the first place, what is in reality an imperfect mechanism for adjudicating truth claims begins looking like a conspiracy to suppress your great uncle’s naturopathic cure for cancer. And the thing about conspiracies is that they can never be disproven; any evidence counter to the conspiracist conclusion is merely additional proof that those who offer the evidence are in on the conspiracy.
In the right (wrong) sorts of circumstances, this problem becomes a horrible vicious circle. It can only be resolved by taking a step back and trying to understand science as a human institution and scientists as human practitioners; in other words, trying to figure out what scientists are doing and why. That is also very hard, especially if you come from outside a scientific discipline, because you’ll be entering into discussions in which you lack the requisite terminology for understanding all the little details. That’s why scientific communication is a two-way street: if the average person holds some responsibility for trying to understand how science gets done, then scientists have commensurate responsibility to explain that process in a way that’s understandable. Sadly, scientists have often failed at this task; those who can do it well, like Carl Sagan, Neil deGrasse Tyson, and P.Z. Myers, are worth their weight in gold because they’re quite rare.
The problem with people like global warming deniers and the anti-vax crowd is that everything they do undermines these institutions. If you only care about being right instead of getting it right (parsing the distinction is left as an exercise for the reader), then all this stuff like peer review and independent verification is just so much cruft that you can discard when it runs up against something you want badly to be true. The danger of that is that sooner or later you’ll cut down the very tree you sit in, as the Russian expression goes, and when you actually require those mechanisms and institutions to function properly because they impact your own life, they won’t.