More Loaded Dice
Several years ago I had an exchange on this blog with Professor Robert Altemeyer over his claim that authoritarianism was more common on the political right than on the political left. I argued that the survey on which his claim was based was, probably not intentionally, loaded. Questions about respect for authority consistently referred to authorities more popular on the right than the left, questions about people bravely defying authority referred to forms of defiance more popular on the left than on the right, hence people on the left would appear, by their score on his questions, less authoritarian than they were, people on the right more. I recently encountered the same problem in a different context, this time an article describing a study that purported to show that people on the right are more often misinformed about public issues than people on the left.
The obvious way to rig the results of such a poll is to select questions where the answer you consider mistaken is more popular with one side than the other. Most people who believe Obama was not born in the U.S. are on the right. Most people who believe the Chamber of Commerce used foreign money to influence the most recent election are on the left. By my count, for at least seven of the eleven questions the answer that the study's authors considered misinformed was a view more popular with the right than the left. One—the Chamber of Commerce question—went the other way.
A second problem with the study was that, for at least three of its eleven questions (whether stimulus had saved several million jobs, whether the economy was recovering, whether Obamacare increased the deficit), the right answer was unclear. In none of the three did the study's authors provide adequate support for their view—which, in each case, coincided with the claims of the Administration.
I first heard of the study via a critical piece on Reason's blog. A while later, I came across another reference to it, a Usenet post by someone who obviously approved of its conclusions. I responded and pointed out the problems.
With regard to the three questions where the study's answer was less obviously correct than its authors thought, I can easily imagine a reasonable person disagreeing with me, arguing that the study at worse mildly exaggerated how clear the right answer was. I do not, however, see how any reasonable person could fail to see the way in which the selection of questions was biased, once it was pointed out.
I am now waiting to see if there is anyone reading that particular Usenet thread who is willing to admit that the evidence for a conclusion he likes is bogus.