A commenter on a recent post of mine pointed me at an old post by Michael Huemer: Is Critical Thinking Epistemically Responsible?; he asks the same question I do but reaches a different conclusion. This post is my side of the argument; I have invited Michael to respond, possibly on his Substack, have not yet gotten an answer. When and if he does I will add a link to his post to this one.
Consider three possible approaches to the problem of forming an opinion on a controversial issue. One is to look at the arguments and evidence as best you can and on that basis form your opinion, what Huemer describes as “critical thinking” and rejects. Another is to find out who the experts are and what conclusion they support. A third option is to conclude that you do not have an adequate basis for an opinion. While both of us agree that the third option is sometimes correct, he rejects the first as unworkable and argues for trusting the experts. I take the opposite position.
The extreme version of the first approach would be to go to the sources of information on which the experts base their views, read and evaluate them, look at the arguments, evaluate them, reach your own conclusion on that basis. Doing that is a lot of work and likely to require expertise you do not have.
The same approach on easy mode is to find people, ideally smart people, arguing for each side, evaluate their arguments as best you can, and base your conclusion accordingly. This requires much less expertise both because partisans of both sides will have done much of the work for you and because you can often evaluate arguments on grounds that do not require technical knowledge of the subject being argued about. If you find that people on one side are dishonest and those on the other side are not, that gives you some basis for a conclusion. If, more plausibly, you conclude that some people on one or both sides are dishonest, making incoherent arguments, or giving an incompetent account of parts of the issue that you happen to have relevant expertise on, you can reject those people and their arguments and form your conclusion on the basis of what is left.
Huemer rejects that approach on the grounds that it is too difficult in favor of the alternative of believing the experts:
Obvious point: experts tend to be better than non-experts at correctly assessing difficult issues, due to their obvious cognitive advantages over lay people. E.g., the experts typically have greater than average intelligence, much greater knowledge about the issue in question, and have also spent more time thinking about the issue than you. That’s why they’re called “experts”.
E.g., say you’re an ordinary person who wants to form an opinion about the wisdom of gun control laws. Well, there are smart people who have devoted many years to studying that. Do you suppose they learned anything during those years? How could they possibly fail to be more reliable than you? Are we going to say that intelligence has no effect on ability to figure out the truth? Are we going to say that knowledge of the subject also has no effect?
He rejects the obvious reply, the difficulty of knowing which experts to trust:
For most controversial issues, it is a lot easier to judge who is an expert on X than it is to judge the truth of X itself.
One of the first things that struck me about our disagreement was that we were writing for different imagined audiences. I am writing for smart people. He is writing for ordinary people. The clearest evidence:
the experts typically have greater than average intelligence
So do the people I am writing for.
More than One Input
I have in mind controversial issues that have been publicly discussed, e.g., abortion, gun control, global warming.
One problem with Huemer’s argument is that the expertise of the expert rarely covers enough of the issue to derive a conclusion on the basis of his own knowledge. A climatologist knows more about climate than I do so the expert consensus on what is going to happen to global temperature, assuming I can figure out what it is, is a better guess than I can produce for myself. But the climatologist has no expertise in economics, which is one of the things needed to work out the consequences of that change — and I do. He probably has little expertise in statistics; I am better off figuring out for myself, with the assistance of information from statisticians, whether to believe Michael Mann’s hockey stick diagram. His expertise in climatology tells him nothing about the effect of CO2 concentration on crop yields, a question it may not have occurred to him to think about. His conclusions about the consequences of climate change depend on quite a lot of things he is not an expert in. Once I have accepted his prediction of future climate I have no reason to give special weight to his conclusions about its effect on human welfare or the cost of reducing CO2 production, both of which the policy conclusions that people argue about depend on, so there is little reason to prefer his conclusions to mine.
That is one example that I happened to have looked at in considerable detail. The same would be true of gun control, where one of the main controversies (over concealed carry) was set off by an article by two economists offering evidence for an economic point I had made, years before, in my Price Theory. For the abortion controversy the issues are religious and moral questions on which, as best I can tell, there are no experts, at best people whose writing helps the reader think through the issue for himself.
Those are the three issues he starts his argument with.
Bias is the Rule, not the Exception
Huemer’s basic argument takes it for granted that the experts’ highest priority is scientific truth. For most people it isn’t. While he mentions the possibility of bias it does not seem to have occurred to him that all three of the controversies he mentioned at the beginning of his post are currently subjects of political, ideological and, in the case of abortion, religious disagreements. He, like me, has spent his life in academia and surely realizes that for many, perhaps most, academics, coming out on the wrong side of any of those three issues, the side that the people important to them reject, would be socially, perhaps professionally, costly.
His subject, after all, is “controversial issues.”
For a fourth example, consider the controversy over the origin of Covid. The relevant experts are virologists. Is there any doubt that most virologists do not believe, very much want other people not to believe, that their colleagues, the sort of work they themselves do, were responsible for a pandemic that killed millions of people?
Experts on both sides
In case the experts disagree, you could try to figure out what most experts think, or what the best experts think, or something like that.
In the case of a controversial issue it is almost certain that there will be people with some claim to expertise on both sides. To decide which ones to trust, which are the best, you have to actually look at their arguments and figure out which ones are right — counting heads doesn’t do it.
Credentials don’t do it either. Paul Krugman has the ultimate credential, a Nobel prize. Back when he was an academic economist he had the same view of minimum wage laws as the rest of us, that they reduced employment for low-skill workers, and said so, having read and rejected the Card and Krueger article that some offer as evidence against. When he became a professional left wing public intellectual his view reversed.
Judging Arguers as Well as Arguments
It does not require expertise in a technical field to recognize a bad or dishonest argument about that field. If someone offers a positive correlation between the existence of the death penalty and the murder rate in U.S. states as proof that the death penalty does not deter without discussing the obvious problem of reverse causality you know that he is either dishonest or intellectually incompetent, either of which is grounds to ignore his conclusion on the subject. If you observe that an elementary textbook on climate science has a chapter on effects of climate change which does not mention a single positive effect, it does not take much knowledge of the field to conclude that the author is biased and suspect, since the textbook is in its third edition, that the field is biased as well. That is a reason to ignore that author’s conclusions and to discount arguments based on how many climate scientists share them. If you read in the Summary for Policy Makers (SPM) of the IPCC report that climate change is projected to increase the proportion of high-end tropical cyclones and then, looking in the body of the report, discover that the increase in the proportion of high-end cyclones is due not to a projected increase in their number but to a projected decrease in the number of low-end cyclones, you can conclude that the SPM is deliberately giving a biased presentation — and also have some reason to trust the body of the report. If you have read enough news coverage of climate issues to realize that reporters rarely read more than the SPM you also have a reason to heavily discount news stories about climate change.
None of this requires expertise in either criminology or climate science, just a set of generally applicable intellectual tools that an educated individual should have and the willingness to cast a critical eye on what the experts write.
The Incentive Problem
Suppose that most people act as Huemer argues they should. There is now an incentive for partisans to try to control who counts as an expert — avoid funding research or giving promotions or publishing articles by people on the other side of the controversy. The result is both to bias the result of the approach Huemer argues for and to corrupt the scientific enterprise.
Michael Mann to Phil Jones on the subject of an article in Climate Research criticizing Mann’s work: “I think we should stop considering Climate Research as a legitimate peer‐reviewed journal. Perhaps we should encourage our colleagues … to no longer submit to, or cite papers in, this journal. We would also need to consider what we tell or request our more reasonable colleagues who currently sit on the editorial board.” (from the Climategate emails)
When a measure becomes a target, it ceases to be a good measure.
P.S. Max More has a post on the issue focused on whether cryonic suspension will work and demonstrating some of the problems with relying on experts.
A commenter points me at another Huemer post along similar lines that covers some of my points.
It turns out that Huemer was answering my emails; my failure to realize it was my fault (see the comments for more details). He agrees that we were writing for different people and that my view is correct for the sufficiently intelligent. See the comments for more details.
Thanks for your thoughts, David. I think you are not receiving my email messages, but I'll just post a couple of initial reactions here.
One is that I don't think we have a large substantive disagreement, since I think you are giving advice for especially smart and knowledgeable people similar to yourself. I agree that it's better for you, David Friedman, to think issues through for yourself, in about the way you currently are. I just think there are very few David Friedmans in the world, and that the great majority of people (even the great majority who are interested in these kinds of controversial issues) will mess it up.
When I mentioned climate change, I was mostly thinking about the questions of (i) whether global warming is happening, and (ii) what is causing it. I think we agree that those questions are best addressed by consulting expert climate scientist opinion. I also agree with you that climate scientists are not experts on the human impacts of climate change (which depend on how humans will respond).
About the Krugman case: Your comments on that seem persuasive, but notice that you didn't try to present the economic arguments directly. You alluded to the fact that the great majority of economists ("the rest of us") think minimum wage reduces employment, you mentioned that even the earlier Krugman agreed with that view, then suggested a reason why his current position might be biased. To me, this sounds closer to my advice in the "critical thinking" post than yours.
"Trust the experts" has less appeal if you know something of the history of science, particularly applied science. I've read that early in the 19th century, a group of British experts went to Portugal, where there was an epidemic, found no visible evidence of any kind of harmful substance travelling from one person to another, and concluded that quarantine was based on Papist superstition. There is the famous case of Semmelweiss starting to clean his hands before delivering babies, and avoiding most cases of childbed fever---and being driven out of the profession by other doctors. There is the widespread support of doctors and biologists for eugenics in the early 20th century, including forced sterilization of large numbers of people in the United States. In all these cases, experts were giving practical and moral advice based partly on their specialized knowledge and partly on the value judgments common in their class---and were harmfully wrong. Should we suppose that our experts now are immune to that sort of error? That seems wildly optimistic, and is susceptible to becoming self-serving.