63 Comments

Thanks for your thoughts, David. I think you are not receiving my email messages, but I'll just post a couple of initial reactions here.

One is that I don't think we have a large substantive disagreement, since I think you are giving advice for especially smart and knowledgeable people similar to yourself. I agree that it's better for you, David Friedman, to think issues through for yourself, in about the way you currently are. I just think there are very few David Friedmans in the world, and that the great majority of people (even the great majority who are interested in these kinds of controversial issues) will mess it up.

When I mentioned climate change, I was mostly thinking about the questions of (i) whether global warming is happening, and (ii) what is causing it. I think we agree that those questions are best addressed by consulting expert climate scientist opinion. I also agree with you that climate scientists are not experts on the human impacts of climate change (which depend on how humans will respond).

About the Krugman case: Your comments on that seem persuasive, but notice that you didn't try to present the economic arguments directly. You alluded to the fact that the great majority of economists ("the rest of us") think minimum wage reduces employment, you mentioned that even the earlier Krugman agreed with that view, then suggested a reason why his current position might be biased. To me, this sounds closer to my advice in the "critical thinking" post than yours.

Expand full comment
author

Glad to hear from you. No idea what the email problem is — replying to my emails should automatically put the right email address on your replies.

I agree with you on the question of whether warming is happening and the cause, although I am less certain about the latter — climate is a very complicated system, and although the sign of the effect of additional CO2 is pretty clear on theoretical grounds the magnitude is not, so it is possible that something else is responsible for much of the warming. But the important issue is what, if anything, should be done about climate change.

Expand full comment

Fair point. Perhaps there aren't any experts on what should be done. Or perhaps the experts are the people in the Copenhagen Consensus project.

I assume only that human activity is a significant contributor to the warming; I think that is the consensus. That's compatible with there being other significant contributors.

Maybe my messages have been spam-blocked?

Expand full comment
author

It's my fault. They got sorted out of In to Libertarian, and it is a subfolder of a subfolder which might explain why I didn't see New Mail color. Found them now.

Expand full comment

What about option 4: figure out what policy option is best for you, and choose to support those experts advocating the alleged truth which leads to policy you want.

This might be the worst option, but I believe it is, in fact, the most common/ popular, in those cases where option 5 doesn’t determine.

Option 5-whatever my tribe claims, I’m loyal to it.

Expand full comment

If you trust the experts, you'll (bear with me) think there's no such thing as global warming. That's if the first experts you find are the ones who say the false experts who believe in global warming are propagandists working for the carbon trader-legislative complex. If you find climate change supporters first, then you'll believe in climate change and think the "deniers" are the fake self-acclaimed experts. (David fits in neither role, complicating matters further. :-) )

I think there are roughly zero controversies for which this isn't the case. Ken Ham is an expert on the origin of the species. There are even experts who will tell you that HIV does not cause AIDS, and that the "experts" claiming it does are the fakes. So, good luck believing the experts - if you can figure out who they are with anything less difficult than critical thinking!

Expand full comment

"Trust the experts" has less appeal if you know something of the history of science, particularly applied science. I've read that early in the 19th century, a group of British experts went to Portugal, where there was an epidemic, found no visible evidence of any kind of harmful substance travelling from one person to another, and concluded that quarantine was based on Papist superstition. There is the famous case of Semmelweiss starting to clean his hands before delivering babies, and avoiding most cases of childbed fever---and being driven out of the profession by other doctors. There is the widespread support of doctors and biologists for eugenics in the early 20th century, including forced sterilization of large numbers of people in the United States. In all these cases, experts were giving practical and moral advice based partly on their specialized knowledge and partly on the value judgments common in their class---and were harmfully wrong. Should we suppose that our experts now are immune to that sort of error? That seems wildly optimistic, and is susceptible to becoming self-serving.

Expand full comment

I have figured out that expert advice is generally better than you doing your own research when certain conditions are met :

1. When experts have some real skin in the game for being right.

For example I trust my car mechanic lot more than Bill Nye. My car mechanic will go out of business if his expert opinion is wrong consistently.

It is easier to trust an expert talking about gravitational field of Moon rather than climate impact of human fats because person is unlikely to be emotionally/financially dependent on what that opinion is when it comes to former.

However, as we move into esoteric topics, it is generally harder to understand the incentive structure either. For example saying certain things about moon's gravity might have some perverse incentives to start next project and receive more funding for that researcher.

Expand full comment

My test for how seriously to take the "experts" in a field: does being wrong cause the "experts" to exit the field?

Expand full comment

Some random comments (not a single coherent argument).

1) What first came to my mind here was the idea of the fox and the hedgehog. I don't think "always trust the experts" is likely to be a good policy - but I feel the same way about "always examine the evidence for myself".

2) Does Michael Huemer follow his own advice? In particular, is this argument he's making based on the consensus of expert opinion about how best to make decisions? Or is this his own argument, the result of him examining the evidence personally?

3) It's one thing to "trust the experts". It's another thing to trust a soundbite summary of expert consensus. And that may be true even if the experts themselves produced the soundbites (intentionally dumbing down the topic to the level that they feel the average person can handle), let alone when the summary was produced by journalists.

4) Beware of experts attempting to convince "the average person". Or maybe just beware of experts attempting to convince. Some even admit that they've studied what the average person finds most convincing, and chosen to produce that, fallacies and all, rather than the sort of arguments that might convince those they regard as their peers.

Expand full comment

A random response to "some random comments".

1-I recommend reading Huemer's articles on the subject. As he recently said in response to Friedman's blog post they don't "have a large substantive disagreement". His arguments defend the thesis that we should prefer 2 strategies over the free-thinking way (strategy #3) when we're ***average people*** trying to answer controversial questions in which we're not experts.

Strategy #1: Credulity. "You canvass the opinions of a number of experts, and adopt the belief held by most of them. In the best case, you find a poll of the experts; failing that, you may look through several books and articles and identify their overall conclusions.

Strategy #2: Skepticism. "You give up on finding the answer, i.e., immediately suspend judgement."

2- Humer seems to think that there are times when "the non expert" may have a "best chance of reaching a true conclusion" following strategy #3. As he said in his 2005 paper on Critical Thinking: "It seems, then, that the sort of consideration suggested favors the adoption of Critical Thinking only if (a) something about the experts renders them less able than ordinary people to implement the techniques of critical thinking, or (b) the experts have not generally tried to implement those techniques. There may well be cases in which one or the other of these conditions holds. If there are, and the lay person has good reason to believe he is dealing with such a case, then the approach of Critical Thinking is probably his best bet."

Expand full comment

*thoughtful*

As you recognized, I was responding entirely to David Friedman's post, and had never read anything by Michael Huemer.

There's a long answer in me trying to get out, but I'd need to actually read Huemer, and then give the ideas time to get properly absorbed and responses considered. Otherwise I'd just give you a "fast system" response, which would be about as likely to be useful as such responses generally are in unfamiliar situations ;(

Other than that, while I'm not a specialist in any currently controversial areas - or if I am, not a specialist in the precise sub-area - I'm smarter than the average bear, and often have relevant experience. I don't have a good mental model of a person who considers themselves neither intelligent nor expert, and suspect that while I could advise them (based on a bad mental model), they'd do well to reject my advice as inapplicable. I simply don't know what course of action would make sense *to them*, either à priori, or based on its results.

Expand full comment

Here you can find Humer blogposts: https://fakenous.substack.com/p/is-critical-thinking-epistemically https://fakenous.substack.com/p/on-challenging-the-experts

In his website you can find his paper on epistemic responsability and Critical Thinking. https://owl232.net/

Expand full comment

>Does Michael Huemer follow his own advice? In particular, is this argument he's making based on the consensus of expert opinion about how best to make decisions? Or is this his own argument, the result of him examining the evidence personally?

a-Michael is not a regular person, he is brilliant.

b- He is an expert on epistemology.

Expand full comment

He’s brilliant, but like all brilliant folk is able to lie to himself so well he believes his own lies: non-truths. David too, probably also you and me, as well as every expert.

Expand full comment

Yes, none of us can deny that. But in general, experts are in a better (epistemic) position to find the truth. If you have a medical treatment A that cures the patient 50% of the time, and another treatment B with a reliability of 20%, you should choose A. Or do you think that you're generally in a better epistemic position than the experts to find the truth? It seems really plausible to me that it's generally better to follow the expert consensus or to be skeptical about the questions you're trying to answer.

Expand full comment

When buying and selling are controlled by legislation, the first things to be bought and sold are legislators (PJR). And when 'trust the experts' becomes a rule governing belief formation, the first thing to be governed is who counts as an "expert".

Goodhart's law deserves to be known more widely-- I would argue it is one of the most fundamental laws in the social sciences. Here are my attempts:

https://triangulation.substack.com/p/harvards-plagiarism-scandal-reveals

https://triangulation.substack.com/p/goodharts-law-in-education

Expand full comment
author

George Stigler has an essay on the subject that you would enjoy, probably in _The Intellectual and the Marketplace_. It's a story about an educational reformer who tries various ways of fixing academic incentives, each of which gets gamed.

Expand full comment

I probably spent too much time trying to convince my graduate MPA students that it was impossible to implement a policy that couldn't be gamed. I had a series of "Laws" about such. Among them were "Every law or policy has at least one loophole that allows it to be gamed. Corollary: writing new law or policy aimed at closing a loophole merely at the very least creates a new loophole, whether it closes the old one or not. Second corollary: finding and exploiting loopholes pays off better that finding and closing loopholes so there will be more loophole seeker/exploiters than loophole seeker/closers."

I advised them to try to figure out the loopholes in advance and see what level of exploitation they felt were worth living with and justifying.

I used Hurricane Katrina as an example. You can carefully vet everyone to see who 'deserves' and will use tax dollar assistance wisely, and take two years to figure out who they are and how much to give, or you can load up credit cards for $2K and hand them out to everyone in the appropriate zip codes knowing that at least 10% will be ineligible/crooked or used for silly things. Pick one and defend it. I was brutal that way. ;-)

Expand full comment

Thanks David. Just downloaded the book. I believe you're referring to the first essay of the book. The reformer name is Seguira.

Expand full comment
author

Sounds right. I read it a long time ago. Stigler has several spoof essays, all good.

Expand full comment

Of note on the utility of listing arguments against your own position, I think the chapter in Machinery of Freedom where you went through potential problems with your ideas was what actually convinced me to start calling myself an ancap.

Expand full comment

Are you sure Huemer rejects evidence and arguments in favor of authority? He relies on evidence and arguments all the time. I've never known him to appeal to his own authority as an expert. Experts seem to be least reliable concerning matters involving values--such as ethics, politics, and religion--or when they have other reasons to be biased. In many fields experts have no apparent reason to be biased, and in those fields--especially in fields that require expertise well beyond one's own capacity--one may have no reasonable alternative to relying on expert opinion.

Expand full comment

It's difficult to trust "experts" when you have NGOs naming unqualified people as "experts" in order to further their agendas. Titles such as "Chief Scientist" are affixed to mouthpieces for pay.

From experience I know you can't trust anything out of The Rocky Mountain Institute. Yet a "report" (which was retracted a few days later) from RMI that purported that gas stove cause asthma was used and is still being used as justification for a campaign against gas stoves by regulatory bodies in the US goverment.

Yet the RMI continues to be a go-to for media outlets seeking "neutral" experts.

Or consider the case of Mark Jacobson. He's been soundly criticized and defeated in court, yet Standford university continues to support him. Why would I trust any "expert" from Stanford at this point? Being associated with Standford is clearly no indication of quality. The university has demonstrated that they are no purveyor of reliable scientific discourse.

Expand full comment

Two of my favorite thinkers debating one of the most important questions. What a joy! I hope this is a continuing debate. It might even be worth a podcast or live debate between you two.

The extreme historical example is Lysenkoism. Despite significant breakthroughs by Russian biologists in the late 19th and early 20th centuries, Lysenkoism nearly completely ruined an entire scientific field in Russia for a very long time, largely through the mechanisms David points out at the end.

This topic is all the more important given the use of scientific experts to justify global near-totalitarianism during the COVID epidemic. Expertise has been weaponized by governments and corporations, and I can sympathize with average IQ people starting to question expertise. Of course, people still need to act in the world, and it's all too easy for an average IQ person to fall prey to alt-experts.

Inculcating steelman thinking into people seems to be very difficult. It's not clear to me there's a great way out of this situation for the average person until the incentives are fixed, in the same way that Soviet biology only recovered once political pressures subsided.

Expand full comment

It's worth noting, however, that some of what I was told, in the 1970s, was Lysenkoist rubbish, is now established consensus biology.

The experiences of the parents can and do affect the biology of children and even grandchildren, and it took some extremely solid evidence to get through the wall of "this is a crackpot idea". Look up the term "epigenetics" for more details.

Expand full comment

While it's true that some conclusions of Lysenkoism have ended up partly true, the more concerning thing about it, in the context of this discussion, was its explicit denial of Mendelian genetics (so much so that geneticists were literally murdered). My point was that science can be heavily corrupted by incentives. Of course, Lysenkoism is an extreme example where disagreeing scientists were murdered. In the U.S., they're just called conspiracy theorists.

Expand full comment

And their careers and platforms are "cancelled." That's a lot worse than name calling.

Expand full comment

"If someone offers a positive correlation between the existence of the death penalty and the murder rate in U.S. states as proof that the death penalty does not deter without discussing the obvious problem of reverse causality ..."

I get that correlation does not tell us causation but I don't know what you mean regarding reverse causality. Surely you don't mean that having the death penalty causes people to commit murder so what is this obvious problem?

Expand full comment

States with higher murder rates might be more likely to choose to have a death penalty?

Expand full comment

Yes, that makes sense as a reverse causality.

But does it happen? And then the rates drop to about the same as states without?

So do we see a change in murder rate when death penalty laws come and go?

Expand full comment

Ah, Richard "I realize social science is fake, but I still want to be a technocrat so I'm going to act like it's not" Hanania.

Expand full comment

Cross examination of experts in court is interesting. It’s conducted by non-experts and judged by a non-expert. My impression over 30 years is that true experts are very rare, and even true experts only have expertise over a very narrow range. There are no experts as soon as the questions broaden out.

Expand full comment

I listened to "dueling" experts for 23 days in a "bad baby" case. When reaching a verdict (unanimous, although we needed only 6/8) it was interesting to listen to how my jury mates evaluated the various experts. We had at least 4 on the jury who I believe would probably qualify as experts in their own fields, and they were the most hesitant to substitute expert testimony for their own good sense.

We also had a very tired (he worked all night then attended the trial in the day) 25-year-old who was quite nice, reasonably intelligent young man who swung like a weather vane, accepting the word of whichever expert testimony was being discussed.

In the end, although I disagreed a bit about the outcome, I accepted the verdict and award as quite probably the best possible outcome. I was at that moment proud of our jury system and that jury. Looking back, I still am. They basically accepted the 'experts' as information but not as determinate or final.

Expand full comment

In my opinion that's how experts should be taken. I am a top expert within parts of my field, and somewhat of an expert within other parts. But if I don't keep up with the most recent changes or information, or if I'm tired or don't think through the conclusions in specific circumstances, I could be wrong. If I had to estimate, I'm right about stuff in my job 90-95% of the time overall, and 99%+ when I'm dealing with the topics I am best at, but maybe 70% on topics I'm not as sure about or weird corner cases, even within my normal job.

In most circumstances I double-check myself and do the necessary research to confirm what I thought was true actually is. Sometimes I find I made a mistake and was wrong, or that some specific intricacy of the situation was unusual enough to be different from normal. Sometimes other people notice and point out to me the error. This is okay, it happens, and there's no way to fully prevent it. Even experts are fallible. This experience tells me that it's both okay and necessary to double-check what even experts say, even in their own fields.

Expand full comment

As an "average" person I believe there are few instances where pure expertise sways me. As others have noted, today's expert may well be tomorrow's fool, even in the most objective pursuit. In the back of my mind is "well that's what is believed now". And when it comes to extremely complex multidisciplinary problems, like climate change, there can only be consolidators who evaluate many experts to form an opinion much like I would thought likely with a higher capacity and more perseverance. For some of the "softer" issues, abortion for one, there is little legitimate expertise, only well read opinion.

Expand full comment

Agreed. I have pretty strong memories of different times in my life that archeologists have discovered some previously unknown city/culture/earliest human and changed what was taught about it. At no point did anyone acknowledge that they were *wrong* in what was taught before the change, nor acknowledge that they may be wrong about what they're saying now. Too many incentives to claim to be right, and too few to hedge appropriately.

Expand full comment

I favor your side, David, although perhaps you are not so different from Michael if he is talking about people of average intelligence and you are not.

In some areas -- especially in fields that are new or not well established -- it can be tricky to figure out who the experts are. You may have to dig into relevant work to even figure that out. I wrote about the expertise issue in relation to cryonics:

https://biostasis.substack.com/p/who-are-the-experts-on-cryonics

Expand full comment

Also relevant:

https://www.niemanlab.org/2021/08/trusting-science-leaves-people-vulnerable-to-believing-pseudoscience-new-research-finds/

“‘Importantly, the conclusion of our research is not that trust of science is risky but rather that, applied broadly, trust in science can leave people vulnerable to believing in pseudoscience.’ [The researchers] suggest a more sustainable solution for curbing misinformation is helping the public develop a type of scientific literacy known as methodological literacy. People who understand scientific methods and research designs can better evaluate claims about science and research, they explain.“

Expand full comment