23 Comments

“it may be in your interest to become a person who does not always act in your own interest.”

It is hard to say what is and isn’t in my interest, as this shows. The immediate consequence of an act may be something I would prefer, while the long-term effect on my subsequent opportunities make it less appealing. This argues against thinking of my interest as a unified consistent thing, rather than a bag of competing interests. But the fact that I must decide how to act narrows the scope - I can’t both do it and not do it. This doesn’t quite erase the distinction between altruism and selfishness, but it definitely complicates it.

Expand full comment

In my view, your explanation of why virtue pays is the best available (though I am familiar with it from David Gauthier's /Morals by Agreement/, ch VI). To rearrange the quote, if "you are engaged in voluntary interactions with people who correctly perceive what you will or will not do" then , yes, "it is in [your] selfish interest to be thought to be honest; the easiest way of achieving that result is to be honest". Sometimes, the interest-maximizing behavioral disposition would have one do things that are not interest-maximizing. But it is pretty far from what people want a justification of virtue to look like . . .

For why engage in voluntary interactions at all? After all, a disposition to reckless aggression is likely to be very beneficial, especially if one is stronger than others, and even if it would sometimes lead one into pointless fights (though, if it really is effective, that won't happen all that often). And this sort of thing is no fiction, and in some people it is quite natural.

Further, even if we just stick with voluntary interactions, it does not follow that "it is in your self-interest to be committed to act in ways that maximize the summed benefit to the group of people with whom you are interacting". Two problems here.

First, a more plausible alternative, it seems to me, is that one's interactions with others should be calibrated to the sort of person one is dealing with, viz a disposition to exploit those weaker than oneself (to develop the first point, where "exploit" is defined in terms of the objective interests of each party), a disposition to deal "fairly" with equals in power (this is the case you seem to have most in mind), and a disposition to acquiesce to those who are stronger (the flip-side of the first point, for, as Hobbes would put it, why kick against the pricks?).

And second, even if we are dealing with equals, not sure why a utilitarian outcome ("the summed benefit to the group of people") should be expected. A more plausible alternative, in my view, is for the "fair" outcome in this case to be determined by a would-be /bargain/ between the relevant parties, and this is unlikely to match the utilitarian outcome. It is also one that is more likely to result in "rights" being respected, since there is not tyranny of the minority with a bargain, if any party can walk away (again, assuming voluntary interactions).

So we get something like negative rights amongst people of roughly equal power and nous, but otherwise the strong and cunning exploiting the weak and naive. Better than nothing, I guess.

Expand full comment

I never understood why Newcomb's Paradox is supposed to be a difficult decision. Taking both boxes when the alien predicts you will take one gains you a 0.1% increase in your return, effectively zero compared to the error bars on whether the $1M will be there. Taking one when the alien predicted two has an opportunity cost of $1k, but gains valuable information about the fallibility of the alien.

Expand full comment

Seems to me the con man and the honest man share the same point-of-view in that they both are not concerned whether or not their self-directed action turns out to be maximally beneficial to them; i.e., they are willing to take risks. Virtue as a commitment strategy, on the other hand, is other-directed and seeks to minimize risk.

Expand full comment

"Kim" has always struck me as the ultimate "in theory this, in practice that" fiction. If you count the sheer number of creepy middle-aged men of various marital states through whose hands he passes, the chances of him making it to puberty without being raped and then murdered are in practice, nil.

Expand full comment

So you're saying that religious people are less likely to steal if they believe in an omnipresent omnipotent G-d?

And other people are less likely to steal simply because they don't want to be known as the type of person who steals?

Theoretically, you convinced me that the first type of person is more trustworthy.

Expand full comment