Global Warming: The Hidden Assumption
On the face of it, there is no obvious reason why making the world a few degrees warmer would be a bad thing. Yet many people regard global warming not merely as a random element that might make things better or worse but as something obviously bad and obviously worth going to a lot of effort to prevent. Part of the reason, I think, is an unstated assumption—that, absent human intervention, climate is stable. Given that assumption it seems natural enough to worry about the destabilizing effect of human action, such as increases in carbon dioxide. We know the present situation is tolerable; who knows what change might bring?
That assumption is contradicted by massive geological evidence. As my geologist wife likes to point out, at various points in the past million years England has gone from being warm enough for hippos to live there to being buried under a mile of ice. And while the major glaciations are spaced out at intervals of about a hundred thousand years, they are separated by multiple smaller swings in climate. Looking at an even shorter time scale, it looks as though more than half of the temperature increase from 1600 to the present, at least in Europe, was merely bringing the temperature back up to where it was in 1100, before the start of the little ice age.
If earth's climate is inherently unstable, with or without human interference, the argument that we should play safe by not interfering looks a lot weaker.
All of which raises a factual question to which I think I know the answer but am not sure. If we consider global warming in the context not of the effect of human action but of past swings in climate, which is more dangerous—the hot or the cold end of the range? Given that the cold end involved glaciers covering much of North America and northern Europe, my guess is that it is worse, but I don't have any very detailed idea of just how hot the hot end got, and where.