Monday 17 January 2011

Being Certain about Climate Change Uncertainty

By Martin Sewell, Senior Research Associate, 4CMR

There are aspects of climate change about which we are almost certain (the physical chemistry), and areas in which uncertainty is rife (e.g. the effect of clouds, the ocean, the response of biological processes, climate change mitigation). My view is that we must explicitly engage with uncertainty, and the best way to do so is using a probability distribution, and the wider the distribution, the greater the uncertainty.

The 18th century philosopher (and economist) David Hume pointed out that ‘even after the observation of the frequent or constant conjunction of objects, we have no reason to draw any inference concerning any object beyond those of which we have had experience’. In other words, one can never generalize beyond one’s data without making subjective assumptions, so science always involves a degree of uncertainty.

What is the best way of communicating uncertainty? In March 1951, the CIA secretly warned US officials that a Soviet attack on Yugoslavia should be considered a ‘serious possibility’. When Sherman Kent, a CIA intelligence analyst, asked his colleagues what probability they attributed to the likelihood of an attack on Yugoslavia in 1951, he was shocked to hear such a wide range of responses that varied from 20% to 80%. In 1964 Kent wrote the seminal Words of Estimative Probability in which he attempted to quantify qualitative judgements and eliminate what he termed ‘weasel’ words. For example, he recommended that ‘probable’ meant 63–87%, and ‘almost certain’ 87–99%. Since then, the BBC and the IPCC have also given serious consideration to how to communicate uncertainty. My view is that we should use probability.

Probability is conceptualised differently from discipline to discipline, for example, starting with the physics and chemistry, there is no significant doubt that CO2 is a greenhouse gas, and that levels of anthropogenic CO2 in the atmosphere are increasing. The theory of anthropogenic global warming is built on peer reviewed science that has accumulated since Fourier in the 1820s. In 2004, Naomi Oreskes analysed 928 abstracts, selected using the keywords ‘global climate change’, published in refereed scientific journals and found that none of the papers disagreed with the theory of anthropogenic climate change. According to the IPCC, it is extremely likely (> 95% probability) that human activities have exerted a substantial net warming influence on climate since 1750, very likely (> 90% probability) that anthropogenic greenhouse gas increases caused most of the observed increase in global average temperatures since the mid-20th century, and virtually certain (> 99% probability) that global warming shall continue in the future. The apparently counter-intuitive result that the past is less certain than the future implies that we’re pretty sure of the physics, less sure of historical measurements, and the forecast into the future of the global temperature must be steeper than the rise in the past.

Climate sensitivity is a measure of how responsive the temperature of the climate system is to a change in the radiative forcing. It is usually expressed as the temperature change associated with a doubling of the concentration of CO2 in the earth’s atmosphere. Climate sensitivity has a positive skew. The compounding effect of essentially linear feedbacks dominates system sensitivity, and the uncertainty here does not diminishing with time, the estimates are not expected to improve. So in terms of climate sensitivity, we have a situation where we’re pretty certain of the uncertainty.

Moving to economics, it seems clear that economic models contain greater uncertainty than climate models. Economic forecasting is notoriously difficult, largely because we can’t predict human creativity in innovation. Energy-environment-economy (E3) models, the type that can capture the interdependencies we need to make fully informed decisions, contain greater uncertainty still.

Let me illustrate this with an example: two popular solutions to the problem of how to mitigate climate change are a carbon tax and emissions trading (cap and trade). They are theoretically equivalent except that they’re logically opposed regarding where the uncertainty lies: a carbon tax fixes the rate of taxation and allows emissions to vary, whilst emissions trading fixes emissions and allows the cost of compliance to vary. The physical science solution would be to fix emissions (implying a preference for emissions trading), whilst the economist’s solution would be to internalize an externality with a carbon tax; however, because we introduce greater uncertainty when considering variations in human behaviour, my preference lies with taxation. Of this, at least, I am certain!

4 comments:

  1. On that basis, since none of the extreme weather during the age of CO2 obsession has been historically abnormal in frequency, strength, or normalized damage levels, can we say thta the probability of dangerous cliamte change is ~0%?

    ReplyDelete
  2. "none of the extreme weather during the age of CO2 obsession has been historically abnormal in frequency, strength, or normalized damage levels"
    I would argue that during the last 30 years there have been many examples of extreme weather, for example the 2003 European heat wave. On this basis alone, it would be higher than 0%.

    ReplyDelete
  3. I will have to agree with Frontiers, there hasn't been a extreme weather event in many years. The English Hail storm of 1087 AD was an extreme weather event.

    ReplyDelete
  4. Ian - in terms of the UK, the Great Storm of 1987, which occurred on the night of 15-16 October 1987, might be considered an extreme weather event.

    As with smoking and cancer, it is impossible to prove cause of one event (a smoker dies of lung cancer), but easy to prove the probable cause of an increased number by assessing many such events (20-a-day smokers are XX% more likely to die of lung canger than those who don't smoke).

    ReplyDelete