Bruce Hall observes that climate forecasters probably need to adjust their confidence intervals. For example:
- NOAA predicted a the beginning of this season that there was an 85% chance of an above-normal season. In fact, the hurricane season was well below average. I haven’t done the math, but my guess is that if their forecast showed 85% probability of above normal, the year probably came in in the bottom 1% of its expected distributions
- The UK Met Office predicted that there was a 60% probability that world temperatures in 2007 would be the highest in the last 100+ years, ie higher than temperatures in 1998. In fact, it looks like 2007 will be among the coolest years in decades, and will come in as much as a half degree C below 1998, a huge difference. Again, I have not run the numbers, but it is safe to say that this outcome would probably have been in the bottom 1% of the original forecast distribution.
If all your forecasts are coming out in the bottom 1% of the forecast range, then it is safe to assume that one is not forecasting very well. Which reminds me of Michael Mann, who said with famous confidence that there was a 95-99% probability that 1998 was the hottest year in the last 1000, which is an absurd claim. (Mann now denies having said this, but he is actually on film saying it, about 25 seconds into the linked clip).