CO2 and Drought

One of the sloppier predictions about global warming is that is will cause massive droughts, and certainly we have seen this line of reasoning over the last week as the media attempts to hang the blame for Southern California fire damage on CO2, when in fact most of the blame lies on rapid home construction in areas known to have a high fire danger.

I suppose the layman’s logic is as follows:  Well, it is usually hot when we have droughts, and it is hot in deserts, so therefore if the world gets hotter, we will have droughts and deserts.  Of course, this logic is silly, but is none-the-less prevenlent (does no one remember that rain forests are hot too?)

In fact, one almost certain effect of global warming will be an increase in the evaporation rate of the oceans.  Megatons more water is put into the sky as temperatures of the air and oceans rise.  Presumably, much of this water will fall as rain somewhere, so it would probably be more logical to guess that warming would cause more rain rather than less.

As Steven Malloy points out, as temperatures have risen about 0.6C over the last century, rainfall in the US and Southern California have actually increased:

During the period 1900-2005, precipitation seems to have actually increased in areas above 30 degrees north latitude — including California and the rest of the U.S. — according to the most recent assessment from the United Nations’ Intergovernmental Panel on Climate Change.

This does not mean, of course, that droughts haven’t occurred in North America over the last 100 years, but it doesn’t support a link between rising global temperature and increased drought.

Examining the occurrence of drought in southern California since 1900 is also illuminating.

According to data maintained by the federal National Climatic Data Center, drought conditions are no stranger to southern California.

During the period 1900 to 2005, moderate-to-severe drought conditions occurred in Southern California during 34 of those 106 years — that is, about one-third of the time.

Comparing the southern California drought record against the global temperature record reveals the following:

— During the period 1900-1940, when most of the 20th century’s one-degree Fahrenheit temperature increase occurred, there were 7 years of moderate-to-severe drought.

— During the period 1941-1975, when global temperatures cooled, giving rise to concerns of a looming ice age, there were 11 years of moderate-to-severe drought.

— During the period 1976 to 1990, when global temperatures rose back to the 1940 level, there were 8 years of moderate-to-severe drought.

— Since 1991, when global temperatures rose slightly past the 1940 levels, there have been 7 years of drought.

In fact, just last week I posted drought maps that showed that while Southern California has had drought conditions over the last year…

Spi12_200709_pg

…they have had absolutely average rainfall over the last five years and Northa America has been downright soggy:

Usnmx20070960monpctpcppg

On Monday, I will be releasing my climate video tentatively titled "What is Normal?"  A recurring theme in that video is our current inability to accept extremes in the weather as "normal,"  which in fact they are.

  • Jody

    What’s even worse is CO2-induced global warming’s positive-feedback assumption relies on more water vapor entering the air.

  • Doesn’t increased water vapor in the atmosphere buffer against warming? More clouds reflecting more light?

    Normal is what we live in and survive in at the moment. So if we’re living on a glacier, or sub-Sahara, that’s normal.

  • markm

    gmee: It depends… Clouds in the day-time have a cooling effect, but at night-time they hold the heat in very effectively. This summer in Michigan, it’s been very common to see clouds form at night (when the air cools a little), then dissipate soon after the sun comes up. So we gained heat in the day, cooled off to the dew point at night, and then stayed at the same temperature until sun-up. Next day, the sun would raise the temperature higher and pump more vapor into the air from Lake Michigan, so the dew-point was higher and part of the added heat was retained. The cycle repeated, heating more every day until a cold front pushed through, with thunderstorms on it’s leading edge.

    OTOH, if the clouds had stayed through the day, they would have reflected most of the sun and there would have been a cooling trend. And you can have entirely different weather on one side of a hill than on the other side three miles away…

    So the effects of water vapor depend entirely on cloud formation and dissipation and local and daily weather cycles – and if I’ve been properly informed, the weather modeling programs don’t include clouds at all, let alone use a grid fine enough to capture local effects.

  • Jim Evans

    I am skeptical that man has contributed to global warming, and I understand in theory what you’re saying. But even with my skepticism, it’s hard for me to completely ignore that the area I live in is undergoing massive drought, unprecedented at least in the last century, with the outlook for the next six months being drier and warmer than usual. If I, a skeptic, wonder if there’s a connection, I’m sure that GW believers are absolutely convinced there’s a connection. And of course there are those that use the drought to further their argument and the hysteria about coming environmental catastrophe. Naturally, they are ignoring the fact that the drought here in the southeast is impacted in part to man’s mis-use of water resources. Perhaps the dried up lakes would have plenty of water in them if society had better water resource conservation.

  • morganovich

    deserts are also very cold. Antarctica is the largest desert in the world.

    if you want to see a world with a drought ecology, look to the ice ages. the combination of water tied up in glaciers and and lower global temperature driving less evaporation leaves most of the rest of the globe in desert.

    back in the jurasic when greenland was covered in boreal forest all the way to the poles, it got a lot more rain, as did the world as a whole.

    these days every bad thing but tooth decay seems to be attributed to “global warming” and i suspect it’s only a matter of time before the ADA weighs in on the topic…

  • I’m no AGW enthusiast, but even I can see some problems in the argument you are making here.

    1. It’s not really that hot in rainforests. They are rather mild because the moist air stabilizes the temperature while the leaves and clouds keep insolation down. Rainforest plants turn heat into chemical energy while desert rocks merely re-radiate that heat.
    http://earthobservatory.nasa.gov/Newsroom/NewImages/images.php3?img_id=4803
    Look at the Amazon, Congo, and Indian/Asian rainforest areas in the linked image; all are blue (below average).

    2. The cooling effect of clouds is modeled and not large, possibly even negative (and therefore warming). It has much to do with where the heat is radiated/reflected: high or low in the atmosphere.
    http://www.realclimate.org/index.php?p=212

    I don’t know of any attempts to reconcile this model with the previous insolation data set – any hints? An obvious one is to conclude the model is wrong, but that would only be an assertion without empirical backup (“no evidence of black swans” is not the same as “evidence of no black swans”).

    3. The Steven Malloy quote may be misleading since the time spans of each of the periods are variable. The first one (lowest average temperature) is 40 years with 7 drought years (17.5%). The second is 34 with 11 drought years (32%). The third is 8 out of 14 (57%). The most recent (highest average temperature), assuming it goes to 2007, is 7 out of 16 (43%). When only looking at the number of drought years in each period (7, 11, 8, 7), it looks like there was more drought in the “cool” past, but looking at the percentage (17.5, 32, 57, 43%), it looks like there is more drought in the “warm” present.

    4. The last map uses the post-1940 period as its baseline and finds – shockingly enough – that there isn’t much change. How would it compare to a baseline compiled before 1940 (if such exists)?

    I recall reading somewhere that the official definition of drought is when rainfall is one-sixth the average. Since rainfall appears to vary around a mean in a bell curve distribution, you will have a drought roughly once every six years, or to put it another way, one-sixth of the country is probably always in a drought (I’m sure I’m fudging the statistics substantially here). The point is, I never get excited about this stuff because it is always in flux. The SW was in a severe drought two years ago, we were suffering flooding last year, and I believe we are in a slightly-below-average year this year, yet each has been claimed as a sign of AGW.

    Which leads me to the other point I would like to make, which is that Warren has hit on exactly one-half of a fallacy. The other half is the claim that all periods of high precipitation are *also* claimed to be signs of AGW. Similarly, both heat and cold waves are signs, the latter being claimed as AGW-induced “extreme” events. In Boolean terms, it looks like this:

    If A, then B
    If Not-A, then B

    Simplifies to

    If (A OR Not-A), then B

    The quantity (A OR Not-A) is always true

    Therefore, B (regardless of A)

    where B is “AGW is true” and A is “hot conditions exist” or “humid conditions exist” (and therefore Not-A is “cold …” or “dry …”)

    When someone finds that everything is evidence of that which they believe, that isn’t science, it’s dogma. Better to be honest and just say that you believe B to be true and dispense with irrelevant discussions of A and Not-A.

    There are some big problems with this the way I have presented it, but I have presented it essentially the way it is popularly argued by non-specialists, which is the problem Warren is addressing in the first place. One problem is with the definition of A and Not-A, and whether they are within a “normal” variation (what is normal?).

  • Mesa Econoguy

    Gee, since you’re no AGW enthusiast, quoting realclimate.org websites, how ‘bout you publish your data sets and multivariable models on a public drive?

  • happyjuggler0

    A drought is when there isn’t enough new water (usually defined by rainfall and capture) to replace the water used. The more water used, the harder it is to replace. Human population is growing, and they have to live somewhere, so that uses more water, making it harder for 100% water replacement to occur. Hence drought, and “severe drought”.

    Anyone who has studied economics even a little bit ought to have alarm bells ringing in his head whenever he hears the word “shortage”. In economics you learn that there is no such thing as a shortage, but rather it is more more properly framed as “a shortage at a price”. If there is a persistent shortage of something then the price is too low. What is needed is private property rights in water. Failing that due to concerns over “natural monopolies”, then I suggest pricing water at or near replacement levels assuming replacement came from desalination plants. Then put the new “profit” into a “permanent dividend fund” similar to Alaska and divide it up equally to each man woman and child citizen who resides in the area in question. You’ll have near instant water conservation and the “needy poor” will likely make money on the deal, so shut up already about the price hike being “regressive”. No need for government intrusion on how you water your lawn or how long you take showers for under high pressure water. Consider this link entitled I Can Fix the Water “Shortage” in Five Minutes.

  • Gee, since you’re no AGW enthusiast, quoting realclimate.org websites, how ‘bout you publish your data sets and multivariable models on a public drive?

    I’m not sure what you mean. Are you really saying that if I’m a skeptic, I’m not allowed to read or note what the non-skeptics are saying, and that I must have my own data and models to share? You didn’t really read my whole comment, did you?