I wouldn’t say that I am a total sun hawk, meaning that I believe the sun and natural trends are 100% to blame for global warming. I don’t think it unreasonable to posit that once all the natural effects are unwound, man-made CO2 may be contributing a 0.5-1.0C a century trend (note this is far below alarmist forecasts).
But the sun almost had to be an important fact in late 20th century warming. Previously, I have shown this chart of sunspot activity over the last century, demonstrating a much higher level of solar activity in the second half than the first (the 10.8 year moving average was selected as the average length of a 20th century sunspot cycle).
Alec Rawls has an interesting point to make about how folks are considering the sun’s effect on climate:
Over and over again the alarmists claim that late 20th century warming can’t be caused by the solar-magnetic effects because there was no upward trend in solar activity between 1975 and 2000, when temperatures were rising. As Lockwood and Fröhlich put it last year:
Since about 1985,… the cosmic ray count [inversely related to solar activity] had been increasing, which should have led to a temperature fall if the theory is correct – instead, the Earth has been warming. … This should settle the debate.
Morons. It is the levels of solar activity and galactic cosmic radiation that matter, not whether they are going up or down. Solar activity jumped up to “grand maximum” levels in the 1940’s and stayed there (averaged across the 11 year solar cycles) until 2000. Solar activity doesn’t have to keep going up for warming to occur. Turn the gas burner under a pot of stew to high and the stew will heat. You don’t have to keep turning the flame up further and further to keep getting heating!
Update: A commenter argues that I am simplistic and immature in this post. I find this odd, I guess, for the following reason. One group tends to argue that the sun is largely irrelevant to the past century’s temperature increases. Another argues that the sun is the main or only driver. I argue that the evidence seems to point to it being a mix, with the sun explaining some but not all of the 20th century increase, and I am the one who is simplistic?
The commenter links to this graph, which I will include. It is a comparison of the Hadley CRUT3 global temperature index (green) and sunspot numbers (red):
Since I am so ridiculously immature, I guess I don’t trust myself to interpret this chart, but I would have happily used this chart myself had I had access to it originally. Its wildly dangerous to try to visually interpret data and data correlations, but I don’t think it is unreasonable to say that there might be a relationship between these two data sets. Certainly not 100%, but then again the same could easily be said of the relationship of temperature to Co2. The same type of inconsistencies the commenter points out in this correlation could easily be made for Co2 (e.g., why, if CO2 was increasing, and in fact accelerating, were temps in 1980 lower than 1940?
The answer, of course, is that climate is complicated. But I see nothing in this chart that is inconsistent with the hypothesis that the sun might have been responsible for half of the 20th century warming. And if Co2 is left with just 0.3-0.4C warming over the last century, it is a very tough road to get from past warming to sensitivities as high as 3C or greater. I have all along contended that Co2 will likely drive 0.5-1.0C warming over the next century, and see nothing in this chart that makes me want to change that prediction.
Update #2: I guess I must be bored tonight, because commenter Jennifer has inspired me to go beyond my usual policy of not mixing it up much in the comments section. A lengthy response to her criticism is here.