Using Computer Models To Launder Certainty

(cross posted from Coyote Blog)

For a while, I have criticized the practice both in climate and economics of using computer models to increase our apparent certainty about natural phenomenon.   We take shaky assumptions and guesstimates of certain constants and natural variables and plug them into computer models that produce projections with triple-decimal precision.   We then treat the output with a reverence that does not match the quality of the inputs.

I have had trouble explaining this sort of knowledge laundering and finding precisely the right words to explain it.  But this week I have been presented with an excellent example from climate science, courtesy of Roger Pielke, Sr.  This is an excerpt from a recent study trying to figure out if a high climate sensitivity to CO2 can be reconciled with the lack of ocean warming over the last 10 years (bold added).

“Observations of the sea water temperature show that the upper ocean has not warmed since 2003. This is remarkable as it is expected the ocean would store that the lion’s share of the extra heat retained by the Earth due to the increased concentrations of greenhouse gases. The observation that the upper 700 meter of the world ocean have not warmed for the last eight years gives rise to two fundamental questions:

  1. What is the probability that the upper ocean does not warm for eight years as greenhouse gas concentrations continue to rise?
  2. As the heat has not been not stored in the upper ocean over the last eight years, where did it go instead?

These question cannot be answered using observations alone, as the available time series are too short and the data not accurate enough. We therefore used climate model output generated in the ESSENCE project, a collaboration of KNMI and Utrecht University that generated 17 simulations of the climate with the ECHAM5/MPI-OM model to sample the natural variability of the climate system. When compared to the available observations, the model describes the ocean temperature rise and variability well.”

Pielke goes on to deconstruct the study, but just compare the two bolded statements.  First, that there is not sufficiently extensive and accurate observational data to test a hypothesis.  BUT, then we will create a model, and this model is validated against this same observational data.  Then the model is used to draw all kinds of conclusions about the problem being studied.

This is the clearest, simplest example of certainty laundering I have ever seen.  If there is not sufficient data to draw conclusions about how a system operates, then how can there be enough data to validate a computer model which, in code, just embodies a series of hypotheses about how a system operates?

A model is no different than a hypothesis embodied in code.   If I have a hypothesis that the average width of neckties in this year’s Armani collection drives stock market prices, creating a computer program that predicts stock market prices falling as ties get thinner does nothing to increase my certainty of this hypothesis  (though it may be enough to get me media attention).  The model is merely a software implementation of my original hypothesis.  In fact, the model likely has to embody even more unproven assumptions than my hypothesis, because in addition to assuming a causal relationship, it also has to be programmed with specific values for this correlation.

This is not just a climate problem.  The White House studies on the effects of the stimulus were absolutely identical.  They had a hypothesis that government deficit spending would increase total economic activity.  After they spent the money, how did they claim success?  Did they measure changes to economic activity through observational data?  No, they had a model that was programmed with the hypothesis that government spending increased job creation, ran the model, and pulled a number out that said, surprise, the stimulus created millions of jobs (despite falling employment).  And the press reported it like it was a real number.

Postscript: I did not get into this in the original article, but the other mistake the study seems to make is to validate the model on a variable that is irrelevant to its conclusions.   In this case, the study seems to validate the model by saying it correctly simulates past upper ocean heat content numbers (you remember, the ones that are too few and too inaccurate to validate a hypothesis).  But the point of the paper seems to be to understand if what might be excess heat (if we believe the high sensitivity number for CO2) is going into the deep ocean or back into space.   But I am sure I can come up with a number of combinations of assumptions to match the historic ocean heat content numbers.  The point is finding the right one, and to do that requires validation against observations for deep ocean heat and radiation to space.

169 thoughts on “Using Computer Models To Launder Certainty”

  1. renewable guy:

    NetDr:

    So far the CAGW hysteria is short on testable results.

    By predicting everything they predict nothing. Saying that every weather event is climate is silly.

    So inadvertently your name speaks a profound truth.

    ###################################

    http://www.skepticalscience.com/empirical-evidence-for-global-warming.htm

    The skeptic argument…
    There’s no empirical evidence
    “There is no actual evidence that carbon dioxide emissions are causing global warming. Note that computer models are just concatenations of calculations you could do on a hand-held calculator, so they are theoretical and cannot be part of any evidence.” (David Evans)

    What the science says…
    Direct observations find that CO2 is rising sharply due to human activity. Satellite and surface measurements find less energy is escaping to space at CO2 absorption wavelengths. Ocean and surface temperature measurements find the planet continues to accumulate heat. This gives a line of empirical evidence that human CO2 emissions are causing global warming.
    **********
    Recent studies show that the particular isotope of CO2 attributed solely to mankind is also produced naturally. As the oceans warm they hold less CO2. Exciting news !

    The RATE OF WARMING is what is significant. Non engineers think it is a Boolean function 1/0 but it isn’t. The “C” in CAGW requires a RATE OF WARMING.

    We have never seen that rate even for a short time so far in recorded temperatures.

    The 1978 to 1998 run up was only 1.2 ° C per century.

    Even the oceans warming has “paused” [stopped] according to the met office.

    http://www.metoffice.gov.uk/news/releases/archive/2011/ocean-warming

    So both the atmosphere and the oceans are longer warming ?

  2. Renewable

    The truth shall set you free.

    Hansen’s model scenario “B” predicted:

    http://cstpr.colorado.edu/prometheus/archives/hansenscenarios.png

    1988 = .4 [measured]

    2010 = 1.0 predicted

    So predicted warming was .6 ° C.

    Actual Warming.

    1988 = .4 [measured]

    2010 = .63

    http://data.giss.nasa.gov/gistemp/graphs/Fig.A2.txt

    Actual warming [scenario “B”] .23

    So predicted / actual = .6 / .23 = 260 % So Predicted warming was 260 % of actual warming and you claim it was close ??????????

    I don’t understand the verbal tap dance about 1984. Skeptical science has no respect for truth.

    The prediction was made [or shown to congress] in 1988 and the temperature was known at that time. Hansen should have known that the climate is a negative feedback system and a high is always followed by a low to drive toward the “set point”. [Like 1998 high was followed by 1999 and 2000]

    Ignorance is no defense when you claim to understand climate well enough to predict 100 years in the future.

    Sounds like more humma humma to me.

    Skeptical Science is notoriously lousy at math and logic.

    A child could do better.

  3. Still no response from

    Yeah, Everyone’s Opinion is Equally Valid in Science:

    The sock puppet has been sacked.

  4. NetDr:

    Did you consider the average trend line from Hansen’s projections.

    The five year mean at 2008 was .55
    1988 mean was .25
    This would suggest about .21C/decade

    A trend line would put Hansens average projection at about .8 to .85

    this would suggest about .28 to .30/decade.

    .28 – .21 = .07

    .07/.21 x 100% = 33%

    Your denier math and logic is very good.

    Not so good for your science math and logic.

    I sense cherry picking here.

    Hansen guessed high on co2 emissions and climate sensitivty.
    Like I said Hansen got the trend right. When you change the values of his model to present day knowledge, the model he used gets even closer to present day temp observations.

  5. The more data the better. There are biases in the argo data that your link doesn’t discuss. Its starting to work its way through straightening out the data and its meaning.

    http://www.skepticalscience.com/Ocean-Cooling-Corrected-Again.html

    Upper ocean warming (0-700mtrs) is slower than that observed during the 1990’s, but the oceans are still gaining heat. Indeed, the slow-down is to be expected if recent papers on increased reflective aerosols in the atmsophere are correct.

  6. Lindzen vs Hansen. Hansen wins hands down. Lindzen’s point of view does not even come close to matching the temperature record.

  7. renewable guy:

    NetDr:

    Did you consider the average trend line from Hansen’s projections.

    The five year mean at 2008 was .55
    1988 mean was .25
    This would suggest about .21C/decade
    *********
    My calender says 2011, how about yours ?

    Let’s talk about the present or is that “cherry picking” ?

  8. Renewable

    Why do you keep trying to prove that Hansen’s model war doing pretty well in 2007 ?

    That is cherry picking.

    YOU ARE IN DENIAL. Dr Hansen’s model is doing poorly as of 2010 and 2011 will be even worse.

    Jan – July = 45 41 57 54 42 50 = .48 average

    I can hardly wait for the entire year to be done, Dr Hansen’s model will look far worse than it does now.

  9. netdr:
    renewable guy:

    NetDr:

    Did you consider the average trend line from Hansen’s projections.

    The five year mean at 2008 was .55
    1988 mean was .25
    This would suggest about .21C/decade
    *********
    My calender says 2011, how about yours ?

    Let’s talk about the present or is that “cherry picking” ?

    ################################

    I can see that its not to your advantage to look at the trend. If I were you I would do the same.

    But………………… I’m not you.

    As I have said before, its the trend. And if you look at the further projections by Hansen, it flattens out for awhile.

    Next year is the El Nino which is predicted to be a record temperature year.

    3C is on track by the end of the century.

    http://en.wikipedia.org/wiki/File:Risks_and_Impacts_of_Global_Warming.png

    3 degrees centigrade gets us into the 1 thru 4 category.

    From what I have read of the IPCC they are conservative. This will have a higher chance of being stronger than they say.

  10. netdr:
    Renewable

    Why do you keep trying to prove that Hansen’s model war doing pretty well in 2007 ?

    That is cherry picking.

    YOU ARE IN DENIAL. Dr Hansen’s model is doing poorly as of 2010 and 2011 will be even worse.

    Jan – July = 45 41 57 54 42 50 = .48 average

    I can hardly wait for the entire year to be done, Dr Hansen’s model will look far worse than it does now.

    ############################

    You are actually making my case Net. Its the long term trends that the deniers like to ignor and that is what Hansen defintely has right. It’s really not about the being totally predictive of exact temperatures. I’ve never made that case. It’s your strawman point.

    It is your choice to ignor the long term trend.

  11. TheChuckr:
    renewable guy:
    The Chuckr:
    No GCM has shown skill in predicting long-term climate trends. Pielke Sr. has written numerous papers on this topic as have others or maybe Dessler, Romm, or Schmidt don’t take Pielke Sr. seriously, either

    #################
    source?

    http://pielkeclimatesci.wordpress.com/category/climate-models/

    ########################

    Do you know what models do well and what they don’t do well?

  12. TheChuckr:
    The science community knows a fraud when they see one. That is why Spencer has been criticized for the model in his paper. Which rang through the echo chamber of the conservative media.

  13. renewable

    Alarmists think reality is a crutch.

    My calendar says 2011 and if that is an uncomfortable fact so be it.

  14. *****Do you know what models do well and what they don’t do well? ********
    **********
    They don’t predict future warming well at all.

    In fact they now call them “projections”. [which policy should not be based upon ?]

    They are good for printing out reams of paper to burn on a cold winter night. [Due to global warming ?]

  15. http://en.wikipedia.org/wiki/Attribution_of_recent_climate_change

    Some scientists do disagree with the consensus: see list of scientists opposing global warming consensus. For example Willie Soon and Richard Lindzen[36] say that there is insufficient proof for anthropogenic attribution. Generally this position requires new physical mechanisms to explain the observed warming.[37]

    ###########################

    PDO anyone?

  16. http://en.wikipedia.org/wiki/Attribution_of_recent_climate_change

    Attribution of recent climate change is the effort to scientifically ascertain mechanisms responsible for recent changes observed in the Earth’s climate. The effort has focused on changes observed during the period of instrumental temperature record, when records are most reliable; particularly on the last 50 years, when human activity has grown fastest and observations of the troposphere have become available. The dominant mechanisms (to which recent climate change has been attributed) are the result of human activity. They are:[1]

    increasing atmospheric concentrations of greenhouse gases

    global changes to land surface, such as deforestation

    increasing atmospheric concentrations of aerosols.

    There are also natural mechanisms for variation including climate oscillations, changes in solar activity, variations in the Earth’s orbit, and volcanic activity.

    Attribution of recent change to anthropogenic forcing is based on the following facts:

    The observed change is not consistent with natural variability.
    Known natural forcings would, if anything, be negative over this period.
    Known anthropogenic forcings are consistent with the observed response.
    The pattern of the observed change is consistent with the anthropogenic forcing.

    ###############

    The scientists have laid out their case for why the earth is warming over mostly the last 50 years.

  17. http://en.wikipedia.org/wiki/Attribution_of_recent_climate_change

    Recent reports from the Intergovernmental Panel on Climate Change (IPCC) have concluded that:

    “Most of the observed increase in globally averaged temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations.”;[2] It is extremely unlikely (<5%) that the global pattern of warming during the past half century can be explained without external forcing (i.e., it is inconsistent with being the result of internal variability), and very unlikely that it is due to known natural external causes alone. The warming occurred in both the ocean and the atmosphere and took place at a time when natural external forcing factors would likely have produced cooling.[3]

    "From new estimates of the combined anthropogenic forcing due to greenhouse gases, aerosols, and land surface changes, it is extremely likely that human activities have exerted a substantial net warming influence on climate since 1750."[1]

    "It is virtually certain that anthropogenic aerosols produce a net negative radiative forcing (cooling influence) with a greater magnitude in the Northern Hemisphere than in the Southern Hemisphere.[1]

    ##############

    The panel defines "very likely," "extremely likely," and "virtually certain" as indicating probabilities greater than 90%, 95%, and 99%, respectively.[1] The IPCC's attribution of recent global warming to human activities is a view shared by most scientists,[4][5]:2 and is also supported by a number of scientific organizations (see scientific opinion on climate change).

  18. http://en.wikipedia.org/wiki/Attribution_of_recent_climate_change

    Estimates of internal variability from climate models, and reconstructions of past temperatures, indicate that the warming is unlikely to be entirely natural.

    Climate models forced by natural factors and increased greenhouse gases and aerosols reproduce the observed global temperature changes; those forced by natural factors alone do not.[17]

    “Fingerprint” methods indicate that the pattern of change is closer to that expected from greenhouse gas-forced change than from natural change.[18]

    The plateau in warming from the 1940s to 1960s can be attributed largely to sulphate aerosol cooling.[19]

  19. Renewable

    You keep thinking it is 2007.

    The TREND as measured by 5 year averages is:

    http://cstpr.colorado.edu/prometheus/archives/hansenscenarios.png

    1988 = .25
    Predicted 2010 = 1, 1, 0.85, 0.85, 0.85, = .91 average

    [I picked the values off the graph, you might disagree slightly.]
    Actual 2010 = 0.55 0.58 0.44 0.57 0.63 = .55 average

    So predicted = .91-.25 = .66

    Actual was .55 – .25 = .3

    Ratio = .66/.3 = 220 % double what actually happened

    YOU ARE IN DENIAL !

    As of 2011 Hansen’s model is worthless.

    There was a simulated volcano in the model which depressed temperatures prior to 2010 so the more of those years you pull into your TREND the better he looks. As you can see using 5 year averages doesn’t make the model look any better.

    Pretend it is 2007 again!

    I can see why you prefer those cherries ! Your TRENDS all stop at 2007 which everyone agrees was when Dr Hansen’s model looked best.

    Since then warming has stopped when the model predicted it would increase sharply.

  20. http://en.wikipedia.org/wiki/Attribution_of_recent_climate_change

    Attribution requires demonstrating that a signal is:

    unlikely to be due entirely to internal variability;

    consistent with the estimated responses to the given combination of anthropogenic and natural forcing

    not consistent with alternative, physically plausible explanations of recent climate change that exclude important elements of the given combination of forcings.

    Detection does not imply attribution, and is easier to show than attribution. Unequivocal attribution would require controlled experiments with multiple copies of the climate system, which is not possible. Therefore, attribution, as described above, can only be done within some margin of error. For example, the IPCC’s Fourth Assessment Report says

    “it is extremely likely that human activities have exerted a substantial net warming influence on climate since 1750,” where “extremely likely” indicates a probability greater than 95%.[1]

  21. Estimates of internal variability from climate models, and reconstructions of past temperatures, indicate that the warming is unlikely to be entirely natural.

    Climate models forced by natural factors and increased greenhouse gases and aerosols reproduce the observed global temperature changes; those forced by natural factors alone do not.[17]
    **********
    That is the lamest reason to believe in global warming that has ever been advanced.

    the models don’t even pretend to include clouds or ocean cycles or aerosols. All of which radiate heat away from earth.

    That is like simulating a car without a radiator and predicting it will overheat.

  22. NetDr:

    Climate is about 30 year trends. Why are you focused on 2010?

    Gavin Schmidt better at this than I am. 3rd graph down.

    http://www.realclimate.org/index.php/archives/2009/12/updates-to-model-data-comparisons/

    The trends are probably most useful to think about, and for the period 1984 to 2009 (the 1984 date chosen because that is when these projections started), scenario B has a trend of ((((((0.26+/-0.05 ºC/dec))))))) (95% uncertainties, no correction for auto-correlation). For the GISTEMP and HadCRUT3 data (assuming that the 2009 estimate is ok), the trends are ((((((0.19+/-0.05 ºC/dec)))))) (note that the GISTEMP met-station index has (((((((0.21+/-0.06 ºC/dec)))))). Corrections for auto-correlation would make the uncertainties larger, but as it stands, the difference between the trends is just about significant.

    .26 -.21 = .05 .05/.21 x 100% = 24% high

    .26 – .19 = .07 .07/.19 x 100% = 37% high

    which ever you want to compare to the trends are upward in temperature record and model.

  23. netdr:
    Estimates of internal variability from climate models, and reconstructions of past temperatures, indicate that the warming is unlikely to be entirely natural.

    Climate models forced by natural factors and increased greenhouse gases and aerosols reproduce the observed global temperature changes; those forced by natural factors alone do not.[17]
    **********
    That is the lamest reason to believe in global warming that has ever been advanced.

    the models don’t even pretend to include clouds or ocean cycles or aerosols. All of which radiate heat away from earth.

    That is like simulating a car without a radiator and predicting it will overheat.

    ############################

    Dr Dessler has confirmed cloud feedback on a short term data as mostly pos feedback.

    Care to explain your position?

    What do you know that they don’t?

    Are the scientists careless and reckless with their studies?

    Do all models not simulate clouds?

    Ocean cycles aren’t included? Can you show that?

    There is also emperical evidence besides models. Disproving models if you are able does not make global warming go away.

    Doubting does not explain why we are warming.

  24. “it is extremely likely that human activities have exerted a substantial net warming influence on climate since 1750,” where “extremely likely” indicates a probability greater than 95%.[1]

    ######################

    This is the IPCC consensus.Substantial to me is way more than half of temp increase from human influence.

    This decade will be warmer than the last decade. Co2 pretty much guarantees that.

  25. NetDr
    http://www.grida.no/publications/other/ipcc_tar/?src=/climate/ipcc_tar/wg1/007.htm

    Figure 4: Simulating the Earth’s temperature variations, and comparing the results to measured changes, can provide insight into the underlying causes of the major changes.

    A climate model can be used to simulate the temperature changes that occur both from natural and anthropogenic causes. The simulations represented by the band in

    (a) were done with only natural forcings: solar variation and volcanic activity. Those encompassed by the band in

    (b) were done with anthropogenic forcings: greenhouse gases and an estimate of sulphate aerosols, and those encompassed by the band in

    (c) were done with both natural and anthropogenic forcings included.

    From (b), it can be seen that inclusion of anthropogenic forcings provides a plausible explanation for a substantial part of the observed temperature changes over the past century, but the best match with observations is obtained in

    (c) when both natural and anthropogenic factors are included.

    These results show that the forcings included are sufficient to explain the observed changes, but do not exclude the possibility that other forcings may also have contributed. The bands of model results presented here are for four runs from the same model. Similar results to those in (b) are obtained with other models with anthropogenic forcing. [Based upon Chapter 12, Figure 12.7]

    #######################################

    This is all hindcasting.

  26. netdr:
    renewable

    Alarmists think reality is a crutch.

    My calendar says 2011 and if that is an uncomfortable fact so be it.

    ######################################

    Fill me in Net. What are talking about.

  27. netdr:
    *****Do you know what models do well and what they don’t do well? ********
    **********
    They don’t predict future warming well at all.

    In fact they now call them “projections”. [which policy should not be based upon ?]

    They are good for printing out reams of paper to burn on a cold winter night. [Due to global warming ?]
    #####################

    You might get it some day or you will take denial to your grave. Your choice.

    You and the scientists are on different wavelengths. Possibly you should correct them and show them how to do it right.

  28. netdr:
    renewable guy:

    NetDr:

    Did you consider the average trend line from Hansen’s projections.

    The five year mean at 2008 was .55
    1988 mean was .25
    This would suggest about .21C/decade
    *********
    My calender says 2011, how about yours ?

    Let’s talk about the present or is that “cherry picking” ?

    ############################
    Mine says 2011 also. Climate is more than the average temperature of 2010. The purpose wasn’t to know exact temperature at a future date. Its to show trends with human influence on the climate.

  29. Renewable Guy:

    Two statements of yours:

    1. “The five year mean at 2008 was .55 … This would suggest about .21C/decade”
    2. “Climate is about 30 year trends.”

    If the climate is about 30 year trends (and it is), stop using 5 year trend to say Hansen was not that far off from reality.

    What you are trying to do with Hansen is worse than cherry-picking. Cherry-picking is merely taking several ways of looking at the same data and choosing the one most favorable to your point, preferring it to other ways without a good reason, just because. What you are doing is *inventing* an invalid way to look at the data and preferring it to valid ways. That’s so wrong, I don’t know how to call it.

  30. Sorry, the above should be not 5 year trend, but a non-trend. Renewable takes two 5 year means placed 20 years apart and calls the straight line between these means a trend. His method is highly dependent on the choice of end points and the radius, does not account for middle values, which is why this is not how scientists compute trends.

  31. Renewable Guy:

    “If you choose to read the article, there is discussion about the argo float data and the accuracy of short term trends. There is a section called the more data the better.”

    Of course, I did read the paper. Yes, there is discussion about the argo float data and the accuracy of short term trends. You seem to imply that this discussion contains something that casts doubt or even invalidates the findings in the paper, but this is not the case. The discussion is this:

    [Argo floats] This monthly dataset (Fig. 1) uses only data from the Argo array of profiling floats. Heat content is evaluated down to 900 m depth.

    [The accuracy of short term trends] The objective is to estimate the linear trend in heat content. However, there is an obvious one year periodicity in the data (Fig. 1a) as noted by Willis et al. (2008b). Proper assessment of trend needs to take this into account, especially when the data are over a 4.5-year interval.

    So what? The first is a simple statement of what the data was. The second is an equally simple statement that the data contains a clear periodical signal and computing a trend should take this into account. The paper did take this into account (look up the word “sinusoidal” in text).

    Now, the main result of the paper is this:

    The 95% confidence intervals on the trend are from -0.148 x 10^22 to -0.550 x 10^22 J/yr. This result clearly excludes warming as a possible interpretation of this data. Examination of residuals from the model fit shows no evidence of nonlinearities, indicating a constant linear cooling trend.

    So, what was that “If you choose to read …” about? The implication that “discussion about the argo float data and the accuracy of short term trends” somehow weakens the paper is wrong.

    “The long term trend is still a warming earth.”

    Where does this come from? The paper says otherwise.

  32. Renewable Guy:

    On von Schuckmann, 2011, from skepticalscience:

    “The warming trend observed is slightly smaller than that seen in Von Schuckmann (2009), where the authors measure down to ocean depths of 2000 metres, and found a warming trend of 0.77 ±0.11 watts per square metre. However, it completely refutes a recent (2010) skeptic paper which suggested the oceans were cooling, based on the upper ocean down to 700 metres. Clearly much heat is finding it’s way down into deeper waters. And although small in comparison, the deep ocean is gaining heat too.”

    Sorry, no.

    So, there are papers that show that the oceans are cooling and papers that show that the oceans are warming, based on the same data. I understand that folks like skepticalscience want to immediately assume their case and label the first kind of papers unscientific, but the proper way to do this is to look into the data and the analysis done to it, and find out exactly why the two kinds of papers come to different results from the same data. That’s how science works.

    Now, I might not be a climate scientist, but when I look into von Schuckmann, 2011, I see that they throw away what potentially is quite a bit of original data by using the following procedure: the data points are averaged into boxes, then the values in boxes with less than 10 data points are thrown away and replaced with the spatial mean. The paper notes that replacing the thrown away data with zeros (which seems to make sense, since we are talking about “anomalies”) is wrong because doing this “results in an underestimation of the global trend”. This might be OK, but, I have to say, this does look like something worth exploring further due to the smell of circular logic again (the anomalies are not zero, because zeros are inconsistent with the assumed global trend, and then: the global trend computed from all anomalies, including those that could have been filled with zeros but weren’t, coincides with the assumed global trend). Regardless, if the above procedure of throwing away data and replacing it with approximations is really the reason why the two kinds of papers come to different results as regards ocean heat trends, this absolutely has to be explored further. If the raw data says “cooling”, but the averaged, filtered, backfilled with interpolations data says “warming”, that’s a fundamental conflict. You can’t resolve it by saying that the raw data is wrong.

    All in all, the existence of both papers that show that the oceans are cooling and papers that show that the oceans are warming is exactly what I was arguing when I said that there is no definitive research that allows one to ultimately conclude that ‘It’s the thermal lag’. If you beg to differ, Renewable, I am listening.

  33. Finally, this:

    “No Malcolm, it was so wrong it couldn’t make it into the quality journals. They don’t publish far right propaganda.”

    …is a cop out. We are talking about science. If you want to say that a particular paper is bad science, go ahead and point out the flaws in that paper. Insisting that a paper is bad science because it has been published in a wrong journal is more often than not a sign that you can’t find anything else wrong with it.

  34. renewable guy:
    TheChuckr:
    renewable guy:
    The Chuckr:
    No GCM has shown skill in predicting long-term climate trends. Pielke Sr. has written numerous papers on this topic as have others or maybe Dessler, Romm, or Schmidt don’t take Pielke Sr. seriously, either

    #################
    source?

    http://pielkeclimatesci.wordpress.com/category/climate-models/

    ########################

    Do you know what models do well and what they don’t do well?

    August 6, 2011, 6:39 pm

    Temperature trend but not magnitude, not rainfall, not ocean heat content, not sea level rise, and not regional phenomena.

    renewable guy:
    TheChuckr:
    The science community knows a fraud when they see one. That is why Spencer has been criticized for the model in his paper. Which rang through the echo chamber of the conservative media.

    August 6, 2011, 6:43 pm

    Ad homs, as usual when you cannot challenge the science, and Spencer is not the only scientist who has come to the conclusion that cloud feedbacks are likely negative and there is a large discrepancy between the “satellite observations and IPCC models in their co-variations between radiation and temperature” causing the IPCC models to overestimate global warming (you can look up references yourself).

    You also conveniently forget that the CAGW cabal have actively have to prevent publication of articles that disagree with the “consensus” even to the point of trying to remove the editors of magazines that allow such articles to be published. If their science was any good, they wouldn’t have to defend their consensus view by such actions which amount to scientific censorship.

  35. Renewable

    You are in DENIAL about Dr Hansen’s flawed model.

    As I showed above even using 5 year averages to establish the TREND he is wrong by 220 %.

    The mumbo jumbo skeptical science engages in is childish drivel. I have pointed out their errors many times and they go [snip].

  36. renewable guy:

    netdr:
    renewable guy:

    NetDr:

    Did you consider the average trend line from Hansen’s projections.

    The five year mean at 2008 was .55
    1988 mean was .25
    This would suggest about .21C/decade
    *********
    My calender says 2011, how about yours ?

    Let’s talk about the present or is that “cherry picking” ?

    ############################
    Mine says 2011 also. Climate is more than the average temperature of 2010. The purpose wasn’t to know exact temperature at a future date. Its to show trends with human influence on the climate.

    **********************************
    A 5 year trend is a fair way to establish a trend.

    It guards against a particularly hot El Nino year or cold La Nina year from skewing the data.

    Since the model was presented to congress in 1988 there are only 23 years of data available to evaluate.

    Starting your trend before this date is sloppy thinking.

    Skeptical Sciences rational for using 1984 is laughable.

    Using 1984 is cherry picking at it’s finest and unfair because the data from 1984,5,6,7 is already known in 1988. [Of course the model matches it ! ] Sloppy thinking sloppy sloppy.

  37. A 5 year trend is a fair way to establish a trend
    S/B
    A 5 year average is a fair way to establish a trend.

  38. Renewable

    Skeptical Science’s humma humma about trends is junk.

    The actual trend from 1988 to present is .0152 per year [No crisis in sight is there?]

    http://www.woodfortrees.org/plot/hadcrut3vgl/from:1988/to:2012/plot/hadcrut3vgl/from:1988/to:2012/trend

    #Least squares trend line; slope = 0.0152067 per year

    The projected warming was [using 5 year averages.] .032 per year.

    [1988 5 year avg [.388] projected 2010 5 year average [.744]

    Dr Hansen’s model predicts 200 % of actual warming. That is not skillful despite the attempts of skeptical science to cherry pick and mislead you.

    In the future when temperatures are even lower than today 20 years from now the good ship Global Warming will have long since sunk but you will still be in DENIAL.

    Al Gore’s rantings at Aspen show that his cause is leaking badly. Damaged by too much skepticism !

  39. Renewable

    What doesn’t convince anyone not already a believer.

    Denning’s Views on Pitfalls to Avoid

    An example of “what doesn’t work” in speaking with audiences such as those at the Heartland conference, Denning wrote, “is the condescending argument from authority that presumes that the Earth’s climate is too complicated for ordinary people to understand, so that they have to trust the opinions of experts.”

    [As I always said:
    .
    .
    “People who let others think for them because they think they aren’t capable of it ARE ABSOLUTELY RIGHT !

    They know their limitations.” — Netdr]

    .
    “Appeals to ‘overwhelming scientific consensus’ are more likely to confirm the audience’s suspicions of some kind of nefarious conspiracy than to change minds,

    [Group think isn’t nefarious but it is illogical. — Netdr]” Denning wrote, and “even the concept of peer review can sound sinister.” [Peer review by like minded individuals isn’t peer review at all is it ? – Netdr]
    .
    .
    Trust a skeptic to do a good job of finding your errors. Just ask Dr Mann.

  40. BTW\

    The skeptical science method of analyzing Dr Hansen’s model performance was an amazing example of cherry picking and twisted thinking. If I made a model and evaluated it that way my boss would fire me.

    How they thought it was fair to start at 1984 is beyond comprehension.

    A fair method is 5 year averages to avoid spikes then evaluate predicted vs actual.

    By that method Hansen’s model was an epic fail ! 220 % wrong.

  41. Malcolm:
    Sorry, the above should be not 5 year trend, but a non-trend. Renewable takes two 5 year means placed 20 years apart and calls the straight line between these means a trend. His method is highly dependent on the choice of end points and the radius, does not account for middle values, which is why this is not how scientists compute trends.

    #########################

    I dropped that method and went to one already done by Gavin Schmidt. Taking a five year trend is more intune with the averge temp than a single year.

  42. Malcolm:
    Finally, this:

    “No Malcolm, it was so wrong it couldn’t make it into the quality journals. They don’t publish far right propaganda.”

    …is a cop out. We are talking about science. If you want to say that a particular paper is bad science, go ahead and point out the flaws in that paper. Insisting that a paper is bad science because it has been published in a wrong journal is more often than not a sign that you can’t find anything else wrong with it
    #########################

    http://www.skepticalscience.com/spencers-misdiagnosis-of-surface-temperature-feedback.html

    http://www.skepticalscience.com/just-put-the-model-down-roy.html

    I had already read these and I believe I had talked about them earlier on here. If not, you can go look for yourself.

    Climate Progress also did some of their own work crticizing Roy’s science. Roy is a well known denier working for right wing think tanks. Its very easy for me to scorn him. He just isn’t really honest. He claims his paper is science but it really a story to bolster denial.

  43. http://www.skepticalscience.com/Ocean-Cooling-Corrected-Again.html

    Covering ocean depths to 2000 meters is a more thorogh coverage than 700 meters. If you read the skeptical science article, there is a large amount of uncertainty of the argo data because it is so new.

    Getting the ocean temperature data down to lower uncertainties is important right now and will take time. It would be unusual for the atmosphere to be heating up and the ocean cooling.

    Ocean warming in context
    The warming trend observed is slightly smaller than that seen in Von Schuckmann (2009), where the authors measure down to ocean depths of 2000 metres, and found a warming trend of 0.77 ±0.11 watts per square metre. However, it completely refutes a recent (2010) skeptic paper which suggested the oceans were cooling, based on the upper ocean down to 700 metres. Clearly much heat is finding it’s way down into deeper waters. And although small in comparison, the deep ocean is gaining heat too.

    Upper ocean warming (0-700mtrs) is slower than that observed during the 1990’s, but the oceans are still gaining heat. Indeed, the slow-down is to be expected if recent papers on increased reflective aerosols in the atmsophere are correct.

    Conclusion
    The ARGO network was completed in November 2007, and only since then has the network been able to provide more robust short-term trends. Over the period 2005-2010 the oceans (10-1500 meters down) have warmed 0.55 watts per square meter, but error uncertainty is almost 20%. Uncertainty will reduce as the length of the observational record increases, but Von Schuckmann and Le Traon (2011), caution that this is provided no more systematic errors remain in the network.

  44. Do you know what models do well and what they don’t do well?

    August 6, 2011, 6:39 pm

    Temperature trend but not magnitude, not rainfall, not ocean heat content, not sea level rise, and not regional phenomena.

    ###################################

    Can you show me your sources on this?

    ###################################

    renewable guy:
    TheChuckr:
    The science community knows a fraud when they see one. That is why Spencer has been criticized for the model in his paper. Which rang through the echo chamber of the conservative media.
    #############
    http://www.skepticalscience.com/just-put-the-model-down-roy.html
    #################
    August 6, 2011, 6:43 pm

    Ad homs, as usual when you cannot challenge the science, and Spencer is not the only scientist who has come to the conclusion that cloud feedbacks are likely negative and there is a large discrepancy between the “satellite observations and IPCC models in their co-variations between radiation and temperature” causing the IPCC models to overestimate global warming (you can look up references yourself).

    #############################

    I need to know if you are just asserting an opinion or you are basing it on some fact. If you are basing it on Coyote Blog, that would be a poor source of information. This goes to credibility. Many deniers such as yourself don’t like to expose themselves to high levels of crediblity.

    #######################

    We are following worst case scenario of the IPCC

    ######################

    You also conveniently forget that the CAGW cabal have actively have to prevent publication of articles that disagree with the “consensus” even to the point of trying to remove the editors of magazines that allow such articles to be published. If their science was any good, they wouldn’t have to defend their consensus view by such actions which amount to scientific censorship.

    ##########################

    Could you give an example?

  45. netdr:
    Renewable

    You are in DENIAL about Dr Hansen’s flawed model.

    As I showed above even using 5 year averages to establish the TREND he is wrong by 220 %.

    The mumbo jumbo skeptical science engages in is childish drivel. I have pointed out their errors many times and they go [snip].
    ##################

    http://www.realclimate.org/index.php/archives/2009/12/updates-to-model-data-comparisons/

    You have such a narrow focus which is what denialists do.

    I’m going with Gavin Schmidt’s analysis.

  46. netdr:
    BTW\

    The skeptical science method of analyzing Dr Hansen’s model performance was an amazing example of cherry picking and twisted thinking. If I made a model and evaluated it that way my boss would fire me.

    How they thought it was fair to start at 1984 is beyond comprehension.

    A fair method is 5 year averages to avoid spikes then evaluate predicted vs actual.

    By that method Hansen’s model was an epic fail ! 220 % wrong.

    ############################################

    You have given your opinion but have failed to make your point.

    There are other years which have 0% difference. Should Hansen’s model be evaluated by those years also?

  47. netdr:
    Renewable

    What doesn’t convince anyone not already a believer.

    Denning’s Views on Pitfalls to Avoid

    #############################

    So are you a Heartland follower?

    The conservative white male syndrome. A lot of people on this site seem to fit that stereotype.

    Just keep ignoring what you can’t deal with. It gives you the illusion of confidence.

  48. http://cstpr.colorado.edu/prometheus/archives/hansenscenarios.png

    If you notice a gray bar across the graph that says “Estimated temperature during Altithermal and Eemian times”

    One of Hansens points later on is that we are lifting up higher than the highest temperature of the Holocene (our time) and we will approach the average temperatures of the Eemian. During that time the sea level was several meters higher. WIth society’s present level of carbon use, we can go much higher than this. This graph is part of the argument for policy change.

Comments are closed.