Rebane's Ruminations
August 2012
S M T W T F S
 1234
567891011
12131415161718
19202122232425
262728293031  

ARCHIVES


OUR LINKS


YubaNet
White House Blog
Watts Up With That?
The Union
Sierra Thread
RL “Bob” Crabb
Barry Pruett Blog

“In fact, [climate change] has now driven our climate outside the range that has existed the last 10,000 years…” — Dr. James Hansen (from NPR)

George Rebane

TwoViewsThere’s a coordinated blitz to recover ground lost to climate change skeptics in the past couple of years.  The lost ground is uneven – in fact, in California there’s actually been a gain as the rogue CARB continues creating chaos – but there’s been enough doubt expressed by politicians across the country so that another application of bogus science is called for to shore up the politically correct belief systems that have recently shown some fraying at the edges.

And who but that intrepid NASA scientist Hockeystick Hansen himself has leaped into the very breach of the breech with a Washington Post op-ed piece, and a report purported to be published by the National Academy of Sciences (I couldn’t find it, and their search engine has no knowledge of a James Hansen.  Hmmm.).  But there’s enough in the political coverage (here and here) of ol’ Hockeystick’s re-emergence to piece together the elements of his current assault on the nation’s credibility.

RR readers have been exposed to his shenanigans in past years when he first introduced his notorious global temperature ‘hockeystick’ (here and here).  And, in addition to scores of scientists, tireless bloggers like Anthony Watts, Steve McIntyre, and our own Russ Steele have spent years laying bare the errors and lack of science that the AGW (anthropogenic global warming) crowd assembled by the UN has been inflicting on the world’s public policies.

Well, now Dr Hansen claims that all these extreme hot weather events we’ve had in the last few years are proof positive that they are caused by AGW.  BTW, the cold weather events during the same interval are just that, ‘weather’ and not climate change.  You gotta have a PhD to tell the difference.

And this time he’s not bringing up another collection of dodgy computer models like those that created the hockeystick.  No, now he’s appealing to established statistics and recorded historical weather data – you know, bell curves and all that.  Well actually it’s hard to tell what kind of statistics he’s using because the media are guaranteed to muck up what he really did.  All we know from the reports is that he compared data from the base period of 1951-1980 with data from 1981-2010 – two thirty-year periods from a (climate) process that changes in the order of centuries.  Best to watch the NPR video to get the gist of his arguments.

Assuming he carefully tallied up weather events of various intensities for the base period, he could then make a histogram that plots weather intensity (x-axis) against the number of such incidents for each level of weather intensity (y-axis).  He then fits a bell curve (Gaussian distribution) to this histogram.  Then he does the same thing for the 1981-2010 interval data, and lays the two bell curves over each other.  And by Jove, it appears that the most recent bell curve is visibly shifted to the right of the base period bell curve.

The extreme weather events are those represented by the right tails of the two bell curves (see Hansen’s figures in the video), and the most recent 30 years shows a higher number for any intensity level of weather events.  So there you have it, slam dunk, end of story, AGW is here, let’s get that cap n’tax legislation going again.


But not so fast.  There are a couple of huge assumptions that went into this little statistical hokus-pokus.  First, his sample intervals are too small to support the assertion that his thirty year tallies correctly captured the real distributions of the random weather processes which any given operating climate regime will contain.  In fact, skeptics point out that looking at data windows the size of a century or so reveals that 1951-1980 was an extraordinarily mild period in the world’s weather when compared with what came before.  Such alternating periods of intense and mild years are quite normal in a stable climate process that are longer than Hansen’s thirty year looks.

Second, he hasn’t told us what the normal variability is in such bell curves as we slide a 30-year window from, say, 1850 to 2000.  We know that year by year such bell curves will shift left and right, and get wider and narrower.  (Here we’re looking at the auto-correlation of the intense weather process that he’s reporting on, but that’s getting too techie for the scope of this post.)  Finally, we know that he has included wildfires as ‘weather events’ without acknowledging the ongoing and insane forest fuels build-up policy of the US Forest Service, and the increased use of wildlands by the public.  These factors have increased significantly between the two 30-year study periods.

In sum, we haven’t even gotten to the more seminal factors that argue for 1) man-made climate effects, and 2) whether we actually know how to purposely, and without doing greater harm, alter climate in a desirable direction.  For example, we know that historically higher temperatures don’t go in lockstep with atmospheric carbon levels as claimed in the public policy (read political) arena.  The main political thrust of UN’s fraud-weighted International Panel on Climate Change  continues to be the implementation of the UN’s frequently denied Agenda21 through frontal attacks like California’s AB32.  These are now being rammed down our throats by CARB, and the more nefarious ‘sustainability’ and ‘smart growth’ projects fostered by the various worldwide ICLEI (‘International Council for Local Environmental Initiatives’) chapters of which over 500 are in the US, with California claiming 100 of them.

Exit question:  Is Nevada County getting ready to set up its own ICLEI chapter, now formally renamed as ‘ICLEI – Local Governments for Sustainability’?  (Gotta hide that “International Council” stuff else the natives start getting restless.)

[7aug12 update]  For completeness, here is Anthony Watts’ response to the latest from Hansen.  And Fred Krupp, president of the Environmental Defense Fund, argues (celebrates?) in the 7aug12 WSJ that there is ‘A New Climate-Change Consensus’ between skeptics (mostly conservatives) and true believers (mostly liberals).  To the mix we now add the recent work by Richard Muller, former skeptic and UC Berkeley physicist also cited by Krupp, who has reanalyzed the historical temperature data and fit some new regression models to it that show a recent increase of temperature.  He does no climate physics to identify contributing factors, let alone their magnitudes, to his temperature derivations, and simply concludes that the rise is overwhelmingly anthropogenic in origin since what else could it be.

[Technical Appendix]  A more informative and defendable approach to investigating the imputed increase in bad weather incidents is to look at them in terms of arrivals, and then compute the probability of a given number arriving in, say, a recent time interval.  I hope the non-technical readers haven’t rolled their eyeballs yet, because understanding this approach is very accessible.

Let’s start with a slight detour by baking a raisin cake.  Suppose we intend to bake a raisin cake in a standard 8x8x2 inch cake pan.  The volume of this cake will be 128 cubic inches.  The recipe calls for a cup of raisins, which here we’ll count out as 300 raisins.  We pour the raisins into the gooey batter, mix everything thoroughly, and stick the cake in the oven.  Out comes a beautiful raisin cake that has an average density of 300/128 = 2.34 raisins per cubic inch.

Now the raisins are distributed randomly as a result of our pouring the raisins into the gooey batter in the mixing bowl and turning on the mixer.  If we cut off a 4×2 inch piece measuring 16 cubic inches, we’d expect to find about 16×2.34 = 37.5, or about 38 raisins in that piece.  I said ‘about’ since the actual number of raisins will vary, because that number is really what’s called a random variable – there may be more or fewer raisins in another 16 cubic inch piece we cut from the cake.

An important and interesting question to ask at this point is, ‘What is the probability that we’ll find x raisins in a given volume V of the cake in which the expected number of raisins is λ (Greek letter lambda)?’  Here our example volume was V = 16 cu in, and λ = 37.5.  Skipping a bunch of mathematical details, the answer to the question comes from a formula called the Poisson distribution, which expresses (please, nobody panics!) P(x), the probability of finding x raisins as

PoissonPDF
The x! term is the factorial which is illustrated by the example 3! = 3*2*1 = 6.  Also, it turns out that 0! = 1 by definition.  The natural base (a property of our universe) e = 2.718… .  So if we wish to know what is the probability that our 16 cu in piece has only x = 20 raisins in it, we would just plug into the formula and compute (easy to do on a spreadsheet that has Poisson function built in).

PoissonExample
So the chances of finding exactly 20 raisins in that volume of cake is mighty slim, as you would expect.  A more useful question might be ‘what’s the probability of finding at least 20 raisins in our piece of cake?’  This is computed as the complement of the probability of not finding zero, one, two, …, or nineteen raisins in the piece.  Or P(x ≥ 20) = 1 – [P(0) + P(1) + P(2) + … + P(19)].  If you go through the calculations, you’ll find that P(x ≥ 20) = 0.999, almost a certainty that you’ll find 20 or more raisins in a 16 cu in piece of cake.

In a similar manner, we can use the formula to find answers to such questions about the raisins in various sizes of pieces in various quantities, or within various limits like, say, ‘what’s the probability that the number of raisins found in a 25 cubic inch piece is between 50 and 75?’ – now using a λ = 25*2.34 = 58.50, the expected number of raisins in 25 cu in, the answer is 0.84 or about 5 out 6 chance.  That’s all good and well, but what’s it have to do with severe weather incidents and climate change?

To answer that let’s trade in raisin cake volumes for time intervals, and the number of raisins for the number of specifically defined severe weather events.  If we take a sufficiently long (think credible) time interval such as, say, from 1850 to 1950, and count the number of, say, known hurricanes observed in the Atlantic during that time, we’ll get an average hurricane incidence or arrival rate of so many hurricanes a year.

If we now assume that the same weather process that operated over the 1850-1950 ‘control interval’ is still in operation today – in other words climate has not changed significantly since 1950 – then we can use the Poisson distribution formula to compute the probabilities that we would have seen the actual recorded number of hurricanes (or any other ensembles of weather events) during the recent years.  This will give us a definite level of confidence on what has or has not happened to climate.

We can also compare such recent probabilities to those that occurred for similar spans of years in the control interval.  I would venture that such probabilities are not sufficiently different to cause any reasonable person to become worried that some fundamental climate process has changed markedly to give the results computed for the last thirty year interval.  And that’s the proper way to do the analysis, and avoid the synthesized pyrotechnics generated by Dr James Hansen.

I invite people – Anthony Watts?, Steve McIntyre? – who have the data, and have studied the parameters of severe weather events, to perform this analysis and write a report that would be much more revealing than the one described by Dr Hansen.

Posted in , , , ,

155 responses to “Hockeystick Hansen Strikes Again (updated 7aug12)”

  1. TomKenworth Avatar

    The future beckons to STEM students who make the “right” choices:
    http://www.towson.edu/main/abouttu/newsroom/cybersecurity081712.asp

    Like

  2. George Rebane Avatar

    TomK 1017pm – I think I mentioned this solution under an earlier post. Designer chemicals including tracer radioactive (very low level) isotopes can be injected with the fracking mud. They will permeate anywhere that CH4 does. If they show up in your sink where none had been observed before, you can make a case that it was due to fracking.
    I’m sure that the energy companies have gone all around the territory and baselined the potable water sources. It would be foolish for them not to have done that, given the potential for subsequent legal costs. Anyone have any data on that?

    Like

  3. TomKenworth Avatar

    That would be only if you ignored the law of gravity. are the tracer isotops as light as methane, and are they in a gaseous state? Yes, those folks probably do have such data. No, they are not giving it out until they have to in a court of law to prove their case, otherwise they’d be sharing it already to prove their innocence before it goes to court.

    Like

  4. George Rebane Avatar

    TomK 211pm – no gravity need be ignored; methane and other injectible gases can be made radioactive (some are so naturally). Remember, you are assuming here that methane is leaking out of the return path of the drill pipe, and not percolating upward through thousands of feet of rock into the water table, which percolation would have occurred without fracking.

    Like

Leave a comment