Friday, April 30, 2010

Where are the tone trolls when you need them

Coby Beck brings word from the Charlottesville Hook that the VA Attorney General, Ken Cuccinelli, has told UVa to turn over every piece of paper Michael Mann ever touched when he was there, including that used in the nether regions. Chip Knappenberger, to give him credit, points out that this is not a good thing,

Sorry, but I can’t agree with Dr. Battig or Dr. Singer on this one.

Cuccinelli is taking things too far. Way too far. This has all the trappings of a witch hunt, plain and simple.

It does not strike me as being much of a stretch that it is not far along this path before scientists at Virginia’s public universities become political appointees, with whoever is in charge deciding which science is acceptable, and prosecuting the rest. Say good-bye to science in Virginia. Who is going to sign up to do it?

UPDATE: Steve McIntyre comes in on Chip's side of the register, condemning the Cooch

but old S. Fred goes nuclear

There is a good chance that Virginia’s Attorney-General Ken Cuccinelli will come up with the “smoking gun” — where other socalled investigations have only produced one whitewash after another.

We know from the leaked e-mails of Climategate that Prof.Michael Mann was involved in the international conspiracy to “hide the decline” [in global temperatures], using what chief conspirator Dr.Phil Jones refers to as “Mike [Mann]’s trick.” Now at last we may find out just how this was done.

A lot is at stake here. If the recent warming is based on faked data, then all attempts to influence the climate by controlling the emissions of the so-called “pollutant” carbon dioxide are useless –and very costly. This includes the UN Climate Treaty, the Kyoto Protocol, the Waxman-Markey Cap & Trade (Tax) bill, the EPA “Endangerment Finding” based on the UN’s IPCC conclusion, and the upcoming Kerry-Lieberman-Graham bill in the US Senate.

There go all the windfarms, both onshore and offshore, the wasteful ethanol projects, and the hydrogen economy. Maybe Al Gore will cough up some of his ill-gotten $500 million, gained from scaring the public, from carbon trading, carbon footprints, and all the other scams.

So – good luck, Ken Cuccinelli. We are with you all the way.

S. Fred Singer, PhD
Professor Emeritus of Environmental Sciences, University of Virginia
Chairman, Virginia Scientists and Engineers for Energy and Environment

Eli, being the ever hopeful bunny assumes that Prof. Curry will make a large donation to covering Prof. Mann's legal fees.

Tuesday, April 27, 2010

In which Ethon dines at the Wegman buffet

Eli has been sanely jealous of Deep Climate and John Mashey who have been picking over the rotting corpse of the Wegman report and inspiring others. It has emerged, that Wegman and his colleagues plagiarized Ray Bradley's book on dendrology, of course, altering the conclusions to suit Joe Barton's purpose. As pointed out by Dave in the ever so nice Judith Curry throws Susan Solomon under the bus thread at Kloor's bar and grill Wegman and Said appear to suffer from McIntyre dyslexia, (you know, the evil eye syndrome where things get twisted).

Re: Wegman: Regardless of the plagiarism claims, there is some pretty bad stuff in that report.

eg: compare Wegman’s statement:

“Both Esper et al. (2002) and Moberg et al. (2005) indicate that current global temperatures are not warmer that the medieval warm period.”

With Esper (2002):
“annual temperatures up to AD 2000 over extra-tropical NH land areas have probably exceeded by about 0.3 C the warmest previous interval over the past 1162 years. ”

and Moberg (2005):
“We find no evidence for any earlier periods in the last two millennia with warmer conditions than the post-1990 period—in agreement with previous similar studies”

this appears to be a habit with the GMU two, They even snuck one past Wm Connolley, the ever vigilant attack beast of the Wikipedia, copying out paragraphs into their report in areas that Wegman and his student Said were supposed to be expert at.

Thus, Eli was surprised to see Ethon at the window with some delicious bits of Wegman in his beak. What have you brought the bunny asked. A large bit of delicious karmic payback, Deep Climate blew up Wegman and Said with their own mine. Do you realize, Ethon said, that Wegman and Said published a conference paper on how to use computers to detect fraud? And it is available for all to see, you know, "Text Mining with Applications to Fraud Discovery"

Ooo Eli said, that's interesting but we want something more norishing. Well the big bird said, looking into his stash, how about this:







Brokerage – Social relations can be considered to be channels that transport information, services, or goods between people or organizations. From a bird’s eye view, social structure helps to explain how information, goods or even attitudes and behavior diffuses within a social system. Network analysis reveals social structure and helps to trace the routes that goods and information may follow. Some social structures permit rapid diffusion of information, whereas others contain sections that are difficult to reach. We can also focus on the position of specific people or organizations within the network. In general, being well connected is advantageous. Contacts are necessary to have access to information and help. The number and intensity of a person’s ties are called his or her sociability or social capital. Social capital is known to correlate positively to age and education in Western societies. Some people occupy central or strategic positions within the system of channels and are crucial for the transmission process. Some positions may exert pressure on their occupants, but they also yield power and profit. The direction of ties is not very important in social network structures that capture the exchange of information.

Part III – Brokerage

In quite a few theories, social relations are considered as channels which transport information, services or goods between people or organizations. In this perspective, social structure helps to explain how information, goods or even attitudes and behavior diffuses within a social system. Network analysis reveals social structure and helps to trace the routes which goods and information may follow. Some social structures permit rapid diffusion of information whereas others contain sections which are difficult to reach.

This is a bird’s eye view of an entire social network. However, we can also focus on the position of specific people or organizations within the network. In general, being well connected is advantageous. Contacts are necessary to have access to information and help. The number and intensity of a person’s ties are called his or her sociability or social capital. Social capital is known to correlate positively to age and education in Western societies. Some people occupy central or strategic positions within the system of channels and are crucial for the transmission process. Some positions may exert pressure on their occupants, but they also yield power and profit. The direction of ties is not very important in social network structures that capture the exchange of information.


Which talon is holding the Wegman report and which the real thing?

Yes, good bunnies, the left hand side is from page 21 of the Wegman report, the right from Exploratory Social Network Analysis with Pajek, by W. de Nooy, A. Mrvar and V. Batagelj published January 25, 2005, Chapter 6, page 109, which can be found as an Adobe Acrobat file by any plagiarism software available at your local university. For those not in the know, and the Rabett was not, Pajek is software for viewing large networks.

But said Ethon, there is more, these social network specialists, Wegman and Said, went on





Centrality – This is one of the oldest concepts in network analysis. Most social networks contain people or organizations that are central. Because of their position, they have better access to information, and better opportunity to spread information. This is known as the ego-centered-approach to centrality. The network is centralized from socio-centered perspective. The notion of centrality refers to the positions of individual vertices within the network, while centralization is used to characterize an entire network. A network is highly centralized if there is a clear boundary between the center and the periphery. In a highly centralized network, information spreads easily, but the center is indispensable for the transmission of information.
In this chapter we present the concepts of centrality and centralization which are two of the oldest concepts in network analysis. Most social networks contain people or organizations that are central. Because of their position, they have better access to information, and better opportunity to spread information. This is known as the ego-centered-approach to centrality. Viewed from a socio-centered perspective the network as a whole is more or less centralized. Note that we use. centrality to refer to the positions of individual vertices within the network, whereas we use centralization to characterize an entire network. A network is highly centralized if there is a clear boundary between the center and the periphery. In a highly centralized network, information spreads easily, but the center is indispensable for the transmission of information.

For those interested in time lines, the Wegman report was published in July 2006, the Pajek book in January 2005.

There is more from de Nooy, Mrvar and
Batagelj, but why should Eli have all the fun? and Barry Brickmore has the last word on whether Christopher Monckton is a member of the British House of Lords

Friday, April 23, 2010

Ray Pierrehumbert on the habitable zone


The habitable zone is the region about a star where an earth-like planet can exist. Ray Pierrehumbert is lecturing at the University of Toronto on New Worlds, New Climates and Steve Easterbrook has an excellent report. The figure to the left, stolen from Steve, who took it from Rockstrom et al, Nature 461, 472-475 (24 Sept 2009) shows the green zone for Earth like planets, the red is where we are. The news is not good

Thanks to Jim Eager for having FTFL to the lecture slides

Comments?

The keyboard of doom, or Deep Climate on Wegman

Deep Climate does it again


Comments

Thursday, April 22, 2010

Spot the Blog

A bit ago Ethon pointed out that you can judge how serious someone is when they call out the really nasty or nonsensical coming from their side of the tracks. On the nasty side, Eli noticed that the Dear Senator, Darryl Inhofe, was going around threatening to toss climate scientists into the gulag unless they straightened out, and wondered who was going to step up. Suffice it to say that some folk stepped right up and called out those who dared to mention that Inhofe was doing his junior McCarthy bit.

Into this closed circle steps Julian Sanchez to start a war about the Republican war on reality where

Reality is defined by a multimedia array of interconnected and cross promoting conservative blogs, radio programs, magazines, and of course, Fox News. Whatever conflicts with that reality can be dismissed out of hand because it comes from the liberal media, and is therefore ipso facto not to be trusted. (How do you know they’re liberal? Well, they disagree with the conservative media!)
Sanchez calls this epistemic closure. Today, over at the National Review blog, Jim Manzi takes it the next mile
I started to read Mark Levin’s massive bestseller Liberty and Tyranny a number of months ago as debate swirled around it. I wasn’t expecting a PhD thesis (and in fact had hoped to write a post supporting the book as a well-reasoned case for certain principles that upset academics just because it didn’t employ a bunch of pseudo-intellectual tropes). But when I waded into the first couple of chapters, I found that — while I had a lot of sympathy for many of its basic points — it seemed to all but ignore the most obvious counter-arguments that could be raised to any of its assertions. This sounds to me like a pretty good plain English meaning of epistemic closure. The problem with this, of course, is that unwillingness to confront the strongest evidence or arguments contrary to our own beliefs normally means we fail to learn quickly, and therefore persist in correctable error.
Manzi turns to the part of Levin's book about global warming, because he says, this is something he knows about, OTOH, this really could be any issue. Manzi concludes
It was awful. It was so bad that it was like the proverbial clock that chimes 13 times — not only is it obviously wrong, but it is so wrong that it leads you to question every other piece of information it has ever provided.
and indeed, this IS the basic point. If someone is not only wrong, but obviously wrong, and digs in, and refuses to admit they were wrong, but drags it out forever, you can discount everything else they say, and if you are on the same side as the argument as they are, and Manzi almost certainly is on the same side as Levin, they do terrible harm to your side of the argument.

Manzi, of course, is now under friendly fire.

Wednesday, April 21, 2010

In which it is shown how the NIPCC and Fred Singer omit, distort, and yes, flat out lie about sea level rise estimates.

Martin Stolpe from the Klimakrise, brings news of how the quote mine over at Heartland is flooding out. He came across a graphic being flogged by EIKE (the German version of Heartland) purporting to show how IPCC estimates of sea level rise by 2100 have changed over the years.

Eli has been lead to believe that the title reads "The IPCC's estimates of sea level rise in the 21st century. The IPCC estimates are more and more converging to the actual value of 20 cm/century. (compare to Fig.s 4-5)". Martin tracks this graphic down to the NIPCC report, the bunnies know the one where Singer got $100K+ from Heartland for editing, where we find the same graphic

Figure 19: Estimates of sea-level rise to Year 2100 from IPCC reports of 1990, 1995, 2001, and 2007. Note the strong reduction in estimated maximum rise, presumably based on better data and understanding. Also shown are the published seal level rise values of Hansen (H) [2006], Rahmstorf (R) [2007], and Singer (S) [1997]. Both H and R are well outside of the maximum IPCC values. The ongoing rate of rise in recent centuries has been 18 cm per century; therefore, the incremental rate of rise for IPCC 2007 would be 0 to 41 cm, and about 0 to 2 cm for Singer.
FWIW, Eli did not make that up about how the seals are rising, that's how it appeared on page. Problem is that the estimates shown for the FAR, the SAR and the TAR and the AR4 are (the first, second, third and fourth IPCC assessment reports), hmm, how can Eli say this nicely.......Oh yes, stone lie, probably a lie, closer, and a major quote mine.

Following Martin, what did FAR say about sea level rise
This present assessment does not foresee a sea level rise of ≥ 1 metre during the next century.
They show a figure for a range of scenarios with a maximum rise by 2100 of 110 cm and a minimum of 15. Where did the 367 cm come from. Oh yes, again as Martin points out they mention a 1986 study which predicted a rise of 367 cm.

Now a good quote miner would point out that when Fred and Co. said
"Estimates of sea-level rise to Year 2100 from IPCC reports of 1990"
well yes, there is an estimate of 367 cm from the 1986 study which was cited in the FAR and this would simply be excused as an example of a quote mine expert at work. But Eli reads a bit further in the NIPCC and finds in the text
Successive IPCC reports have reduced their estimates of projected sea-level rise, as shown in Figure 19, and are coming closer to a value of 18 cm per century.
which clearly attributes the estimate to the IPCC. Game, set, match, lie.

We also can look at what the 1995 SAR WGI report said about sea level rise (BTW, if you go to Amazon, you can read, the SAR in great part)
Average sea level is expected to rise as a result of thermal expansion of the oceans and melting of glaciers and ics-sheets. For the IS92a scenario, assuming the "best estimate" values of climate sensitivity and of ice melt sensitivity to warming, and including the effects of future changes in aerosol, models project and increase in sea level of about 50 cm from the present to 2100. This estimate is approximately 25% lower than the "best estimate" in 1990 due to the lower temperature projection but also reflecting improvements in the climate and ice melt models. Combining the lowest emission scenario (IS92c) with the "low" climate and ice melt sensitivities gives a projected sea level rise of about 15 cm from the present to 2100. The corresponding projection for the highest emission scenario (IS92e) combined with "high" climate and ice melt sensitivities gives a sea level rise of about 95 cm from the present to 2100.
The SAR does list an estimate of 3-124 cm from a paper by a 1993 paper by Wigley and Raper which is mentioned in table 7.8. Is that what the quote miners dug out?

How about the 2000 TAR, in the summary for policy makers one reads
Global mean sea level is projected to rise by 0.09 to 0.88 m between the years 1990 and 2100, for the full range of SRES scenarios, but with significant regional variations. This rise is due primarily to thermal expansion of the oceans and melting of glaciers and ice caps. For the periods 1990 to 2025 and 1990 to 2050, the projected rises are 0.03 to 0.14 m and 0.05 to 0.32 m, respectively.
And the AR4, well Fred gets the 18-59 cm projection right, but, of course, swallows the caveat
Because understanding of some important effects driving sea level rise is too limited, this report does not assess the likelihood, nor provide a best estimate or an upper bound for sea level rise. Table SPM.1 shows model-based projections of global average sea level rise for 2090-2099.[10] The projections do not include uncertainties in climate-carbon cycle feedbacks nor the full effects of changes in ice sheet flow, therefore the upper values of the ranges are not to be considered upper bounds for sea level rise. They include a contribution from increased Greenland and Antarctic ice flow at the rates observed for 1993-2003, but this could increase or decrease in the future.[11] {3.2.1}
More coming. Eli let this one out before it was finished. Apologies.

Will this do?


There has been a request (actually several, for Lindsay Lohan. Eli is not THAT good, but will Natialie Imbruglia do?
Eli may take this down if Ms. Rabett comes calling.

Sunday, April 18, 2010

Eli can retire Part X - The grim reaper is a hot head

The US EPA responses to challenges to its Endangerment Finding for increasing CO2 concentrations blows hot and cold.

Comment (5-24):
Several commenters (e.g., 3347.3, 11453.1, 3187.3) note that the April 2009 TSD indicated that cold-related deaths presently exceed heat-related deaths in the United States and that this provides evidence that a warming climate will have beneficial effects on temperature-related mortality. Commenters note that on page 70 of the April 2009 TSD, EPA states that 5,983 heat-related deaths were reported in the United States between 1979 and 2002. In the same timeframe, 16,555 people died of extreme cold. A commenter (3187.3) provides a paper by Goklany (2007), which indicates that death from extreme cold exceed death from extreme heat.

Response (5-24):
We have revised the TSD’s estimates of heat-related deaths based on the latest findings of the assessment literature (Karl et al., 2009). Based on these results, other supporting evidence presented in the TSD, and additional evidence cited below, we have determined that the available literature strongly supports the conclusion that extreme heat is, on an average annual base, the leading cause of weather-related death in the United States. We agree that the April 2009 TSD contained statistics that could be interpreted as suggesting that cold-related mortality has recently been higher in the United States than heat-related mortality. The cold-related mortality statistics in the TSD from Ebi et al. (2008) are similar to those cited by Goklany (2007). However, the methods and data for estimating heat-related mortality were recently updated and these revised values are presented in Karl et al. (2009).

The more recent heat-related mortality numbers from Karl et al. (2009) reflect results from the Centers for Disease Control and Prevention (CDC). CDC (2006) reports more than 3,400 deaths from 1999 to 2003 for which exposure to extreme heat was listed as either a contributing factor or the underlying cause of death. This result of roughly 680 heat-related deaths per year is almost identical to the 689 deaths per year from cold exposure reported by Ebi et al. (2008) and summarized in the TSD. CDC (2006) suggests that even the revised heat-related mortality numbers may underestimate total heat-related mortality, noting: “Because heat-related illnesses can exacerbate existing medical conditions and death from heat exposure can be preceded by various symptoms, heat-related deaths can be difficult to identify when illness onset or death is not witnessed by a clinician. In addition, the criteria used to determine heat-related causes of death vary among states. This can lead to underreporting heat-related deaths or to reporting heat as a factor contributing to death rather than the underlying cause.” This issue has long been recognized in attempting to estimate the mortality impact of extreme heat using information from death certificates (American Medical Association Council on Scientific Affairs, 1997). As noted in a subsequent response (5-29), cold-related deaths are likely also underestimated. One complication with these death certificate–based estimates of extreme cold and heat is they are not limited to periods that would be considered heat waves or cold snaps in the location where the death occurs. Therefore, while these results are based in a consistent methodology and data source, they have an uncertain overlap with the occurrence of the weather events of primary interest to the TSD, cold snaps and heat waves. As a result, these data alone do not provide strong evidence the heat-related mortality is presently greater than cold-related mortality.

However, we note that alternative and much higher estimates of heat-related mortality come from analyses of daily urban summertime mortality patterns in Kalkstein and Greene (1997) and Davis et al. (2003a), which use a different methodology to compute heat-related deaths compared to CDC (2006). These studies first define extreme heat events by identifying threshold conditions for an event in a location and then calculate the number of extreme heat–attributable deaths based on differences in daily deaths on extreme heat days compared to longer-term averages. In these studies, heat’s mortality impact is quantified in terms of the excess deaths that result during the extreme heat conditions. By evaluating changes in daily deaths attributable to all causes, this approach also effectively eliminates differences or restriction in using certain causes of death as potential sources of bias in estimating the extreme heat’s mortality impact. This method is also more consistent with the view that heat waves are effectively identified through exceptional weather conditions that result in increases in daily mortality (e.g., Confalonieri et al., 2007; U.S. EPA, 2006a). Although differences in the time series, definitions of urban populations, and other analytical methods prevent an exact comparison of the results in these two studies, both studies (Kalkstein and Greene, 1997; Davis et al., 2003a) estimate that there are approximately 1,700–1,800 excess deaths per year during extreme heat events based on an evaluation of a subset of approximately 40 U.S. metropolitan areas (see U.S. EPA, 2006a). These estimates of extreme heat’s mortality impact are much higher than the corresponding death certificate–based estimates for heat as well as the Ebi et al. (2008) estimate for cold-related mortality summarized in the TSD.

We also note that Davis et al. (2004) find that the net impact of the observed temperature increase from 1964 to 1998 (considering both reduced temperature mortality in winter and increased temperature mortality in summer) was an extra 2.9 deaths (per standard million) per city per year in 28 major U.S. cities. This indicates that extreme heat has been the larger cause of mortality in the recently observed record when temperatures have warmed.
Furthermore, we note that the USGCRP assessment (Karl et al., 2009) specifically refers to a recent study by Borden and Cutter (2008), which concludes heat is the most deadly natural hazard in the United States. It also cites Medina-Ramon and Schwartz (2007), which found that in 50 U.S. cities between 1989 and 2000, extreme heat increased death rates 5.7% while extreme cold increased death rates by only 1.6%. These results are summarized in the TSD.

Though we are aware of a recent study by Andersen and Bell (2009) that finds a similar mortality risk for extremely hot and cold days based on the synthesis of results from 107 U.S. communities (contrasting with Medina-Ramon and Schwartz), Andersen and Bell are clear that cold temperatures more indirectly affect mortality than heat. In addition to the longer lag times for exposure incorporated for the effects of extreme cold (up to 25 days, compared a one-day lag for heat), they note that infectious diseases, which are more common in industrialized countries during colder weather (when people spend more time indoors and in proximity) could account for a substantial portion of the cold-related effect.

Summarizing, both recent studies and the assessment literature provide strong evidence that heat-related mortality presently exceeds cold-related mortality in the United States


Comments?

Saturday, April 17, 2010

Carrot Eater Wants a Pony


More pony lore. Place your orders.



Rabett Run is a full service blog

Thursday, April 15, 2010

Eli can retire Part IX - PPP, MER, GDP AEIOU

From the US EPA responses to challenges to its Endangerment Finding for increasing CO2 concentrations, something economical

Comment (4-44):
A commenter (4632) objects to the emission scenarios produced by the IPCC because of critiques by Castles and Henderson (2003a, 2003b, 2005) that highlight the use of market exchange rates (not purchasing power parity) and the implausible assumption that poor nations will equalize per capita emissions with rich nations.

Response (4-44):
Both IPCC (2007) and CCSP (2007b) address the issue of using market exchange rate (MER) versus purchasing power parity (PPP) approaches in determining future gross domestic product (GDP) growth rates, in response to the critiques by Castles and Henderson (2003a, 2003b, 2005). The IPCC (Fisher et al., 2007) states the following:

--------------------------
In the debate on the use of exchange rates, market exchange rates (MER) or purchasing power parities (PPP), evidence from the limited number of new PPP-based studies indicates that the choice of metric for gross domestic product (GDP), MER or PPP, does not appreciably affect the projected emissions, when metrics are used consistently. The differences, if any, are small compared to the uncertainties caused by assumptions on other parameters, e.g. technological change (high agreement, much evidence).
--------------------------

The IPCC adds the caveat that, unlike emissions, the numerical expression of GDP does depend on conversion methods. CCSP (2007b) notes that while MER is used to set the base year of the models in that assessment, “growth prospects and other parameters for the world’s economies were assessed relative to their own historical performance” in order to avoid potential issues arising from interactions between the MER/PPP issue and assumptions regarding convergence.

While we find that both the IPCC and CCSP approaches yield credible estimates of future emissions that have been well supported by the literature, the robustness of conclusions based on emission projections developed through different means adds even more confidence that the TSD is appropriately summarizing the best existing science
Comments?

Wednesday, April 14, 2010

The Implications of Uncertainty in Administering Upper Bayesian Shortage






Sometimes what you read is better than what they write

Denialists denied again

The Independent Inquiry headed by Lord Oxborough on the Climate Research Unit has reported. The wild charges (hi there Steve:) against the CRU were emphatically rejected by the Inquiry. To the right, Nelson well expresses our views on this failed denialist jihad. Like Tamino, Eli fears he will age waiting for apologies.

UPDATE: Eli will put up a separate post on this later but Rabett Run has many visitors at the moment. Besides the sporting aspect there is a great reason to shove these reports virgorously down the throats of some who will not be mentioned. Besides their uncalled for denigration of honorable scientists and excellent science, the alphabet soup of denialist think tanks has based their petition the US EPA for reconsideration of the CO2 Endangerment Finding on the CRU Emails. The various official inquiries reporting back on how vicious and vacuous the jihad has been strip the petitions and petitioners of standing. The more HaHa we generate leaves them standing naked in the public square. Believe Eli, an unclothed Tony Watts, is an ugly thing. (not that yrs truly is svelte, mind you, but he is cute, ask Ms. Rabett)

The charge of the Committee was

The Panel was set up by the University in consultation with the Royal Society to assess the integrity of the research published by the Climatic Research Unit in the light of various external assertions. The Unit is a very small academic entity within the School of Environmental Sciences. It has three full time and one part time academic staff members and about a dozen research associates, PhD students and support staff. The essence of the criticism that the Panel was asked to address was that climatic data had been dishonestly selected, manipulated and/or presented to arrive at pre-determined conclusions that were not compatible with a fair interpretation of the original data.
Keith Briffa (dendochronology), Phil Jones (surface temperature reconstructions) and their colleagues had been accused of data murder and rape by the usual suspects. The Inquiry finds these accusations to be infamous lies.

The Inquiry first recognizes something important about tree rings
Chronologies (transposed composites of raw tree data) are always work in progress. They are subject to change when additional trees are added; new ways of data cleaning may arise (e.g. homogeneity adjustments), new measurement methods are used (e.g. of measuring ring density), new statistical methods for treating the data may be developed (e.g. new ways of allowing for biological growth trends).
Much of the criticism came from the fly-in-amber school of science, where nothing ever changes, where initial publications must be perfect. The Inquiry report remarks that the nature of the dendro beast (and indeed, most other science) implies choices in data selection guided by experience, expertise and statistics. They ding the CRU for not having sufficient statistical expertise, but conclude
8. After reading publications and interviewing the senior staff of CRU in depth, we are satisfied that the CRU tree-ring work has been carried out with integrity, and that allegations of deliberate misrepresentation and unjustified selection of data are not valid. In the event CRU scientists were able to give convincing answers to our detailed questions about data choice, data handling and statistical methodology. The Unit freely admits that many data analyses they made in the past are superseded and they would not do things that way today.
The Inquiry demurs,
9. We have not exhaustively reviewed the external criticism of the dendroclimatological work,
Wise folk, wading into the Climate Audit swamp requires at least six months of cleaning under the nails afterwards, and they continue with extreme British understatement
but it seems that some of these criticisms show a rather selective and uncharitable approach to information made available by CRU. They seem also to reflect a lack of awareness of the ongoing and dynamic nature of chronologies, and of the difficult circumstances under which university research is sometimes conducted. Funding and labour pressures and the need to publish have meant that pressing ahead with new work has been at the expense of what was regarded as non-essential record keeping. From our perspective it seems that the CRU sins were of omission rather than commission.
The last bit is the one you are going to see at Climate Audit
Although we deplore the tone of much of the criticism that has been directed at CRU, we believe that this questioning of the methods and data used in dendroclimatology will ultimately have a beneficial effect and improve working practices
without the first line which will be left in the quote mine.

On to Phil Jones and the CRUTEMP surface temperature reconstructions.
4. Like the work on tree rings this work is strongly dependent on statistical analysis and our comments are essentially the same. Although there are certainly different ways of handling the data, some of which might be superior, as far as we can judge the methods which CRU has employed are fair and satisfactory. . . .

All of the published work was accompanied by detailed descriptions of uncertainties and accompanied by appropriate caveats. The same was true in face to face discussions.
5. We believe that CRU did a public service of great value by carrying out much time-consuming meticulous work on temperature records at a time when it was unfashionable and attracted the interest of a rather small section of the scientific community.
Reaching an overall conclusion about the CRU's work
1. We saw no evidence of any deliberate scientific malpractice in any of the work of the Climatic Research Unit and had it been there we believe that it is likely that we would have detected it. Rather we found a small group of dedicated, if slightly disorganised, researchers who were ill-prepared for being the focus of public attention. As with many small research groups their internal procedures were rather informal.
The Inquiry picks up on James Annan's point that governments are insisting on charging for data they create while demanding that it be free to all. And, horrors, they come pretty close to recommending that FOI laws be modified to prevent their vexious use
4. A host of important unresolved questions also arises from the application of Freedom of Information legislation in an academic context. We agree with the CRU view that the authority for releasing unpublished raw data to third parties should stay with those who collected it.
George Monbiot will call it white wash

Eli thanks Tracy and http://nelsonhaha.com for the appropriate comment

Tuesday, April 13, 2010

A Puzzler, where did Loehle go too far?


what's too far said he

where you are said she - ee cummings
The bunnies have been asking what's wrong with the rather strange roughly cylindrical brown object that Craig Loehle left lying on the sidewalk at Atmospheric Environment. The constant nattering has woken Eli from his retirement nap so he might as well have a go.

This is a case, where when you see the trick you want to go punch the referees, decapitate the editor and shoot the author into space with a one way ticket. It starts with a paper by Hofmann, Butler and Tans in the same journal which looked at how fast the CO2 mixing ratio has been growing since ~1800. They find a close correlation between population growth and the mixing ratio, indeed, in 1987 (note added in proof) Newell and Marcus had suggested that monitoring CO2 mixing ratio growth was a reasonable way of monitoring global population.

HBT show that the CO2 mixing ratio has been growing very close to exponentially since ~ 1960 when monitoring at Mauna Loa started, although going back further than that on the basis of ice cores shows slower growth in earlier times. They then try to tie everything together to suggest
Besides showing the insight gained by removing pre-industrial CO2 in explaining the curvature in the Mauna Loa CO2 record, and organizing the confusion on growth rates, does this new analysis of the CO2 record have any further use for the future? One possibility is to use it to watch for the expected and necessary break of the exponential nature of CO2 and its growth rate. Fig. 4 shows an exponential fit to the past 14 years of Mauna Loa anthropogenic CO2 data and the residual between the data and the exponential function. The residual has varied from zero less than 1% over this time period. A sustained downward trend of the residual by more than 1% (in the absence of any known major volcanic event) would be a clear sign of a change in the nature of anthropogenic atmospheric carbon dioxide, and a possible gauge of progress in the inevitable need to limit atmospheric CO2. Similarly, a break in the close relation between anthropogenic CO2 and population would signal that a change in ‘‘business as usual’’ had occurred.
For anyone who needs a shorter HBT, they said watch for deviations from exponential growth on short to medium term intervals to spot changes in emissions. They did not predict future emissions but they did discuss the entire record. Let Eli see what Loehle did according to Loehle
A paper by Hofmann et al. (2009, this journal) is critiqued. It is shown that their exponential model for characterizing CO2 trajectories for historical data is not estimated properly. An exponential model is properly estimated and is shown to fit over the entire 51 year period of available data. Further, the entire problem of estimating models for the CO2 historical data is shown to be ill-posed because alternate model forms fit the data equally well. To illustrate this point the past 51 years of CO2 data were analyzed using three different time-dependent models that capture the historical pattern of CO2 increase. All three fit with R2 > 0.98, are visually indistinguishable when overlaid, and match each other during the calibration period with R2 > 0.999. Projecting the models forward to 2100, the exponential model comes quite close to the Intergovernmental Panel on Climate Change (IPCC) best estimate of 836 ppmv. The other two models project values far below the IPCC low estimates. The problem of characterizing historical CO2 levels is thus indeterminate, because multiple models fit the data equally well but forecast
very different future trajectories.
Watch the pea. The IPCC estimate is for A PARTICULAR EMISSION SECENARIO, it is not a projection of the last fifty years of measuring CO2 mixing ratios, nor is it the only emission scenario. Fast work there folks. Still, Eli is a trusting bunny so let us go on. The three fitting forms and Loehle's values for the parameters are
  • the exponential, a + b exp(ct),
    a=259.4, b=2.978 x 10-13, and c= 0.01677

  • the quadratic, a + bt + ct2
    a= 45 318.5, b= -46.7684, and c = 0.0121
    (Gives a really bad fit. a is adjusted to 45504 for a decent fit)

  • and the saturated, c + a(1-exp[b(t-d)])2
    a= 616.694, b= -0.0067, c= 311, and d= 1945.7
and this is what Craig shows,
but Eli made a somewhat fuller picture, using the Loehle's values (click on the graphs for better views)

The exponential model, although not perfect, does match the CO2 mixing ratio much better at earlier times than 1959, so, contrary to Loehle, the three models are not equally plausible. But there's more. Eli also did his own fit.

The saturating curve here is much closer to the data and the exponential curve. Still, young bunnies, there is still something like junior high school wrong with what Craigie poo did. Flip to the continuation for the answer


------------------------------------------------






Loehle extrapolated all of his fits from the year dot. Little ones can spot this in his value for b in the exponential fit and for a in the quadratic and d in the saturated one. If the time is replaced the three fitting forms by [t-1985] you get much much improved saturated fits. The quadratic fit still sucks at early times (and late ones) but even it comes closer to the other two..

The extrapolations are shown below.


Model

Year 0 Loehle

Year 0 Eli

Year 1985 Eli

2000

Exp

370

369

369


Quad

367

367

370


Saturated

368

370

370

2050

Exp

513

513

510


Quad

479

479

490


Saturated

467

500

512

2100

Exp

846

846

819


Quad

651

651

668


Saturated

567

729

834


The referees and the editor missed the slight of hand, they also did not read Hoffman, et al. The graphs demonstrate that quadratic fits are not industrial grade for this problem, and the table and graphs show that Loehle's fits are fishy. Doesn't anyone check the programming? Where are the auditors when you need them.

Eli has placed an Excel spreadsheet at the memory hole for those who want to play around with the numbers.

Monday, April 12, 2010

Eli can retire Part VIII - The EPA reads Rabett Run

Well, well, well, Eli discovers that the EPA reads everything, including Rabett Run. We are honored. From the US EPA responses to challenges to its Endangerment Finding for increasing CO2 concentrations

Comment (3-45):
A number of commenters believe that anthropogenic global warming is impossible, many citing arguments made by Gerlich and Tscheuschner (2009). Several commenters (e.g., 0430) note that the greenhouse effect is not like a real greenhouse. Several claim that it is thermodynamically impossible because heat cannot be transferred from a cool substance to a warmer substance (0430, 2210.5): for example, blankets cannot make you warmer than body temperature (1707, 0183.1,). Another thermodynamic argument for the impossibility of the greenhouse effect was proposed by two commenters (2210.3, 4509) citing Gerlich and Tscheuschner (2009) who states that the greenhouse effect as commonly formulated violates the Second Law of Thermodynamics. Another commenter (0711.1) requests evidence of any peer reviewed climate change paper that does not rely on computer simulation. Another theory (2887.1) holds that long-wave radiation will cause increased evaporation of the surface ocean, negating any heat increase. One commenter (0535) submitted a non-peer reviewed paper providing a different explanation for the net energy budget of the Earth, with no role for warming by CO2.

Response (3-45):

The evidence for the atmospheric greenhouse effect is well supported by the scientific literature.

The objections raised by a number of commenters to the basic thermodynamics are without grounds. We are well aware that the greenhouse effect is not at all like a real greenhouse. However, the analogy of a blanket is a little bit better: and indeed, sufficiently insulating blankets can cause overheating. GHGs (blankets) will, by reducing the rate of heat loss, raise the surface temperature of the Earth (body) until a new thermodynamic balance is achieved between incoming solar radiation (internal body heating) and outgoing thermal radiation (in the case of a blanket, including convection and non-radiative processes). This process works regardless of whether the atmosphere (blanket) is cooler than the surface (body). We are aware of the paper by Gerlich and Tscheuschner, and we have determined that the conclusions of the paper are inconsistent with the well-supported literature regarding the mechanism of the greenhouse effect. For example, as a disproof of the greenhouse effect, the paper by Gerlich and Tscheuschner presents the example of a pot of water, noting that the bottom of the pot will be cooler if it is filled with water than if it is empty. Contrary to the assertion in the paper, the primary thermal effect of adding water to the pot is not a reduction in heat transfer, but rather an increase of thermal mass. We assert that a more appropriate example for the paper to have examined would have been the addition of a lid to a pot of water, which reduces the rate of heat loss, and leads to an increase of heating of the water compared to a case with no lid. The paper by Gerlich and Tscheuschner is also inconsistent with the scientific literature with regards to the interpretation of radiative balance diagrams and the assertion that there is no “mean temperature” of the Earth, in contrast to the hundreds of peer-reviewed publications and many assessment reports which use both concepts.
You read it first at Rabett Run
This, of course, neglects the latent heat carried away from the pot and thus the heating element by evaporation of the water in the pot. Since it is well known that people who are physics obsessed are often forgetful, we postulate that the housewife forgets that she has put the pot on the range, and all the water boils away. At that point, when all the water has evaporated, measurements show that the heating element rises to a higher temperature than it was before the tea pot was placed on it.
and, of course, there are the famous Rabett blanket posts

EPA Rocks!!

Sunday, April 11, 2010

Monckton jumps the land shark. Gets eaten


UPDATE: John Nielsen-Gammon drops by to say that he found the same errors in May 2009 and received roughly the same pretentious short shrift from Chris Monckton. Eli rather suspects that you can tell Chris he is wrong, show him in detail, prove it in spades, and what ya wanna bet, Monckton will keep repeating the nonsense ad infinitum till you bite em.
------------------------------------
In this case, the land shark has pretty teeth too, and his name is Barry Bickmore, a professor of geology at Brigham Young University. Utah is a REAL Republican state, and the BYU is an institution of the Church of Latter Day Saints (Mormons). Bickmore is a devout Mormon. As an important sign of the worm turning, the tale first appears under the byline of Judy Fahys, at the Salt Lake City Tribune, a majorUtah newspaper. Bickmore and is soon to appear at Real Climate

"The moral of the story is not that amateurs should stay out of the debate about climate change," writes Bickmore, who noted that he is a geologist rather than a climate scientist.

"Rather, the moral is that when you see a complete amateur raising objections about a highly technical subject, claiming that he or she has blown the lid off several decades of research in the discipline, you should be highly suspicious."

Bickmore has solved two mysteries. The first is where Monckton's claimed IPCC CO2 mixing ratio projections came from. What the IPCC did was feed the A2 emission scenario into a bunch of models and see what they predicted for CO2 mixing ratios. Since this was an emission scenario, and there were a bunch of scenarios of which A2 was among the most extreme, the range of predictions is mostly lower, however, we do appear to be following A2 fairly closely. Monckton also messed up the results of the modeling exercise because he does not understand the difference and relationship between an emissions scenario and a resutling model prediction of mixing ratios. Eli will let Prof. Brickmore explain honor among bloggers and all that.

Even better, as the Salt Lake Tribune explains, Monckton and Bob Ferguson, president of Moncktons fiction publisher, tried to threaten Brickmore.

Monckton did not respond to requests for comment. Nor did Bob Ferguson, president of the Science and Public Policy Institute, which sponsored Monckton's recent appearance at Utah Valley University.

But, in their emails to Bickmore over the past few weeks, Monckton and Ferguson accuse the scientist at LDS Church-owned BYU of personal attacks, and both threaten him.

Ferguson ends a Thursday e-mail by hinting there might be repercussions through their shared faith.

"I trust you are gentleman and Christian enough to not bear such false witness," Ferguson concludes. "If not, I will seek both professional and ecclesiastical redress for 'conduct unbecoming'."

In an exchange a week earlier, Monckton said that Bickmore's "unjustifiable and gratuitous remarks about my habitual mendacity are to be drawn to the attention of the President of the University... to be investigated as a disciplinary matter." Monckton also said he had spoken with "some of the University's leading supporters" about Bickmore's role in the university's decision not to host Monckton's climate-change speech.

"This, too, I understand, is to be referred to the University as a disciplinary matter, since the University prides itself on allowing academic freedom," Monckton wrote.

Bickmore said Friday he is not aware of any investigation or disciplinary action. And university spokesman Michael Smart said none was in the works.

"Barry Bickmore is not and has not been under academic investigation," Smart said. "There is no basis for any accusation that he is."

Emails, Barry got Emails, and Bickmore [corrected] bit back by writing the House of Lords to ask if Monckton was a member and got a succinct reply
"Christopher Monckton is not and has never been a Member of the House of Lords. There is no such thing as a 'non-voting' or 'honorary' member."
As Bickmore told the Salt Lake Tribune
The false claims undermine Monckton's credibility in a way that is easy for anyone to understand, said Bickmore. They open a window onto the skeptic's scientific claims, like his assertion that the Intergovernmental Panel on Climate Change is wrong about global warming.
Eli awaits details appearing at Real Climate, but the first bite was scrumptious

Saturday, April 10, 2010

Eli can retire Part VII - The EPA plays so's your old man

Well, well, well, Eil discovers that the EPA reads everything, and laughs at some of them. From the US EPA responses to challenges to its Endangerment Finding for increasing CO2 concentrations

Comment (1-12):
Several commenters (1924, 2898.1, 3214.1, 3330.1, 3389, 3446.2, 3560.1, 3679.1, 3748.1, 3969.1, and 4172) argue that EPA should base its endangerment finding on the recent Nongovernmental International Panel on Climate Change (NIPCC) report entitled Climate Change Reconsidered, instead of IPCC and CCSP reports.

Response (1-12):
EPA has reviewed and considered the NIPCC report and found that it lacks the rigorous procedures and transparency required to serve as a foundation for the endangerment analysis. A review of the NIPCC Web site indicates that the NIPCC report was developed by “two co-authors” and “35 contributors and reviewers” from “14 countries (http://www.nipccreport.org/index.html). The organization does not appear to have established any procedures for author selection and provides no evidence that a transparent and open public or expert review was conducted. Thus, the NIPCC’s approach stands in sharp contrast to the clear, transparent, and open procedures of the IPCC, CCSP, USGCRP, and NRC. Relying on the work of the major assessment reports is a sound and reasonable approach. See Section III.A. of the Findings, “The Science on Which the Decisions Are Based,” for our response to comments on the use of the assessment literature and previous responses in this section regarding our treatment of new and additional scientific literature provided through the public comment process.

Although EPA sees no reason to base the endangerment analysis on the NIPCC, we did thoroughly review the report and the associated references. For EPA’s responses to comments and literature provided on specific climate science issues in the TSD, including the work of the NIPCC, please refer to the appropriate Response to Comment volumes.
Don't wanna mess with those guys

Friday, April 09, 2010

The sadists, the masochists and the scientists

Andy Revkin moves to the opinion side of the house, George Monbiot tries to pull out of the CRU hole, Stoat foams at the mouth about the whole thing

And (rather small beer by comparison) Monbiot thinks that cliamte scientists are like paedophiles (I exaggerate just a little for effect, you understand). But its still hopeless stuff, even if mt likes it. The best defence of Monbiot I can think of is that he is just using this incident to push his pet point of view with no great interest in reality, which is sad.
and, predictably CP Snow's two cultures business comes up again, even after Eli dipped into that one
So, what's going on Eli asks?

Well, IEHO, there is a disjunction, but it's a strange one. Scientists can't figure out why reporters keep giving space to denialists who keep on burning them and, of course, the denialists keep on burning them and the reporters keep on taking it.
Eli is pleased to note that Monbiot, provided an important clue to what's happening. The reporters are the masochists, the denialists, the sadists, and the scientists? We just want to be left alone to indulge in some interesting straight sex.

What's your pleasure?

Thursday, April 08, 2010

Eli can retire Part VI - Going where the sun don't shine

Ms. Rabett told Eli to get off his tired old well the bunnies know what, and go where the sun don't shine. So the Rabett dialed up the US EPA responses to challenges to its Endangerment Finding for increasing CO2 concentrations and considered the matter of solar influences

Comment (3-35):
A number of commenters (e.g., 0670) argue that the sun is the primary driver of global temperature changes. Several commenters (3323.1, 4003, 4041.1, 4932.1, and 5158) referred to a new 2009 paper by Scafetta and Willson suggesting that the IPCC used faulty solar data in dismissing the direct effect of solar variability on global temperatures. Commenters also cite other research by Scafetta and others that suggests that solar variability could account for up to 68% of the increase in Earth’s global temperatures. One commenter (1616.1) attributes 0.14°C of the warming since 1950 to increased solar irradiance, and another 25% of warming since 1979, as in Scafetta and West (2006) (3596.1). Another commenter (7031) states that the correlation between solar variations such as sunspots and global climate has been pointed out by several scientists, such as Scafetta and West (2008). A number of specific climate-related regional phenomena have been related by commenters (e.g., 3596.1) to solar variability, such as sea surface temperature, floods, droughts, monsoons, and North Atlantic drift ice.

Response (3-35):
We have reviewed the comments and the literature submitted and have determined that changes in solar irradiance are not a sufficient explanation for recent climate change. The contention that direct solar variability can explain recent warming is not supported by the bulk of the scientific literature. As the TSD notes, the IPCC Fourth Assessment Report estimates that changes in solar irradiance since 1750 are estimated to cause a radiative forcing of +0.12 (+0.06 to +0.30) W/m2, or approximately 5% of the combined radiative forcing due to the cumulative (1750–2005) increase in atmospheric concentrations of CO2, CH4, and N2O (2.30 W/m2 with an uncertainty range of +2.07 to +2.53 W/m2). The natural 11-year cycle of solar irradiance has a magnitude of less than 2 W/m2 at the distance of the Earth—which, once corrected for albedo and distribution over the surface area of the planet, is a magnitude of less than 0.35 W/m2.

In addition, Karl et al. (2009) state that “if most of the observed temperature change had been due to an increase in solar output rather than an increase in GHGs, Earth’s atmosphere would have warmed throughout its full vertical extent, including the stratosphere. The observed pattern of atmospheric temperature changes, with its pronounced cooling in the stratosphere, is therefore inconsistent with the hypothesis that changes in the Sun can explain the warming of recent decades. Moreover, direct satellite measurements of solar output show slight decreases during the recent period of warming.” A number of other recent studies also show results that contrast with the interpretation that solar variability is driving recent warming. Both Lockwood and Fröhlich (2008) and Lean and Rind (2009) show that the solar contribution to warming in recent decades has been small or negative, consistent with the IPCC attribution of most of the warming in recent decades to anthropogenic GHGs.

The attribution of components historical climate change to solar activity involves a number of issues. The first is the actual reconstruction of historical solar activity: even for the last three decades there is some controversy, as is evident in the differences between Scafetta and Willson (2009), which uses a total solar irradiance composite from the Active Cavity Radiometer Irradiance Monitor (ACRIM) analysis of satellite data, and Lockwood and Fröhlich (2008), which uses a composite based on the Physikalisch- Meteorologisches Observatorium Davos (PMOD) analysis of satellite data. These two composites don’t even agree on the sign of the solar irradiance trend over this time period.

Lockwood and Frolich analyze both datasets and find that the ACRIM dataset is inconsistent with methods of historical reconstructions that have shown correlations between historical solar activity and climate. Krivova, Solanki, and Wenzler (2009) also find no evidence of an increase in total solar irradiance (TSI) from 1986 and 1996 using an analysis based on magnetograms. Scafetta and Willson, on the other hand, claim that the PMOD approach requires a correction of the data from the earth radiation budget (ERB) system on the NIMBUS7 satellite, and this correction has been rejected by one of the scientists on the NIMBUS team (D.V. Hoyt, personal communication to Scafetta, 2008). Neither dataset shows an increase of solar irradiance between the minima of 1986 and 2008, which would be required in order to explain warming over that period.

Therefore, reconstructions of recent solar variability do not agree, but in one case show no trend, and in the case of the Lockwood and Fröhlich reconstruction the solar contribution during this period would have been a cooling, not warming, influence.

The second issue is that in order for solar irradiance to be a major driver of recent warming, there must be an amplification effect that is active for solar irradiance that is not active for forcing due to GHGs. Studies such as Scafetta (2009) often rely on a significantly different factor for solar irradiance than is used for GHG climate sensitivity. Additionally, the Scafetta study relies on a “slow lag” solar response and the timescale chosen has itself been the subject of dispute. The climate sensitivity for this slow lag response used by Scafetta is 0.46° K/Wm-2. Note that this is compared to the total solar irradiance: therefore, the effective sensitivity to the average solar irradiance according to Scafetta would be (4*0.46)/0.7 = 2.6° K/Wm-2. This can be compared to a climate sensitivity range of 2 to 4.5, or about 0.5° K/Wm-2 to 1.2° K/Wm-2. Additionally, Scafetta claims that solar variability accounts for most of the recent warming and that GHG sensitivity is on the low end of the range: this means that Scafetta is effectively claiming that sensitivity to solar variability is on the order of five times the sensitivity to forcing by GHGs, without a good mechanism to explain this extreme difference. Although it is not impossible that there are differences between solar and GHG induced changes, the evidence for an amplification of the magnitude needed to explain recent warming is weak. For example, while Meehl et al. (2009) find an amplification of the solar cycle variability is needed to explain certain patterns of tropical Pacific climate response, the authors note: “This response also cannot be used to explain recent global warming because the 11-year solar cycle has not shown a measurable trend over the past 30 years.”

Moreover, the sensitivity needed is nowhere near as large as the Scafetta sensitivity, and the behavior explained is geographically localized, which is different from a global increase in sensitivity. Therefore, the evidence for an amplification of the magnitude needed to explain recent warming is weak.

Some other authors also show some correlations between solar variability and regional trends. Eichler et al. (2009) find a strong correlation between solar activity (as reconstructed by carbon-14 and beryllium-10 proxies) and temperatures in the Siberian Altai region. However, the authors note that “underlying physical processes are still not yet understood” in terms of amplifying a weak solar signal (in terms of radiative forcing) in order to see larger effects, and also that “[i]n large spatial scale hemispheric or global reconstructions the solar signal may therefore even vanish” because the “main effect of solar forcing is presumably on location, routes, and stability of atmospheric pressure systems, which all act on regional scales.” The conclusion of the Eichler work is that while solar activity was a main driver for temperature variations in the Altai region preindustrially, during the industrial period they found that only CO2 concentrations show a significant correlation with the temperature record. They did find agreement with the northern hemisphere (NH) temperature reconstruction of Scafetta and West (2007) in that they found that only up to approximately 50% of the observed global warming in the last 100 years can be explained by the sun. Note that this conclusion provides 50% as an upper limit to the explanatory power of solar variability, and this is for the full century. Therefore, for the last 50 years, this conclusion is still consistent with the IPCC (2007b) statement that “[m]ost of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations.”

Therefore, to summarize: attempting to attribute late-20th century temperature change mainly to solar variability requires choosing a specific solar dataset, assuming a simplified model with different “fast” and “slow lag” responses based on timescales from a controversial paper, and assuming that the climate system is several times more sensitive to changes in solar irradiance (or other, non-radiative changes in the sun) than it is to changes in GHG forcing. All three of these assumptions are counter to the conclusions of the IPCC and CCSP assessments and not viewed as established conclusions in the literature. While science in this area will continue to evolve, our review did not uncover any compelling alternatives to the science represented in the assessment literature, and summarized in the TSD.
Ho Ho Hummmm.

Greenhouse in a Java jar

While Ethon was out in Boulder pecking away, he thought it would be like nice to bring Eli a gift. The PhET project has a huge number of really cool (the Bunnies ARE nerds:) science simulations, among which is the greenhouse in a java jar, which, among other things lets you play with the level of greenhouse gases, the solar insolation, the clouds and more. Enjoy

As you change the conditions the temperature of the surface changes as it does naturally.

Wednesday, April 07, 2010

Muir Russell and the Wayback Machine


Eli has been thinking a bit about the Muir Russell Inquiry into the stolen Climate Research Unit Emails. One of the issues (if McIntyre and his ilk are about, when has it not been) is what sharing of data and methods is required by the act of publication. Now Eli is an OLD bunny. Maybe not quite so old that he wrote his thesis with a quill pen, but old enough that he did the drawings in India ink and lettered them with Leroy templates. He remembers when the copying machine was a rexograph, mimeographs being too expensive, and when you paid a couple of hundred bucks for the original and three carbon copies of your thesis. The original went to the university library, one copy to you, one copy to your adviser and the third to University Microfilms, who, well microfilmed it.

In those days you actually paid for reprints of your papers, because once the type was set, the offprints were cheaper than copying, and people actually sent you postcards from third world countries like England and Japan, begging for copies, which you mailed off, because you wanted to keep them coming to get the stamps to give to your kid brother, who was grateful for a nanosecond, and postage cost nothing or was paid for by the department.

There were NO data repositories, no hard discs, paper tape took a lot of room and magnetic tape was something you ooed and aahed about. This, as all things changed. As it changed journal requirements changed also. Today, Eli wandered into the depths of the library to look at some dead trees. Specifically Nature. Turns out that in 1996 Nature's requirements changed from

Nature requests authors to deposit sequence and crystallography data in the databases that exist for this purpose
those fields being the first to establish such data archives, to the current
Materials: As a condition of publication authors are required to make maerials and methods used freely available to academic researchers for thier own use. Supporting data sets must be made available on the publication date from the authors directly and by posting on Nature's web site, by depostion in the appropriate data base or on the internet.
Most other journals have much less stringent policies, and these too have changed over time. Data retention is another area where forever is no answer. In the US NIH policy is
Period of retention. Data should be retained for a reasonable period of time to allow other researchers to check results or to use the data for other purposes. There is, however, no common definition of a reasonable period of time. NIH generally requires that data be retained for 3 years following the submission of the final financial report. Some government programs require retention for up to 7 years. A few universities have adopted data-retention policies that set specific time periods in the same range, that is, between 3 and 7 years. Aside from these specific guidelines, however, there is no comprehensive rule for data retention or, when called for, data destruction.
Kings College (London) has a flow chart for the engineering bunnies where the recommendation is seven years for funded research and four for unfunded.

All this goes to the accusations against Phil Jones and the CRU for "destroying data". It's been clearly established that the CRU was never a data depository for data from the National Meteorological Services, but there has been plenty of noise that they had an obligation to plasticize every piece of paper in the place.

Nonsense. As Jones wrote:

No one, it seems, cares to read what we put up on the CRU web page. These people just make up motives for what we might or might not have done.

Almost all the data we have in the CRU archive is exactly the same as in the Global Historical Climatology Network (GHCN) archive used by the NOAA National Climatic Data Center [see here and here].

The original raw data are not “lost.” I could reconstruct what we had from U.S. Department of Energy reports we published in the mid-1980s. I would start with the GHCN data. I know that the effort would be a complete waste of time, though. I may get around to it some time. The documentation of what we’ve done is all in the literature.

If we have “lost” any data it is the following:

1. Station series for sites that in the 1980s we deemed then to be affected by either urban biases or by numerous site moves, that were either not correctable or not worth doing as there were other series in the region.

2. The original data for sites for which we made appropriate adjustments in the temperature data in the 1980s. We still have our adjusted data, of course, and these along with all other sites that didn’t need adjusting.

3. Since the 1980s as colleagues and National Meteorological Services (NMSs) have produced adjusted series for regions and or countries, then we replaced the data we had with the better series.

In the papers, I’ve always said that homogeneity adjustments are best produced by NMSs. A good example of this is the work by Lucie Vincent in Canada. Here we just replaced what data we had for the 200+ sites she sorted out.

The CRUTEM3 data for land look much like the GHCN and NASA Goddard Institute for Space Studies data for the same domains.

Apart from a figure in the IPCC Fourth Assessment Report (AR4) showing this, there is also this paper from Geophysical Research Letters in 2005 by Russ Vose et al. Figure 2 is similar to the AR4 plot.

I think if it hadn’t been this issue, the Competitive Enterprise Institute would have dreamt up something else!

Yes indeedy