Thursday, March 25, 2010

CLearn's Misrepresentations

This site,, a supposed climate simulator, was developed in order to help those of us who cannot possibly understand their complex climate models as a way of showing the threat due to AGW.

John Sterman, one of the authors, put it this way in the TV show The Agenda:

"The strong scientific consensus about the causes and risks of climate change stands in stark contrast to widespread confusion and complacency among the public. Science is clearly not the bottleneck to action. Many people believe we can "wait and see" whether climate change will turn out to be very harmful to human welfare, and, if so, then take action. Wait and see works well in simple systems, with short time delays. But the climate is a complex dynamic system with multiple feedback processes, accumulations, nonlinearities and very long time delays. If climate change turns out to cause serious damage to our welfare, it will be too late to take action. The only way people can explore the dynamics of such large, complex systems is through simulations. Existing large-scale climate models, as wonderful as they are, weren't designed to facilitate learning among policymakers and the public, to whom they are black boxes. And the cycle time for running simulations is too long to be useful in the fast-paced world of negotiations and policy, or in a sound-bite world of channel surfing and short attention spans. The C-ROADS climate policy model we developed runs essentially instantly, so people get immediate feedback on the consequences of their assumptions and can try many experiments in a short time."

Let me see if I understand this correctly. Climate models are very complex and take a long time to churn out a scenario, which is too long and complex for the public. So they "dumbed" it down so it runs the same sumulation in an instant? Really?

Give me a break. All they are doing is making the final result even more off from reality, and peddling it as reality. What a fraud.

Let's have a look at specific examples.

As with all simulations where you get to imput the values, the limits and assumptions can be exposed by putting in extreme numbers and look at the results. So I did just that.

At the bottom of this page you can enter a goal of how much CO2 you want as a maximum and see how it affects the graphs. So I picked 7000 ppm (20 times today, as it was 250myo). Nothing happened. No change. So I tried twice, 700. Again, no change. What gives?

So I tried at the bottom range to support plant life, 150ppm. No change after running.

If you go to the Data section, it shows the data generated by each run. All identical even if I choose 450 or 250 for my CO2 targets. Something is definitely not right here.

Sea level rise is one of my favorite AGW items to debunk because the real in situ measured data is clear. the rate since 1900 is 1.74mm/year, with the current TOPEX and Jason satelite data of 3.2 -2.4mm/year since 1998.

This simulation generates the exact same sea level rise outcome regardless of the CO2 I enter.

I copied the data and pasted it into Excel. I wanted to see what the rate of acceleration was they were using. That is, the rate of increase in the rate of increase. To get to the target of 2 meters in 100 years the acceleration of the rate of increase has to be 4% per year.

By 2100 their increase is 896.3mm. Or 0.896 meters. Half the scare level. That means their acceleration has to be around 2%. But it's not. It's real weird.

Can someone explain to me the mathematical formula that accelerates the rate of sea level in a saw-tooth manner, then suddenly in 2050, the rate decreases!

This has gotta be bogus data. Well, of course it is because this apparent acceleration doesn't exist!

This is their data from 2000 to 2010.

With their acceleration in the rate they come to a current rate of 3.88mm/year. One problem. The measured sea level rate from TOPEX and Jason is 2.4mm/year! So reality has shown this site to be a fraud already.

Of course, predictions is not evidence, is not a certainly of what the future will be. Passing this nonsence off as such is also fraud.

Now temperature. This is the graph they present of what the future temperature will be. It barely changes with different CO2. Notice anything missing from this plot?

No error bars!

No degrees of probability. Instead a line of certainty. It is not physically possible to know with that certainty what the future average of the global mean temperature is. Thus omitting the error bars, the uncertainty of the simulation, is a fraud. No one in science omits the error bars, you would fail physics 101 if you did.

Wednesday, March 24, 2010

Specific Stations: 4333 (Ottawa)

This is the temperature and precipitation graphs for Station 4333 Ottawa.

Note that it is doing what the other stations in Southern Ontario are doing. Cooling summers and not as cold winters. Typical Narrowing.

This is the number of days above 30C:

Cleary dropping. Almost half the number of days above 30C than in the 1920's. Notice the relative stability since the end of the 1940s.

This is the number of days below -20C:

Cleary dropping. Fewer real cold days. Half the number of days below -20C. Good for your heating bills!

This is the number of days below zero:

Cleary dropping. The growing season is 30 days longer than it was in the early 1900s. Good for our gardens.

This is the precipitation data for this station:

There is a slight increase in rain, and a slight decrease in show. This makes sense with half the days below zero as those early spring and late fall days would rain now. However, the total precipitation is not beyond what has happened in the past, a cycle.

Tuesday, March 23, 2010

Summer Seasonal Precipitation Trends

Summer (June, July and August):

British Columbia:

VAVENBY (note the significant drop off in rain in the last two years, something's up with that).

Central Canada:
Calgary Airport. Though it has the longest temperature record for Alberta, its precipitation data ends in 1988.


Birtle, MN

Southern Ontario:

Southern Quebec:

East Coast:
Cartwright, NFD

Over All Average:
This is summer averaging all stations beyond those above, including stations with short datasets:

The over all average is looking too much like the number of stations per year and hence this last graph may or may not, mean anything.

Spring Seasonal Precipitation Trends

The next 4 posts will look at the precipitation data for those same locations as the yearly precipitation but for each season.

First, Spring (March, April and May):

British Columbia:

VAVENBY (note the significant drop off in rain in the last two years, something's up with that).

Central Canada:
Calgary Airport. Though it has the longest temperature record for Alberta, its precipitation data ends in 1988.

MUENSTER, Sask (This too shows a significant drop off in rain and show in the last 2 years)

Birtle, MN

Southern Ontario:

Southern Quebec:

East Coast:
Cartwright, NFD

Over All Average:
This is spring averaging all stations beyond those above, including stations with short datasets:

It is clear from this average of all stations, that over all in Canada there is nothing unusual in the precipitation.

Sunday, March 21, 2010

No Acceleration in Sea Level Rise.

Proponents of AGW throw out the scare of rising sea levels in the future due to thermal expansion and melting ice around the word.

The official IPCC position is:

"Global average sea level rose at an average rate of 1.8 [1.3 to 2.3] mm per year over 1961 to 2003. The rate was faster over 1993 to 2003, about 3.1 [2.4 to 3.8] mm per year. Whether the faster rate for 1993 to 2003 reflects decadal variability or an increase in the longer-term trend is unclear" (

This "fast rate" from 1993 to 2003, only 10 years then, was from TOPEX and Jason measurements:

A paper by Holgate showed that the rate fluctuated over the past 110 years of known in situ measurements. The abstract reads:

Nine long and nearly continuous sea level records were chosen from around
the world to explore rates of change in sea level for 1904–2003. These records
were found to capture the variability found in a larger number of stations over
the last half century studied previously. Extending the sea level record back
over the entire century suggests that the high variability in the rates of sea
level change observed over the past 20 years were not particularly unusual. The
rate of sea level change was found to be larger in the early part of last
century (2.03 ± 0.35 mm/yr 1904–1953), in comparison with the latter part (1.45
± 0.34 mm/yr 1954–2003). The highest decadal rate of rise occurred in the decade
centred on 1980 (5.31 mm/yr) with the lowest rate of rise occurring in the
decade centred on 1964 (−1.49 mm/yr). Over the entire century the mean rate of
change was 1.74 ± 0.16 mm/yr.
(Holgate, 2007)

You can see this clearly in a number of in situ stations from around the world such as


Global stations can be found here However, you have to be careful to pick a location that is on a stable craton (land mass), because tectonics alters the apparent rate at locations where there is subduction, glacial rebound or rifting.

In all cases, not one global station shows any acceleration in the rate of sea level rise.

In fact, this study ( shows no change in many of the Pacific islands, including Tuvalu, the island that claims they are being inundated and swallowed up by the sea. Tuvalu is the poster child of sea level rise, yet this study states:
"If the depression of the 1998 cyclone is ignored there was no change is sea level at Tuvalu between 1994 and 2008; 14 years, despite 14 separate tsunami events The claim of a trend of + 6.0mm/yr is without any justification."

Maldives is another poster child, with the President of the island at Copenhagen making his claim of being swallowed up. Yet this study

( ) finds no increase:

"The Maldives have a uniquely position in sea level research (as discussed
in Integrated Coastal Zone Management, No. 1, 2000, p. 17-20). In the last
decade, they have attracted special attention because, in the IPCC-scenario, the
Maldives would be condemned to become flooded in the next 50-100 years. Our
research data do not lend support to any such flooding scenario, however. On the
contrary, we find no signs of any on-going sea level rise. Our
results comes from visits to numerous islands including extensive work on
Hulhudoo and Guidhoo in the north, in Viligili and Loshfuchi (the site of "the
reef woman") in the middle, and in Addu in the south. This includes coring,
levelling, sampling and dating (35 C14-dates). Present sea level was reached at
about 4500 BP. In the last 4000 years, sea level oscillated around the present
in the last 4000 years. At 3900 BP, there was a short and sharp sea level
high-stand at about +1.2 m. For the last millennium, a detailed sea level record
is established: +0 m 1000-800 BP, +60 cm 800-300 BP, 0 to just below 0 in the
18th century AD, +30 cm 1790-1970 AD, fall to 0 in ~1970 up to today. At about
1970, sea level fell by 20-30 cm (presumably due to increased evaporation). This
is recorded in storm level, high-tide level, mean sea level and in lake and
lagoon levels (from the north to the south). In the last decade, there are no
signs of any rise in sea level. Hence, we are able to free the islands from the
condemnation to become flooded in the 21st century
(bold added)."

Sea level rise has two components. Steric and mass. Steric deals with the salinity and thermal components that change the volume of water, where as mass is the physical change in the volume of water. Steric is influences by the temperature of the seas, and mass is changed by the inflow of water from ice sources.

A summary of this can be found here:

Even though TOPEX shows a 3.2mm/yr since 1993, this paper ( which closed the balance between the two components shows a rate of 1.8mm/yr, in close agreement with Holgate.

Thus no acceleration.

But this has not stopped AGW proponents from claiming there is an acceleration:
"Multi-century sea-level records and climate models indicate an acceleration
of sea-level rise, but no 20th century acceleration has previously been
detected. A reconstruction of global sea level using tide-gauge data from 1950
to 2000 indicates a larger rate of rise after 1993 and other periods of rapid
sea-level rise but no significant acceleration over this period. Here, we extend
the reconstruction of global mean sea level back to 1870 and find a sea-level
rise from January 1870 to December 2004 of 195 mm, a 20th century rate of
sea-level rise of 1.7 ± 0.3 mm yr−1 and a significant acceleration of sea-level
rise of 0.013 ± 0.006 mm yr−2. This acceleration is an important confirmation of
climate change simulations which show an acceleration not previously observed.
If this acceleration remained constant then the 1990 to 2100 rise would range
from 280 to 340 mm, consistent with projections in the IPCC TAR.
& White, 2006)

I fail to see how a 0.007% increase constitutes a "significant" acceleration!!

It has also not stopped them from claiming future increases away from this average. We have seen all these wild speculations, 20 feet in 100 years, 50 feet in 100 years, etc. Anything to make the scariest claims. But how realistic are any of these? It’s simple to show. On only has to plot the current rate over that past 110 years of in situ measurements and then extrapolate the various scenarios.

As you can see, to get to a linear run up to 2 meters in 100 years would require an instant change in the current rate of rise from 1.74mm/year to 20 mm/year. The longer this snap up trick doesn’t happen the steeper that rate to the 2 meters must be.

But of course that is not what these guys claim. Their claim is a gradual increase as in the curved red line. But what they don’t tell you is this is a growth curve. It is simple enough to show what the growth rate must be to reach 2 meters in 100 years. You simply put the data into a spreadsheet and have it calculate what the growth would have to be to meet that target. It’s 4.3% per year. At this rate the doubling time is around 9 years.

This means that every 9 years the rate of sea level rise would double from the previous 9 years. This is the only way to get from here to 2 meters in 100 years. This means by the last year, 2099, the rate of sea level rise would have to be a staggering 84mm/year! That’s more in one year than in all of the previous 110 years. Sorry that is just not credible.

In Sept of 2008, 3 researchers, and supporters of AGW, dropped a bombshell paper where they calculated the maximum possible rise in sea level up to 2100.

"On the basis of climate modeling and analogies with past conditions, the
potential for multimeter increases in sea level by the end of the 21st century
has been proposed. We consider glaciological conditions required for large
sea-level rise to occur by 2100 and conclude that increases in excess of 2
meters are physically untenable. We find that a total sea-level rise of about 2
meters by 2100 could occur under physically possible glaciological conditions
but only if all variables are quickly accelerated to extremely high limits. More
plausible but still accelerated conditions lead to total sea-level rise by 2100
of about 0.8 meter. These roughly constrained scenarios provide a "most likely"
starting point for refinements in sea-level forecasts that include ice flow
."(Pfeffer et al., 2008)

This sent off a scurry of activity at RealClimate who were not too happy with the paper. (
Pfeffer even had to come to RC to defend the paper.

But of course, we all know projections of what MIGHT be are a far cry from what WILL be. The current TOPEX of 3.2mm/year is just one increase of several that Holgate noted has occurred in the past 110 years. Until that rate goes beyond that normal variation, there is no scientific justification for claiming this shows a true acceleration.

Here is a case in point. This paper claims acceleration at Brest France:

"Three linear trend periods can indeed be distinguished in the Brest MTL
time series over the period 1807–2004:
(a) 1807–1890, over which the sea level rate is estimated at −0.09±0.15 mm/year
(b) 1890–1980, at +1.3±0.15 mm/year
(c) 1980–2004, at 3.0±0.5 mm/year
"( Wöppelmann, et al, 2006).

So let’s have a look at these "accelerations" on the graph. (I emailed Wöppelmann questioning this "acceleration" and asking for his data. He did not reply, sound familiar?)

It is clear from this graph that there have been other times in the past where the rate was higher in short periods than it is today. It’s a huge leap of faith to claim these three periods are any kind of significant acceleration.

They note that this "acceleration" is "…estimated at 0.0071±0.0008 mm/year2 over the period 1807–2004 using a simple quadratic least-squares adjustment." With an over all rate of 1.00 for Brest, an "acceleration" of .0071 is 0.71% growth. At that rate of growth the highest sea level can get in 100 years is 23cm, or 1 foot. The upper end of the IPCC prediction. That’s a far cry from the 80cm even for the max that Pfeffer et al claim.

Suffice it to say, there is no credible evidence for any kind acceleration in sea level rise, which is required by AGW theory to happen. It’s just not there. Those such as Church and Wöppelmann are grasping at straws. This is not only unscientific, it is disgraceful to try and deceive the public and policy makers. It is clear to anyone that there is no acceleration.

"there is no discernible divergence in the rate of sea-level rise over the past
two centuries to suggest a connection with the documented increase in
atmospheric CO2 concentration… the rate of sea-level rise has been linear over
this time period and shows no indication of the pronounced mid-20th-century
increase" (Larsen and Clark, 2006)

Church, J. A., and N. J. White (2006), A 20th century acceleration in global sea-level rise, Geophys. Res. Lett., 33, L01602, doi:10.1029/2005GL024826.

Holgate, S. J. (2007), On the decadal rates of sea level change during the twentieth century, Geophys. Res. Lett., 34, L01602, doi:10.1029/2006GL028492.

Larsen, C.E. and I. Clark. 2006. A search for scale in sea-level studies. Journal of Coastal Research, 22(4) ,788–800.

Wöppelmann . G, N. Pouvreau, B. Simon (2006) Brest sea level record: a time series construction back to the early eighteenth century, Ocean Dynamics 56: 487–497

Wöppelmann, G., B. Martin Miguez, M.-N. Bouin, and Z. Altamimi. (2007) Geocentric sea-level trend estimates from GPS analyses at relevant tide gauges world-wide. Global and Planetary Change, 57, 396–406.

W. T. Pfeffer, J. T. Harper, S. O'Neel (2008) Kinematic Constraints on Glacier Contributions to 21st-Century Sea-Level Rise, Science: Vol. 321. no. 5894, pp. 1340 - 1343

Precipitation Trends for All Canada

You cannot add together the precipitation from different locations, you cannot even average them together because different geographical areas get different amounts depending on the topography and proximity to lakes. For example, the area west of the Niagara Escarpment gets more snow and rain because of winds off Lake Huron. Streamers can dump as much as 2 feet of snow in one run in this area, while Toronto gets nothing. Thus the deposit of snow and rain has local factors involved in initiation of precipitation and has nothing to do with global temperatures.

That said, we will look at stations with at least 80 years of data to see if there is any change in precipitation.

British Columbia:


Central Canada:
Calgary Airport. Though it has the longest temperature record for Alberta, its precipitation data ends in 1988.


Birtle, MN

Southern Ontario:

Southern Quebec:

East Coast:
Cartwright, NFD

Over All Average:
This is averaging all stations beyond those above, including stations with short datasets:

It is clear from this average of all stations, that over all in Canada there is nothing unusual in the precipitation. As noted above, however, one cannot put too much credence on this graph because not all stations are represented in all years. In fact, the number of stations per year with precipitation data looks like this:

This means that in the early and recent years, the average of the precipitations will be skewed by the geographical and regional differences in those stations. Thus the average for all Canada is unreliable for determining trends.

All these show is changes over time. There is nothing to link this with AGW or CO2 emissions. It's just normal random long term changes that would happen whether we emitted CO2 or not.

The next post on precipitation will look at the rain and show fall for the various months to see if there are seasonal shifts.

Wednesday, March 3, 2010

Playing with Anomalies

When you see plots you will mostly see them as an anomaly. What this means is a base line of the dataset is obtained, and all subsquent runs are subtracted from that baseline. The choice of the baseline is arbitrary. It will be a range between two years.

The UK Met office explains anomaly as:

Absolute temperatures are not used directly to calculate the global-average temperature. They are first converted into ‘anomalies’, which are the difference in temperature from the ‘normal’ level. The normal level is calculated for each observation location by taking the long-term average for that area over a base period. For HadCRUT3, this is 1961–1990.

For example, if the 1961–1990 average September temperature for Edinburgh in Scotland is 12 °C and the recorded average temperature for that month in 2009 is 13 °C, the difference of 1 °C is the anomaly and this would be used in the calculation of the global average.

One of the main reasons for using anomalies is that they remain fairly constant over large areas. So, for example, an anomaly in Edinburgh is likely to be the same as the anomaly further north in Fort William and at the top of Ben Nevis, the UK’s highest mountain. This is even though there may be large differences in absolute temperature at each of these locations.

The anomaly method also helps to avoid biases. For example, if actual temperatures were used and information from an Arctic observation station was missing for that month, it would mean the global temperature record would seem warmer. Using anomalies means missing data such as this will not bias the temperature record.

So they calculate a base line for each station for each month. The claim is that this levels out the absolute temps.

Here is what the range of temps was for July 1980 for Southern Ontario. This year was picked because all but 3 stations have data for that year, the highest of all the years.

You can see that there is a 5C difference around the province and those differences tend to be geographic, well, somewhat. This then could be considered the "type" temperature regime for the region for July. The yellow dot is the average of all those stations and represents the balance point for the region. Were these weights and not temps, you could balance the region on the end of your finger on that point. (one would have to check every month to see if it follows the same pattern, wanna bet it doesn't?). But this is only the "type", that location is only the balance point, because 1980 happens to be only 1 of 3 years, out of more than 100 years, that has this many stations represented.

In this example Harrow is the hottest location, however that station only has a small number of years. This means for years that Harrow is not represented, that balance point will have moved. The fewer the stations, the move skewed the balance point can be.

So to counter this problem, the anomaly is calculated for each region. So let's see how that works.

For Ottawa (stn 4333) I calculated the "normal" base line using their date range (1961-1990) for each month. I then linked the raw average data for that station to this recordset and got the temperature monthly anomaly above. The moving average is 120 months (10 years).

Then I averaged the monthly baseline to give a baseline for the entire years 1961-1990, and plotted that with the average of the yearly means to give this anomaly. The moving average is 10 years.

Hold on a second. How can the slopes be different? The yearly anomaly trend regression line slope is TEN TIMES larger than the monthly regression line. However, look at the two on top of each other.

The reason for the slope difference is because of the length of points on the x-axis. The monthly normal is 12 * 109 years, where as in the yearly normal graph it just 109 years. So we can use either of these methods to generate the anomaly graphs. So for simplicity we will use the yearly normal plots.

As we have seen in other posts what is important is not the trend of the average of the yearly means, but the trends of the hottest and coldest days of the year. Those two converging trends that must be part of a cycle. Let's see anomalies of those:

This is the anomaly for the max temps for each year:

This is the average of the actual max temps for each year:

They match up prettly close, especially their slops and R2. I would expect this. Except one thing. This is claiming, both the average and the anomaly, that the max temps of the year are increasing. But for this location, they are not. The summer temps are decreasing. This average in the max temps is for the entire year, and we know that the winters are warming, so that is what will increase that max temps. In other words, this is not depiction of range, but the same old problem of AVERAGES being influenced by changes in the ranges within each year.

So we can check that. Let's see what these anomalies look like using the highest of the maximum temperatures. This is the anomaly of each years highest maximum (summer) temps.

It shows that the hottest temperatures of the year are increasing, getting hotter. But when you plot the ACTUAL highest maximum temperatures in each year you get this:

It's decreasing!! Thus the anomalies, because it is a comparison to a "base line" of years, can produce a trend that in reality is not there. Thus anomalies cannot show us what is physcially going on within the extreme ranges of the years. And it is those extreme ranges that AGW proponents claim is changing for the worst. It's not. Cooler summers (not as many heat waves) and not as cold winters (fewer deep freeze days) is not bad at all, but quite good.

Let's see what happens if we choose a different base line, say the entire range of data (why they choose 1961-1990 is a mystery).

Pretty close to the average mean anomaly in the second graph from the top. The slopes are close, so is the R2. So why restrict the base line to a subset of the records, and not the entire set if they give the same results? Let's try a real short range for the base, say the period where temps were increasing (the range they choose of 1961-1990 is inside the drop in temps during the 1945-1975 period). So we will choose the range 1980-2000. The so called hottest period.

Notice no change in the slope, the shape of the graph and the R2. All that has changed is the B part of the equation, which is expected as all you are doing by changing the base years is shifting the graph up or down.

So there is no mathematical reason to choose a specific range of years. If you want to see the anomaly for any given station to compare to other stations in different regions, then one should use the entire range of years to get the base line.

Part two of this will show how the anomalies differ from region to region to see if their premise of anomalies as a tool to see the over all change is true.

About Me

jrwakefield (at) mcswiz (dot) com