90 Miles From Tyranny : Canada’s global warming models threw out actual historical data and substituted models of what the temperature should have been

infinite scrolling

Monday, September 23, 2019

Canada’s global warming models threw out actual historical data and substituted models of what the temperature should have been

Environment Canada, led by Justin Trudeau-appointed Environment Minister Catherine McKenna, is all-in on the hypothesis that manmade global warming is an existential threat to humanity. It is so important to hand control of energy use to the government that mere actual, historical data that might raise doubt about the extent of purported warming over time must be thrown out and replaced by “models” of what the “scientists” think the historical temperature record must have been.

In other words, the computer models Canada uses to measure and project “global warming” are themselves based on other computer models. The expression “Garbage in / garbage out” refers to the vulnerability of all computer models to poor quality data used as the basis of their calculations. This raises the awkward question of the quality of the models used in place of actual historical data. And it raises the question of why this scrapping of actual data and substituting of guesses (aka, models) was not made clear from the outset.

We only know about this fundamental issue because of the efforts of an intrepid reporter in Ottawa, who digs through Canadian government documents. Lorrie Goldstein explains in the Toronto Sun:
Canadians already suspicious of Prime Minister Justin Trudeau’s carbon tax are likely be even more suspicious given a report by Ottawa-based Blacklock’s Reporter that Environment Canada omitted a century’s worth of observed weather data in developing its computer models on the impacts of climate change.

The scrapping of all observed weather data from 1850 to 1949 was necessary, a spokesman for Environment Canada told Blacklock’s Reporter, after researchers concluded that historically, there weren’t enough weather stations to create a reliable data set for that 100-year period.

“The historical data is not observed historical data,” the spokesman said. “It is modelled historical data … 24 models from historical simulations spanning 1950 to 2005 were used.”

These computer simulations are part of the federal government’s ClimateData.ca website launched by Environment Minister Catherine McKenna on Aug. 15.

She described it as “an important next step in giving our decision-makers even greater access to important climate data for long-term planning. The more each of us uses this type of information, the more it will help.”

Blacklock’s Reporter notes that in many cases the data that were scrapped indicated higher temperatures in the past:
For example, Vancouver had a higher record temperature in 1910 (30.6C) than in 2017 (29.5C).

Toronto had a warmer summer in 1852 (32.2C) than in 2017 (31.7C).

The highest temperature in Moncton in 2017 was four degrees cooler than in 1906.

Brandon, Man., had 49 days where the average daily temperature was above 20C in 1936, compared to only 16 in 2017, with a high temperature of 43.3C that year compared to 34.3C in 2017.

Those of us who are castigated as “science deniers” for questioning the output of the models forecasting doom must point out that real scientists don’t hide or downplay the source of their data used as inputs, they are completely...



Read more: HERE

No comments: