Category Archives: Data quality

Day/Night temperature spread fails to confirm IPCC prediction

By David Mason-Jones, 

Research by Dr. Lindsay Moore

The work of citizen scientist, Dr. Lindsay Moore, has failed to confirm an important IPCC prediction about what will happen to the spread between maximum and minimum temperatures due to the Enhanced Greenhouse Effect. The IPCC’s position is that this spread will narrow as a result of global warming.

Moore’s work focuses on the remote weather station at Giles in Western Australia and run by Australia’s peak weather monitoring body, the Bureau of Meteorology (BoM).

Why Giles? 

Giles is the most remote weather station in mainland Australia and its isolation in a desert makes it an ideal place to study the issue of temperature spread. It is virtually in the middle of the Continent.It is far from influencing factors such as Urban Heat Island effect, land use changes, encroachment by shading vegetation, shading by buildings and so on, that can potentially corrupt the data. Humidity is usually low and stable and it is far from the sea. In addition, as a sign of its importance in the BoM network, Giles is permanently staffed.

As stated, the IPCC hypothesis is that the ‘gap’ will become steadily smaller as the Enhanced Greenhouse Effect takes hold. As temperature rises the gap will narrow and this will result in an increase in average temperature, so says the IPCC.

Moore’s research indicates that this is just not happening at this showcase BoM site. It may be happening elsewhere, and this needs to be tested in each case against the range of all data-corrupting effects, but it is not happening at Giles.

Notes about the graphs. The top plot line shows the average Tmax for each year – that is, the average maximum daytime temperature. The middle plot shows the average Tmin for each year – that is, the average minimum night time temperature.

The lower plot shows the result of the calculation Tmax-Tmin. In laypersons’ terms it is the result you get when you subtract the average yearly minimum temperature from the average yearly maximum temperature. If the IPCC hypothesis is valid, then the lower plot line should be falling steadily through the years because, according to the IPCC, more carbon dioxide in the atmosphere should make nights warmer. Hence, according to the IPCC’s hypothesis, the gap between Tmax and Tmin will become smaller – ie the gap will narrow. But the plot line does not show this.

The IPCC’s reasoning for its narrowing prediction is that global warming will be driven more by a general rise in minimum temps that it will be by a general rise in maximums. This is not my assertion, nor is it Dr. Moore’s, it is the assertion of the IPCC and can be found in the IPCC’s AR4 Report. 

Dr. Moore states, “In the AR4 report the IPCC claims that elevated CO2 levels trap heat, specifically the long wave radiation escaping to space.

“As a result of this the IPCC states at page 750 that, ‘almost everywhere night time temperatures increase more than day time temperatures, that decrease in number of frost days are projected over time, and that temperatures over land will be approximately twice average Global temp rise,” he says citing page 749 of the AR4 report. 

So where can we go to find evidence that the IPCC assertion of a narrowing spread of Tmax-Tmin is either happening or not happening? Giles is a great start point. Can we use the BoM’s own publicly available data to either confirm, or disprove, the narrowing prediction? The short answer is – Yes we can.

But, before we all get too excited about the result Dr. Moore has found, we need to recognise the limitation that this is just one site and, to the cautious scientific mind, may still be subject to some bizarre influence that somehow skews the result away from the IPCC prediction. If anyone can suggest what viable contenders for ‘bizarre influences’ might be at Giles we would welcome them in the comments section of this post. 

The caution validly exercised by the rigorous scientific mind can be validly balanced by the fact that Giles is a premier, permanently staffed and credible site. The station was also set up with great care, and for very specific scientific purposes, in the days of the Cold War as part of the British nuclear test program in Australia in the 1950’s. It was also important in supplying timely and accurate meteorological data for rocket launches from the Woomera Rocket Range in South Australia in the development of the Bluestreak Rocket as part of the British/Australian space program. This range extended almost all the way across Australia from the launching site at Woomera to the arid North West of Western Australia.

In the early years there were several other weather monitoring stations along the track of the range. Such has been the care and precision of the operation of the station that Giles has the characteristics of a controlled experiment. 

Dr. Moore states, “Giles is arguably the best site in the World because of its position and the accuracy and reliability of its records which is a constant recognised problem in many sites. Data is freely available on the BoM website for this site.”

With regard to the site validly having the nature of a controlled experiment, something about the method of analysis is also notable. The novel adoption of deriving the spread Tmax-Tmin  by doing it on a daily basis neatly avoids meta data issues that have plagued the reliability of data from other stations and sometimes skewed results from other supposedly reliable observation sites.

“I would argue that the only change in environmental conditions over the life of this station is the increase in CO2 from 280 to 410 ppm,” he says.

“In effect this is, I suggest, a controlled experiment with the only identifiable variable input being CO2 concentration,” he says.  

The conclusion reached by Dr. Moore is that an examination of the historical records for this site by accessing the same data through the BoM website unequivocally shows NO significant reduction in Tmax-Tmin. It also shows no rise in Tmin. Anyone can research this data on the Bureau of Meteorology website as it is not paywalled. It is truly sound data from a government authority for the unrestricted attention of citizens and other researchers.  

Dr. Moore concludes, “The logical interpretation of this observation is that, notwithstanding any other unidentified temperature influencing factor, the Enhanced Greenhouse Effect due to elevated CO2 had no discernible effect on temperature spread at this site. And, by inference, any other site.”

He further states, “On the basis of the observations I have made, there can be no climate emergency due to rising CO2 levels, whatever the cause of the rise. To claim so is just scaremongering.

“Any serious climate scientist must surely be aware of such basic facts yet, despite following the science for many years, I have never seen any discussion on this specific approach,” he says.

Finally, Dr. Moore poses a few questions and makes some pertinent points:

He asks, “Can anyone explain, given the current state of the science why there is no rise in minimum temperatures (raw) or, more importantly, no reduction in Tmax-Tmin spread, over the last 65 years of records despite a significant rise in CO2 levels at Giles (280-410ppm) as projected by the IPCC in their AR4 report?” He notes that other published research indicates similar temperature profiles in the whole of the central Australian region as well as similarly qualified North American and World sites.

Seeking further input, he asks, “Can anyone provide specific data that demonstrates that elevated CO2 levels actually do increase Tmin as predicted by the IPCC?” And further, “Has there been a reduction in frost days in pristine sites as predicted by the IPCC?”

On a search for more information, he queries, “Can anyone explain why the CSIRO ‘State of the Climate’ statement (2020) says that Australian average temperatures have risen by more than 1 deg C since 1950 when, clearly, there has been no such rise at this pristine site?” With regard to this question, he notes that Giles should surely be the ‘go to’ reference site in the Australian Continent.

Again he tries to untangle the web of conflicting assertions by reputedly credible scientific organisations. He notes that, according to the IPCC rising average temperatures are attributable to rise in minimum temperatures. For the CSIRO State of the Climate statement to be consistent with this, it would necessitate a rise of around 2 deg C in Tmin. But, at Giles, there was zero rise. He also notes that, according to the IPCC, temperature rises over land should be double World average temperature rises. But he can see no data to support this. 

Dr. Moore’s final conclusion: “Through examination of over 65 years of data at Giles it can be demonstrated that, in the absence of any other identifiable temperature forcing, the influence of the Enhanced Greenhouse Effect at this site appears to be zero,” he says. “Not even a little bit!” 

David Mason-Jones is a freelance journalist of many years’ experience. He publishes the website www.bomwatch.com.au

Dr. Lindsay Moore, BVSC. For approaching 50 years Lindsay Moore  has operated a successful veterinary business in a rural setting in the Australian State of Victoria. His veterinary expertise is in the field of large animals and he is involved with sophisticated techniques such as embryo transfer. Over the years he has seen several major instances in veterinary science where something that was once accepted on apparently reasonable grounds, and adopted in the industry, has later been proven to be incorrect. He is aware that this phenomenon is not only confined to the field of Veterinary Science but is happens in other scientific fields as well. The lesson he has taken from this is that science needs to advance with caution and that knee-jerk assumptions about ‘the science is settled’ can lead to significant mistakes. Having become aware of this problem in science he has become concerned about how science is conducted and how it is used. He has been interested in the global warming issue for around 20 years.   

General link to Bureau of Meteorology website is www.bom.gov.au

Ocean surface temperature limit

Guest post

Richard Willoughby[1]

http://www.bomwatch.com.au/

contact: scientist@bomwatch.com.au

Main points

  • Observations for at least the last 50 years of open ocean surface temperature provide clear evidence that the annual average ocean surface temperature is limited to 30 degrees Celsius with short-lived excursion up to 32 degrees Celsius.  This observation contradicts the predictions of climate models that show tropical ocean surface temperature perpetually rising.
  • The formation of clouds in response to the surface temperature of tropical oceans limits the surface insolation.  Once the surface temperature reaches 32C the cloud becomes sufficiently persistent that surface heat input and heat output reach a balance.  The consequence being that 32C is the maximum possible temperature of open ocean water. 
  • The ocean surface temperature limiting mechanism is tightly linked to the atmospheric water content.  Once the atmospheric water content reaches an equivalent depth of 45mm, a process of cyclic deep convection sets in whereby water vapour is literally catapulted as high as 12,000 metres into the atmosphere by convective instability.  This is a well-defined physical process detailed in the paper.
  • The paper challenges the concept of “Greenhouse Effect” where radiating gasses such as water vapour is assumed to only cause surface heating.  The key failure with climate models is the parameterising of clouds which are, in fact, formed from different phases of atmospheric water.  Once the atmospheric water reaches 45mm, the mode change to cyclic cloudburst results in the atmospheric water becoming a cooling agent rather than a warming agent at lower level.

Background

The satellite era has provided abundant data with broad coverage of the global energy balance, atmospheric conditions and surface temperature.  Observation of the ocean surface temperature data from satellites provides compelling evidence that the ocean surface temperature rarely exceeds 30 degrees Celsius.  Rare exceptions like the Persian Gulf, which reaches up to 35 degrees Celsius, provides the key to identifying that cyclic cloudburst is a surface temperature limiting process.  Cloudburst is rare in the Persian Gulf and it is the only sub-tropical ocean surface exceeding 28 degrees Celsius that has not experienced a tropical cyclone in recorded history.

What we did

In addition to the satellite data, surface level data from the tropical ocean moored buoy arrays was evaluated to observe the surface temperature limiting process in operation through one hour intervals displaying the exquisite precision of this natural atmospheric-ocean system.  A series of charts within the paper demonstrates the same process across the three tropical oceans separated by thousands of kilometres and at different times of the year all regulating ocean surface temperature to a limit of 30 degrees Celsius.

A single column atmospheric model was developed through the course of this study to quantify convective instability and that led to the understanding that the increasing persistency of high level cloud reduced surface insolation by increasing cloud short wave reflection.  Clear sky conditions reduce rapidly with surface temperature above 28 degrees Celsius.

Detailed analysis, month-by-month of top of the atmosphere radiation balance and the level of atmospheric water unearthed a key parameter that the role of water vapour pivots about a level of 45mm.  Below that level, the water vapour is a warming agent through long-wave radiation absorption while above that level it is a cooling agent through the process of convective instability dominating cloud formation and increased cloud short-wave radiation reflection.

Principal findings

  • Current climate models assume the ocean surface temperature can continue to increase without constraint.  Clouds in climate models are parameterised and their formation is not tightly linked to surface temperature.  These fundament flaws mean climate models predict physically impossible outcomes.  They are not representative of Earth’s climate system.  The warming trend all climate models predict for the Nino34 region is clearly not evident in the actual recorded data over the last four decades.
  • Until climate models can replicate the physics of deep convection, tightly linked to surface temperature rather than the naive parameterisation of clouds, they will remain nothing more than extended weather models with useful predictive ability of a few days.
  • The behaviour of water in Earth’s atmosphere follows clearly defined processes that depend on the physical properties of water, in all phases, and atmospheric pressure.  The only way the open ocean surface temperature can exceed 32C is through a substantial increase in atmospheric pressure.  There is strong proxy evidence that higher ocean surface temperature were recorded in the Cretaceous period when the atmospheric pressure was approximately 10% higher than present time.  This is consistent with the temperature limiting process detailed in the paper.
  • Observations of the attributes of water in the atmosphere contradict the heat trapping assumption of atmospheric water described by the “Greenhouse Effect”.  Water in the atmosphere is not heat trapping but rather a temperature regulating component that increases radiating power (the sum of reflected short wave radiation and emitted long wave radiation) when the surface warms and reduces radiating power when the surface cools through reduced cloud cover enabling more surface insolation. 

An important link – find out more

The page you have just read is the basic cover story for the full paper. If you are stimulated to find out more, please link through to the full paper – a scientific Report in downloadable pdf format. This Report contains far more detail including photographs, diagrams, graphs and data and will make compelling reading for those truly interested in the issue.

Click here to download the Full Paper including photos graphs and tables


[1] Professional Electric Engineer consulting in engineering risk for major projects with an enduring interest in natural catastrophes and changing climate.

Charleville, Queensland Australia

Note: If you want to link directly to the Full Paper supporting this Brief Summary, click here. The Full Paper is a PDF of 30 pages and includes extensive analysis including more detailed discussion, tables of raw data used, additional photos, graphs of data from Charleville and the other ‘nearby’ weather stations which the BOM used to homogenise the poor state of the Charleville data.

If you would prefer to read this Brief Summary first, then a link to the Full Paper is provided at the end of this Summary.

Are Australia’s automatic weather stations any good?

Part 4: Effect on temperature extremes

Dr. Bill Johnston[1]

scientist@bomwatch.com.au

Main points

  • There is no change in the climate at Charleville or the ten other sites that were used to homogenise maximum temperature at Charleville in 1996.
  • For the people of central Queensland, save for a sudden shift in the subtropical ridge such as happened in November 1924, the climate they have is the climate they will experience for the foreseeable future. Droughts will come and go every 9 to 12 years but they are not increasing in severity or frequency. Their effects are just more visible from space, on the evening news and on social media.
  • It is also no warmer in Central Queensland than it has been in the past. Conflating the use of rapid-sampling automatic weather station (AWS) probes housed in infrequently serviced 60-litre Stevenson screens under adverse conditions with “the climate” is misleading in the extreme.
  • Using faulty (correlated) data to correct faults in Australian Climate Observations Reference Network – Surface Air Temperature (ACORN-SAT) data has no scientific or statistical merit and should be abandoned.
  • There is no end to the damage that the Bureau’s infrequently maintained AWS network and biased homogenisation methods can inflict on Australia’s economy and our national prosperity. With trust in our institutions at an all-time low, an open inquiry is long overdue.

Background

The United States Army Air Force (USAAF) used the aerodrome at Charleville in southwestern Queensland as a heavy bomber transit base during World War II. The operational precinct, including the original QANTAS hanger, the tower, meteorological (met) office, refuelling and other facilities were located south of Warrego Highway near the town racecourse (Figure 1).

Astonishingly, although Charleville is one of 112 ‘high quality’ Australian Climate Observations Reference Network – Surface Air Temperature (ACORN-SAT) sites used to monitor long term temperature trends, the Bureau of Meteorology (BoM) did not know where their original office was or that the instrument enclosure and cloud-base searchlight was located 160 m southwest at Latitude ‑26.4103o, Longitude 146.2561o.

Assertions that “The Bureau’s methods have been extensively peer-reviewed and found to be among the best in the world”; and that “the community can have confidence the Bureau is providing an accurate estimate of Australia’s true temperature trend” are demonstrably not true. In 2011, ‘World-leading, expert’ peer reviewers did not evaluate the integrity of metadata used to make adjustments to ACORN-SAT data or investigate the soundness of individual datasets. BoM has also misrepresented the usefulness of temperatures observed under variable conditions using a range of instruments at different locations, to accurately measure trend and changes in the climate.

Figure 1. Located in central Queensland (top image), the Charleville aerodrome was used by the USAAF as a heavy bomber transit base during WWII. Facilities including the control tower, radio-met office and met enclosure were located in the northwestern sector of the aerodrome adjacent to the heavy-duty E-W runway (right; section of the plan from the National Archives of Australia (Item ID, 1972125) shown in Appendix 1 of the accompanying report).

About the data for Charleville airport.

From 1943 to 1956 when Bureau staff moved from the Aeradio office to the Department of Civil Aviation Flight Services Unit (FSU) near the terminal and they moved the site about 580 m southeast; ‘raw’ daily maximum temperature data was made-up. Metadata seems deliberately vague about what went on, where the data came from and the process they used to fill the 13-year gap. As if to re-write the climate history of central Queensland, data-reconstruction diminished the impact of the long-running inter-War drought on temperature extremes, particularly during the most devastating years to 1947.

In September 1990 the site moved 150 m nearer the runway where temperature was measured by a Micromac automatic weather station that only reported hourly data. There is some confusion about when a 60-litre screen was installed or whether the 230-litre screen was used by a replacement AWS before it became the primary instrument on 1 November 1996. It appears that in lieu of manual data, which may not have been digitised and despite its shortcomings, BoM relied on the Micromac AWS for reporting daily maxima and minima (Tmax and Tmin) until 1996. The site moved about 600 m west to its present position in October 2003 (Figure 2), which was when the previous 230-litre screen was decommissioned.

Site and instrument comparisons.

The BoM seems to have gone to extraordinary lengths to obstruct instrument (thermometers vs. AWS and 230-litre vs. 60-litre screens) and inter-site comparisons. Aerial photographs show the FSU-site was operational at least until 13 January 1992 and that an instrument was located near the Micromac site possibly as early as 1963; however there is no overlapping data for the 1990 move. There is also no data that allows different screens to be compared after 8 December 1999, when the 60-litre screen was installed until the site moved on 13 September 2003 and the 230-litre screen was decommissioned. Furthermore, the site diagram for Charleville Aero (44021) dated 28 May 2002 (before the move) only shows one screen, while despite being the same site, the plan for Charleville Aero Comparison (given the new ID, 44221) dated 7 May 2005 only shows a single large screen even though it had been decommissioned 2-years earlier!

Occurrence of temperature extremes.

Analysis shows there was no change in the real climate at Charleville. Despite this, the distribution of upper-range values (temperatures greater than the 95th day-of-year percentiles) was distorted (skewed higher) by the AWS and 60-litre screen. Pre- and post-2013 percentile differences showed the AWS most likely exceeded its calibrated range to peak 2.1oC above pre-2013 percentiles and to avoid unrealistic numbers of observations exceeding the old century benchmark of 37.8oC (100oF), values greater than 86th percentile (35.7oC) were most likely culled or moderated in situ or by post-acquisition processing before being logged.

Homogenisation is fatally flawed.

Headed by a case study of data for Mungindi Post Office, the ten sites used by BoM to homogenise Charleville Tmax for a change in 1996 (probably related to a new AWS replacing the Micromac before becoming the primary instrument on 1 November) were also analysed.

Using highly correlated comparator datasets to make adjustments is fatally flawed. Data for Mungindi and the nine other mostly manual datasets owed their correlation with Charleville to embedded parallel faults – alignment of screen and site changes, site moves, watering, the change to the AWS and small screen at Longreach. Like for Charleville, unbiased analysis found no trend or change at any sites that was attributable to the climate.

Using faulty (correlated) data to correct faults in ACORN-SAT data has no scientific or statistical merit and should be abandoned.

Figure 2. The Charleville Airport met-enclosure and 230-litre Stevenson screen photographed in 1954 (top image). The structure to the right protected the cloud-base searchlight. Buildings in the distance included the powerhouse, balloon-filling shed and Aeradio office. Although the original screen was oriented to the north (not to the south, which is the standard) the contrast between the grassy WW-II site and the red bare-soil at the current site (lower image) is stark. The current 60-litre Stevenson screen houses temperature probes for the rapid-sampling automatic weather station. Transient air-pockets rising from dry bare surfaces, which during the heat of the day may reach temperatures of >60oC, result in temperature spikes that are unlikely to represent the ‘true’ temperature of the local climate. (Photographs courtesy of the Airways Museum and BoM.)    

Implications     

The Bureau’s temperature datasets are used widely to inform, discuss, develop/support policy and shape public opinion as well as for a wide range of scientific studies related to the climate and the biosphere more generally. Scores, if not hundreds of peer-reviewed scientific papers published in top-shelf journals over the past few decades are predicated on the notions that (i), data relied upon truly reflect the climate; (ii), data are fit for purpose – that conditions under which measurements were made were not biased by site changes and that data were carefully curated; and (iii), methods used by the Bureau for quality control including for homogenisation are transparent and replicable across multiple sites, datasets and regions.

However, none of the underlying assumptions has been found to be true. While all sites thus far examined by BomWatch have changed and moved in ways that affect data, no datasets show trend or change that could be unambiguously attributed to the climate. Metadata is universally inexact and misleading. That pre-1956 data at Charleville were made-up and that for Bollon and Injune were probably in-filled or estimated for several years data, undermines claims that the Bureau’s data handling methods are amongst the best in the world.

  • That homogenisation uses highly correlated neighbours that embed parallel faults to adjust ACORN-SAT data used by CSIRO in State of the Climate Reports to justify and inform policy; which in-turn is unquestionably reported-on by activist organisations and media such as the Climate Council, WWF, The Conversation, the ABC and The guardian is a scientific travesty of monumental proportions. Further, it is the Bureau that bears the responsibility.
  • That users of the Bureau’s data including professors and students at Australian universities have failed to check soundness of the data they use, seriously undermines their credibility and the standing of the institutions they represent. The multiple failures of down-stream peer review are endemic and beyond redemption.
  • Finally, failure to undertake due diligence by those purporting to be ‘battling’ climate change like the Great Barrier Reef Foundation, Farmers for Climate Action, the Murray-Darling Basin Authority, banks like ANZ, corporates such as QANTAS and BHP, lobbyists for the electricity industry, committees advising ministers, the Australian Research Council, CSIRO and academies of sciences, fuels widespread distain for elitist institutions and entities whose trustworthiness is questionable.

There is no end to the damage that the Bureau’s data and data handling methods can inflict on Australia’s economy, our children’s future prospects and our national prosperity.

14 February 2021  


[1] Former NSW Department of Natural Resources senior research scientist and former weather observer.

An important link – find out more

The page you have just read is the basic cover story for the full paper. If you are stimulated to find out more, please link through to the full paper – a scientific Report in downloadable pdf format. This Report contains far more detail including photographs, diagrams, graphs and data and will make compelling reading for those truly interested in the issue.

To read the Full Paper supporting this Brief Summary, click here.

Rutherglen, Victoria, Australia

Are Australia’s automatic weather stations any good?

Part 3: Non-climate biases

Dr. Bill Johnston[1]

(Read- time 6 minutes)

Historic climate data collected to monitor and describe the weather are notoriously problematic to use in bolt-on experiments that aim to determine trend and change in the climate. The instruments used and conditions under which observations were made are rarely known and it is the earliest data that form the baseline for establishing long-tern trend. Pre-1940 data for Rutherglen Research (Bureau ID 082039) for example were observed at a site in a wheat paddock 1-km north of the current site; thermometers were most likely exposed in a small metal-roofed Stevenson screen not a 230-litre standard one and according to Simon Torok (1996)[2] observations from 1912 were combined with those from the Post Office (about 12 km away) to form a composite series. Metadata does not mention where the original site was or for what period data were combined. Cross-referenced by Torok’s notation for May 1939 that the screen “opens to the west”, a step-down in average maximum temperature (Tmax) in 1941 of -0.71oC indicated the screen was replaced in 1940 and that the previous screen was biased-high. For data up to 1966 the step-change resulted in a spurious negative trend that was unrelated to the climate.

Background

  • Marketing campaigns by taxpayer-funded institutions, universities, the Australian Academy of Science, the ABC, The Guardian, the Nine network (ie, the former ‘Fairfax’ press) and The Conversation; together with ardent climate catastrophists have groomed the public relentlessly for nearly three decades to believe that temperature is bound to increase in-line with CO2. The expectation of ‘trend’ has distracted us into questioning whether temperature is increasing at 0.5oC/century or 1.5oC/century, rather than investigating if real trend exists (i.e. that trend in data is spuriously due to something else).
  • Data-fiddling hardly describes the fraught process of creating homogenised ACORN-SAT datasets (Australian Climate Observations Reference – Surface Air Temperature) used by CSIRO in their State of the Climate reports and those used by the Bureau of Meteorology to create graphs and pictures for recent Bushfire Royal Commissions, inquiries and inquests. In addition area-weighted daily temperatures that form the gridded AWAP (Australian Water Availability Project) dataset are not corrected for individual site biases. With its hard-wired AWS network, the Bureau and its colleagues in CSIRO, the University of NSW and others like James Cook, Monash, Melbourne and Woollongong universities are able to create and market whatever climate they want the community to believe.
  • Winding back the blinds to let some light in takes an enormous amount of effort. It’s impossible to have anything published in a scientific journal that rocks their boat for example. It is unarguable that the Bureau and friends have re-written Australia’s climate history – the droughts, floods, bushfires and extremes that have fashioned the country, in favor of their narrative which they fervently protect. For instance, although the community is led to believe climate change threatens the survival of the Great Barrier Reef, no such change or warming is detectable in climate datasets for sites adjacent to the Reef. The story was made up so handpicked ‘expert’ committees running the agenda and WWF-related campaigners like the Climate Council and Farmers for Climate Action mislead everyone. For a scientist its disheartening that while agricultural industries are blamed and pushed to the brink, billions of taxpayer dollars are being spent to “save the Reef” from a problem that doesn’t exist. 

Investigating automatic weather stations

Starting with raw daily maximum temperature data (Tmax) from Climate Data Online, statistical code that derives annual averages and other attributes, and as much metadata as can be found, we at www.bomwatch.com.au/ recently set out to investigate biases and other problems in Australia’s automatic weather station (AWS) network. Manual stations that report their data every morning are always 1-day late and since they were networked and integrated across the nation, temperature and rainfall maps used by news networks and temperatures reported through the day are produced using AWS data alone. So it is very important to have confidence that the technology is sound and that the Bureau’s data handling methods are beyond reproach.

Our AWS-project was introduced using Amberley RAAF as the case study and six widely dispersed airport sites as blind-study replicates. Aerial photographs, maps and plans and other metadata showed unequivocally that like at Cairns, Townsville and Rockhampton, homogenisation attributed the effect of site changes to the climate.

Metadata for Cape Leeuwin was also sparse and incomplete. The original site near the lighthouse was pinpointed using maps and photographs, which also confirmed that a large standard 230-litre screen was in use before the site moved north in October 1978. The extreme environment affected the data – wind-shaking damaged or caused thermometers to reset; while fog, mist, rain, and sleet driven into the screens by strong winds affected observations, which are assumed to be dry-bulb. Moving the AWS beside the drop-off to the ocean in April 1999 reduced rainfall-catch by 26% (Figure 1). Allegedly one of the most important sites in the AWS network, data for Cape Leeuwin including AWS data were not useful for determining long-term trend or change.

Figure 1. The AWS at Cape Leeuwin was relocated beside the drop-off to the ocean in 1999, where it is over-exposed to strong to gale force winds.

The Rutherglen Institute in northeastern Victoria started life in 1895 as a viticulture college (Figure 2). Early research was instrumental in identifying insect (Phylloxera) resistant vines and rootstock.

Figure 2. The Rutherglen viticulture college in 1908. The building housed schoolrooms, amenities and single-bed accommodation for wards of the State and local children (State Library of Victoria photograph.)

Due to Government antipathy the college almost failed. Proposals included turning it into a consumptive sanatorium (for TB suffers), dispensing with staff and selling it off or making it a boarding school for educating neglected children and those of local farmers in the science and practice of agriculture.

Figure 3. The Rutherglen Automatic Weather Station as it stands today around 1KM away from its original position. A move of this distance can be expected to create a significant step change in the data. The problem is that the Australian Bureau of Meteorology is unaware of numerous site changes that have taken place at many of its stations in Australia. The Rutherglen weather station is a case in point.

Analysis of Rutherglen Tmax since 1912 found no trend or change attributable to the climate. Despite repeated assurances that ACORN-SAT sites have been thoroughly researched metadata was misleading and incomplete. For instance although the site moved twice during the period; no metadata was available from December 1949 until May 1975 (26 years). A 1941 aerial photograph pinpointed the original site while objective statistical analysis provided unequivocal evidence in both time and rainfall domains that trend was due to site changes not the climate. Analysis was supported by close examination of ten, AWS-only datasets of sites used to homogenise Rutherglen ACORN-SAT. Those sites also showed no trend or change attributable to the climate.

Picking and choosing changepoints and using up to 40 highly correlated comparator datasets to make adjustments has no scientific or statistical merit. The process is demonstrably biased and should be abandoned. Peer review in 2012 ignored obvious problems while the Technical Advisory Forum that was supposed to provide overview of the homogenisation process over three years to 2017 failed to do its work (http://media.bom.gov.au/releases/379/bureau-welcomes-release-of-technical-advisory-forum-report/)

If there is no evidence of increasing trends in individual datasets then the Bureau and CSIRO’s claims that temperature has increased since the 1950s are baseless. Instruments and observers change; data are imprecise; site control is ad hoc – a garden over there, a new shed or concrete path; mowing the grass or not, or using herbicide introduces artificial changes in datasets. Due to infrequent maintenance screens also deteriorate, paint flakes-off, dust and grime accumulates, wasps build nests on sensor-probes and ignoring changes in nearby land-use introduces non-climatic bias which on warm-days may be as high as 1 to 2oC.

An important link – find out more

The page you have just read is the basic cover story for the full paper. If you are stimulated to find out more, please link through to the full paper – a scientific Report in downloadable pdf format. This Report contains far more detail including photographs, diagrams, graphs and data and will make compelling reading for those truly interested in the issue.

Click here for a full paper showing air photos, graphs, tables of data and relevant information

For historic datasets used as baselines it’s even worse particularly if they don’t know where the original site was or what it was like; and for ACORN-SAT sites such as Cairns, Townsville, Rockhampton, Amberley, Cape Leeuwin and Rutherglen where they failed to undertake the site research they said they did. By failing in their due diligence, the Bureau and others may have misled the Minister to whom they ultimately report and misled the public to whom they own a duty of care.

Research on long-term data and automatic weather stations is continuing.


[1] Former NSW Department of Natural Resources research scientist and weather observer.

[2] Torok SJ (1996). Appendix A1, in: “The development of a high quality historical temperature data base for Australia”. PhD Thesis, School of Earth Sciences, Faculty of Science, The University of Melbourne.

Cape Leeuwin, Western Australia

Part 2: Issues affecting performance

 Dr. Bill Johnston[1]

Synopsis

AWS data for Cape Leeuwin don’t reflect the weather and are unlikely to be useful for detecting trends and changes in the climate.

Background

Being the longest single-site temperature and rainfall record in Western Australia, the automatic weather station (AWS) at Cape Leeuwin is regarded as one of the most important in the Bureau of Meteorology’s AWS network (Figure 1). However, situated at the southwest extremity of the continent, of all the Bureau’s weather stations, it is also one of the most exposed.  

Figure 1. Signage at Cape Leeuwin explaining the importance of the automatic weather station.

Cape Leeuwin’s rugged rocky headland marks the junction of the Indian and Southern oceans. Cold, sub-polar low-pressure systems sweep in during winter and the sub-tropical high-pressure belt squeezes them south in summer. As there are no obstacles south or west to break-up or divert the low-pressure troughs, associated frontal systems and often gale-force winds; Cape Leeuwin experiences some of the most intense weather systems affecting mainland Australia. (BJ photo.)

While the general circulation moves from west to east at a rate of about 650 km/day, the mid-latitude belt of high-pressure shifts from around -40o Latitude (the Latitude of Flinders Island in Bass Strait) to around -30o Latitude (about the Latitude of the NSW-Queensland border) in April and back again around early October. This N-S seesaw, which follows the sun, is the reason Australian climates are strongly seasonal (moist during winter in the south when low pressure systems prevail and moist in the north during summer due to intrusions of the tropical monsoon and vice versa).

Weather-watches know that when the sub-tropical ridge settles a few degrees further south in winter, say south of Dubbo (-32o Latitude), approaching low pressure systems are also forced south and southern Australia is drier than normal and vice versa. Similarly, the position of the high-pressure ridge in summer determines the southern reach of the monsoon.

Observing the weather in situations that are open and exposed, subject to wild fluctuations, extremely strong winds and heavy seas is a challenging occupation, especially when manual observations were made at regular intervals both day and night. Wind was so strong and unrelenting that it shook the Stevenson screen, reset thermometers and resulted in breaks in their mercury and alcohol columns (Figure 2). It also drove mist, fog, rain and sea-spray into the screen, wetting instruments and interfering with measurements which are assumed to be dry-bulb.

Figure 2. A Fahrenheit meteorological minimum temperature thermometer showing breaks in the alcohol column (adjacent to the 68o, 73o and 75oF indices). Drawn down by the meniscus (sitting at 106.8oF) the metal bar marks the lowest reading of the day (97.3oF in this case). Not shown is a large vapour break in the lower stem of the thermometer, which after four decades of storage rendered it unserviceable. The interval scale in the photograph is magnified by about 50% compared to the actual scale, which was about 2.5 mm (1/10th of an inch) per interval. (BJ photo.)

Minimum temperature observations were noted as problematic in 1916, but the problem was not specified. The original Stevenson screen was an ‘old observatory pattern’, which by 1926 had been lowered to within 2 feet of the ground apparently to reduce wind-shaking. A decade later in June 1936 a new screen was installed; but was it in the same place or did the site move? Then no news about the state of the screen or the condition of the instruments for 28 years – until July 1964, when it was reported that the door, which faced south into the weather was damaged. No news also for the 14 years before the site moved in October 1978. Perhaps by then the screen was neglected like the one in Figure 3.

Despite repeated assurances that Australian Climate Observations Reference Network – Surface Air Temperature (ACORN-SAT) sites like Cape Leeuwin were well researched and peer reviewed, metadata is poor to non-existent. Peer-reviewers did not investigate if documentation was adequate, or if sites were well maintained and data were fit-for-purpose. It is not that long ago that the site moved in October 1978 but site summary metadata miss-specified its previous location and did not report the move. Apparently the Bureau had no records or photographs or files. However, a 1950s plan in a file at the National Archives of Australia unequivocally showed the ‘wind gauge’ northwest of the lighthouse and the Stevenson screen northeast (Figure 4) where it was also visible in photographs taken in 1975.   

Figure 3. A neglected 230-litre screen no longer in use at an agricultural research centre in NSW. (BJ photo.)

The importance of metadata cannot be understated. Scientists say that if sites and instruments remained the same, changes in the data are attributable to the climate. Ignoring (or not documenting) changes that happened or adjusting data for changes that had no impact (or using unspecified, unreplicable statistical tests to make adjustments) is one way to create pre-determined trends. Subjective homogenisation methods based on faulty metadata are unscientific; they result in trends that have nothing to do with the climate and should be abandoned.

Figure 4. Page 104 from file ‘Cape Lewin (Leeuwin) – transport lighthouse’ National Archives of Australia Barcode 1929975; showing location of the wind vane (anemometer) and Stevenson screen, which was about 38 m east of the path.

Using Cape Leeuwin as a case study, the physically based reference frame outlined and demonstrated in Part 1 of the series is applied to the problem of exploratory data analysis (EDA) – the rapid assessment of AWS datasets used to construct daily temperature maps and to homogenise ACORN-SAT data used to calculate Australia’s warming. The approach was developed from previous original research published on http://www.bomwatch.com.au In line with the theme of “Are Australia’s automatic weather stations any good?” this paper outlines an EDA methodology that can be applied by climate scientists to investigate the fitness of individual datasets for determining trend and changes in the climate.   

The approach uses Pearson’s linear correlation coefficient (r) to measure the strength, significance and sign (+ or -) of the association between mean maximum temperature (Tmax) and its expected deterministic covariable, rainfall; and non-parametric LOWESS regression (LOcally Weighted Scatterplot Smoothing) to visualise the shape of the association, the spread of values about the line and identify outliers relative to 95% bootstrapped confidence intervals for the fit.

Leaving aside problems evident in data from 1907, which were also investigated; the AWS, which commenced operating in February 1993, was moved from a relatively sheltered position between two sheds to an up-draft zone closer to the 13 m drop-off above the ocean in April 1999. Corresponding with the move, rainfall-catch abruptly declined by 26%. Mean Tmax at the new site was biased high relative to pre-1993 manual data and although not significant, Tmax implausibly increased with rainfall. Seven of 27 datapoints were identified as outliers including 2011 (21.20oC) and 2012 (21.13oC), which were an unlikely 1.61oC and 1.53oC higher than pre-AWS Tmax (19.6oC).  

It was concluded that as AWS data don’t reflect the weather they are unlikely to be useful for detecting trends and changes in the climate.

An important link – find out more

The page you have just read is the basic cover story for the full paper. If you are stimulated to find out more, please link through to the full paper – a scientific Report in downloadable pdf format. This Report contains far more detail including photographs, diagrams, graphs and data and will make compelling reading for those truly interested in the issue.

Click here to download the full paper including photographs and tables of data

(In line with our open-data policy, datasets used in the study are provided in Appendix 2 (pp. 11-16).)

We also welcome comments on the methodology.


[1] Former NSW Department of Natural Resources research scientist and weather observer.

Case Study Amberley, Queensland, Australia

Part 1: A robust fitness test for maximum temperature data

Dr. Bill Johnston[1]

Synopsis

The first post in this series about the reliability of automatic weather stations (AWS) provides physical justification for using the First Law of thermodynamics as the reference frame against which weather station data may be assessed. While advection warms the air, which increases maximum temperature (Tmax), latent heat removed by evaporation of rainfall is locally cooling. The First Law theorem predicts a dynamic balance between average annual Tmax and annual rainfall, which is evaluated statistically.

Adopting Amberley RAAF as the case study, analysis of Tmax ~ rainfall residuals identified non-rainfall changes that impacted on trend; while statistical significance and variation explained objectively measured overall conformance with the First Law theorem. Analysis of Tmax for six other sites showed the methodology was robust, replicable, widely applicable and therefore useful for benchmarking the operational performance of Australia’s automatic weather station network.

Data are coarse and all sites were poorly documented. Incomplete and misleading metadata, inadequate site control and biased homogenisation methods undermine claims that Australia’s climate has changed or warmed. 

Background

With their current focus on climate warming, climate scientists need robust quality assurance methods for verifying that maximum temperature data (Tmax) used for testing hypothesis about the historic and contemporary climate and its possible changeableness going forward are fit for purpose. Although thousands of datasets are potentially accessible not all are equally useful. The question examined in this series of posts about automatic weather stations is: are their data fit for the purpose of determining change or long-term trends; or are they only useful for monitoring day-to-day weather?

The problem

Since the 1990s Australia’s Bureau of Meteorology has all but abandoned its monitoring network.  Met-offices like at Ceduna, Oodnadatta, Cobar and Mildura sit empty; some like Hobart and Sydney Observatory have been sold or re-purposed, while others like Coffs Harbour, Albany, Canberra, Tindal and Mount Isa have been demolished and removed. Rapid-sampling platinum resistance probes have replaced thermometers, automatic weather stations have replaced observers and 230-litre Stevenson screens used to house instruments, have been replaced by 60-litre ones with that program accelerating over recent years. Due to their restricted capacity to buffer against transient eddies; the 60-litre screens are likely to be biased-high on warm days (Figure 1).

Sensitive instruments housed in 60-litre screens, which are hard-wired to the Bureau’s computer in Melbourne is a potent source of on-going bias. Neighbouring sites that have changed to AWS more or less in unison used to cross-validate and adjust each other’s data in real-time, reinforces rather than adjusts potential errors and embedded anomalies. Its unlikely for example, that data for any sites are truly independent and it is likely that the combined behaviour of infrequently maintained unmanned sites operating with small screens enhances warming. There are other problems too including the propensity for 1-second observations to report random spikes – flurries of warmer air rising from pavements or created by vehicles or people going past. Of the 86,400 one-second values recorded by an AWS each 24-hours, only two of those carry forward as data – the highest is the maximum for the day and the lowest is the minimum.   

Error control using cross validation also requires that neighbouring site data be of acceptable quality and unaffected by parallel non-climate effects, which is impossible to gauge from an office hundreds or thousands of kilometres from the site.

Figure 1. Internal view of the large Stevenson screen at Sydney Observatory in 1947 (Top above, black and white image) and the small screen at Wagga Wagga airport in June 2016 (lower above, colour image). While thermometers in the 230-litre screen are exposed on the same plane, the electronic probe at Wagga Wagga is placed behind the frame about 2 cm closer to the rear of the screen, which faces north to the sun. According to metadata, the 60-litre screen at Wagga Wagga was installed on 10 January 2001 and although thermometers were removed on 28 April 2016 intercomparative data is unavailable.

The burning question is whether data reported by AWS reflect the change in screen size (60-litres vs. 230-litres), behaviour of the electronic instrument (vs. observed thermometers), conversion of electrical resistance to temperature (calibration error), data processing (detection and filtering of erroneous values; averaging; cross-validation); reduced site control and maintenance (grass mowing, cleaning equipment etc.); the climate, or something else.

The First Law Theorem

The First Law of Thermodynamics, which is a universal theorem, is used a reference frame for assessing the fitness of maximum temperature (Tmax) data. Data are expected to behave rationally and not in some random, chaotic way and the methodology outlined in the following posts has been devised to test and evaluate conformance with the theorem. Using Amberley RAAF as a case study, the first paper in this series outlines the physical basis underpinning the approach. Six widely dispersed airport sites were also analysed to replicate and verify that methods are robust and widely applicable. Subsequent posts will evaluate AWS datasets from across Australia and those for sites in the vicinity of capital cities.

As droughts are always hot and dry, and rainy years mild and moist, lack of negative correlation between Tmax and rainfall; low explanatory power (variation explained); confounded signals and weird values – high Tmax in rainy years and vice versa, are causes of concern.

The First Law theorem provides a rational basis for determining if Australia’s automatic weather stations are any good, or if the data they produce consists of incoherent random numbers that bear little relationship to the ‘real’ climate.

An important link – find out more

The page you have just read is the basic cover story for the full paper. If you are stimulated to find out more, please link through to the full paper – a scientific Report in downloadable pdf format. This Report contains far more detail including photographs, diagrams, graphs and data and will make compelling reading for those truly interested in the issue.

Click here to download the full paper including photographs and tables of data


[1] Former Senior Research Scientist. Email: scientistATbomwatch.com.au

About

Welcome to BomWatch.com.au a site dedicated to examining Australia’s Bureau of Meteorology, climate science and the climate of Australia. The site presents a straight-down-the-line understanding of climate (and sea level) data and objective and dispassionate analysis of claims and counter-claims about trend and change.

BomWatch delves deeply into the way in which data has been collected, the equipment that has been used, the standard of site maintenance and the effect of site changes and moves.

Dr. Bill Johnston is a former senior research scientist with the NSW Department of Natural Resources (abolished in April 2007); which in previous guises included the Soil Conservation Service of NSW; the NSW Water Conservation and Irrigation Commission; NSW Department of Planning and Department of Lands. Like other NSW natural resource agencies that conducted research as a core activity including NSW Agriculture and the National Parks and Wildlife Service, research services were mostly disbanded or dispersed to the university sector from about 2005.

BomWatch.com.au is dedicated to analysing climate statistics to the highest standard of statistical analysis

Daily weather observations undertaken by staff at the Soil Conservation Service’s six research centres at Wagga Wagga, Cowra, Wellington, Scone, Gunnedah and Inverell were reported to the Bureau of Meteorology. Bill’s main fields of interest have been agronomy, soil science, hydrology (catchment processes) and descriptive climatology and he has maintained a keen interest in the history of weather stations and climate data. Bill gained a Batchelor of Science in Agriculture from the University of New England in 1971, Master of Science from Macquarie University in 1985 and Doctor of Philosophy from the University of Western Sydney in 2002 and he is a member of the Australian Meteorological and Oceanographic Society (AMOS).

Bill receives no grants or financial support or incentives from any source.

BomWatch accesses raw data from archives in Australia so that the most authentic original source-information can be used in our analysis.

How BomWatch operates

BomWatch is not intended to be a blog per se, but rather a repository for analyses and downloadable reports relating to specific datasets or issues, which will be posted irregularly so they are available in the public domain and can be referenced to the site. Issues of clarification, suggestions or additional insights will be welcome.   

The areas of greatest concern are:

  • Questions about data quality and data homogenisation (is data fit for purpose?)
  • Issues related to metadata (is metadata accurate?)
  • Whether stories about datasets consistent and justified (are previous claims and analyses replicable?)

Some basic principles

Much is said about the so-called scientific method of acquiring knowledge by experimentation, deduction and testing hypothesis using empirical data. According to Wikipedia the scientific method involves careful observation, rigorous scepticism about what is observed … formulating hypothesis … testing and refinement etc. (see https://en.wikipedia.org/wiki/Scientific_method).

The problem for climate scientists is that data were not collected at the outset for measuring trends and changes, but rather to satisfy other needs and interests of the time. For instance, temperature, rainfall and relative humidity were initially observed to describe and classify local weather. The state of the tide was important for avoiding in-port hazards and risks and for navigation – ships would leave port on a falling tide for example. Surface air-pressure forecasted wind strength and direction and warned of atmospheric disturbances; while at airports, temperature and relative humidity critically affected aircraft performance on takeoff and landing.

Commencing in the early 1990s the ‘experiment’, which aimed to detect trends and changes in the climate, has been bolted-on to datasets that may not be fit for purpose. Further, many scientists have no first-hand experience of how data were observed and other nuances that might affect their interpretation. Also since about 2015, various data arrive every 10 or 30 minutes on spreadsheets, to newsrooms and television feeds largely without human intervention – there is no backup paper record and no way to certify those numbers accurately portray what is going-on.

For historic datasets, present-day climate scientists had no input into the design of the experiment from which their data are drawn and in most cases information about the state of the instruments and conditions that affected observations are obscure.

Finally, climate time-series represent a special class of data for which usual statistical routines may not be valid. For instance, if data are not free of effects such as site and instrument changes, naïvely determined trend might be spuriously attributed to the climate when in fact it results from inadequate control of the data-generating process: the site may have deteriorated for example or ‘trend’ may be due to construction of a road or building nearby. It is a significant problem that site-change impacts are confounded with the variable of interest (i.e. there are potentially two signals, one overlaid on the other).

What is an investigation and what constitutes proof?

 The objective approach to investigating a problem is to challenge the straw-horse argument that there is NO change, NO link between variables, NO trend; everything is the same. In other words, test the hypothesis that data consist of random numbers or as is the case in a court of law, the person in the dock is unrelated to the crime. The task of an investigator is to open-handedly test that case. Statistically called a NULL hypothesis, the question is evaluated using probability theory, essentially: what is the probability that the NULL hypothesis is true?

In law a person is innocent until proven guilty and a jury holding a majority view of the available evidence decides ‘proof’. However, as evidence may be incomplete, contaminated or contested the person is not necessarily totally innocent –he or she is simply not guilty.

In a similar vein, statistical proof is based on the probability that data don’t fit a mathematical construct that would be the case if the NULL hypothesis were true. As a rule-of-thumb if there is less than (<) a 5% probability (stated as P < 0.05) that that a NULL hypothesis is supported, it is rejected in favour of the alternative. Where the NULL is rejected the alternative is referred to as significant. Thus in most cases ‘significant’ refers to a low P level. For example, if the test for zero-slope finds P is less than 0.05, the NULL is rejected at that probability level, and trend is ‘significant’. In contrast if P >0.05, trend is not different to zero-trend; inferring there is less than 1 in 20 chance that trend (which measures the association between variables) is not due to chance.

Combined with an independent investigative approach BomWatch relies on statistical inference to draw conclusions about data. Thus the concepts briefly outlined above are an important part of the overall theme. 

Using the air photo archives available in Australia, Dr Bill Johnston has carried out accurate and revealing information about how site changes have been made and how these have affected the integrity of the data record.