B2B  |  Feedback  |  COSIS.net on Facebook COSIS.net News on Twitter


Lost Login Data?

Science News


© 2000-2012 Dreamstime.

Source article

The homepage of the Action HOME

Published By

Dr. Dick van der Wateren

Tags

Climate of the Past, climate variability, Meteorology, Weather Stations, Homogenisation


Follow @COSISnews on Twitter


Homogenisation improves quality of historical climate record

26.01.2012, Age: 2281 days

Raw weather data, either from automatic weather stations, or human operated ones, cannot be directly used to study climate variability. The data contain various local signals that may obscure the climate signal. Examples of these local signals are the urban heat island effect, or variations in measurement methods between stations. Before the weather data can be used for climate studies, they need to be corrected. A new article in Climate of the Past reports about a Europe-wide homogenisation study.

News item submitted by COSIS Member Victor Venema

To study climatic variability the original observations are indispensable, but not directly usable. Next to real climate signals they may also contain non-climatic changes. Corrections to the data are needed to remove these non-climatic influences, this is called homogenisation. The best known non-climatic change is the urban heat island effect. The temperature in cities can be warmer than on the surrounding country side, especially at night. Thus as cities grow, one may expect that temperatures measured in cities become higher. On the other hand, many stations have been relocated from cities to nearby, typically cooler, airports. Other non-climatic changes can be caused by changes in measurement methods. Meteorological instruments are typically installed in a screen to protect them from direct sun and wetting. In the 19th century it was common to use a metal screen on a North facing wall. However, the building may warm the screen leading to higher temperature measurements. When this problem was realised the so-called Stevenson screen was introduced, typically installed in gardens, away from buildings. This is still the most typical weather screen with its typical double-louvre door and walls. Nowadays automatic weather stations, which reduce labor costs, are becoming more common; they protect the thermometer by a number of white plastic cones. This necessitated changes from manually recorded liquid and glass thermometers to automated electrical resistance thermometers, which reduces the recorded temperature values.

Ingeborg Auer (Zentralanstalt für Meteorologie und Geodynamik, Wien, Austria) mentions a further example for a change in the measurement method: “The precipitation amounts observed in the early instrumental period, let's say before 1900, are biased 10% lower than nowadays because the measurements were often made on a roof.” At the time, instruments were installed on rooftops to ensure that the instrument is never shielded from the rain, but it was found later that due to the turbulent flow of the wind on roofs, some rain droplets and especially snow flakes did not fall into the opening. Consequently measurements are nowadays performed closer to the ground.

To reliably study the real development of the climate, non-climatic changes have to be removed. For this the small difference of one station to its direct neighbours are utilized. In this way non-climatic changes (shelter and instrument changes or station moves usually) in a single stations can be more clearly seen as in the record of one station by itself due to the strong natural climatic variability. This method does not work when changes are applied to a whole country’s network. Such extensive changes are less problematic, however, because are typically well documented.

To study the performance of the various homogenisation methods, the COST Action HOME has performed a test with artificial climate data. The advantage of artificial data is that the non-climatic changes are known to those who created the data. The artificial data mimic climatic networks and their data problems with unprecedented realism. The artificial data may have a warming, a cooling or no trend, to ensure objective testing of the methods. The main novelty is that the test was blind. In other words, while homogenising the data the scientists did not know which station contained which non-climatic problem. The artificial data were generated and the analysis of results was performed by independent researchers, who did not homogenise the data themselves. Consequently, the COST Action is sure that the results are an honest appraisal of the true power of homogenisation algorithms.

Some people remaining sceptical of climate change claim that adjustments applied to the data by climatologists, to correct for the issues described above, lead to overestimates of global warming. The results clearly show that homogenisation improves the quality of temperature records and makes the estimate of climatic trends more accurate. Enric Aguilar (Center on Climate Change (C3), Universitat Rovira i Virgili, Tarragona, Spain) states: “Our experiments confirm that homogenization methods applied in past studies improve the reliability of the climate studies, thus help to make better climate change evaluations, but also that we can still do better by using state-of-the-art homogenisation methods, like those advised by COST-HOME.”

In the past it was customary in homogenisation to compare a station with its neighbours by creating a reference time series from averaging over multiple neighbouring stations. Due to the averaging the influence of random non-climatic factors is strongly reduced. Thus if a jump was found in the difference time series of a station with its reference, the jump was assumed to be in the station, not in the reference, which was assumed to be homogeneous. Olivier Mestre (Meteo France, Toulouse, France) explains: “In recent years climatologists and statisticians have worked on advanced statistical methods that do not need a homogeneous reference. The traditional methods reduced the influence of non-climatic factors on the temperature measurements, but the complex modern methods clearly improved the data much more.” This finding could only be reached using the benchmark data simulating complete networks with realistic non-climatic problems. Thus now we can recommend with confidence that climatologists should use the new methods. Tamas Szentimrey, (Hungarian Meteorological Service, Budapest, Hungary): “The recommendations are not only based on the numerical results, but also on a deep mathematical understanding of the algorithms. The mathematical basis is the key.”

The scientific article with 31 authors describing this study has just been accepted by the peer-reviewed journal Climate of the Past. This respected international journal is an open-access and an open-review journal of the European Geosciences Union. The articles of open-access journals can be freely read by anyone; the costs of publication are born by the authors. Open-access publishing makes it easier for researchers, also from poorer countries, to stay up to date and to participate in science. Also the general public can profit from open-access publishing as the access to the primary source can make the public debate on current scientific issues in newspapers and blogs more informed. Victor Venema: “Open access publishing is an exiting new possibility. Especially for this topic, we felt it was important that everyone can read the article.” Climate of the Past is also an open-review journal. This new way of reviewing scientific articles is public, everyone has the possibility to respond to the initial draft of the paper and everyone can read these comments as well as the comments of the official peer reviewers of the manuscript.

The international surface temperature initiative (ISTI) is working on an open and transparent framework for creating and hosting global temperature datasets. The main feature will be provenance, meaning that every temperature value can be traced back to its origin. The database will contain digital images of the records, the keyed numbers, the temperature values in a common format, as well as quality controlled and homogenised data. To be able to study the performance of the software performing all these steps, a similar artificial temperature dataset will be generated, building upon the valuable groundwork of the COST Action efforts. Kate Willett (UK MetOffice, Exeter, United Kingdom): “The experience of HOME will help our Initiative to generate the best possible dataset to validate the homogenisation algorithms.”

This study would have been impossible without the support of COST, which finances the collaboration of European researchers. This Action included researchers from 27 COST countries, as well as Andorra, Australia, Canada and USA.

Article in Climate of the Past (open access):

Victor Venema & 31 co-authors. Benchmarking homogenization algorithms for monthly data. Clim. Past, 8, 89-115, 2012. Link to article.


Twitter Facebook LinkedIn Google Bookmarks Linkarena Newsvine Oneview Stumbleupon Windows Live Yigg

Add Comment (login required)