Climate Monitoring

‘Perhaps some day in the dim future it will be possible to advance the computations faster than the weather advances… But that is a dream.’
Lewis Fry Richardson, 1922

Since the 19th century, meteorology has become a networked science. Meteorologists have used the latest communications technologies to exchange and compile observations, sending data by post, by telegraph, by wireless and now via the internet.

The ambition to create a truly global science of weather has resulted in an evolving set of institutions: from 19th century international scientific conferences to the World Meteorological Organization, an agency of the United Nations founded in 1950 which now has 190 member states.

Lewis Fry Richardson’s distant dream of numerical weather prediction became reality during the second half of the 20th century, driven by the development of computing. Computing is vital not only to the forecasting abilities we now take for granted, but also to the more complex, longer-term study of climate – and particularly to studies of climate change.

Modern supercomputers are immensely powerful, but they are not the only way in which the necessary calculations can be performed. An alternative model is to use a vast network of ordinary computers. This form of distributed computing can harness even more substantial resources.

Click the image above to view this video
(Alternative: please click here to view this video on youtube)


The animated globe above shows results from the climateprediction.net distributed computing project, based at the University of Oxford. Thousands of volunteers all over the world donate idle time on their home computers to run global and regional climate models.

The globe demonstrates two simulations of a global climate model that runs from 1900 through to 2020. One scenario is ‘natural’, based on a level of greenhouse gases as it might have been without human CO2 emissions. The other is ‘forced’, taking actual emissions into account. The changing images represent the mean temperature per season over a 10 year period.

This model is run hundreds of thousands of times, each using slightly different physics and initial conditions. Such repetition helps to determine uncertainties and to refine the predictions – vital in creating a better scientific basis for addressing one of the biggest global issues of the 21st century.

The Case of Ozone
The investigation of atmospheric ozone has a special connection with Oxford
Spectrographs being tested at Boars Hill, Oxford in 1926
A spectrograph displayed in the exhibition was designed and built by the Oxford physicist Gordon Dobson in the mid-1920s. The spectrograph splits incoming light into a frequency spectrum and early tests of the instruments were conducted on Boars Hill in Oxford. Dobson wanted to investigate the unexpectedly high temperatures inferred in the stratosphere, and believed that the explanation might lie with the presence of ozone – a molecule of three oxygen atoms, O3, rather than the much more common O2. The success of Dobson’s observations led him to build five more spectrographs and send them elsewhere in Europe. They were subsequently moved to stations all over the world. Dobson redesigned the instrument in the late 1920s and the network eventually grew to more than 100 instruments across the globe. Data from the spectrophotometer version were used to detect the ozone hole over the Antarctic in the 1980s and are still collected today. Download a PDF of Dobson’s 1968 paper Forty Years’ Research on Atmospheric Ozone at Oxford.