Sandbags, severe flood warnings, mass evacuations – desperate measures as a three-metre-high wall of water bears down on a densely populated coastline. This isn’t fiction. Nor is it an account of tragic events far away in the Indian Ocean. This happened in November 2007 in England, when a huge storm surge welled in the North Sea then smashed into the country’s east coast.
On this occasion sea defences held fast and no one was injured. But a similar incident in 1953 left over 300 people dead. Climatologists now warn that extreme weather events such as this are set to get ever more common, as we start to feel the effects of global climate change.
There are other signs too. According to NASA, 2005 was the warmest year we’ve seen since records began. Malaria has been reported in Nairobi, Kenya as the city’s high altitude becomes warm enough for mosquitoes carrying the disease to survive there.
“There’s no question that we are seeing the effects of climate change already,” says Spencer Weart, director of the Center for History of Physics, at the American Institute of Physics, Maryland.
In February 2007, the Inter-governmental Panel on Climate Change (IPCC) published an Assessment Report that concluded that the world will see a probable temperature rise of between 1.8 and 4 degrees Celsius by the end of the century. “We know the last time the world was three degrees warmer for any long period of time, the sea level was six metres higher,” says Weart. That’s enough to submerge coastal cities from New York to Shanghai.
Piecing together the scientific evidence for climate change has been a long and difficult process. It has occupied thousands of bright minds for more than a century – and has been fraught with controversy and political intrigue.
The story began with the discovery, by French scientist Joseph Fourier, of what we now call the ‘greenhouse effect’. In 1824, Fourier calculated that the Earth was a lot warmer than it should be according to estimates based purely on its distance from the Sun. He surmised that the planet’s atmosphere must be slowing down the rate that the planet radiates heat away into space. Light from the Sun passes through the atmosphere to warm the land and oceans below, but the atmosphere prevents this heat escaping – just like the glass in a greenhouse.
In 1896, a Swedish chemist took the theory further. Svante Arrhenius noticed that the gas carbon dioxide (CO2) was especially good at trapping heat radiation. He soon realised what this meant: that the massive increases in atmospheric CO2, caused by coal burning during the industrial revolution, would fuel Fourier’s greenhouse effect and lead to global warming of the planet.
Far from triggering wide-scale panic, Arrhenius’s prediction was actually welcomed. During the late 19th century geologists had turned up evidence for what they perceived as a far greater climate menace. They discovered that the Earth’s prehistory had been punctuated by long periods of glaciation – when the planet’s surface froze over for hundreds of thousands of years at a time. Arrhenius had actually been investigating the greenhouse effect as a possible explanation of these ‘Ice Ages’. Instead, he stumbled upon a mechanism whereby human CO2 emissions could save us from a frosty fate. And it wasn’t the only benefit. Global warming could be advantageous to food production, bringing marginal lands into cultivation. “Vast land areas in the northern hemisphere could become good agricultural land if the weather warmed,” says Jack Meadows, a science historian at the University of Loughborough. “Canada and the eastern part of Russia was very much a wheat area then, but limited by the short growing season.”
But this was all still theoretical – hard evidence for the changing climate had yet to be seen. It’s true that, in the 15th century, Christopher Columbus had reported that there was less rainfall in the Canary Islands as the inhabitants cut the forest down. During the 17th and 18th centuries, northern Europe experienced an era of extreme cold when the River Thames would regularly freeze – known as the Little Ice Age.
These were regional effects though. It wasn’t until later that the signs of climate change on a global scale began to surface. “In the 1930s older people began to say, ‘you kids have it easy – when I went to school there were blizzards, or the rivers froze,’” says Weart.
A British scientist by the name of Guy Callendar first gathered numerical data to support the anecdotes. In 1938, he published figures showing that between 1890 and 1935, the Earth had warmed by about half a degree Celsius. He pointed out that carbon dioxide levels had risen by ten per cent in this time, owing to the industrial revolution. It was the first solid scientific observation linking climate warming with human carbon emissions.
But nobody believed it. By this time, other research had suggested that the Earth’s oceans absorb excess carbon dioxide, and thus act as a natural brake on the greenhouse effect. At any rate, even Callendar believed that the warming effect would still be largely beneficial.
It wasn’t until 1957 that this thinking was overturned. Roger Revelle, at Scripps Institute of Oceanography in California, with his colleague Hans Seuss, showed that as an ocean absorbs CO2 it becomes more acidic. This radically limits the amount of carbon it can soak up.
Fears become fact
Revelle also reasoned that human CO2 emissions must now be increasing exponentially. For not only was the world’s population doubling every few decades, but the carbon output per capita was itself doubling on the same timescale. His fears became fact just a year later, when Charles David Keeling, also at Scripps, began a careful programme of CO2 monitoring. He found the concentration of the gas to be 315 parts per million, compared with 280ppm in the pre-industrial 19th century. Continuing his measurements until 1961, Keeling charted an inexorable year-on-year rise.
The prevailing attitude of the time was to equate – without question – all human industry with progress. The work of Revelle and Keeling shook this philosophy. But the real body blow came from another quarter entirely: nuclear weapons. “People thought that bomb tests were causing earthquakes and droughts and floods. Fall-out was going around the world,” says Weart. “So human industry is not necessarily all progressive, and human beings can change the entire global environment.”
This awareness – along with concerns over issues such as pollution and chemical pesticides – sparked the first glimmers of the environmental movement. Climate change now had a public voice. Scientists started to listen, and carried out research into its consequences. “By the late 1960s, you began to find scientists warning that global warming may very well be a danger,” says Weart.
So began a long and intense phase of climate research. The number of scientific papers published on the subject leapt from two or three a year to between 20 and 30. By 1977, there was a chorus of scientists calling for policy steps to curb carbon emissions. Otherwise, they warned, the world was going to get dramatically warmer in the 21st century, with potentially catastrophic results.
Controversy was also brewing though. The new research revealed just what an intricate system the climate really is – dependent on many, many variables, with no single ‘master key’ to unlock its behaviour. This led some scientists to doubt that human activity was the root cause of global warming. These ‘climate sceptics’ blamed natural factors, such as variations in the brightness of the Sun.
Nevertheless, the evidence for anthropogenic global warming continued to accumulate. Scientists found that it wasn’t just carbon dioxide that causes global warming, but also other gases given off by human activity, such as methane and nitrous oxide – a greenhouse gas 300 times more potent than CO2. Chlorofluorocarbons (CFCs) made headlines. Used widely as refrigerator coolants and aerosol propellants (although now banned), they were up to 8,000 times as harmful as CO2. Worse still came the news that CFCs were ripping a hole in the Earth’s ozone layer – a protective blanket in the upper atmosphere that blocks harmful rays from the Sun.
The news enraged environmentalists, who stepped up their campaigning. In response, climate sceptics defended their position even more aggressively. The debate became a highly polarised, and sometimes even irrational stand-off.
There was only one way to break the stalemate: more cold, hard scientific research. “Over the last 20 years, the international scientific community has been carrying out research on many, many facets of the climate system,” says Alan Thorpe, chief executive of the UK’s Natural Environment Research Council. “Determining the cause of climate change has required improved understanding of many environmental factors, as well as improvements in our ability to model the climate system.”
That’s been achieved through massive advances in computer modelling. Increases in computing power have enabled basic models of the atmosphere, first formulated in the 1980s, to become vastly more realistic. They now accurately reflect the climate’s impact on land masses, oceans and ice sheets. The treatment of physical processes, such as atmospheric chemistry, has been greatly refined.
The results of this modelling have now all but ruled out natural processes as the chief cause of climate change. One study carried out last year by Mike Lockwood, of Rutherford Appleton Laboratories, near Oxford, examined the last 20 years’ worth of solar variations – one of the major factors cited by climate sceptics. Lockwood found that if the Sun was the main contributor to climate change then the planet should actually be getting cooler – in clear contradiction of the evidence. Consequently, the IPCC stated in its 2007 Assessment Report that the likelihood of global warming since the mid-20th century being caused mainly by human greenhouse gas emissions is now in excess of 90 per cent.
In 2006, the Stern Review of the economic impact of climate change was published. It found that the dollar cost of inactivity – and so suffering the consequences of global warming – grossly outweighed the cost of taking early action to stop it. Meanwhile, military commanders warned of the potential threat to national security posed by mass migrations of refugees from flood lands, and by conflicts over resources as starving nations went to war for food.
These strands came together to build a compelling case for governments to act. The Kyoto Protocol, capping carbon emissions, has now been ratified by 182 signatories – the notable exception being the United States. In 2009, world representatives are due to meet for talks in Copenhagen to finalise the terms of a new treaty that will replace Kyoto once it expires in 2012.
There’s still much to be done to avert environmental hell and high water. But understanding the science of global climate change could literally have saved the world. And it has achieved something else equally incredible – teaching national leaders to look beyond both their borders and short electoral terms. That’s got to be a good thing.
Dr Paul Parsons was formerly editor of award-winning science and technology magazine Focus. His latest book, The Science of Doctor Who, was long-listed for the 2007 Royal Society Prize for Science Books.
Timeline: The weathermen
French scientist Jean Baptiste Joseph Fourier (1768–1830) predicts the greenhouse effect – in which the Earth’s atmosphere traps heat and makes the planet warmer than it otherwise would be.
The French polymath is best known for the ‘Fourier series’ – a tool for breaking down complex mathematics into simpler forms – useful in physics, such as heat flow and wave theory, and in communications. It was Fourier who discovered the ‘greenhouse effect’.
Swedish chemist Svante Arrhenius (1859–1927) calculates that industrial carbon dioxide emissions will enhance the greenhouse effect, leading to global warming. He didn’t know it had already begun.
This Swedish chemist was first to realise that human emissions of carbon dioxide could aggravate the greenhouse effect. Arrhenius was also an early proponent of ‘panspermia’ – the theory (now taken very seriously) that spores of life can drift between the planets.
The first evidence for global warming is brought to light by British researcher Guy Stewart Callendar (1898–1964). He links rising temperatures with increasing carbon dioxide levels since 1850.
A steam engineer by profession, Callendar became intrigued by the impact of steam power, and other products of the industrial age, upon the environment. He was first to tally soaring temperatures since the mid-19th century with increasing carbon dioxide levels.
Roger Revelle in the USA calculates that much less CO2 is absorbed by the Earth’s oceans than originally thought, leaving a great deal to warm the planet.
Charles David Keeling (1928–2005) begins careful measurements of atmospheric CO2 levels at Mauna Loa Observatory in Hawaii. In 1961 he publishes evidence for a relentless year-on-year rise.
Geochemist and oceanographer Keeling extended Callendar’s work into the modern age, charting the ongoing rise in atmospheric CO2. His graph became known as the ‘Keeling curve’. In 2002, he received the US National Medal of Science for his climate research.
The first reliable computerised climate simulation yields grim predictions – doubling of carbon dioxide levels from pre-industrial times will raise the Earth’s temperature by 2.3 degrees Celsius.
American climatologists Michael Mann (1965–), Raymond Bradley and Malcolm Hughes publish the ‘hockey stick’ graph, showing a sharp upturn in global temperatures since the industrial revolution.
Mann led a team who put forward the ‘hockey stick’ graph. A key piece of evidence for anthropogenic global warming, it plotted northern hemisphere temperatures over the last 1,000 years, showing a sharp rise at the start of the 20th century.
The Intergovernmental Panel on Climate Change reports that the likelihood of human carbon dioxide emissions being responsible for most observed climate warming is now over 90 per cent.
Why didn’t we listen?
For decades, the atomic bomb put climate change in the shade
Global warming was initially seen as a good thing – it could stave off the next Ice Age, and open up more land for food production. That changed dramatically when the first computer simulations of the 21st-century climate threw up grim prognostications of soaring temperatures and rising sea levels. That was back in 1967. Why has it taken so long for the world to heed the warnings?
Firstly, the year 2000 seemed a long way away. Those who had lived through the first half of the 20th century had seen terrific short-term upheavals, including two world wars and economic catastrophe. By the Sixties, other very immediate threats to the planet had emerged – such as the Cold War and the atomic bomb.
“It would be hard to expect anybody to take action until they saw changes on a timescale that was meaningful to them,” says Spencer Weart, director of the Center for History of Physics, at the American Institute of Physics, Maryland. “It was hard to imagine planning even two years ahead.” Scientists were listening, though. Come the 1970s, many were devoting serious research effort to climate change. The problem was that convincing governments to change tack on any scientific issue required scientific consensus. Yet the climate is complex – understanding it to the point where consensus could be reached was to take many years of research.
By the end of the 1990s most agreed it was more likely than not humans were to blame. At the same time, the end of the Cold War and increased international stability allowed environmental issues to rise up the public agenda. “People were now willing to plan 50 years ahead,” says Weart.
Under pressure from their electorates to take action, and with scientific evidence mounting, policy makers had little choice but to make fighting climate change a priority.
This article was first published in the October 2008 edition of BBC History Magazine