Page 24

CIM32

w 24 | Focus cnrs I international magazine Essential q Climate models are computer programs used to characterize the planet and its various components, including landmasses, topography, vegetation, volcanoes, atmosphere, oceans, and icecaps. The ocean and atmosphere are subdivided into hundreds of thousands of small areas where the laws of physics and chemistry apply, and which interact with one another. Based on the observation data at a given moment in time, the model calculates for each area what should happen 20 or 30 minutes later, and so on, to model the climate over decades or even centuries. This requires extraordinary computing power. And, of course, the models take into account the effects of humaninduced greenhouse gases. The modeling groups that contributed to AR5 agreed upon a working framework called CMIP5, which defines a set of simulations to conduct for past, present, and future climates. “For example, we first verify that our models are able to reproduce the natural variability of the climate that existed prior to the mid-19th century,” explains Pascale Braconnot of the IPSL1 modeling team, who developed one of the two French climate models.2 A model must indeed be able to reproduce the characteristics of the different seasons and pseudo-cyclical climatic phenomena like El Niño or the North Atlantic Oscillation. “We then simulate the period between 1860 and 2005 to quantify the role of the various phenomena— both natural and human-induced— that could modify the climate, such as variations in the quantity of solar energy, volcanic eruptions and natural aerosols, but also greenhouse gases, human made particles, and changes in land use (e.g., deforestation).” These studies have confirmed, with reliable certainty, the role of human activity in the recent global warming. Predictive models are based on four hypotheses which are physical scenarios of possible evolutions in the amount of solar energy absorbed by the Earth. Over the course of the 20th century, the overall average temperature rose by about 0.9°C due to an energy surplus, called radiative forcing, of 1.8 W/m2. “This may seem insignificant, given that the Earth’s surface receives an average of 200 W/m2,” Braconnot points out. “But it isn’t when you consider that the energy difference between the last Ice Age and the modern era is only 3 to 6 W/m2.” The four possible evolutions of this radiative forcing between now and 2100 have been defined at 2.6 to 8 W/m2, to account for all possible trends in human greenhouse gas emissions. Depending on the scenario, temperatures in the 21st century are expected to rise by 2 to 5°C, according to the official results used as scientific basis for the AR5. The current trend in greenhouse gas emissions points to severe global warming, with a foreseeable temperature rise of nearly 5°C, unless drastic changes in energy policy are implemented very rapidly. The best-case scenario of a 2°C rise is based on the assumption that at least some of the CO2 will be cleared from the atmosphere by 2070. In other words, temperatures could rise this century as much as they would during a deglaciation period, which normally occurs over several millennia. 01. See note 02 page 20. 02. The other model was developed by the French meteorological service Météo France. contact information: Pascale Braconnot > pascale.braconnot@lsce.ipsl.fr 09 Models evaluate factors like the evolution of average precipitation according to various scenarios. 10 Modeling systems, like this one at the LSCE, use complex algorithms to predict the future of our climate. 11 For climate simulations, the ocean and atmosphere are divided into small subzones. © S. Renard /CEA-LSCE 10 09 11 © IPSL/CNRM/CERFACS Precipitation diference between the periods 1971-2000 and 2071-2100 Modest scenario CNRM-Cerfacs model IPSL model Pessimistic scenario Essential   Climate Models   Models mm/day


CIM32
To see the actual publication please follow the link above