Last week, the Nobel physics prize was (half) awarded to Suki Manabe and Klaus Hasselmann for their work on climate prediction and the detection and attribution of climate change. This came as quite a surprise to the climate community – though it was welcomed warmly. We’ve discussed the early climate model predictions a lot (including some from Manabe and his colleagues), and we’ve discussed detection and attribution of climate change as well, though with less explicit discussion of Hasselmann’s contribution. Needless to say these are big topics which have had many inputs from many scientists over the years.
But RC has a more attuned audience to these topics than most, and so it might be fun to dive into the details of their early work to see what has stood the test of time and what has not, and how that differs (if it does) from their colleagues and rivals at the time.
Manabe’s Climate Modeling
Fortunately, Manabe recently wrote a retrospective on his early work in response to receiving the Crafoord prize in 2018. That paper (Manabe, 2019) gives a good overview of Manabe’s particular philosophy of climate modeling which was very much focused on getting things to work, and not worrying too much about the details. He makes a eloquent argument for a hierarchy of modeling where simpler, functional, models can contribute a lot to understanding in advance of the more complete and more detailed versions turning up. In this, he is in violent agreement with Isaac Held, his colleague at GFDL, and indeed most climate scientists.
But let’s go back to the beginning. Manabe’s early focus was on radiative-convective equilibrium, and his seminal 1967 paper (with his longtime collaborator Richard Wetherald, who passed in 2011). The Manabe and Wetherald (1967) paper has been described as the most influential climate paper ever.
The key aspects were the inclusion of water vapour feedback as temperatures increased, and the use of ‘convective adjustment’ to maintain stability of the lower atmospheric column. While not a great parameterization of the complexity of real convection, it served to keep the troposphere and surface linked in ways that match what happens in the real world. In practice, it was a big advance towards realism over the work of Plass or Möller from a few years before (despite the lack of cloud feedback). Two examples of the sensitivity of their model (which have mostly held up) are useful to look at at:
What they showed are the distinct fingerprints of two kinds of forcing; increasing solar activity which warms all parts of the atmosphere, and carbon dioxide increases which warm the surface and troposphere, but cool the stratosphere and above. The source of this result is the spectral resolution of the radiative transfer model they were using, but oddly enough they don’t discuss it at all. In a subsequent short paper Manabe (1970), Manabe extends this result to predict a temperature increase by 2000 of 0.8ºC based on a 25% increase in CO2, which was pretty close. (Funnily enough, this paper appeared in volume about environmental risks that was edited by a young(er) S. Fred Singer, before his turn to the dark side).
Manabe’s subsequent work led to the development of the GFDL GCM, initially just including the atmosphere, but eventually with an ocean, and then the transient results shown in Manabe and Stouffer (1993). Famously, these early results were half the input into the Charney report‘s estimate of climate sensitivity in 1979 (the other half being the preliminary results from Jim Hansen’s model at GISS). Both these predictions have been evaluated in recent years to see how well they did. The time series were included in the Hausfather et al (2020) paper and in the latest IPCC report:
The next step in climate modeling was to couple dynamic ocean models to the atmospheric models, and again, Manabe and his colleagues were pioneers (notably Manabe and Bryan (1969), but more comprehensively in Manabe et al. (1975), and Bryan et al. (1975)). But as expectations increased that coupled models could help climate predictions, there was a growing realization that there was a problem with how they were being designed.
The basic issue stems from the different timescales of the ocean and atmosphere. Given the ocean temperatures, an atmospheric model will equilibriate in a year or so of model time (maybe a decade if you care about the water vapour distribution in the upper stratosphere). However, given information from the atmosphere, an ocean model takes centuries to millennia to equilibriate the deep ocean. The tail of the age distribution for water parcels in the deep Pacific can reach 10,000 years or so. But back in the day, running a coupled model anything like that long was prohibitively expensive. So in order to get a coupled model simulation for near-present, the ocean needed to be ‘spun-up’ for a good while on it’s own, and then, once it was in equilibrium, the coupling was turned on, and voila! a coupled simulation of the present-day climate. Except…
… it didn’t generally work. The newly coupled model would drift away from today’s climate, sometimes with a collapse of the Atlantic overturning circulation, other times just towards a much warmer or cooler climate, or with a terrible ‘double ITCZ’. This was a problem because it’s not at all clear that the sensitivity of the simulated climate (which was off in serious ways) would be the same as the sensitivity of the real world. This stymied progress for a while (maybe a decade or so) as people worked to understand why the models drifted so much, and to find ways to fix it.
When I started in climate modeling (in the early 1990s), this was still a relevant issue, though two approaches had been adopted. One, advocated by Manabe’s group (and Hasselmann’s!), was the imposition of ‘flux corrections’ or ‘flux adjustments’ (Manabe and Stouffer, 1988; Sausen et al, 1988) which added artificial fluxes at the ocean-atmosphere boundary that gave the ocean and atmosphere what they both needed to stay stable, correcting for what would have been calculated, and then keeping that fixed in all future sensitivity experiments. This (by design) produced a good climatology, but effectively buried the models’ poor physics. The other approach was to work with models that had offsets from the real world (which you would keep trying to reduce) but would have sensitivities that were more physically coherent.
The implications of the two approaches are difficult to assess without a perfect model simulation to compare against, and if we had that, there’d be no need to worry about drifts. Thus during the early 90s there was a fair bit of unresolved religious-like discussions about what should be done. Manabe was vocal that you needed a reasonable model to play with and make progress, while others were of the opinion that the sensitivity of a flux-corrected model wasn’t informative of the real world, and that using flux corrections as a crutch, was actually holding back work on the physics that would (eventually) remove the need for the corrections in the first place. (Minor aside, I was a co-author on a paper that assessed this concept for a slightly simpler class of model, and found that the ‘flux-corrected’ version was not predictive of the ‘true’ sensitivity Bjornsson et al., 1997).
Over time the issue more or less resolved itself as models got incrementally better and computational resources increased so that longer coupled model simulations could be done more routinely. Occasionally, the issue still comes up (i.e. Gnanadesikan et al., 2018), but I think it’s fair to say that few modelers think it’s a useful tool anymore. For context, 10 out of 17 models in CMIP2 (~1995) used flux adjustments, and 6 out of 24 in the CMIP3 ensemble (~2001), but none in CMIP5 or CMIP6, while each generation has greater skill than the previous one. In his 2018 retrospective, Manabe doesn’t discuss the issue at all.
The proof of the pudding in climate model terms though are the quality and skill of the predictions. A recent paper Stouffer and Manabe (2017), assessed how good the Manabe and Stouffer (1989) predictions were. These came from an idealized 1% increasing CO2 experiment after 70 years, when CO2 has approximately doubled, and so is warmer than we would expect for 2020, but the pattern is quite robust:
Not too shabby!
Hasselmann’s Statistical Insights
[I have to admit to not knowing Hasselmann’s oeuvre as well as Manabe’s, and to my recollection I don’t think we’ve met, so this might need some amendment…]
I think the key paper to look at is Hasselmann (1979), which really set the stage for formal methods of detection and attribution of climate change. Later papers, notably Hasselmann (1997)(pdf) extended this to multi-variate attribution problems (written in tensor notation no less, so that probably helped 😉). The basic idea is that although there are a vast number of degrees of freedom in the atmosphere/climate system, you can make a lot of progress by reducing the degrees of freedom and looking just at the dominant modes of variability changes and comparing them with expected patterns from simulations. A key insight is that depending on how the noise and the forced patterns line up, the ‘optimal’ pattern to detect might not be what you first thought. But note that this was written when “continuous model time series of comparable length to analyzed global or hemispheric data [were] not available”, so the paper is mainly conceptual. It really is only in the late 1980s, and more clearly and with more models, the mid-1990s, that the data became available to really apply these methods.
The challenge with all detection & attribution (D&A) work is that it must rely on counter-factuals – i.e. estimates of how the climate would behave in special cases – for instance, if the only forcing was greenhouse gases, or if there was only natural forcings or only internal variability. Since the real world has all of these things going on at the same time, it’s hard to extract them from the observations, particularly since good direct observations don’t stretch back more than a century or so, and proxy climate observations have their own, increased, uncertainties. But even with perfect observations, getting a full characterization of internal variability would be hard, and perhaps impossible. So in practice, the ‘noise ellipsoid’ in the above figure is almost always taken from control runs of coupled climate models which, as Hasselmann acknowledged, were not available in 1979.
Hasselmann’s work before this paper was heavily related to measurements and understanding of ocean waves and the role of ‘random’ weather forcing on long term ocean variability, and that has been widely cited, and afterwards, he played a key role in developing the MPI climate model (i.e. Cubasch et al. (1992). But much of later well-cited work built off the 1979 paper and involved further refinements on the theme of D&A, often working with Gabi Hegerl (i.e. Hegerl et al., 1996; Hegerl et al., 1997).
These were very much the ideas that set the discussions in climate science in the 1990s. As you will recall, Hansen had declared in 1988 that “the greenhouse effect is here!”, based on a 3-sigma signal detected with the original GISS model. But the ocean model used there was a simple ‘Q-flux’, no-dynamics, module, and so had no ENSO, or other coupled modes of variability. The implicit estimate of the internal variability here was, to be clear, not widely accepted. There are a couple of articles and responses at the time that give a flavor, for instance “Hansen vs. the World” by Richard Kerr reporting from a workshop where Manabe, and Hasselmann’s coauthors (notably Cubasch and Barnett) were present, and the responses from Wally Broecker and James Risbey.
Hasselmann himself commented on this in Science in 1997, after the 1995 Second Assessment Report from IPCC declared that “the balance of evidence” suggested that the greenhouse gas signal had indeed been detected. The figure he showed there:
… supported the IPCC conclusion, and his last line is worth repeating:
It would be unfortunate if the current debate over this ultimately transitory issue should distract from the far more serious problem of the long-term evolution of global warming once the signal has been unequivocally detected above the background noise.
Today, 24 years later, the detection and attribution of anthropogenic climate change is “unequivocal”, but we are still being distracted by ultimately transitory issues…
What if the prize had been given a decade ago?
The two restrictions on the award of disciplinary Nobel prizes are that the awardees must still be alive, and that there is a limit of three laureates per prize. For advances made in the 1960s and 1970s, the first is extremely relevant, and makes the second condition somewhat less so. But without wishing to take anything away from the two awardees this year, ten years ago the decision would have been much tougher. Norman Philips published what is recognised as the first GCM in 1955 – he died in 2019. Akio Arakawa was the conceptual leader of climate modeling directly influencing both Manabe and Hansen – he died earlier this year. Of the published papers predicting global warming in the 1970s (as catalogued in the Hausfather et al paper), the authors Rasool, Schneider, Benton, Sawyer, Broecker and Mitchell have all passed. Only Nordhaus and Manabe are still alive – though now both have won Nobel prizes.
But the building of climate models and their application is broader than can be recognized like this. There are no prizes for the people that actually wrote the code for the models – people like Gary Russell or Ernst Maier-Reimer (nicely eulogized by Hasselmann), the specialists who designed the parameterizations, or the teams that developed the inputs and processed the outputs or the technicians that kept the old supercomputers running. In recent papers documenting model development, it’s not unusual to have dozens of authors – not the level of the CERN collaborations, but significantly beyond the Nobel limit. The huge advances in understanding we’ve seen since the 1970s have been the work of thousands of smart and dedicated people all around the world, only a few of which will ever be recognized as widely as this. We should always remember this while we celebrate the winners.
Finally, while it is many scientists’ dream to win a Nobel Prize, Hasselmann’s statement that he would rather have “no global warming and no Nobel Prize” captures the ambiguity that many of us feel in successfully predicting events and trends that we don’t want to come true.
- S. Manabe, “Role of greenhouse gas in climate change**”, Tellus A: Dynamic Meteorology and Oceanography, vol. 71, pp. 1620078, 2019. http://dx.doi.org/10.1080/16000870.2019.1620078
- S. Manabe, and R.T. Wetherald, “Thermal Equilibrium of the Atmosphere with a Given Distribution of Relative Humidity”, Journal of the Atmospheric Sciences, vol. 24, pp. 241-259, 1967. <a href=”http://dx.doi.org/10.1175/1520-0469(1967)0242.0.CO;2″>http://dx.doi.org/10.1175/1520-0469(1967)0240241:TEOTAW>2.0.CO;2
- S. Manabe, “The Dependence of Atmospheric Temperature on the Concentration of Carbon Dioxide”, Global Effects of Environmental Pollution, pp. 25-29, 1970. http://dx.doi.org/10.1007/978-94-010-3290-2_4
- S. Manabe, and R.J. Stouffer, “Century-scale effects of increased atmospheric C02 on the ocean–atmosphere system”, Nature, vol. 364, pp. 215-218, 1993. http://dx.doi.org/10.1038/364215a0
- Z. Hausfather, H.F. Drake, T. Abbott, and G.A. Schmidt, “Evaluating the Performance of Past Climate Model Projections”, Geophysical Research Letters, vol. 47, 2020. http://dx.doi.org/10.1029/2019GL085378
- S. Manabe, and K. Bryan, “Climate Calculations with a Combined Ocean-Atmosphere Model”, Journal of the Atmospheric Sciences, vol. 26, pp. 786-789, 1969. <a href=”http://dx.doi.org/10.1175/1520-0469(1969)0262.0.CO;2″>http://dx.doi.org/10.1175/1520-0469(1969)0260786:CCWACO>2.0.CO;2
- S. Manabe, K. Bryan, and M.J. Spelman, “A Global Ocean-Atmosphere Climate Model. Part I. The Atmospheric Circulation”, Journal of Physical Oceanography, vol. 5, pp. 3-29, 1975. <a href=”http://dx.doi.org/10.1175/1520-0485(1975)0052.0.CO;2″>http://dx.doi.org/10.1175/1520-0485(1975)0050003:AGOACM>2.0.CO;2
- K. Bryan, S. Manabe, and R.C. Pacanowski, “A Global Ocean-Atmosphere Climate Model. Part II. The Oceanic Circulation”, Journal of Physical Oceanography, vol. 5, pp. 30-46, 1975. <a href=”http://dx.doi.org/10.1175/1520-0485(1975)0052.0.CO;2″>http://dx.doi.org/10.1175/1520-0485(1975)0050030:AGOACM>2.0.CO;2
- S. Manabe, and R.J. Stouffer, “Two Stable Equilibria of a Coupled Ocean-Atmosphere Model”, Journal of Climate, vol. 1, pp. 841-866, 1988. <a href=”http://dx.doi.org/10.1175/1520-0442(1988)0012.0.CO;2″>http://dx.doi.org/10.1175/1520-0442(1988)0010841:TSEOAC>2.0.CO;2
- R. Sausen, K. Barthel, and K. Hasselmann, “Coupled ocean-atmosphere models with flux correction”, Climate Dynamics, vol. 2, pp. 145-163, 1988. http://dx.doi.org/10.1007/BF01053472
- H. Bjornsson, L.A. Mysak, and G.A. Schmidt, “Mixed Boundary Conditions versus Coupling with an Energy–Moisture Balance Model for a Zonally Averaged Ocean Climate Model”, Journal of Climate, vol. 10, pp. 2412-2430, 1997. <a href=”http://dx.doi.org/10.1175/1520-0442(1997)0102.0.CO;2″>http://dx.doi.org/10.1175/1520-0442(1997)0102412:MBCVCW>2.0.CO;2
- A. Gnanadesikan, R. Kelson, and M. Sten, “Flux Correction and Overturning Stability: Insights from a Dynamical Box Model”, Journal of Climate, vol. 31, pp. 9335-9350, 2018. http://dx.doi.org/10.1175/JCLI-D-18-0388.1
- R.J. Stouffer, and S. Manabe, “Assessing temperature pattern projections made in 1989”, Nature Climate Change, vol. 7, pp. 163-165, 2017. http://dx.doi.org/10.1038/nclimate3224
- R.J. Stouffer, S. Manabe, and K. Bryan, “Interhemispheric asymmetry in climate response to a gradual increase of atmospheric CO2”, Nature, vol. 342, pp. 660-662, 1989. http://dx.doi.org/10.1038/342660a0
- K. Hasselmann, “Multi-pattern fingerprint method for detection and attribution of climate change”, Climate Dynamics, vol. 13, pp. 601-611, 1997. http://dx.doi.org/10.1007/s003820050185
- U. Cubasch, K. Hasselmann, H. Höck, E. Maier-Reimer, U. Mikolajewicz, B.D. Santer, and R. Sausen, “Time-dependent greenhouse warming computations with a coupled ocean-atmosphere model”, Climate Dynamics, vol. 8, pp. 55-69, 1992. http://dx.doi.org/10.1007/BF00209163
- G.C. Hegerl, H. von Storch, K. Hasselmann, B.D. Santer, U. Cubasch, and P.D. Jones, “Detecting Greenhouse-Gas-Induced Climate Change with an Optimal Fingerprint Method”, Journal of Climate, vol. 9, pp. 2281-2306, 1996. <a href=”http://dx.doi.org/10.1175/1520-0442(1996)0092.0.CO;2″>http://dx.doi.org/10.1175/1520-0442(1996)0092281:DGGICC>2.0.CO;2
- G.C. Hegerl, K. Hasselmann, U. Cubasch, J.F.B. Mitchell, E. Roeckner, R. Voss, and J. Waszkewitz, “Multi-fingerprint detection and attribution analysis of greenhouse gas, greenhouse gas-plus-aerosol and solar forced climate change”, Climate Dynamics, vol. 13, pp. 613-634, 1997. http://dx.doi.org/10.1007/s003820050186
- K. Hasselmann, “Are We Seeing Global Warming?”, Science, vol. 276, pp. 914-915, 1997. http://dx.doi.org/10.1126/science.276.5314.914
This is a contributed article