Month: October 2014

How will Antartica’s ice-sheet contribute to 21st century sea level rise?

by Professor Martin Siegert, Co-director, Grantham Institute

Antarctic glacierOn 27th October I convened a meeting at the Royal Society of London to discuss the results of a recent 20-year research horizon scanning exercise for Antarctic Science (Kennicutt et al. 2014). Part of the discussion focused on the research needed to better quantify Antarctica’s likely contribution to sea level rise in the coming decades and beyond, as published in the new Intergovernmental Panel on Climate Change (IPCC) Synthesis Report.

The report states that, ‘Global mean sea level rise will continue during the 21st century, very likely at a faster rate than observed from 1971 to 2010, and will likely be in the ranges of 0.26 to 0.55 m [in the lowest emissions scenario] … and … 0.45 to 0.82 m [in the highest emissions scenario – the closest to “business as usual”]’. It also states that, ‘Based on current understanding, only the collapse of marine-based sectors of the Antarctic ice sheet, if initiated, could cause global mean sea level to rise substantially above the likely range during the 21st century.’ There is medium confidence that any additional sea level rise would be no more than tens of centimetres.

One of the speakers at the event, Prof. David Vaughan, the Director of Research at the British Antarctic Survey, supported the IPCC’s position by remarking that he knew of no glaciologist who would strongly advocate a different position to this, given the evidence at hand. As a glaciologist myself, I can easily accept Prof. Vaughan’s comment and I don’t believe it is controversial among the community. I was, however, provoked by it to consider the relevant issues a little further, given the uncertainties noted by the IPCC, and to take the opportunity to discuss it with colleagues at the meeting.

  Could ice sheet collapse lead to further sea level rise?

Historically, ice sheet responses to global warming have been responsible for sea level changes of a metre or more per century. As the glaciers retreated after the last ice age, sea levels rose by an average of over a metre per century between 20,000 years ago and 10,000 years ago – a total of 120 m. Records also show that the rate of sea level rise can exceed this, however. During the so-called ‘meltwater pulse 1a’ (MWP1a) episode around 15,000 years ago, an increase of around 7 m per century took place. The cause of MWP1a remains uncertain, with some pointing to the rapid decay of the North American ice sheet, whereas others link the change to Antarctica. It may be that both ice sheets were involved to some degree, and the details of the issue remain hotly debated. The point to note is that changes in the cryosphere are certainly capable of causing global sea level to rise at a higher rate than the IPCC suggests.

It is worth considering  whether we can rule out the possibility of a new meltwater pulse being locked in somewhere in Antarctica or Greenland, ready to be released to the ocean once some threshold has been reached. As the IPCC notes, several regions of the West Antarctic ice sheet (in particular) and East Antarctic ice sheet appear close to or at a physical threshold of change, where ground ice retreat into deeper (below sea level) terrain leads to further accelerated loss of ice to the sea (often referred to as marine ice sheet instability). Papers earlier this year by Joughin et al. (2014) and Rignot et al. (2014) point to such irreversible change having already begun in the Amundsen Bay region of West Antarctica. According to Joughin et al. (2014) the full effects of such change may take several hundred years, in line with the IPCC’s position. Evidence from the other side of West Antarctica demonstrates a region the size of Wales being highly sensitive to future ocean warming (Ross et al. 2012), and that such warmth may be delivered within a few decades (Hellmer et al. 2012). Across the continent in East Antarctica, the structure of the underlying bedrock reveals evidence of major ice recession in the past (Young et al. 2011), hinting that the ice sheet response to warming is not necessarily restricted to West Antarctica. Indeed while West Antarctica may be losing mass more quickly than anywhere else on the planet, the greatest potential for sea level change lies in East Antarctica, which about ten times greater in volume.

So, after considering Prof. Vaughan’s point that no glaciologist would differ markedly from the IPCC on Antarctic ice sheet collapse, I returned a question to him and those gathered: how can we be sure that the Antarctic ice sheet won’t respond to ocean warming more quickly than expected in certain regions? The answer is we can’t be certain even though, like Joughin et al. (2014), we may consider it unlikely. While I did not dispute Prof. Vaughan’s point, in the light of both recent findings and more established figures on how ice sheets can change during episodes of global warming, there is surely a non-zero risk of much greater sea level rise over the coming decades than the IPCC alludes to.

Quantifying this risk is difficult – maybe impossible at present – and as a consequence is likely to be highly controversial, which is why the IPCC does not tackle it. The problem is that quantifying a non-zero risk of global sea level rise over 1 m in the next 100 years is a far more challenging problem – for both scientists and decision makers – than restricting the debate to what we consider most likely. Maintaining this restriction on the debate is neither safe nor sensible, however.

Glaciologists will point to the research needed on the Antarctic ice sheet’s sensitivity to ocean warming to advance the debate. In 20 years as a glaciologist, I have been surprised on numerous occasions by what we discover about the flow and form of past and present ice sheets. I am utterly certain that amazing new discoveries lie ahead. For this reason, an appropriately sceptical scientific attitude is to accept that our knowledge of Antarctica remains woefully inadequate to be certain about future sea level rise, and to always challenge the consensus constructively.

The solution lies in our ability to model the ice-ocean system in a way that allows confident predictions of the ice sheet response to ocean warming. To do this we need two things. First is better input data, by way of high-precision landscaping beneath the ice sheet in regions most sensitive to change, and in areas where no data have been collected (and there are several completely unexplored parts of the continent). The data collected would also allow us to better understand the process of ice flow in key regions of potential change. A second advance needed is in the coupling of ice-sheet and ocean models. Both are challenging, but well within our abilities to achieve them. Indeed the horizon scanning exercise discussed last week made such investigations a priority.

The costs of decarbonising the UK

By Dr Flora WhitmarshGrantham Institute

money200The costs associated with reducing emissions in the UK have been discussed recently in the press. In an article in the Mail on Sunday, David Rose made the claim that energy policies shaped by the so-called “Green Blob” –  a term coined by Owen Paterson for what he called “the mutually supportive network of environmental pressure groups, renewable energy companies and some public officials” – will cost the UK up to £400 billion by 2030, and that bills will rise by at least a third.

How much will action on climate change actually cost? The quoted figure of £400 billion equates to 1-1.5% of cumulative UK GDP over the next sixteen years. The most recent analysis to be carried out by the Intergovernmental Panel on Climate Change suggests that the costs of a low carbon economy would be around 1-4% of GDP globally by 2030. Analysis carried out by the AVOID consortium which includes Grantham Institute researchers found that the costs of staying within 2oC were 0.5-4% of global GDP, and a report on the costs of mitigation co-authored by the Grantham Institute put the costs at around 1% of global GDP. The figure quoted in the Mail on Sunday for the overall costs of decarbonisation is of the order of magnitude projected by experts, but these figures do not take into account the co-benefits of mitigation such as improved air quality and energy security. In fact a recent report by Cambridge Econometrics asserts that the UK’s decarbonisation pathway would lead to a net increase in GDP of 1.1% by 2030, due to structural changes in the economy and job creation resulting from the low-carbon transition.

Whilst these estimates relate to the economy-wide cost of using low-carbon energy rather than carbon-intensive sources such as fossil fuels, it is not immediately clear from them what this means for the cost of living. The rising cost of household energy is a key concern for people in the UK who have already seen significant increases in the average bill since 2004 mainly due to the rising cost of gas. In a report published in 2012, the Climate Change Committee concluded that support for low carbon technologies would add an average of £100 (10%) onto energy bills for a typical household by 2020 – where a typical household is one that uses gas for heating, and electricity for lighting and appliances. A further increase of £25 per household is projected by 2030, but this is less than in a scenario with high levels of investment in gas-fired power generation.

Furthermore, this could be partially offset by improvements in energy efficiency. The Climate Change Committee expects that by 2020 the replacement of old inefficient boilers will reduce bills by around £35 on average. The use of more efficient lights and appliances could reduce bills by a further £85, and improved efficiency in heating, mainly due to insulation, could save another £25 on average. However, making these savings would depend on having the right policies in place to encourage energy efficiency.

In defence of biomass energy

By Professor Colin Prentice, AXA Chair in Biosphere and Climate Impacts

Further to previous posts on this blog regarding Owen Paterson’s recent speech to the Global Warming Policy Foundation, I would like to take this opportunity to correct his dismissive statement about biomass energy as a potential contribution to decarbonized energy production in the UK. This is what the former Environment Secretary said:

forest-272595_1280

Biomass is not zero carbon. It generates more CO2 per unit of energy even than coal. Even DECC admits that importing wood pellets from North America to turn into hugely expensive electricity here makes no sense if only because a good proportion of those pellets are coming from whole trees.

The fact that trees can regrow is of little relevance: they take decades to replace the carbon released in their combustion, and then they are supposed to be cut down again. If you want to fix carbon by planting trees, then plant trees! Don’t cut them down as well. We are spending ten times as much to cut down North American forests as we are to stop the cutting down of tropical forests.

Meanwhile, more than 90 percent of the renewable heat incentive (RHI) funds are going to biomass. That is to say, we are paying people to stop using gas and burn wood instead. Wood produces twice as much carbon dioxide than gas.

There are two misconceptions here.

(1) It is extremely relevant that ‘trees can regrow’ – this is the whole reason why biomass energy is commonly accounted as being carbon neutral! To be genuinely carbon neutral, of course, every tonne of biomass that is burnt (plus any additional greenhouse gas emissions associated with its production and delivery to the point of use) has to replaced by a tonne of new biomass that is growing somewhere else. This is possible so long as the biomass is obtained from a sustainable rotation system – that is, a system in which the rate of harvest is at least equalled by the rate of regrowth, when averaged over the whole supply region.

Now it has been pointed out several times in the literature (e.g. Searchinger et al., 2009; Haberl et al., 2012) that if biomass is burnt for energy and not replenished (for example, if trees are cut down and the land is then converted to other uses), then it is not carbon neutral. Indeed, the carbon intensity of this form of energy production is at least as high as that of coal. Paterson may have been influenced by a report on this topic (RSPB, Friends of the Earth and Greenpeace, 2012) which drew attention to the “accounting error” by which energy derived from biomass might be classed as carbon neutral while actually being highly polluting. But this refers to an extreme scenario, whereby increased demand for forest products leads to no increase in the area covered by forests. In this scenario, biomass energy demand would have to be met from the existing (global) forest estate, drawing down the carbon stocks of forests and forcing builders to substitute concrete and other materials for wood. This would certainly be undesirable from the point of view of the land carbon balance; and carbon accounting rules should recognize the fact.

Nonethless, this extreme scenario is implausible. It assumes that the value of biomass as fuel would be comparable to that of timber (highly unlikely) and more generally that there would be no supply response to increased demand. In more economically plausible scenarios, the increased demand for biomass fuel is met by an increase in the use of by-products of timber production (which today are commonly left to decay or burnt without producing any energy), and by an increase in the amount of agriculturally marginal land under biomass production – including non-tree energy crops such as Miscanthus, as well as trees.

Paterson’s blanket dismissal of the potential for biomass production to reduce CO2 emissions is therefore not scientifically defensible. Sustainable biomass energy production is entirely possible, already providing (for example) nearly a third of Sweden’s electricity today. It could represent an important contribution to decarbonized energy production in the UK and elsewhere.

(2) It might seem to be common sense that planting trees (and never cutting them down) would bring greater benefits in extracting CO2 from the atmosphere than planting trees for harvest and combustion. All the same, it is wrong. The point is that just planting trees produces no energy, whereas planting trees for biomass energy production provides a substitute for the use of fossil fuels. There is an enormous difference. Indeed, it has been known for a long time that the total reduction in atmospheric CO2 concentration that could be achieved under an absurdly optimistic scenario (converting all the land that people have ever deforested back into forests) would reduce atmospheric CO2 concentration by a trivial amount, relative to projected increases due to burning fossil fuel (House et al., 2002; Mackey et al. 2013).

I thank Jeremy Woods (Imperial College) and Jonathan Scurlock (National Farmers Union) for their helpful advice on this topic, and suggestions to improve the text.

 

  References

Haberl, H. et al. (2012) Correcting a fundamental error in greenhouse gas accounting related to bioenergy. Energy Policy 45: 18-23.

House, J.I., I.C. Prentice and C. Le Quéré (2002). Maximum impacts of future reforestation or deforestation on atmospheric CO2. Global Change Biology 8: 1047-1052.

Mackey, B. et al. (2013) Untangling the confusion around land carbon science and climate change mitigation policy. Nature Climate Change 3: 552-557.

RSPB, Friends of the Earth and Greenpeace (2012) Dirtier than coal? Why Government plans to subsidise burning trees are bad news for the planet. http://www.rspb.org.uk/Images/biomass_report_tcm9-326672.pdf

Searchinger, T. et al. (2009) Fixing a critical climate accounting error. Science 326: 527-528.

 

Has climate change been exaggerated? Fact-checking Owen Paterson’s comments

By Dr Flora WhitmarshGrantham Institute

tropical storm 250In a lecture to the Global Warming Policy Foundation, the former UK Environment Secretary Owen Paterson has criticised the current government’s climate and energy policies, suggesting there is too much emphasis on renewables and that the consequences of climate change have been exaggerated. A discussion of Mr Paterson’s comments on UK energy policy appears in another Grantham blog by Dr Simon Buckle. Here I will discuss one of the reasons for Paterson’s position, the belief that climate change has been exaggerated.

Paterson suggested that the Earth has not warmed as much as had been predicted, “ … I also accept the unambiguous failure of the atmosphere to warm anything like as fast as predicted by the vast majority of climate models over the past 35 years, when measured by both satellites and surface thermometers. And indeed the failure of the atmosphere to warm at all over the past 18 years – according to some sources. Many policymakers have still to catch up with the facts.”

If we look back to earlier attempts to quantify global warming, it is now becoming clear that while these attempts were not perfect, they were not hugely inaccurate either. Natural climate variation is more significant than global warming over shorter time periods, but about 25 years have now passed since the earliest attempts to produce policy-relevant projections of rate of warming, and subsequent publications have started to assess how accurate these projections were.

Early climate projections

In late 2013, the Intergovernmental Panel on Climate Change (IPCC), a body reporting to the UN, released the first volume of its Fifth Assessment Report. This volume contained an in-depth summary of scientific knowledge about climate science. Scientific understanding of the climate has come a long way since the IPCC released their First Assessment Report in 1990, but the basics of the greenhouse effect were well understood at the time. The projections of future temperature rise in the 1990 report represent the earliest attempt to produce a scientific consensus of opinion regarding the severity of global warming.

A paper published in 2010 by Frame and Stone checked the projections in the First IPCC Report against observed temperature rise.  Under the “business as usual” emissions scenario, the IPCC’s best estimate for the projected temperature increase between 1990 and 2010 was 0.55C, within a range of uncertainty. According to two different data sets, temperatures actually increased by 0.35C (HadCRUT3) or 0.39C (GISTEMP) during that period. This is just outside the broader range given by the IPCC, but the IPCC’s range was intended to reflect the uncertainty in the effects of greenhouse gases emissions on the long term warming trend. No attempt was made to include natural climate variability. Frame and Stone performed calculations to account for natural variability using two plausible methods. Both methods showed that the measured temperature increase is consistent with the IPCC’s projections when natural variability is taken into account. In addition, emissions have not been precisely the same as the trajectory used by IPCC, although on this timescale the difference is probably not very significant.

Another early attempt to make policy-relevant projections was published by Hansen et al. in 1988, and results from this work were presented in testimony to the US congress in the same year. Analysis published in 2006 by Hansen et al. demonstrated that the 1988 calculations had been remarkably accurate, with the observed temperatures closely matching those projected under the most realistic emissions scenario. The exceptionally close agreement between the model projections and the observations may have been coincidental since the sensitivity of the climate to carbon dioxide in Hansen’s original model was near the top of the currently accepted range. Nevertheless, the temperature increases projected by the model were close to observations available in 2006.

It is reassuring that these early projections have proved to be of the right magnitude even though the exact rate of warming wasn’t projected. It is worth bearing in mind that the original projections were made about 25 years ago, and the subsequent analysis referenced here was carried out in 2006 and 2010, meaning that only 18-20 years of data is used. This is still not long enough to iron out the full effects of natural variability. Nevertheless, it is now clear that the planet is warming and that humans are responsible, something that could not be concluded unequivocally from the evidence available 25 years ago. It is testament to this overwhelming evidence that those opposed to action on climate change now rely on relatively minor criticisms of climate science to form the basis of their opposition.

Coming to Paterson’s second point, it is indeed true that there has been no significant increase in global surface temperatures in the 21st century so far. However, global warming is not expected to lead to a linear increase in surface temperatures. Indeed, the First Assessment Report of the IPCC, published in 1990, stated that “The [average global surface temperature] rise will not be steady because of the influence of other factors.” Other factors – notably solar cycles, volcanic eruptions and natural climate variation – are known to affect global surface temperatures. The lack of surface temperature increase this century is due to a combination of factors, but almost certainly there has been some contribution from natural changes in the amount of heat taken up by the ocean. It is important to note that the overall heat content of the planet continues to increase and this is still contributing to sea level rise and ice melt.

The impacts of climate change

Paterson continued, “I also note that the forecast effects of climate change have been consistently and widely exaggerated thus far.

“The stopping of the Gulf Stream, the worsening of hurricanes, the retreat of Antarctic sea ice, the increase of malaria, the claim by UNEP that we would see 50m climate refugees before now – these were all predictions that proved wrong.”

There is a hierarchy of uncertainty in climate change prediction. The increase in surface temperatures at a global level due to the greenhouse effect is well understood scientifically. The total amount of heat in the earth system is increasing due to greenhouse gas emissions, which is having the effect of melting ice and snow and warming the ocean, lower atmosphere and Earth surface. All of these impacts, along with ocean acidification from increasing atmospheric carbon dioxide concentrations, are almost certain to continue. Increasing temperatures will also have more complex dynamic effects, including on ocean currents and atmospheric circulation – key aspects of climate variability – as well as on weather patterns, including extreme weather. These impacts are generally harder to predict because there are more factors involved. Putting all of this together and trying to predict the effect of climate change on humans or ecosystems is even more complicated.

The large scale Atlantic Ocean circulation, of which the Gulf Stream forms a part, is driven in part by processes in the North Atlantic that depend on the density of the water in the region.  Polar ice melt and changing rainfall patterns due to climate change both have the effect of depositing relatively fresh (and therefore low density) water in the North Atlantic, meaning this process could be affected by climate change. The possibility of a complete shutdown of this North Atlantic circulation has been discussed based on the results of simplified models that show this as a possible outcome.  However, mainstream scientific consensus has never been that that this is likely. Again, it is worth going back to older IPCC reports, which form the most comprehensive overview of the scientific understanding of climate change at the time they were written. At the time of the IPCC’s Second Assessment Report in 1995, the available models suggested that the ocean overturning circulation would weaken due to climate change. Subsequent reports in 2001 and 2007 also projected a slowdown and discussed the possibility of a shutdown, but neither report predicted a complete shutdown before 2100. By the time of the latest IPCC report in 2013, the overturning circulation was projected to weaken by between 11% and 34% by 2100. A slowdown has not yet been detected in the observations; this is likely due to the significant natural variability in the strength of the overturning circulation and the limited observational record.

There is more than one way that climate change can affect hurricanes (or tropical cyclones more generally). Heavy rain is almost certainly becoming more frequent and intense globally, and this includes rain that falls during tropical cyclones. In addition there could be an effect on wind speeds or on the frequency of tropical cyclones. The IPCC’s Fifth Assessment Report reported observational evidence that the strongest tropical cyclones in the North Atlantic have become more intense and more frequent since the 1970s, although there is no evidence of a global trend.

There has been a global decline in ice and snow due to climate change. Taking sea ice specifically, Arctic and Antarctic Sea Ice have different characteristics. The Arctic sea ice is more long lived and is declining in both area and mass. Antarctic sea ice is not declining in area because the ice is more mobile than in the Arctic meaning its characteristics are more complex. However since its thickness has not been accurately measured, it is not known whether it has gained or lost mass overall. Sea ice is not to be confused with the Greenland and Antarctic ice sheets, both of which are losing mass. This is discussed in more detail in a previous blog.

Coming back to the hierarchy of uncertainty, changes in malaria incidence and the numbers of potential climate refugees are in the most uncertain group of impacts. These changes depend on the detailed changes in climate in the location under discussion and the response of humans or mosquitos/malarial parasites to that. A change in malaria incidence is still possible, but this remains the subject of research. As well as local climate conditions, the number of climate change refugees would also depend on the response from local people, governments or other organisations in adapting to the effects of climate change. The number of unknowns here makes it very difficult to predict how many people might be displaced by climate change, but this does not undermine our confidence in climate science itself.

 

 

Paterson misses the point

By Dr Simon Buckle,  Grantham Institute

smoking chimneysOwen Paterson’s remarks on the UK response to climate change miss the point.  I do not disagree with him that the UK decarbonisation strategy should be improved.  In particular, there is a need for a more effective strategy on energy demand.  However, my preferred policy and technology mix would be very different to his and include the acceleration and expansion of the CCS commercial demonstration programme in order to reduce the energy penalty and overall costs of CCS. And without CCS, there is no way responsibly to use the shale gas he wants the UK to produce in the coming decades for electricity generation or in industrial processes, or any other fossil fuels.

However, these are second order issues compared to his call for scrapping the 2050 targets and the suspension of the UK Climate Change Committee.  On current trends, by the end of the century, the surface temperature of our planet is as likely as not to have increased by 4°C relative to pre-industrial conditions.  The present pause in the rise of the global mean surface temperature does not mean we do not need to be concerned.   We are fundamentally changing the climate system, raising the likelihood of severe, pervasive, and irreversible impacts on society and the natural systems on which we all depend.

A cost-effective policy to limit these very real climate risks must be based on concerted, co-ordinated and broad-based mitigation action.  This is needed to deliver a substantial and sustained reduction in global greenhouse gas emissions, which continue on a sharply rising trajectory.  The best way to create the conditions for such action by all the major emitting economies – developed and developing, in different measure – is through the UN negotiation process, supplemented by bodies such as (but not confined to) the Major Economies Forum.  The focus of this process is now on achieving a deal covering emissions beyond 2020, due to be finalised at the Paris summit at the end of next year.

There are encouraging signs of progress, e.g. in both the US and China, and the EU is due to agree its own 2030 targets at the end of this month.  But the process is difficult and protracted.  I agree with Paterson that 2050 is not the be all and end all.  I have argued here that the Paris talks should focus on how the next climate agreement can help us collectively to achieve a global peak in emissions before 2030, the first necessary step to any stringent mitigation target, rather than trying to negotiate a deal covering the whole period to 2050.

If Paris is a success, we might then re-assess whether or not the UK’s current mitigation targets are adequate or not.  But we are rapidly running out of time to achieve what the world’s governments profess to be their aim of limiting global warming to at most 2 degrees Celsius above pre-industrial levels.  The longer we delay mitigation action, the more difficult that challenge will be and the more expensive.  At some point soon it will become impossible in practical terms.

Given its leadership on this issue over many decades, UK action to scrap the Climate Change Act and/or suspend or abolish the Climate Change Committee would be severely damaging.  Seeking short-term domestic political advantage – which is what this move appears to be – through recommendations that would undermine national, European and international efforts to limit climate risks is irresponsible.   Sadly, this seems to be what the so-called political “debate” in the UK has been reduced to.

2°C or not 2°C – should we ditch the below 2°C target for global warming?

By Professor Joanna Haigh, Co-Director, Grantham Institute

Thermometer250A commentary published in Nature this week has opened up a discussion about the value of using the goal of keeping global warming to below 2°C.

David Victor and Charles Kennel are concerned that the below 2°C target for global warming is not useful, partly because they consider it is no longer achievable and partly because global mean surface temperature does not present a full picture of climate change.  The problem comes, of course, in identifying an alternative approach to establishing what is required from attempts to mitigate global warming.

The 2 degree target is in a sense nominal, in that it there is no precise threshold at which everything goes from bearable to unbearable, but it does have the advantage of being easy to understand, for both policy makers and the wider public . The proposed alternative indicators, including ocean heat content and high latitude temperature, have scientific validity but the implications of changes in these parameters may not be obvious to people living away from these areas. Furthermore, monitoring any measure on a real time basis will not avoid the intrinsic variability seen in the global temperature record. Ocean heat content shows an apparently unremitting upward trend at present but a climate change denialist would have been happy to point out a “hiatus” in that trend during the 1960s.

Victor and Kennel also state that “the 2°C target has allowed politicians to pretend that they are organizing for action when, in fact, most have done little”. This criticism would apply to any target – providing it is still at such sufficient distance to remain broadly plausible – and their proposal for “a global goal for average [greenhouse gas] concentrations in 2030 or 2050” would provide equal opportunity for prevarication.

According to a review of recent emissions reduction modelling studies conducted for the AVOID 2 programme co-authored by Ajay Gambhir of the Grantham Institute, it is still possible to meet the 2 degrees C target, provided that a broad portfolio of technologies is available and that there are no significant delays in global coordinated mitigation action. A continuation of relatively weak policies to 2030, or the absence of specific technologies such as carbon capture and storage, could however greatly increase mitigation costs and in some models render the target unachievable.

Nevertheless, Victor and Kennel are right to point out the problems with the current over-simplistic approach and I hope that their initiation of a search for “indicators of planetary health” will spur someone to invent a useful new measure for monitoring and assessing climate change.

 

Read more about Grantham Institute research on climate mitigation