Blog posts

UNFCCC climate negotiations: reflections from the Rhine

BonnBy Dr Simon Buckle, Grantham Institute

I spent a few days at the recent Bonn climate change conference (4-15 June) during the High Level Ministerial events on 5-6 June.  Not that these were the most interesting things happening there. Unsurprisingly, by and large, Ministers did not stray from well rehearsed positions, reflecting the continued skirmishing over the interpretation of the UN Framework Convention on Climate Change (UNFCCC) term “common but differentiated responsibilities” in a world that is radically different from the one in which the Convention was conceived.

More interesting were the briefing session on the UN Secretary General’s forthcoming climate summit in New York on 23 September and a series of special events where negotiators got the chance to hear from and question IPCC authors about the implications of the IPCC AR5 reports for the UN negotiations and the review underway of the long-term target (2°C or 1.5°C?), a key issue for vulnerable countries (e.g. small island states) given the very different potential implications of sea-level rise. It’s worth looking at some of the webcasts.

A particularly revealing moment came during a special event to engage with Observers to the UNFCCC process organised by the co-chairs of the so-called “Ad Hoc Working Group on the Durban Platform for Enhanced Action” (ADP). The ADP is the subsidiary body charged with developing the Paris 2015 agreement as well as trying to identify ways to enhance mitigation action before 2020.

As part of the ADP process, early in 2015 countries should notify the UNFCCC Secretariat of their “intended nationally determined contributions” (INDCs) to emissions reductions in the period after 2020.  I therefore asked the ADP co-chairs how the UN process would ensure that these bottom-up contributions – and the aggregate global emissions that they implied – would be consistent with the long-term climate targets that countries had committed to.  This was indeed a critical issue they said, but they had as yet no idea how this might be achieved.

This seems to me to be a pretty fundamental problem. An aspiration to achieve “a carbon neutral world in the second half of the century” (the current UNFCCC thinking on such an overarching aim) is in my view just not good enough to constrain climate risks and achieve a cost-effective transition to a low-carbon world. In particular, it says nothing about the emissions path over the next 30 years or so.

We know that effective international action on climate is difficult to achieve given the concerns over competitiveness and free-riding.  So instead of the top down approach to earlier climate agreements (i.e. Kyoto with it modest targets and timetables) or the bottom-up approach that has now emerged by default from the political trauma of the 2009 Copenhagen summit, we now need a hybrid approach. What I have in mind is that the aggregate bottom-up emissions pledges under the UN process need to be supplemented and given coherence by a political commitment among the major emitting economies – notably the US, China and the EU – to achieve a clear, measurable and negotiable near-term mitigation objective, which would be reflected within the Paris agreement.  My own suggestion for what this objective should be is that it should commit to achieving a global peaking in fossil-related carbon dioxide emissions by 2030 or earlier if possible, with a subsequent decline.

Recent developments in the US and China suggest that such a goal is not impossible, particularly if the EU can get its act together and agree its 2030 targets at the October Council.  Moreover my own calculations suggest a global peak in fossil carbon dioxide emissions could be achieved while allowing developing country emissions (e.g. in India) to continue to grow for some time to come, as they must in any politically viable deal.

Of course achieving a global peak in carbon dioxide emissions is just a first step on the road to a longer term objective and if it is achieved too late or the peaking level is too high, we may not be able to achieve some of the more stringent climate targets.  But if we just focus on the long term target, we will end up in a zero-sum negotiation over the level and shares of a corresponding fixed carbon budget consistent with this target.  The urgent need at this point in time is to reverse the continuing growth in global CO2 emissions, a necessary first step in achieving any long-term goal.  The pace of emissions reductions after the peak and the eventual level of emissions in the second half of the century can be agreed at future summits.

No doubt there are several other ideas for making Paris a success being discussed by governments in private as I write.  One might be to build in flexibility to the Paris agreement itself and have short commitment periods, perhaps to 2025 initially but agreed on a rolling five years basis thereafter, mirroring the UK approach to setting carbon budgets.  And there needs to be far more public discussion about these alternatives.  But these technical suggestions are worthless without political leadership. The next 18 months presents an unprecedented opportunity to shift the world decisively onto a cost-effective, low climate-risk development path, with myriad benefits for our wellbeing and economic development.  The Secretary General’s September summit will be an early indication of whether our political leaders are ready to make this step or not.

 

What is the best way to write about climate change in fiction?

By Dr Flora Whitmarsh, Grantham Institute

Last week I attended Weather Fronts, an event organised by Tipping Point. The event brought climate scientists together with writers of fiction and poetry to discuss how authors can bring climate change into their work.

Climate change is a global problem and solving it requires collective action. When too many citizens fail to exercise their voice, it is harder for such problems to be adequately addressed at the societal level. Artists have a voice they can use to communicate about the things that concern them. Writing about global warming of course has the potential to raise awareness of its impacts and possible solutions. Novels or poems can be more engaging for some audiences than scientific documents or news reports.

After two thought-provoking keynote speeches from John Ashton and Professor Chris Rapley, and a writer’s panel with Maggie Gee, Jay Griffiths, Gregory Norminton, and Ruth Padel, much of the time was spent in small group discussions. We talked about diverse subjects including utopia and dystopia in fiction, uncertainty in climate modelling, and who should take decisions about climate change. I met novelists, graphic novelists and poets who were passionate about the environment, many of whom are already writing about the subject.

Literature would seem to be the perfect medium to bring climate change to life, but the muse is fickle and anybody setting out to write fiction with an explicit message could struggle to create engaging art. The danger is in creating something more akin to propaganda: a story with an obvious message, and unrealistic or one dimensional characters who function as little more than a mouthpiece for the author’s own opinions.

We often hear that the best way to write about climate change is to create a “positive vision of the future”, something that probably works well in science communication and outreach. But in art this can be less effective, because one runs the risk of the message being too obvious.

It is often said of climate change communication that we should avoid overly frightening or guilt-inducing messages, because this risks evoking fear and powerlessness. At the conference one writer suggested that dystopias are easier to write than utopias, suggesting it could be difficult to create art with a positive message about the future. However, a future in which climate change has been solved need not be a utopia. One solution to all of this is to write a futuristic story about something else entirely, but set it in a world powered by renewable energy without explicitly alluding to climate science or policy.

The conflict between creating good art and giving it some sort of message cannot be easily solved, but one experienced writer summed up the answer in the words of Emily Dickinson: “Tell all the truth, but tell it slant”. In other words, in fiction it can often be more effective to hint at your message or arrive at it in a roundabout way than to spell it out explicitly.

 

Science and an open society

By Dr Simon Buckle, Grantham Institute

Professor Lennart Bengtsson’s resignation from the GWPF (Global Warming Policy Foundation) Academic Advisory Council has received wide coverage and raises important issues.

Whatever anyone’s views are on the role, motivation and integrity of the GWPF in this matter, it is up to individual academics whether or not to associate themselves with it in an advisory role.

It is regrettable that perceived political stances on the climate issue are apparently so affecting academic activity.  The Grantham Institute at Imperial has always opposed such behaviour, believing that scientific progress requires an open society.  We try to engage with a wide range of figures, some with radically different views on climate change.

The outcome in this case is probably a reflection of the “us and them” that has permeated the climate science debate for decades and which is in part an outcome of – and reaction to – external pressure on the climate community.  But we must be clear: this is not a justification.  Concerted external pressure – if that is what it was – on Professor Bengtsson to resign from his GWPF role was wrong and misjudged.

Academic work on climate science and responses to climate variability and change should be politically neutral.  Policy towards climate is inevitably value-based and hence political.  We need the insights from high quality research and analysis to ensure our policy and political choices are as well informed as can be – importantly including social, political and economic research as well as that from the physical sciences and engineering.

What we learn from this event is that maintaining a healthy separation between science and politics – on either side of the political debate – is a continual but necessary challenge.  We have to keep the scientific endeavour as free as possible from political contention over policy responses.  All serious scientific voices on climate change therefore deserve both respect and to be heard. But given the enormity of the issues, these views require rigorous scrutiny and testing.

This episode should not distract us from the fact that we are performing a very dangerous experiment with the Earth’s climate.  Even by the end of this century, on current trends we risk changes of a magnitude that are unprecedented in the last 10,000 years.  How we respond to that is a matter of public policy, on which of course scientists have both a voice and often strong opinions, but as citizens not as policy experts.

Collaboration with Stanford University and Biofuels Research at the Joint BioEnergy Institute

By C. Chambon, Research Postgraduate, Department of Chemistry

As part of a group of six Imperial students who visited California, I travelled to San Francisco to work on two projects: the New Climate Economy project, and a research collaboration with the Joint BioEnergy Institute.

The New Climate Economy project is a government-commissioned project looking at how economic goals can be achieved in a way that also addresses climate change. The Innovation stream, led by Stanford University and the Grantham Institute at Imperial, is focused on the potential economic and environmental impact of disruptive technologies. Beginning in January, a group of six Imperial students each focused on a different technology for the project, researching and preparing case studies for our weekly teleconferences. The topics researched were as varied as solar PV, nanomaterials, customer segmentation and the smart grid. My focus was on carbon capture and storage or utilisation (CCUS) technologies, and the policies needed to support them.

Pic 1
The Imperial team at Stanford University

In Palo Alto, we worked together with Stanford students to construct a business model for each of our technology clusters. Our research findings were presented to NRG Energy’s newly formed Station A, a kind of skunkworks for energy resilience within NRG, a wholesale power company. The collaboration was a successful and a productive one, and several of us will continue to work with the New Climate Economy project to publish our research. The work will contribute to the UNFCCC COP negotiations in Paris in 2015.

During the latter half of the trip, I combined visits to Stanford with research for my PhD at Lawrence Berkeley National Lab across the bay. The San Francisco Bay Area is well-renowned as a bioscience and biotech hub, and is home to over 200 bioscience companies, start-ups and research institutes. One of these is the Joint BioEnergy Institute (JBEI), a branch of Lawrence Berkeley National Lab in the Berkeley hills. JBEI is a U.S. Department of Energy bioenergy research center dedicated to developing second-generation biofuels. These are advanced liquid fuels derived from the solar energy stored in plant biomass. The cellulosic biomass of non-food plants and agricultural waste can be converted to petrol, diesel and jet fuel, whilst the non-cellulosic part is a promising candidate to replace aromatic chemicals.

Pic 2
The Joint BioEnergy Research Center in Emeryville, California

My project at JBEI looked at the upgrading of lignin, extracted from the non-cellulosic part of woody biomass, into aromatic building-blocks. This experience was a valuable addition to my PhD project, which looks at the valorisation of lignin from pine wood to improve the economics of the biorefinery.  A highlight of my stay was a visit to the scaled-up biorefining facilities at LBNL, where a one-of-a-kind reactor is used to convert biofeedstocks into fuels. It was a very inspiring glance into the future of biorefining and I look forward to working closely with LBNL researchers and others working in the field of bioenergy.

The challenge of seasonal weather prediction

By Hannah Nissan, Research Assistant in Regional Climate Modelling, Physics

In April 2009 the UK Met Office issued their now infamous forecast: “odds-on for a BBQ summer”. By the end of August, total precipitation since June had climbed to 42% above average levels for 1971-2000 (UKMO, 2014). Why is it so challenging to provide seasonal forecasts several months ahead?

A question which arises often in conversations about climate change is “how can we predict the climate in 50 years when we can’t even get the weather right next week?” While we have no skill in forecasting the weather on a particular day in 2060, we can make much more confident projections about the average conditions at that time. Some explanation of this comes from mathematician and meteorologist Ed Lorenz, who made an accidental discovery while rerunning some calculations inspired by weather predictions on his computer (Lorenz, 1963). Taking a seemingly harmless shortcut, he rounded a number to a couple of decimal places, and the result was a completely different weather forecast. His finding was coined the “butterfly effect”, which invokes a powerful analogy: tomorrow’s weather forecast depends so strongly on getting today’s weather right (a significant challenge) that forgetting to account for the effect of a butterfly’s flapping wings is enough to derail it. In contrast, climate is an average of the weather over a few decades, and doesn’t suffer as strongly from this debilitating “initial conditions” problem.

The distinction becomes rather blurred when we turn to seasonal prediction. When looking at longer term forecasts, say for this coming summer or next winter, correctly characterising the initial conditions of parts of the climate system still matters. This is particularly true for slower varying fields like soil moisture and the upper ocean levels, and to a lesser extent for the atmosphere. However, on these timescales other challenges become increasingly important. As we move from forecasting days ahead, to weeks, months, seasons and decades, the number and complexity of physical processes that must be well described by the computer models used to simulate the weather increases. Delivering better forecasts requires improvements on both these fronts, which compete for limited computer resources.

The benefits of developing more sophisticated prediction models must be balanced against the critical need for an indication of the uncertainty related to a forecast. To do this meteorologists create not just one weather forecast, but a whole ensemble of predictions starting from slightly different initial conditions. Other ensembles to capture the uncertainty in the physics of the model itself are also simulated. The spread of the ensembles tell us something about the range of possible outcomes and their likelihoods. We can never have perfect knowledge of the current state of the weather, nor a perfect model, so running lots of ensembles to develop a probabilistic weather forecast in this way is really important.

To illustrate the trade-offs involved between running ensembles and improving model complexity, consider an example from the UK Met Office. The wetness and storminess of UK winters are strongly linked with the North Atlantic Oscillation (NAO), a pattern of sea surface pressure between Iceland and the Azores islands. By investing in research to uncover some of the important physical drivers for the NAO and including these in their model, the Met Office have recently achieved a significant improvement in NAO forecasts (Scaife, 2014a,b). At the European Centre for Medium Range Weather Forecasting (ECMWF), improving the way convection is modelled delivered better tropical precipitation forecasts and greater skill in predicting seasonal European weather (Bechtold, 2013).

Predictability itself is not a fixed quantity, but varies with each situation. Some weather systems are strongly influenced by larger scale phenomena happening over longer periods of time for which forecasts can be quite reliable. For example, weather in much of the tropics depends on the El Nino Southern Oscillation (ENSO), a pattern of sea surface temperature and atmospheric pressure in the tropical Pacific which persists for several months. Others may be dominated by brief, local processes that are much harder to predict, like convection.  In general, models are able to simulate the cascade of energy downwards, from large movements like the jet stream down to small waves and turbulence. The reverse case is not so easy: it is a major challenge to represent the effects of processes occurring at small physical scales and short time periods, like turbulence and convection, on the wider weather. Consistent with the butterfly effect, some of the effects of small-scale processes are inherently unpredictable and must be represented by random noise.

A good test for the usefulness of a seasonal forecast is whether it offers an improvement over simply looking at the average conditions. In other words, can we do a better job if we simply say that summer temperatures will be the same this year as they have been on average over the last 30 years? Weather prediction models beat statistical forecasts in the tropics, where the influence of ENSO is strong and fairly predictable. This has not in general been the case over Europe and in other higher latitude regions, where lots of different phenomena interact (Buizza, 2014).  However, the latest forecast systems are starting to show some skill even here (Scaife, 2014b).

Temperature forecasts several months ahead are often better than looking at long term data. Predictive skill for precipitation, however, is much worse. This is because rainfall is driven partly by local processes, right down to how individual raindrops are formed. Temperature on the other hand, tends to be controlled by larger, more predictable features (Buizza, 2014). That said, the disastrous floods in Pakistan in 2012 were well forecast a week ahead by ECMWF because, in that particular situation, the rainfall was controlled by large air movements that were relatively well understood (Hoskins, 2012).

The challenging reality is that predictability varies from case to case according to the physical factors controlling each phenomenon. Extracting predictive information across both space and time scales can allow us to unpick these convoluted problems and make real improvements in seasonal prediction (Hoskins, 2012).

With thanks to Brian Hoskins for his helpful review comments.

References

Lorenz, 1963. Deterministic non-periodic flow. JAS 20:130-141.

UKMO, 2014. Summer 2009. http://www.metoffice.gov.uk/climate/uk/2009/summer.html. Accessed 20/03/2014.

Bechtold, P., N. Semane, P. Lopez, J.-P. Chaboureau, A. Beljaars, N. Bormann, 2013: Representing equilibrium and non-equilibrium convection in large-scale models. ECMWF RD TM 705, available at http://www.ecmwf.int/publications/library/ecpublications/_pdf/tm/701-800/tm705.pdf

Buizza, R., 2014. Coupled prediction: opportunities and challenges. Seminar, Imperial College London, 18th March.

Scaife, A., 2014a. Forecasting European winters: present capability and potential improvements. Willis Research Network Seminar: Forecasting, Flood & Fortitude: An Afternoon with Willis, 18th March.

Scaife, A., 2014b. Skilful long range prediction of European and North American winters. GRL, in press.

Hoskins, B., 2012. The potential for skill across the range of seamless weather-climate prediction problem: a stimulus for our science. QJRMS 139(672):573-584.


[1] Convection is when local heating causes air to rise

Stranding our fossil assets or stranding the planet

By Helena Wright, Research Postgraduate, Centre for Environmental Policy

Earlier this month Carbon Tracker came to Imperial College London to discuss their report on ‘Unburnable Carbon’.  The report outlines research which shows between 60-80% of coal, oil and gas reserves of publicly listed companies are ‘unburnable’ if the world is to have a chance of keeping global warming below the globally-agreed limit of 2°C.  The event was followed by a lively debate.

The research, led by the Grantham Research Institute at LSE and the Carbon Tracker Initiative, outlines the thesis that a ‘carbon bubble’ exists in the stock market, as companies with largely ‘unburnable’ fossil fuel reserves are being overvalued.

In fact, the OECD Secretary-General Angel Gurria recently said:

“The looming choice may be either stranding those [high carbon] assets or stranding the planet.”

Digging a hole: ever deeper extraction, ever higher risks

The report found that despite these systemic risks, companies spent $674 billion last year to find and ‘prove’ new fossil fuel reserves.  Capital expenditure has been increasing, while production has been decreasing, with reserves ever harder-to-reach.

Companies like Exxon and Shell have been spending record sums trying to prove reserves, that ultimately risk being stranded in future. The research by Carbon Tracker suggests this is a faulty business model, and in fact risks inflating the ‘carbon bubble’.

If these high levels of capital expenditure continue, we will see over $6 trillion allocated to developing fossil fuel supplies over the next decade – a huge sum of wasted capital.  Luke Sassams outlined evidence that some companies are now starting to pick up on this and rein in their CAPEX spending.

Investors and regulators are now picking up on the issue.  A Parliamentary Report on the ‘carbon bubble’ was released last week, and Chair of the House of Commons EAC, Joan Walley MP, said: “The UK Government and Bank of England must not be complacent about the risks of carbon exposure in the world economy”.

Carbon Entanglement: Getting out of the bubble

One issue that has been highlighted is the fact that some OECD governments receive rents and revenue streams from fossil fuels.  There is also a policy and credibility issue.  If businesses do not believe governments are serious about tackling climate change, they may carry on investing in fossil fuels and perpetuate the entanglement.

It seems that investors are currently backing a dying horse. But continued expenditure on finding new fossil fuel reserves might also be testament to the failures of recent climate policy.

Some have argued the ‘carbon bubble’ thesis relies on the assumption that governments will act on climate change. But arguably, there is not a question of ‘whether’ this government regulation will happen, but merely a matter of ‘when’.   There is a systemic financial risk to fossil assets, whether the necessary government regulation happens pre-emptively, or as a result of severe climatic disruption.

In the discussion that followed, the audience discussed whether the ‘carbon bubble’ will actually burst, and several participants suggested it was likely to burst unless it is deflated in a measured way. An audience member asked: “Don’t the investors have the information already?” and various participants felt they do not, demonstrating the need for enhanced disclosure on carbon risk.

Finally, the discussion turned to institutional investors who are investing in fossil fuels.  Some commentators recognise the irony.  How can a pension fund claim to be helping pensioners, while potentially risking the lives of their grandchildren?  It has also been found that several universities invest in fossil fuels, including Imperial College, sparking a recent petition. The risks of climate change highlighted in the recently released IPCC AR5 report, are driving calls for all types of investors to recognise the risks of high carbon investment.

New Climate Economy Collaboration with Stanford University

By Phil Sandwell, Research postgraduate, Department of Physics and Grantham Institute for Climate Change

Stanford 2

This March, six Imperial students travelled to Palo Alto, California, to work with Stanford University students on the innovation stream of the New Climate Economy.

The Global Commission on the Economy and Climate was established to investigate the economic benefits and costs associated with climate change mitigation and adaptation. The flagship project of this is the New Climate Economy, a worldwide collaboration of internationally renowned research institutions. One such stream, focusing on innovation, was spearheaded by Stanford University and the Grantham Institute at Imperial College London.

The aim of this part of the project was to analyse how disruptive technologies, techniques or methods could develop, overtake their incumbent (and generally environmentally damaging) predecessors and mitigate greenhouse gas emissions. These ranged from carbon capture and storage to 3D printing, with my focus being concentrated photovoltaics (CPV).

Stanford 3

Beginning in January, we held weekly video conferences with the two Stanford professors facilitating the course. Using their guidance and experience, we established the current limitations of our chosen technologies, how they are likely to advance and the conditions under which their development can be accelerated.

After travelling to Palo Alto, we were divided into groups with the Stanford students based on the themes of our research, for example electric vehicles and car sharing. We then integrated our findings, investigating the synergies and similar themes, and together built models to quantify the potential for greenhouse gas emissions reduction and how it could become achievable.

My research led to the conclusion that CPV can become economically competitive with the most common solar technology, flat-­‐plate crystalline silicon cells, in the near future. Given the correct conditions being met (for example the cost per Watt continuing to decline as currently projected) CPV would compare favourably in regions with high direct normal irradiance, such as the Atacama Desert in Chile, the Australian outback and South Africa. One possible application of CPV would be supplying these countries’ mining industries, displacing their current fossil fuel-­‐intensive electricity generation and providing an environmentally responsible alternative – with even less embedded carbon and energy than silicon cells.

This project was a valuable addition to my PhD project, the focus of which is to investigate how several different photovoltaic technologies can mitigate greenhouse gas emissions. Collaborating on this project introduced me to interesting new ways of approaching my work, as well as identifying parallels between my research and that of others in the field of renewable energy technology.

A slow start to a global climate treaty

By Gabriele Messori, Stockholm University (former Imperial PhD student)

The United Nations’ climate negotiations usually gain the press spotlight once a year, when the big Conference of the Parties (COP) meeting takes place. The most recent COP, which took place in Warsaw last November, was discussed on this blog here. However, the efforts to design a global climate treaty under the umbrella of the United Nations are ongoing, and additional negotiations take place throughout the year. These are particularly important in preparing the ground for the COPs, and provide the occasion to iron out the contrasts which might hamper later work.

The most recent of these meetings took place last week in Bonn, Germany. Formally, this was the fourth part of the second session of the Ad Hoc Working Group on the Durban Platform for Enhanced Action, or ADP 2-4.The focus was on two distinct topics. Firstly, on the ongoing effort to design a global climate treaty, which should be agreed upon by 2015 and implemented by 2020. Secondly, on the promotion of ambitious mitigation plans for the period before 2020. However, several points of contention emerged in the talks.

Far from reaching a quick consensus on the key topics, the participating countries raised several procedural issues which bogged down the discussions. These ranged from trivial aspects, such as the size of the meeting rooms assigned to the different groups, to more important considerations on the modality of the negotiations.

The crucial point was whether to proceed with informal consultations or establish contact groups. In the jargon of the United Nations, a contact group is an open-ended meeting which provides a more formal setting for the negotiations. A contact group needs to be officially established and its sessions are generally open to observers. The last two years have seen the negotiations carried out as informal consultations. Some countries, among which the EU, opposed the creation of a contact group. Many others, including the least developed countries, argued that a new, formal setting was needed.

The latter proposal was finally adopted, thus establishing a contact group. However, the debate that preceded the decision was lengthy and time consuming. While having an appropriate setting for the negotiations is important, the focus should always remain on climate change, which is the reason for which these meetings exist in the first place!

A second crucial discussion concerned the Nationally Determined Contributions (NDCs). These are national plans for action on climate change, made by all countries participating in the talks, and should form an important part of the 2015 climate treaty. At present, there is still no clarity on fundamental points such as the form the NDCs should take, the topics they should address and the mechanisms for evaluating their progress. There is also a strong disagreement on how the burden of action should be shared between developed, developing and least developed countries. This is just a small selection of the unanswered questions concerning the national contributions; the complete list is much longer. Positions on these key aspects vary greatly. As an example, Brazil explicitly asked for the contributions to encompass the full range of actions needed to tackle climate change, including both mitigation and adaptation. Tuvalu, on the other hand, clearly stated that the NDCs should focus primarily on mitigation. Agreeing on the nature of the NDCs is one of the most challenging aspects of the negotiations.

On a more positive note, the work on pre-2020 action included for the first time technical expert meetings. These are meetings where experts can share best practices on renewable energy and energy efficiency with the country delegates. The meetings were praised by the vast majority of countries, and there were requests by a number of delegates, including those of the EU and the USA, to arrange similar meetings in future negotiations.

The week-long talks in Bonn also addressed many other topics, including transparency and equity in the 2015 climate agreement and climate finance.

Leaving aside the contrasts over specific items of the agenda, and considering the larger picture, the impression in Bonn was of a framework that is still missing some of its essential elements. While the technical expert meetings had a promising start, a lot still needs to be done both in terms of pre-2020 action and the 2015 climate treaty. In Warsaw last year, countries agreed to present their Nationally Determined Contributions “well in advance” of the 2015 COP, which will take place in Paris. In order for this to happen, there needs to be a rapid acceleration of the negotiations, and issues such as procedural aspects need to be dealt with swiftly, so that the discussion may focus on more concrete aspects of action on climate change.

2014 – A pivotal year for CCS?

By Dr Niall Mac Dowell, Centre for Environmental Policy

For centuries, all of the world’s economies have been underpinned by fossil fuels.  Historically, this has primarily been oil and coal, but since the mid-1980s natural gas has become increasingly important. Over the course of the last decades, there has been an increasing focus on electricity generation from renewable sources, and since about 1990 carbon capture and storage (CCS) has become an important part of the conversation around the mitigation of our greenhouse gas (GHG) emissions.

The role of CCS in addressing our GHG mitigation targets is clear and unambiguous – see for example the IEA CCS technology roadmaps which show that by 2050, almost 8 GtCO2/yr needs to be sequestered via CCS; a cumulative of 120 GtCO2 in the period from 2015 to 2050. Tellingly, this means that we need to see real action on the commercial scale deployment of CCS globally by 2015 such that we have at least 30 installations around the world actively capturing and sequestering CO2 from a range of industrial and power-generation plants. Currently, there are 8 CCS projects around the world which are actively capturing and sequestering CO2 – primarily in North America (Shute Creek, Val Verde, Enid Fertilizer and Century Plant in the US and the Weyburn-Midale project in Canada) and Europe (Sleipner and Snøhvit in Norway), although Algeria have also been operating the In Salah project since 2004.

However, it is notable that none of these plants are capturing CO2 emitted from power stations; rather they are capturing from industrial sources from which CO2 arises in a stream suitable for transport and storage. This is particularly important as CO2 emissions from power generation represent the single largest source of global emissions.

For this reason, it is particularly encouraging to note the UK’s leadership position in this area. Following from our signing into law the mandate to mitigate by 80% our GHG emissions by 2050, the Department of Energy and Climate Change (DECC) have recently signed agreements for Front End Engineering Design (FEED) studies for two commercial scale CCS projects; the Peterhead project and the White Rose project.

These are two really exciting projects, both of which represent real world firsts. The Peterhead project is a collaboration between Shell and SSE and is a retrofit of post-combustion capture plant to an existing power plant. This project is intended to operate in a base-load fashion and follows on from the Boundary Dam CCS project in Canada which also uses Shell technology. However, a key distinction between the Boundary Dam and Peterhead projects is the CO2 source; Boundary Dam is a coal-fired power plant whereas Peterhead is a gas-fired power plants. From an engineering perspective, these plants present significantly distinct CCS challenges, and therefore the Peterhead project represents a real step forward.

It is, of course, important to emphasise the importance of the Boundary Dam project. Returning to the IEA’s CCS technology roadmaps, we can see that CCS on coal-fired power plants is of vital global importance; potentially contributing to about 40% of emission mitigation in both OECD and non-OECD countries.

The White Rose project on the other hand is an example of oxy-combustion technology applied to a coal-fired power plant. This project is a collaboration between Alstom, Drax Power and BOC. Here, instead of performing a retrofit, the White Rose project is building a brand new, state-of-the-art 450MWe super-critical power plant which has the capacity to co-fire biomass and coal which, when combined with CCS can lead to the plant producing carbon negative electricity. Importantly, the White Rose plant will have an emphasis on the generation of flexible power; something which is key as we have more and more intermittent renewable energy in our energy system.

Thus, 2014 is the year where CCS on power generation becomes a reality. Given the fact that fossil fuels will remain a vital part of the world’s energy landscape for some time to come, with some sources indicating that they will account for over 66% of the world’s energy by 2100, it is almost impossible to over emphasise the importance of our ability to utilise them in an environmentally benign and sustainable way. For this reason, I believe 2014 represents a pivotal year; one which, in time, we will look back on as being the dawn of the age of sustainable fossil fuels.

Moving from tactics to strategy: extreme weather, climate risks and the policy response

 By Dr Flora MacTavish and Dr Simon Buckle

In the press coverage of the recent floods, there has been a lot of discussion about whether the authorities could have been better prepared or responded more effectively. The National Farmers Union has called for the reintroduction of river dredging, although experts argue that dredging may be limited in its effectiveness. Local authorities have been criticised by experts for distributing sand bags rather than encouraging the use of more effective alternatives such as wooden or metal boards.

These are essentially tactical issues, however. It is the government and local authorities that have the vital strategic responsibility for fully embedding weather and climate risks into decisions on the level and focus of investment into flood defences and planning regulations about what can be built and where.

The persistence of the weather pattern that has caused this winter’s exceptional rainfall and floods has been very unusual.  However, as the Adaptation Sub-Committee of the UK Climate Change Committee noted in their 2011 report, heat waves, droughts and floods are all expected to get worse as a result of climate change. The recent Intergovernmental Panel on Climate Change assessment of the science (AR5) concluded that average precipitation was very likely to increase in the high and some of the mid latitudes, with a likely increase in the frequency and intensity of heavy precipitation events over land (see our note on The Changing Water Cycle).  If we are to improve our resilience, we need to get the strategic policy framework and incentives right.

Unfortunately, for flood risk, this doesn’t seem to be happening yet, despite the Pitt Review after the 2007 floods. In 2011, the Climate Change Committee noted a decline in urban green space in each of six of local authority areas studied, and an increase in hard surfacing in five of the six. The Committee’s 2012 report showed that the UK has become more exposed to future flood risk. It judged that four times as many households and businesses in England could be at risk of flooding in the next twenty years if further steps are not taken to prepare for climate change.

In particular it noted that:

  • Development within the floodplain in England has grown at a faster rate (12%) than development outside it (7%) over the past ten years;
  • One in five properties built in the floodplain have been in areas of significant flood risk:
  • Levels of investment in flood defences and uptake rates of protection measures for individual properties will not keep pace with the increasing risks of flooding due to climate change.

The Committee has acknowledged that the economic and social benefits of new developments may not always be outweighed by the risks of building on flood plains.  Decision makers should weigh up the trade-offs between long term risks such as climate change and other shorter term priorities, but the Committee judged that this was not happening “widely or consistently” at the time they wrote their 2011 report.

The government is in the process of trying to implement the Flood Re scheme to address concerns over the affordability and availability of flood insurance, but as our colleagues at the Grantham Research Institute at LSE have noted in their response to the government consultation,

“The design of the Flood Re scheme, which is expected to last until at least 2035, has not taken into account adequately, if at all, how flood risk is being affected by climate change. For this reason, it is likely to be put under increasing pressure and may prove to be unsustainable because the number of properties in future that will be at moderate and high probability of flooding has been significantly underestimated. “

Whether or not these particular floods are due to climate change, this is the sort of thing we expect to see more of in the future. When the immediate crisis is over, the government needs to think hard about its strategic response, which must include mitigation action as well as measures to develop greater resilience to weather and climate related risks.

Workshop on climate science needed to support robust adaptation decisions

By Dr Simon Buckle

I just wanted to highlight the great event we held last week with Judy Curry at Georgia Tech on how we can use climate science to help us make better decisions – in business, government, health and development.  Do have a look at the presentations from the really diverse group we managed to assemble in Atlanta, from international organisations, business, development agencies, NGOs and research.

A few  points strike me as worth (re)emphasising:

  • Climate models are extremely valuable tools for assessing climate change over the rest of this century, but even the most advanced climate models are not yet able to provide detailed information with sufficient confidence on the variability and change of regional climate in the next few decades. This will take time and money (higher resolution, more computational power).
  • So trying to forecast the climate in 5, 10 or 20 years time is right at the research frontier, but many decision makers aren’t as hung up over the uncertainties in climate projections as the scientists.  They’re used to dealing with uncertainty and some of the factors they need to take into account are way more uncertain than how the climate will change;
  • It’s the holistic view of risk that matters. In other words, how climate variability and change interacts with other factors such as population, urbanisation, economic growth, degradation of ecosystems, land use change etc;
  • Scientists working on decision relevant issues need to think really hard about the decision making context.  Who is making decisions? What is the motivation? What is being decided and what are the relevant timescales? And are the research methods and outputs relevant and informative? Are there alternative approaches that might increase the robustness of decision making in the face of uncertainty? Are the limitations of the research transparent to the decision makers who might use it?
  • Many different approaches are emerging from collaboration among decision makers and scientists that can supplement the valuable insights gleaned from climate models and help inform robust decision making in the face of climate variability and change.
  • Even if some prominent UK politicians still have their heads in the sand over climate risks, major businesses, governments and development organisations are already factoring climate into their decision making.

You can read a more detailed summary of the workshop on the Grantham Institute website.

Climate change and health risks – new commission launched

By Siân Williams, Research postgraduate, Department of Physics and Grantham Institute for Climate Change

healthv3
Georgina Mace and other panelists at the UCL Institute of Global Health event on 16 January. Photo: S. Williams

In 2009 a joint report between University College London and The Lancet stated, “Climate change is the biggest risk to global health of the 21st century”. The work highlighted extreme weather events, changing patterns of disease and food and water insecurity.

Now a second UCL-Lancet commission is underway. Last month, UCL’s Institute of Global Health hosted a launch event for the report entitled ‘Climate crisis: emergency actions to protect human health’.

The event was chaired by UCL’s Anthony Costello, head of the first Lancet commission. Panellists involved in the new commission include scientists and economists from UCL, Tsinghua University in Beijing and the Stockholm Resilience Centre.

The new commission is structured with five working groups. Its aims range from drawing out the key implications of the IPCC’s Fifth Assessment Report on health through to assessing the financial and policy mechanisms available to governments to protect their citizens against the worst impacts of climate change.

The format of the event allowed a wide range of issues to be discussed. These included the impacts of climate on mental health and the disillusion felt by many towards the COP international climate negotiation process.

Isobel Braithwaite, from the student-led Healthy Planet organisation, commented that “The first UCL-Lancet commission really served to shift the discussion away from climate change being just about ice caps and polar bears to an issue that’s ultimately about people’s health, so it’s exciting to hear that a second commission’s now underway. It sounds like this second one will go into much more depth on the actions we need to take to avert the major health crisis posed by unmitigated climate change”.

At the most recent COP negotiations in Warsaw, Healthy Planet formed part of the protest movement against Poland’s plans for future coal plants. The topic of climate and health will continue to be in the spotlight during Health Planet’s national conference, which will take place over the first weekend of March. Tickets for the event are available here.

The TROPICS research cruise from Tenerife to Trinidad: Tracing oceanic processes using corals and sediments

 By Torben Struve, Research Postgraduate, Department of Earth Science & Engineering  and Grantham Institute for Climate Change

Tropics_logo

 

 

 

 

 

 

How to start a retrospective on two amazing months at sea? Probably at the beginning! In the beginning there was…an idea! The idea was to reconstruct abrupt changes in chemistry and ocean circulation in the Equatorial Atlantic Ocean to learn about global climate and deep-water habitats. The plan was to do so by collecting sediments, seawater and deep sea corals and analysing all of these for their geochemical composition.

Developing this idea into our actual scientific cruise, JC094, took several years of planning and preparation, led by principal investigator and chief scientist Dr. Laura Robinson (University of Bristol) and funded by the European Research Council. The closer the day of embarkation, the busier the participants: on the one hand everyone has to pass medical examinations and safety training courses and on the other hand getting all scientific equipment sorted before leaving port is very important as it is too late to receive mail deliveries once at sea!

IMG_0129 20131129-IMG_5453

 

 

 

 

 

 

 

 

Left: The RRS James Cook at the dock in Tenerife (Photo by: Torben Struve). Right: Science party of expedition JC094. Standing row (left to right): Martin Bridger, James Cooper, Paul Morris, Lucy Woodall, Mélanie Douarin, Stephanie Bates, Michelle Taylor, Allison Jacobel, Veerle Huvenne, Leigh Marsh, Vanessa Fairbank, Kais Mohamed Falcon, Shannon Hoy, Maricel Williams, Peter Spooner, Laura Robinson, Marcus Badger. Sitting row: Jesse van der Grient, Kate Hendry, Torben Struve, Hong Chin Ng. (Photo by: Sam Crimmin)

We were lucky that our vessel, the 89.5 m long RRS James Cook, was docked in Southampton before our cruise, giving us the opportunity to spend a few days at the National Oceanographic Center (NOC) in Southampton to prepare the science facilities on board so that the labs and our equipment are ready-to-go once we were at sea. Our swimming laboratory, the RRS James Cook sailed ahead of us and we met her again for embarkation in Tenerife on the 13th October. On the afternoon of the 13th October we left the port of Tenerife. Although this was our last land experience for seven weeks every participant of this multi-national expedition (British, US, French, Dutch, Belgian, Malaysian, Spanish and German) was excited about finally launching JC094.

Our aim was to collect of a wide range of sample material in order to unravel modern and past secrets of the deep equatorial Atlantic Ocean.

The Atlantic Ocean is separated into two basins by the Mid-Atlantic Ridge (MAR which is part of a global sub-marine mountain range) allowing only restricted deep-water exchange between these basins via the Vema Fracture Zone. The measurement of modern seawater properties is crucial for achieving our scientific goals. The distribution patterns of deep-sea species in the modern ocean are poorly understood and are, besides seafloor topography most likely linked to seawater chemistry. Reconstructions of past ocean properties (paleoceanography) are based on proxies extracted from marine archives, i.e. past seawater properties are reconstructed with chemical tracers extracted for instance from marine carbonates like foraminifera (single-celled organisms) shells or deep-sea corals. Such proxy work relies on modern calibrations of the chemical tracer extracted from live specimen against seawater.

For this purpose we aimed to collect seawater, sediment and a wide range of biological samples including the most-desired deep-sea corals. Our five sampling locations spanned across the equatorial Atlantic from east to west: Carter and Knipovich seamounts in the eastern basin, the Vema fracture zone at the Mid-Atlantic Ridge and the Vayda and Gramberg seamounts in the western basin.

During the expedition the science party was divided into two 12 hour shifts from 4am(pm) to 4pm(am) covering a day’s 24 hour cycle. Each scientist was trained in various methods and techniques in order to help dealing with all the different types of sampling techniques applied during JC094: seawater sampling with a CTD rosette, hydroacoustic surveying, long and short coring as well as collecting and processing coral samples collected with the remotely operated vehicle (ROV) ISIS.

 Map_tropics

Cruise track of JC094 from Tenerife to Trinidad. EBA: Carter seamount; EBB: Knipovich seamount; VEM: Vema Fracture Zone; VAY: Vayda seamount; GRM: Gramberg seamount. (Map created by: Shannon Hoy)

Seawater sampling with a CTD rosette:

Seawater is usually sampled with a CTD rosette (conductivity-temperature-depth) measuring various seawater properties online and collecting seawater samples at particular depths with the 24 Niskin bottles attached to the frame. At every sample location we started our scientific program with a CTD profile. A CTD profile across the entire water column (~4500 m water depth) took about 4 hours making sample collection a time-consuming business. Once back on deck, the actual work started with sampling the Niskin bottles for dissolved oxygen, carbonate chemistry, radiocarbon, nutrients and trace elements following a strict scheme. This could usually be done within one 12 hour shift and the day shift (4 am to 4 pm) had the privilege of processing all CTD rosettes during JC094.

Meanwhile, the ship moved on for hydroacoustic surveying of the sampling location. Such hydroacoustic surveys are crucial to determine good locations for sediment coring and ROV dives since most deep-sea floor in the area has never been mapped.

Water_sampling

 

Photo 3 (seawater): Procedure of seawater sampling with a CTD rosette. (1) Sensors reporting back to main lab computer, (2) recovery of the CTD rosette, (3,4) seawater sampling from Niskin bottles and (5) sealed and labeled seawater samples for oxygen isotope analyses (Photos by: (1,2) Torben Struve, (3) Mélanie Douarin, (4,5) Vanessa Fairbank)

Deep-sea sediment sampling:

The sediment coring efforts focused on recovery of surface material, and combined with long cores reaching back to at least the Last Glacial Maximum, i.e. 20,000 years ago. The rate of sediment deposition in the deep sea is on the order of 1-3 cm per 1,000 years and may be dominated by foraminiferal shells. During JC094 we used two different coring techniques: long coring and short coring.

Long coring allows deep penetration of a metal barrel (we used 12 m long barrels) into the sediment providing long sediment records. Once on deck, long cores are cut into 1.5 m segments, split into two halves and sub-sampled for chemical and physical analyses. As a result of the long coring technique the top part of the sediment column (sediment-seawater interface) is disturbed/lost.

Longcoring

Photo 5 (Long coring): Long coring work flow. (1) Coring device is back at the surface, (2) metal barrel needs to be aligned along starboard before it can be craned back on deck, (3) pulling the core liner (yellow tube) holding the sediments out of the metal barrel and cutting the liner into sections, (4) splitting the core liner sections into two halves: work and archive, (5) archive half of ~ five meter long sediment core and (6) D-tube which is used for long-term storage of sediment core sections. (Photos by: (1,2,4) Torben Struve, (3,5) Mélanie Douarin, (6) Stephanie Bates)

Megacoring allows collection of undisturbed short cores so that both coring techniques complement one another. Most of the short cores are sliced, bagged and stored right away whereas some have been investigated with respect to anthropogenic impact, i.e. microplastics.

Megacoring

 

 

 

 

Photo 4 (megacoring): Processing samples from a megacorer. (1) Recovering the megacorer, (2) a single megacore tube on the sediment extraction table, (3) slicing sediments of a megacore tube and (4) sliced and bagged sediment samples. (Photos by: (1) Hong Chin Ng, (2, 4) Mélanie Douarin, (3) Jesse van der Grient)

ROV (Remotely Operated Vehicle) dives:

The main focus of this expedition was diving with the ROV (Remotely Operated Vehicle) ISIS which is basically a robot of the size of a small car connected to the ship with a cable. An onboard CTD reported seawater properties, various cameras allowed online seafloor observation, two robotic arms used various tools for selective sample collection and a hydroacoustic system allowed ultra-high resolution seafloor mapping. At any time during a dive, at least three scientists and two pilots (rotating with replacement teams) were in the control unit making sure that we got the most out of every single dive. Such dives could be quite long and during JC094 we also established a new record of longest ISIS diving time, i.e. 43 hours and 43 minutes!

ROVing

Photo 6: Operating the ROV ISIS from RRS James Cook. (1) Deploying the ROV, (2) insight into the control unit on deck housing screens for the various cameras and instruments onboard ISIS, (3) sample collection at the seafloor with one of the two mechanical arms, (4) recovery of ISIS with the port side A-frame crane and (5) ISIS is back on deck with some unexpected bycatch: fishing lines. (Photos by: (1) Mélanie Douarin, (2) Torben Struve, (3) ISIS, (4,5) Vanessa Fairbank)

Our ROV dive efforts focused on the collection of live and fossil (i.e. dead) sample material, and in particular on deep sea corals. With regard to investigations of past ocean properties, deep sea corals have the advantage of growing in places where sediment deposition is either lacking or discontinuous, i.e. for instance on steep slopes of seamounts and in high current environments. Our cruise track across the Atlantic was designed to target seamounts peaking up to more than 4000 m from the seafloor allowing us to collect samples over a wide depth range. Live coral specimen are used for calibration and method development purposes so that such methods may eventually be applied to fossil deep sea corals revealing secrets about past ocean properties.

Besides deep-sea corals, live specimens of various types of deep-sea species have been collected for DNA analyses which allow drawing conclusions about deep-sea species’ distribution patterns.

Furthermore, we also ran ultra-high resolution seafloor and habitat mapping campaigns with the ROV, trying to investigate potential links between bathymetry and deep-sea species’ habitats. Such data may be combined with the seawater data and thus, unraveling major biogeographical relationships between deep-sea biology, hydrography and bathymetry.

Dive_impressions

Photo 7: Impressions of deep-sea ROV dives during JC094. (Photos by: (1) Jesse van der Grient, others by ISIS)

The sample recovery from the ROV on deck had to be done quickly: all biological samples including live and fossil deep-sea corals were transferred into the cold room lab for identification, separation and documentation. Sediment and seawater samples collected with the ROV were processed separately from the biological samples. The fossil corals were separated from live samples and transferred into the deck lab for drying, sorting and identification – one of the most puzzling tasks on board!

Sample_sorting

Photo 8: Processing samples collected with the ROV. (1) Sample recovery from the various trays and boxes on ISIS, (2) samples placed in buckets, sorted by water depth and location, (3) sorting, documenting and archiving of biological samples in the cold room, (4) fossil deep-sea coral samples are transferred from the cold room into the deck lab for drying and identification, (5) dried and sorted fossil coral samples waiting for (6) photographing and bagging. (Photos by: (1) Vanessa Fairbank, (2,6) Mélanie Douarin, (3,4) Torben Struve, (5) Hong Chin Ng)

So we moved across the Atlantic Ocean collecting thousands of samples during the seven weeks and everybody was involved in processing all types of samples, preventing the work to become monotonous. Eventually, it came the time to say goodbye and after seven amazing weeks on board RRS James Cook expedition JC094 ended in Port of Spain, Trinidad. Everybody carried home memories of a great experience and scientific success at sea. Now, we’re looking forward to receiving the samples for detailed chemical analyses.

Find out more on the Tropics project website.

Beyond adaptation: loss and damage negotiation at the United Nations

By Gabriele Messori, Research postgraduate in the Department of Physics

The 19th Conference of the Parties of the United Nations Framework Convention on Climate Change (UNFCCC) took place last month in Warsaw, Poland. These conferences are at the core of the international negotiations on climate change, and set the scene for future climate policies around the world. By most accounts, the Warsaw meeting had mixed results – it marked progress in some areas and stagnation in others. One of the most contentious negotiation streams, and one where some measure of progress was made, was loss and damage.

The current approach to climate change is based on two pillars: mitigation and adaptation. Mitigation is concerned with minimizing climate change. Adaptation is the result of the failure of mitigation to prevent climate change, and is aimed at adapting to a climate different to the one we have been familiar with in the past. However, even adaptation has it limits. For example, in the future it might become unfeasible for low-lying island states to adapt to rising sea levels. Loss and damage is meant to tackle cases where both mitigation and adaptation have failed. The concept stems from the realization that a changing climate will imply rising human and economic losses for our planet.

From a scientific standpoint, the problem of loss and damage is very complex. Any agreement on damage associated with climate change will need to have clear guidelines defining what can be ascribed to climate change and what cannot. The issue becomes particularly contentious for extreme events. In fact it is very hard, if not impossible, to associate a single weather event to changes in the state of the climate.

From a political point of view, the contentiousness of loss and damage mainly arises from two distinct considerations. The first is that the main beneficiaries of a loss and damage agreement would be low-income, low-resilience countries. A number of developed nations therefore view an agreement as a very costly form of climate finance. The second important aspect is that loss and damage is intimately tied to the idea of historic responsibility. The developed countries have been emitting greenhouse gases for much longer than the least developed and developing ones, and are therefore responsible for a large part of the cumulative emissions budget. Because of this, the agreement on loss and damage is very complicated under the legal aspect, with developed countries fearing that it might lead to the question of climate change liability.

In Warsaw, the negotiations on loss and damage were aimed at establishing a global framework to tackle the issue. Some of the developed countries, notably Australia, initially refused to commit any finance specifically to loss and damage. At the other end of the scale, a large coalition of small island states, least developed and developing countries demanded a comprehensive agreement, explicitly addressing “permanent losses and irreversible damage, including non-economic losses”. As expected, the final result was a compromise between these two extremes.

The “Warsaw international mechanism for loss and damage associated with climate change impacts” requires (developed) countries to provide financial, technological and capacity-building support to address the adverse effects of climate change. Next year the executive committee of the mechanism will develop a work plan, and a review will take place in 2016. After 2016, an “appropriate decision on the outcome of this review” will be made. Crucially, the agreement explicitly mentions that “loss and damage associated with the adverse effects of climate change includes, and in some cases involves more than, that which can be reduced by adaptation”. At the same time, however, the mechanism is placed under the adaptation pillar of the negotiations. This was bitterly opposed by many countries, which asked for loss and damage to be entirely distinct from adaptation.

The agreement was a hard-fought compromise between very distant negotiating positions, and will hopefully provide the foundations for an effective loss and damage global framework. The 2016 review, and the subsequent “appropriate decision”, remain important open questions regarding the future of the mechanism. A lot will depend on how the different parties will approach the review process. In addition to this, the scientific question of how to ascribe specific damage to climate change has largely been overlooked. Ultimately, only time will tell how effective the recent compromise reached in Warsaw will be.

Seasonal chill

By Professor Sir Brian Hoskins

The US has been suffering from icy weather and snow storms in recent days.  This image from NOAA shows the surface air temperature anomaly for the week 2-8 December – that is the difference from the mean temperature for this time of the year.

It was very cold over North America (where we get lots of news from!) but very warm in Eurasia and parts of the Arctic (where we don’t!). This is the sort of thing the atmosphere can do on short timescales through having a particular pattern of weather.

 

2-8 Dec temps