Author: Emma Critchley

Responding to environmental change

This blog post is part of a series on Responding to Environmental Change event, organised by the Natural Environment Research Council (NERC) funded Doctoral Training Partnerships at Imperial (SSCP), and the University of Reading and the University of Surrey (SCENARIO).

A recent event in London brought together emerging environmental scientists (PhD students and early career researchers) with leaders from business, policy and academia to explore the challenges posed by environmental change and opportunities to work in collaboration to respond to these.

Communities today find themselves and the environments they live in under increasing pressure. This is driven by growing populations, urban expansion and improving living standards that place increasing stress on natural resources. Added to this is the rising threat from environmental hazards and environmental change.

Research, development and innovation within the environmental sciences and beyond offers the opportunity to manage these pressures and risks, exploring how we can live sustainably with environmental change, whatever its drivers.

Discussion at the event covered three key societal challenges and their implications for business and policy. A summary of these talks has been captured by students attending the event and can be found below.

The event was organised by the Natural Environment Research Council (NERC) funded Doctoral Training Partnerships at Imperial (SSCP),  and the University of Reading and the University of Surrey (SCENARIO).

Benefiting from natural resources

Natural resources are fundamental for wellbeing, economic growth and sustaining life. Greater demand for food, water and energy requires better management and use to reduce stress on natural systems and ensure a sustainable future.

Read more in a report by Jonathan Bosch, a first year SSCP PhD student researching transitions to low-carbon energy systems.

Resilience to environmental hazards

Environmental hazards are becoming more frequent and severe, with potentially serious impacts on people, supply chains and infrastructure globally. Advancing our knowledge and understanding of these hazards, and the processes involved, will allow us to better predict, plan for and manage the risks in order to increase resilience to these changes.

Read more in the report by Malcom Graham, a first year SSCP PhD student researching saline intrusion in coastal aquifers.

Managing environmental change

In addition to natural variability, human activities are causing rapid, large-scale climate and environmental change. Understanding how these processes work as a whole Earth system can improve our understanding of the impacts of these changes and inform responsible management of the environment.

Read more in a report by Rebecca Emerton, a first year SCENARIO PhD student researching approaches to global forecasting of flood risk.

The Road to Paris 2015 – COP 21

Matthew Bell, Chief Executive at the Committee on Climate Change, concluded the event with a talk on the road to Paris and the issues that could be faced in the climate negotiations.

Read more in a report by Samantha Buzzard, a third year NERC PhD student at Reading investigating the role of surface melt in the retreat and disintegration of Antarctic ice shelves.

 

Watch videos of all the talks on our YouTube channel.

Find out more about the Science and Solutions for a Changing Planet DTP at Imperial College London.

Find out more about the SCENARIO DTP at the University of Reading and University of Surrey.

Benefiting from natural resources

This blog post by Jonathan Bosch, an SSCP DTP student, is part of a series on Responding to Environmental Change, an event organised by the Natural Environment Research Council (NERC) funded Doctoral Training Partnerships at Imperial (SSCP), and the University of Reading and the University of Surrey (SCENARIO).

See the full list of blogs in this series here.

Biofuels

Natural resources are fundamental to human well-being, economic growth, and other areas of human development. Greater demand for food, water and energy resources against the current backdrop of climate change and population growth requires better management and more efficient use of natural resources to reduce the resulting stress on the earth’s natural systems.

In this “benefiting from natural resources” section of the programme there were three talks from representatives of three distinct sectors, presenting how the respective areas of industry, regulatory bodies, and academia are currently dealing with the management of natural resources.

 Sustainable business

Andy Wales, Director of Sustainable Development at SABMiller plc, made an arresting case for why sustainability is not only important for their business model, but also why it’s vital for its continued success. SABMiller is a multinational beer and soft drinks producing company.

SABMiller presents itself as a local beer brand, although it operates in 40 countries. As such, the business is exposed to the perturbations and vulnerabilities of, principally, local water supplies, but also grain and packaging supply chains. And with 80% of its income coming from developing markets, it cannot secure its future profitability without smart resource management. Procuring primary products from local markets is important to achieving this and therefore water management is critical.

The ‘Prosper’ programme, sits on five sustainable development pillars, and has as its catchphrase, “When business does well, so do local communities, economies and the environment around us. When they prosper, we do.” The five pillars are associated with a “thriving, sociable, resilient, clean and productive world.” Encompassed in these areas is an acknowledgement that not only do water supplies matter, but, for example, ‘clean’ – reducing its carbon footprint, and ‘productive’ – food and land security, are central to ensuring a profitable future.

A number of case studies went some way in demonstrating the achievements of Prosper to date. Already, $40m in efficiency savings have been achieved by programmes implemented in Colombia and India, using a systems approach which helped farmers choose better crop types – reducing water consumption by 30% and raising crop yield by 20%.

In Bogota, India, Prosper highlighted issues of poor land management which caused regular and intolerable spikes in water prices. Water run off was high and productive yield of food crops and milk production was low. A sophisticated approach tackled the problem by simply changing the breed of local milk cows to better benefit from the local ecological conditions. The result was an increase in the milk yield of the region and a sharp reduction in water run-off, securing milk and water availability for all users.

Prosper continues to forge collaborations worldwide in the nexus of water, food and energy security. A partnership with the WWF will continue development in that direction.

 Environmental regulation

Miranda Kavanagh, Executive Director, Evidence Directorate of the Environment Agency, focused her talk on ‘Fracking’, or hydraulic fracturing, which is one of the unconventional techniques of oil and gas extraction currently attracting world-wide media attention for its, as yet, undetermined environmental risks.

The Environment Agency’s role is in delivering on a policy framework set by the relevant government agencies, principally DEFRA. It has three specific roles in achieving this objective: Regulating industries and activities that can potentially harm the environment; advising government, industry and the public about more sustainable approaches to the environment; and specific operational work to protect and improve the environment.

The Environment Agency (EA) is guided by its Evidence Directive, which aims to use evidence to “guide and inspire” their actions and those they advise. It states that they must use the best available evidence, use environmental data to support the decisions of others, and develop a joined up approach to evidence, among other equally impressive visions.

On Fracking, the EA must balance, pragmatically, the needs and interests of different groups concerned with environment, resource exploitation and people, as Kavanagh clarified in the Q&A session. As well as the pure environmental impacts, the EA must consider the effects on people and communities, but also the need for fuel exploitation and energy security; areas of interest of both government and the energy sector.

These needs were highlighted in the Potential Contribution report produced by the UK Institute of Directors, which highlighted the social benefits of one scenario to include a likely decrease in the use of imported gas, 70,000 energy jobs and a net benefit to the Treasury.

These benefits are offset by the environmental risks, which are complex, and in some cases, undetermined. The known risks involve a range of air, land and water pollution, the release of chemical and radioactive substances, and a range of spatial and time dependent risks, which will affect exploited regions differently, and on differing timescales. For example, ground water contamination may take decades to become detectable.

The EA works in many areas to produce evidence for the advice and regulation of future potential fracking operations. The EA was, for example, instrumental in producing a UK geological map of the subsurface extent of shales and their vertical separation to aquifers. These were important as a preliminary risk assessment for a broad geological understanding of the importance and distribution of our groundwater resource. This type of evidence gathering must be done for the range of environmental concerns listed above.

Also highlighted were collaborations and opportunities which the EA are eager to develop. NERC Fellowships and various PhD Studentships are ongoing and include projects as broad as evaluating methodologies for environmental risks, but also the invention and patenting of new instruments for air-quality impacts and other applications.

The EA welcomes partnerships, particularly those involved with their Collaborative Research Priorities. Their expertise and extensive datasets are important resources which other organisations with similar resources and objectives may make use of to aid progress on some key questions within applied environmental science.

 Ecosystem services

Elizabeth Robinson, Professor of Environmental Economics at the University of Reading presented two projects demonstrating her work on how scientists and social scientists can and have been working together to improve our ability to benefit from natural resources.

There has been, in the past, little need to actively manage our natural resource base, but the pressures of climate variability and population growth have made optimising the use of these resources effectively much more important. The relationship between ecosystem services – measurable by ecological scientists – and agricultural intensity – understood by management and social structures – becomes a crucial collaboration.

But what is the relationship between these inextricably connected issues? Robinson was concerned, as a trained economist, with ‘drawing a curve’ between these two dimensions, which would describe how and why a change in the intensity of agriculture would affect the ecosystem services which are critical to the sustainability and well-being of communities.

In Ghana, cocoa production was investigated in order to understand how some farmers may choose to intensify their agriculture and why some do not, and furthermore some intensifications were damaging the ecosystem more than others. Ecologists were employed to determine the relationship between intensity and ecosystem services, while social scientists interviewed farming communities to discover how the forest land was managed and what were the limiting factors to best managing the land in benefiting farmer yield and ecosystems.

It was found that there are often complicated social factors affecting how the farms and forest land was managed, and these included the ability of farmers to use or afford fertiliser, shift cultivation to newly converted forest when soils are exhausted, and even whether farmers benefit from pollination by nearby forests. It was seen that many local perceptions of resource space and property rights restricted the farmers’ ability to optimise their practices even if they desired to do so. Among many other constraints, poverty, labour availability and wages, and institutional contexts affect the outcomes when farmers attempted to intensify their practices.

Ultimately, a simple behavioural model can attempt to capture the ecological boundaries, and social constraints, and can be used to propose routes toward an optimum solution for ecosystem services and farmer preferences and resources.

The second case study was related to managing fisheries in Tanzania, where such efforts are typically addressed only when falling stocks become an issue. This project also highlighted the need to observe the socio-economic situation and implement credible solutions which may indeed lead to a slower recovery of the ecology, but which resolve societal tensions and allow the fishing communities a reliable income without implementing a total fishing ban. A ‘Bio-economic’ model was indispensable in this project too.

Watch a video of the talk on our YouTube channel.

Resilience to environmental hazards

This blog post by Malcom Graham, an SSCP DTP student, is part of a series on Responding to Environmental Change, an event organised by the Natural Environment Research Council (NERC) funded Doctoral Training Partnerships at Imperial (SSCP), and the University of Reading and the University of Surrey (SCENARIO).

See the full list of blogs in this series here.Road landslip

Environmental hazards are becoming more frequent and severe, with potentially serious impacts on people, supply chains and infrastructure globally. Advancing our knowledge and understanding of these hazards, and the processes involved, will allow us to better predict, plan for and manage the risks in order to increase resilience to these changes.

This session focussed offered perspectives from academia (Imperial College London), the world of (re)insurance (Willis Re) and the charity sector (Oxfam).

Evaluating risks

David Simmons, the Head of Strategic Capital and Result Management at Willis Re, began proceedings and impressed us by speaking with no slides or notes, describing it as a ‘liberating’ experience. Despite (or perhaps helped by) the absence of visual aids, his delivery was nevertheless engaging and humorous.

His talk focussed on the world of reinsurance, which he assured us was the ‘sexy’ part of the insurance sector, specialising as it does in catastrophe risk. He contrasted this with the banal nature of regular insurance work and the social death that ensues for most practitioners.

We were told that reinsurance, which covers the insurance companies themselves against major disasters, is suffering from too much capital. Stoically, David explained the reasons behind this: essentially, due to financial uncertainty in other sectors, no one else could offer the low risk and high returns on investment now commonplace in the reinsurance industry. This he attributed to a much greater understanding of catastrophe risk over the last few years than had previously existed.

Following on from Don Friedman’s modelling of hurricanes in the 1980s, which provided a basis for hazard and probability analysis, David explained how there has since been massive investment in producing ever more reliable models to understand these elements. Indeed, the process of developing models in itself seems to have driven the understanding of various components and allowed constraints to be placed on the ‘unknown unknowns’, a Rumsfeldism which seems to make its way into most talks on modelling these days.

The price of reinsurance has apparently dropped substantially in recent times, driven by the unprecedented levels of investment. In particular, we were told that reinsurance for many parts of the developing world comes at negligible cost, due in part to a reduction in the number of deaths from droughts as a result of more reliable aid. Although this is clearly a positive development, David was keen to point out that the arrival of aid was often too slow to prevent significant human suffering and damage to assets and infrastructure. The focus has therefore turned to more timely interventions and having better systems in place for disaster response.

We learnt that insurers are now playing an important role in driving best practice from governments, with many African countries having to present draft disaster response plans, audited reports on actual responses implemented by the government and the results of anti-corruption tests before they can join insurance programs.

David’s talk closed with commentary on the growth of various large-scale insurance schemes, many of them covering multiple countries. He cited the example of the African Risk Capacity, which is expanding from 5 to 10 members, and a scheme in the Caribbean which is now expanding into Latin America. He did highlight some pitfalls with the more inclusive approach to insurance, contrasting the approach to flood insurance in the UK, where higher risk properties pay an additional premium, with the French system where all households pay the same, thereby removing some of the incentive for individuals to reduce their risk.

Improving resilience

Our second talk of the session came from Martin Rokitzki, former resilience advisor for climate change adaption at Oxfam. Humbly professing to be ‘the least scientific person in the room’, he could nevertheless point to 15 years of practical experience working on climate change and environmental issues.

His talk began by looking at what is actually meant by the term ‘resilience’, which appears to have numerous definitions relating to one’s ability to cope, adapt, prepare or thrive when faced with shocks, stresses or uncertainties.

When presented with such an uncertain framework, we were unsurprised to learn that there is no ‘cookie-cutter or cook-book’ for resilience and that the term may be applied to a huge range of social and economic groups. By talking about his experiences with Oxfam, Martin was at least able to narrow his focus to addressing the resilience of the world’s poor.

Even within this constraint, understanding hazards and impacts was presented as a multi-faceted exercise. Variations in the spatial extent of damage, its intensity, duration, rate of onset and level of predictability could all have profound effects on the planning process. Counterintuitively, Martin felt that slow-onset hazards were often the hardest to address and his talk focussed on how to deal with challenges of that nature, such as the East African food crisis, glacier melt in Nepal and salt intrusion in Tuvalu.

We were told that Oxfam’s approach to resilience involves 5 key areas: livelihood viability (i.e. the economic buffer to disaster); innovation potential; contingency resources and support access (i.e. provision of aid); integrity of the natural and built environment (in the case of the extreme poor, they are directly dependent on the surrounding natural environment); and social and institutional capacity (i.e. governance).

In contrast to the preceding speaker, Martin’s presentation abounded with eye-catching schematics, highlighting various approaches to disaster management. Key to these were the integration of policy and projects to get a successful outcome. To illustrate this, he presented us with the ‘Cycle of Drought Management’ which moves through stages of preparedness, disaster response and relief, reconstruction and mitigation. Alas, the paucity of data in 80-90% of affected areas means that the preparedness stage is often a huge challenge. Our presenter highlighted this as a key reason for Oxfam to collaborate more closely with scientists.

Towards the end of his talk, Martin touched on Oxfam’s R4 approach to risk, encompassing risk reduction (managing resources well), risk transfer (insurance), risk taking (credit for investment) and risk reserves (savings). Without this sort of strategy, seasonal food shortages could easily become year round famines. As part of this Oxfam has been helping to administer financial services in remote rural areas and developing a focus on flexible and forward-looking decision making.

Martin’s final message was that we need more collaboration between the ‘thinkers’ and the ‘doers’ – a clear call for the science community to engage more directly and more frequently with aid agencies and other environmental organisations.

Assessing impacts

Our final speaker of the session was Imperial’s very own Professor Ralf Toumi, who described his ongoing work on the OASIS project, an open access model for looking at the impacts of extreme weather events on the built environment.

His main driver for the project was the limited number of companies providing assessments of risk in this area, thereby giving a fairly narrow field of views on risk to the insurance sector. He reflected that this has not been helped by a continuing barrier of information between researchers and insurers and the ‘black box’ approach to disaster modelling which exists within the commercial world.

Following the previous speaker’s flurry of eye-catching diagrams, Ralf was not shy to present a few schematics of his own, illustrating the concepts behind OASIS. These highlighted the user’s ability to select combinations of models to give a tailor-made view of risk, including a broader spread of results and a greater understanding of model bias and uncertainty. To highlight the point, Ralf asserted that vulnerability modelling (i.e. the damage caused by an event) has a much greater level of uncertainty than hazard modelling. Indeed, one of the key challenges of the OASIS project has apparently been to get hold of datasets on damage, information which some players in the industry have been reluctant to provide.

A further challenge, we were told, is the effect of giving insurers greater experience in using this modelling framework: the desire for greater complexity. Whilst models appear to be ever more powerful (a 30 year dataset can apparently now be used to predict a 1 in 1000 year event!) there is a serious challenge to translate this complexity from the academic / journal environment to insurance professionals. There has also been a need to standardise the wide array of different data formats associated with OASIS’ component models.

Despite these challenges, it appears that OASIS is flourishing. Our presenter proudly displayed a series of media articles after their press release went viral, along with a list of 44 members of the OASIS Loss Modelling Framework, a list that includes numerous insurance and reinsurance companies. Their many associate members include a variety of government bodies, academic institutions and IT companies.

Long-term planning

A combined question and answer session followed on the three presentations. It began with the question of how all these ‘big complex’ models have been validated with data. Professor Toumi agreed that validation is a huge issue, although hazard validation is much easier to do, using historical datasets, than validating predictions of damage, which sometimes diverge wildly. David Simmons was able to point to a recent paper he had written on model validation and highlighted that the non-stationary world we live in means that there are never sufficient data. Nevertheless, he believed that even non-validated models are better than nothing and that the modelling process aids understanding as much as the end result. He also highlighted that satellite datasets can act as a useful first-pass method for validating models.

The second question focussed on how we transition from looking at short-term resilience to combatting longer-term changes. Martin Rokitzki responded that although we live in a short-term world, transformative scenario planning is more commonly done nowadays, which is often based on narratives rather than data alone. Adaptive management is also more common.

Another audience member (currently working for one of the London mayoral candidates) wondered what question we should pose to mayoral candidates of large cities in relation to risk management and resilience. The panel were somewhat stumped by this, but (ironically) opted to answer the question about a question with another question: Martin Rokitzki wondered who has responsibility for risk management. Should adaptation be a government service, should it be borne by individuals or even by the private sector?  David Simmons cited an example of the World Bank trying to cover industrial areas against earthquakes and reward good design through financial incentives. Unfortunately, the scheme struggled through a lack of political will to take decisions which might be unpopular with their electorates, despite having clear long-term benefits.

The final question related to the possible impacts of a catastrophic asteroid impact and the huge disparity between the insurance fund set aside to cover Florida’s coastline from storm damage and flooding ($2 trillion) compared to a much smaller sum assigned globally for larger-scale catastrophes like asteroid impacts ($0.5 trillion). David Simmons responded that the insurance industry focuses on the short-term, partly due to the 5 year tenure of most CEOs. This makes asteroid impacts beyond the timescale of concern. Another contributor to the disparity is that flood insurance is governed by a regulator in Florida. Despite this, David felt that Florida now has enough reinsurance capacity and that there is now a need to better understand hazards like asteroids.

And as we all dwelt on what sort of cosmic destruction may be in store, the session was brought to a close, leaving us with the much simpler conundrum of what to have for our lunch.

Watch a video of the talk on our YouTube channel.

Managing environmental change

This blog post by Rebecca Emerton, a Scenario DTP student at University of Reading, is part of a series on Responding to Environmental Change, an event organised by the Natural Environment Research Council (NERC) funded Doctoral Training Partnerships at Imperial (SSCP), and the University of Reading and the University of Surrey (SCENARIO).

See the full list of blogs in this series here.

In addition to natural variability, human activities are causing rapid, large-scale climate and environmental change. Understanding how these processes work as a whole Earth system can improve our understanding of the impacts of these changes and inform responsible management. One key challenge is how we monitor and record environmental data, and the role this data can play in managing the environment.

The third challenge area of the Responding to Environmental Change event explored the management of environmental change, including how environmental data is monitored and recorded, and challenges faced in utilising this data.

Monitoring the environment from  Space

Jacquie Conway, Head of Institutional Relations UK within Airbus Defence and Space – Geo-Intelligence, opened the afternoon with a discussion of the practical applications of Earth Observation (EO) data. A key question was presented: “Why Space?”, highlighting the benefits of EO for providing evidence used to assess how much land change is occurring, where this land change is taking place and the causes and impacts of the change, alongside uses in model validation and determining possible future changes. Examples were given such as forest mapping and monitoring, in order to identify degradation and illegal logging, and the changes in these over time. Further examples include food security and crop sustainability – analysis of drought areas and possibilities for improved farming management practices, and urban planning through monitoring land use change and developing cities. Disaster management is also key, with EO data and mapping used in emergency response, recovery and preparation.

The challenges associated with EO and Big Data are continuously evolving, with increased volume, diversity and value of EO data, in conjunction with non-space data. Aspects such as quality, continuity, timeliness and uniqueness of data are significant in approaching the Big Data challenge. Emerging solutions include the Airbus Processing Cloud, which provides a platform for hosted processing, with examples given of completed successful processing and reprocessing campaigns. Where the previous data processing time for one mission was greater than 700 days, it is now possible to process this data in just 2 weeks through use of the Airbus Processing Cloud. Alongside data processing, the platform will enable development of new products and services through a partnership approach, with the intent to support SMEs, research organisations and Universities, among others.

Copernicus was introduced as the European Flagship Earth Observation Programme to monitor environmental change, by Jacquie Conway, and discussed further by Dr Farhana Amin (Defra). Copernicus is led by the EU and co-ordinated by the ESA, and is the European response to a global need to manage the environment, providing necessary data for operational monitoring of the environment, and for civil security. With a €3.8bn investment in Copernicus, 6 missions (each with 2 satellites) will be launched, resulting in up to 8TB of new, open access data on the environment, per day. These missions will provide valuable information for land, marine and atmosphere monitoring, alongside emergency management, security and climate change services.

Environmental policy and regulation

Dr Amin gave a policy perspective on managing environmental change, highlighting the responsibilities of Defra for policy and regulation on environment, food and rural affairs, including the protection from floods and plant/animal diseases, alongside improving the environment and rural services. The statutory obligations of Defra range from monitoring pesticide residues on food, to managing natural resources through monitoring of air quality and biodiversity. Emphasis was placed on Evidence-Based Policy, using observations, knowledge and scientific research to provide the basis for all policies. Examples were given of current programmes such as Cefas – the Clean Seas Environment Monitoring Programme, which aims to detect long-term trends in the quality of the marine environment through collection of high quality, standardized data. Other examples include the monitoring of bathing water quality, and UK Surveillance Schemes involving partnerships between the Joint Nature Conservation Committee (JNCC), NGOs, research bodies and volunteers to monitor wintering and breeding birds, butterflies, bats, plants and other mammals.

Satellite applications also have a long history of use within Defra, for research and monitoring of land use, roads and marine environments, and GPS data for forestry monitoring, flood monitoring and field sample collections. Again, challenges with EO were discussed, such as the highly complex processes involved, the need for high quality data and regular analysis, working around multiple partners and methodologies, and the resource intensive nature of environmental monitoring.

Understanding the ‘Critical Zone’ for life

Professor Anne Verhoef (University of Reading) provided a research perspective on managing environmental change, discussing steps towards an improved understanding and sustainable management of the ‘Critical Zone’ (CZ), which extends from groundwater reservoirs to soil, to the surface and lower atmosphere – in other words, the zone in which we live. The CZ affects food, water and energy resources, and plays a major role in our weather and (micro)climate, also allowing us to mitigate the effects of extreme events and environmental change. Advances in monitoring of the CZ at many time and space scales (for improved understanding and management), include novel monitoring of field-scale soil moisture and a wireless underground sensor network. Also on the theme of Earth Observation, imaging such as X-Ray CT imaging and remote sensing play a role in understanding and managing the CZ.

Another key aspect is modelling of the CZ, using various models to study part of, or the entire, CZ, such as land surface models (within global circulation models, e.g. JULES), groundwater models, and Soil-Vegetation-Atmosphere-Transfer (SVAT) models. SVAT models can further be coupled with remote sensing (EO) data of multiple types and at a range of spatio-temporal scales, leading to more generic tools for environmental research and management. Versatile tools exist allowing the calculation of crop yield, photosynthesis etc., such as the SCOPE model, which is an SVAT model supporting the direct interpretation of EO data. It was concluded that improving models to include more realism, and combining them with EO and remote sensing products, alongside the use of novel in-situ monitoring techniques (for improved ground data), will improve our understanding of the CZ and help move towards sustainable management of environmental change.

Benefits of collaboration for sustainable management

Both the similarities and differences between the perspectives from business, policy and research, and the challenges faced in using EO data for the management of environmental change, show the benefits of collaboration and partnerships, alongside the advances and extensive work towards sustainable management of the changing environment.

Watch a video of the talk on our YouTube channel.

The Road to Paris 2015 – the UK’s postition

The Climate and Environment at Imperial blog has moved. View this post on our new blog

This blog post by Samantha Buzzard, a NERC student at the University of Reading, is part of a series on Responding to Environmental Change, an event organised by the Natural Environment Research Council (NERC) funded Doctoral Training Partnerships at Imperial (SSCP), and the University of Reading and the University of Surrey (SCENARIO).

See the full list of blogs in this series here.Rooftop view on the Eiffel Tower, Paris, France

To conclude the Responding to Environmental Change meeting Matthew Bell, Chief Executive of the Committee on Climate Change, outlined the position of the UK in relation to climate change and the issues that could be faced at the Paris Climate Conference (COP 21) at the end of this year. At the beginning of his talk he emphasised that the credibility of the Committee on Climate Change depends on properly interpreting the science of climate change and also that the committee should feedback into the scientific community through signalling the gaps in the evidence and determining what research would be most valuable in the long term.

The UK at present

Matthew made it clear that most of the debate in the UK was not whether climate change is happening, but around the uncertainty of the levels of change and its impacts. This was highlighted only a few days ago when David Cameron, Nick Clegg and Ed Milliband made a pre-election pledge to uphold the climate change act, which holds the UK to a statutory 2050 target for emissions reductions. In fact when the act was first introduced in 2008 it received massive cross-party support with only three MPs voting against it.

It is because of this act that Matthew was able to speak to us – it established the Committee on Climate Change as an independent advisor to report back to the government annually on the UK’s progress towards meeting the five year legally binding carbon budgets that the country has been set in order to meet the 2050 emissions target (the Committee also suggest the levels that these five year targets should be set at when they are planned). The Committee also gives an assessment of the country’s adaptation to climate change, ensuring that actions taken are in line with the level of risk expected.

CCC graph

The UK’s 5 year carbon budgets. The UK met the first budget but mostly due to the economic slowdown. (Source Matthew Bell, Committee on Climate Change).

Issues in Paris

There will be many areas under discussion at COP 21, ranging from pledges and the monitoring of them once they are made, support from high to low income countries (both financial and non) and the actions required from ‘international’ sectors such as aviation and shipping.

However, the focus here was on the wider co-benefits of tackling climate change. Matthew stressed that when looking at these issues the Committee have to take into account a range of factors. Although scientific knowledge is key, areas such as technology, the impact of actions upon the competitiveness of UK industry, social circumstances (particularly fuel poverty) and fiscal circumstances all have to be considered. There is a trade-off to be made between the cost of mitigation and how much we are willing to accept risk to ecosystems and certain parts of the planet. Furthermore, there are both benefits and costs of tackling climate change, some of which are outlined below:

Benefits:

  • Improved air quality
  • More active lifestyles
  • Fewer (net) road traffic accidents
  • Time savings from reduced congestion
  • Less water abstraction
  • Improved health from better diet

Costs:

  • Landscape impact of renewables
  • Hazardous waste (and risk of major incidents) from nuclear
  • Road accidents from walking and cycling
  • Air quality impacts of biomass for heat
  • Airstream quality and upstream fuel impacts of coal carbon capture and storage

Some work has been done to calculate the net impact of tackling climate change but the error bars are large and more work is needed. The current recommendation that the Committee on Climate Change are suggesting would costs less than 1% of the UK’s GDP.

The UK leading into Paris

The UK is currently in a good position leading up to COP21 having met the first of our five yearly carbon budgets – although it must be stressed that this is largely due to the financial crisis and economic slowdown rather than specific policies. There is still a lot to do to meet the 2nd and 3rd targets and the 4th is going to be a very big step down.

A key stage in reaching these targets will be to have a largely decarbonised power sector by 2030. Matthew suggests a highlight for future research could be the wider use of low-carbon heat, for example having this in 15% of homes by 2030. To ensure the success of policies relating to these changes more research also needs to be done into behaviours – what prevents people taking up green actions and determines their reactions to environmental policies?

It was emphasised that we also have a poor evidence base and lack of data for working with the industry and agriculture sectors, so these areas need greater attention in future. Furthermore, despite success in reducing vehicle emissions by a greater amount than expected (due to EU regulation) it will now be even more challenging to reduce them further.

The Committee are due to release a progress report on both adaptation and mitigation in June outlining the key risks to achieving the 2050 carbon target and will also advise on the level of the 5th carbon budget (the 2028-32 budget as these are set 12 years in advance) at the same time COP 21 is taking place in December.

Help in different areas will be important to the Committee this year and well beyond. From scientists better near-term climate models, better monitoring and understanding of the full life-cycle of greenhouse gas emissions and their wider environmental impacts and linking the science of diversity, ecology and evolution to policy debates about climate will all be helpful for the committee’s work. However, this will need to be combined with better understanding of people’s behaviours and gaining the optimal balance between adaptation and mitigation, as well as understanding the best timing and level (local, regional or national) at which to apply measures.

Watch a video of the talk on our YouTube channel.

Ocean warming in the media

A recent paper on ocean warming has been reported on in a number of newspaper articles, most recently by Christopher Booker in the Sunday Telegraph.

The author of the paper, Professor Carl Wunsch of MIT, wrote a letter to the editor of the Sunday Telegraph in response to Christopher Booker’s article. As the letter has yet to be published in the Sunday Telegraph, with the permission of Professor Wunsch we have decided to post it here.

Dear Editor,

In the Sunday Telegraph of 27 July 2014, Christopher Booker pretends to understand a highly technical paper on ocean warming to such a degree that he can explain it to his lay-audience. Had he made the slightest effort to contact me, I could have told him that the paper in reality says that the ocean is warming overall at a rate consistent with previous values – but that parts of the deepest ocean appear to be cooling. This inference is not a contradiction to overall warming. He imputes to me a wish to hide my views: nothing could be further from the truth. I believe that global warming is an extremely serious threat, but how that threat will play out in detail is scientifically still poorly understood. Anyone who interprets the complexity of change to mean global warming is not occurring and is not worrying, is ignorant enough to regard The Great Global Warming Swindle as a documentary – it is an egregious propaganda piece.

Carl Wunsch

Harvard University and Massachusetts Institute of Technology

Grantham Institute welcomes results of Energy and Climate Change Committee review of IPCC WG1 report

Uk ParliamentThe House of Commons Energy and Climate Change Committee report on the Working Group 1 contribution to the IPCC Fifth Assessment Report, which is published today, has found the IPCC process to be robust. The committee launched an inquiry into the IPCC WG1 report in October 2013, following criticism by some commentators of the IPCC review process and its conclusions.

The Grantham Institute submitted written evidence to the committee (you can read our evidence here) and our Chair Professor Sir Brian Hoskins was called before the committee to give oral evidence.

The committee found that “the IPCC has responded extremely well to constructive criticism in the last few years and has tightened its review processes to make its Fifth Assessment Report (AR5) the most exhaustive and heavily scrutinised Assessment Report to-date. The MPs call on the IPCC to continue to improve its transparency, however. The IPCC would benefit, they say, from recruiting a small team of non-climate scientists to observe the review process and the plenary meetings where the Summary for Policymakers is agreed.”

 

Commenting on the report Professor Joanna Haigh, Co-Director Grantham Institute said:

“Having assessed a significant quantity of submitted evidence, both written and oral, this report is overwhelmingly supportive of both the procedures and the conclusions of the IPCC. It concludes that the WG1 report is the best available summary of the state of the science of climate change, that improvements to IPCC procedures since the Fourth Assessment have ensured “the highest Quality of scholarship” and that there is no scientific basis for downgrading UK’s ambition to reduce greenhouse gas emissions.

In terms of procedures it recommends two areas of further improvement – the appointment by governments of some non-climate scientists as members of the Executive Committee, and to observe the review process, and a greater level of transparency in plenary meetings discussing the Summary for Policymakers – but these recommendations in no way reflect concern about the content of the Assessment. A whole chapter of the report is devoted to examining criticisms that have been levelled, from both inside and outside the scientific community, on the scientific conclusions but none is found to have significant bearing.

Such a robust report from an all party parliamentary committee surely means that we can now reduce efforts spent on dealing with the constituencies working to discredit the IPCC, concentrate on understanding the science behind climate and climate change and do our best to make sure that the government plays a leading role in achieving a global deal on climate change.”

 

Professor Sir Brian Hoskins, Chair of the Grantham Institute said:

“The committee recognises that the recent WG1 report of IPCC gives a very good summary of the science relevant to climate change, whilst there are some remaining issues on transparency.

The question now is how do we respond to the risk posed by climate change, and I am pleased to see that the Report is clear: it supports the basis for the advice given by the Climate Change Committee and the path the UK is taking towards its 2050 carbon reduction target, in particular the 4th Carbon budget recently confirmed by Government, it advises that the UK Government at the top level should play a major role in international discussions leading up to Paris 2015.”

 

Sticking to the budget

Carbon budgetFollowing on from Simon Buckle’s post this morning another piece of good news on emissions reductions, the UK government has announced that they will not amend the fourth carbon budget, after reviewing their commitments in light of progress within the EU.

Therefore the carbon budget for 2023-27 remains at 1,950 MtCO2e, keeping the UK on track to reduce greenhouse gas emissions by 2050 by 80% relative to 1990 levels.

This decision is in line with advice from the Committee on Climate Change given in December 2013, that there was no basis to change the fourth carbon budget.

You can read more about the review in our background note on the fourth carbon budget.

UNFCCC climate negotiations: reflections from the Rhine

BonnBy Dr Simon Buckle, Grantham Institute

I spent a few days at the recent Bonn climate change conference (4-15 June) during the High Level Ministerial events on 5-6 June.  Not that these were the most interesting things happening there. Unsurprisingly, by and large, Ministers did not stray from well rehearsed positions, reflecting the continued skirmishing over the interpretation of the UN Framework Convention on Climate Change (UNFCCC) term “common but differentiated responsibilities” in a world that is radically different from the one in which the Convention was conceived.

More interesting were the briefing session on the UN Secretary General’s forthcoming climate summit in New York on 23 September and a series of special events where negotiators got the chance to hear from and question IPCC authors about the implications of the IPCC AR5 reports for the UN negotiations and the review underway of the long-term target (2°C or 1.5°C?), a key issue for vulnerable countries (e.g. small island states) given the very different potential implications of sea-level rise. It’s worth looking at some of the webcasts.

A particularly revealing moment came during a special event to engage with Observers to the UNFCCC process organised by the co-chairs of the so-called “Ad Hoc Working Group on the Durban Platform for Enhanced Action” (ADP). The ADP is the subsidiary body charged with developing the Paris 2015 agreement as well as trying to identify ways to enhance mitigation action before 2020.

As part of the ADP process, early in 2015 countries should notify the UNFCCC Secretariat of their “intended nationally determined contributions” (INDCs) to emissions reductions in the period after 2020.  I therefore asked the ADP co-chairs how the UN process would ensure that these bottom-up contributions – and the aggregate global emissions that they implied – would be consistent with the long-term climate targets that countries had committed to.  This was indeed a critical issue they said, but they had as yet no idea how this might be achieved.

This seems to me to be a pretty fundamental problem. An aspiration to achieve “a carbon neutral world in the second half of the century” (the current UNFCCC thinking on such an overarching aim) is in my view just not good enough to constrain climate risks and achieve a cost-effective transition to a low-carbon world. In particular, it says nothing about the emissions path over the next 30 years or so.

We know that effective international action on climate is difficult to achieve given the concerns over competitiveness and free-riding.  So instead of the top down approach to earlier climate agreements (i.e. Kyoto with it modest targets and timetables) or the bottom-up approach that has now emerged by default from the political trauma of the 2009 Copenhagen summit, we now need a hybrid approach. What I have in mind is that the aggregate bottom-up emissions pledges under the UN process need to be supplemented and given coherence by a political commitment among the major emitting economies – notably the US, China and the EU – to achieve a clear, measurable and negotiable near-term mitigation objective, which would be reflected within the Paris agreement.  My own suggestion for what this objective should be is that it should commit to achieving a global peaking in fossil-related carbon dioxide emissions by 2030 or earlier if possible, with a subsequent decline.

Recent developments in the US and China suggest that such a goal is not impossible, particularly if the EU can get its act together and agree its 2030 targets at the October Council.  Moreover my own calculations suggest a global peak in fossil carbon dioxide emissions could be achieved while allowing developing country emissions (e.g. in India) to continue to grow for some time to come, as they must in any politically viable deal.

Of course achieving a global peak in carbon dioxide emissions is just a first step on the road to a longer term objective and if it is achieved too late or the peaking level is too high, we may not be able to achieve some of the more stringent climate targets.  But if we just focus on the long term target, we will end up in a zero-sum negotiation over the level and shares of a corresponding fixed carbon budget consistent with this target.  The urgent need at this point in time is to reverse the continuing growth in global CO2 emissions, a necessary first step in achieving any long-term goal.  The pace of emissions reductions after the peak and the eventual level of emissions in the second half of the century can be agreed at future summits.

No doubt there are several other ideas for making Paris a success being discussed by governments in private as I write.  One might be to build in flexibility to the Paris agreement itself and have short commitment periods, perhaps to 2025 initially but agreed on a rolling five years basis thereafter, mirroring the UK approach to setting carbon budgets.  And there needs to be far more public discussion about these alternatives.  But these technical suggestions are worthless without political leadership. The next 18 months presents an unprecedented opportunity to shift the world decisively onto a cost-effective, low climate-risk development path, with myriad benefits for our wellbeing and economic development.  The Secretary General’s September summit will be an early indication of whether our political leaders are ready to make this step or not.

 

What is the best way to write about climate change in fiction?

By Dr Flora Whitmarsh, Grantham Institute

Last week I attended Weather Fronts, an event organised by Tipping Point. The event brought climate scientists together with writers of fiction and poetry to discuss how authors can bring climate change into their work.

Climate change is a global problem and solving it requires collective action. When too many citizens fail to exercise their voice, it is harder for such problems to be adequately addressed at the societal level. Artists have a voice they can use to communicate about the things that concern them. Writing about global warming of course has the potential to raise awareness of its impacts and possible solutions. Novels or poems can be more engaging for some audiences than scientific documents or news reports.

After two thought-provoking keynote speeches from John Ashton and Professor Chris Rapley, and a writer’s panel with Maggie Gee, Jay Griffiths, Gregory Norminton, and Ruth Padel, much of the time was spent in small group discussions. We talked about diverse subjects including utopia and dystopia in fiction, uncertainty in climate modelling, and who should take decisions about climate change. I met novelists, graphic novelists and poets who were passionate about the environment, many of whom are already writing about the subject.

Literature would seem to be the perfect medium to bring climate change to life, but the muse is fickle and anybody setting out to write fiction with an explicit message could struggle to create engaging art. The danger is in creating something more akin to propaganda: a story with an obvious message, and unrealistic or one dimensional characters who function as little more than a mouthpiece for the author’s own opinions.

We often hear that the best way to write about climate change is to create a “positive vision of the future”, something that probably works well in science communication and outreach. But in art this can be less effective, because one runs the risk of the message being too obvious.

It is often said of climate change communication that we should avoid overly frightening or guilt-inducing messages, because this risks evoking fear and powerlessness. At the conference one writer suggested that dystopias are easier to write than utopias, suggesting it could be difficult to create art with a positive message about the future. However, a future in which climate change has been solved need not be a utopia. One solution to all of this is to write a futuristic story about something else entirely, but set it in a world powered by renewable energy without explicitly alluding to climate science or policy.

The conflict between creating good art and giving it some sort of message cannot be easily solved, but one experienced writer summed up the answer in the words of Emily Dickinson: “Tell all the truth, but tell it slant”. In other words, in fiction it can often be more effective to hint at your message or arrive at it in a roundabout way than to spell it out explicitly.

 

Science and an open society

By Dr Simon Buckle, Grantham Institute

Professor Lennart Bengtsson’s resignation from the GWPF (Global Warming Policy Foundation) Academic Advisory Council has received wide coverage and raises important issues.

Whatever anyone’s views are on the role, motivation and integrity of the GWPF in this matter, it is up to individual academics whether or not to associate themselves with it in an advisory role.

It is regrettable that perceived political stances on the climate issue are apparently so affecting academic activity.  The Grantham Institute at Imperial has always opposed such behaviour, believing that scientific progress requires an open society.  We try to engage with a wide range of figures, some with radically different views on climate change.

The outcome in this case is probably a reflection of the “us and them” that has permeated the climate science debate for decades and which is in part an outcome of – and reaction to – external pressure on the climate community.  But we must be clear: this is not a justification.  Concerted external pressure – if that is what it was – on Professor Bengtsson to resign from his GWPF role was wrong and misjudged.

Academic work on climate science and responses to climate variability and change should be politically neutral.  Policy towards climate is inevitably value-based and hence political.  We need the insights from high quality research and analysis to ensure our policy and political choices are as well informed as can be – importantly including social, political and economic research as well as that from the physical sciences and engineering.

What we learn from this event is that maintaining a healthy separation between science and politics – on either side of the political debate – is a continual but necessary challenge.  We have to keep the scientific endeavour as free as possible from political contention over policy responses.  All serious scientific voices on climate change therefore deserve both respect and to be heard. But given the enormity of the issues, these views require rigorous scrutiny and testing.

This episode should not distract us from the fact that we are performing a very dangerous experiment with the Earth’s climate.  Even by the end of this century, on current trends we risk changes of a magnitude that are unprecedented in the last 10,000 years.  How we respond to that is a matter of public policy, on which of course scientists have both a voice and often strong opinions, but as citizens not as policy experts.

Collaboration with Stanford University and Biofuels Research at the Joint BioEnergy Institute

By C. Chambon, Research Postgraduate, Department of Chemistry

As part of a group of six Imperial students who visited California, I travelled to San Francisco to work on two projects: the New Climate Economy project, and a research collaboration with the Joint BioEnergy Institute.

The New Climate Economy project is a government-commissioned project looking at how economic goals can be achieved in a way that also addresses climate change. The Innovation stream, led by Stanford University and the Grantham Institute at Imperial, is focused on the potential economic and environmental impact of disruptive technologies. Beginning in January, a group of six Imperial students each focused on a different technology for the project, researching and preparing case studies for our weekly teleconferences. The topics researched were as varied as solar PV, nanomaterials, customer segmentation and the smart grid. My focus was on carbon capture and storage or utilisation (CCUS) technologies, and the policies needed to support them.

Pic 1
The Imperial team at Stanford University

In Palo Alto, we worked together with Stanford students to construct a business model for each of our technology clusters. Our research findings were presented to NRG Energy’s newly formed Station A, a kind of skunkworks for energy resilience within NRG, a wholesale power company. The collaboration was a successful and a productive one, and several of us will continue to work with the New Climate Economy project to publish our research. The work will contribute to the UNFCCC COP negotiations in Paris in 2015.

During the latter half of the trip, I combined visits to Stanford with research for my PhD at Lawrence Berkeley National Lab across the bay. The San Francisco Bay Area is well-renowned as a bioscience and biotech hub, and is home to over 200 bioscience companies, start-ups and research institutes. One of these is the Joint BioEnergy Institute (JBEI), a branch of Lawrence Berkeley National Lab in the Berkeley hills. JBEI is a U.S. Department of Energy bioenergy research center dedicated to developing second-generation biofuels. These are advanced liquid fuels derived from the solar energy stored in plant biomass. The cellulosic biomass of non-food plants and agricultural waste can be converted to petrol, diesel and jet fuel, whilst the non-cellulosic part is a promising candidate to replace aromatic chemicals.

Pic 2
The Joint BioEnergy Research Center in Emeryville, California

My project at JBEI looked at the upgrading of lignin, extracted from the non-cellulosic part of woody biomass, into aromatic building-blocks. This experience was a valuable addition to my PhD project, which looks at the valorisation of lignin from pine wood to improve the economics of the biorefinery.  A highlight of my stay was a visit to the scaled-up biorefining facilities at LBNL, where a one-of-a-kind reactor is used to convert biofeedstocks into fuels. It was a very inspiring glance into the future of biorefining and I look forward to working closely with LBNL researchers and others working in the field of bioenergy.

The challenge of seasonal weather prediction

By Hannah Nissan, Research Assistant in Regional Climate Modelling, Physics

In April 2009 the UK Met Office issued their now infamous forecast: “odds-on for a BBQ summer”. By the end of August, total precipitation since June had climbed to 42% above average levels for 1971-2000 (UKMO, 2014). Why is it so challenging to provide seasonal forecasts several months ahead?

A question which arises often in conversations about climate change is “how can we predict the climate in 50 years when we can’t even get the weather right next week?” While we have no skill in forecasting the weather on a particular day in 2060, we can make much more confident projections about the average conditions at that time. Some explanation of this comes from mathematician and meteorologist Ed Lorenz, who made an accidental discovery while rerunning some calculations inspired by weather predictions on his computer (Lorenz, 1963). Taking a seemingly harmless shortcut, he rounded a number to a couple of decimal places, and the result was a completely different weather forecast. His finding was coined the “butterfly effect”, which invokes a powerful analogy: tomorrow’s weather forecast depends so strongly on getting today’s weather right (a significant challenge) that forgetting to account for the effect of a butterfly’s flapping wings is enough to derail it. In contrast, climate is an average of the weather over a few decades, and doesn’t suffer as strongly from this debilitating “initial conditions” problem.

The distinction becomes rather blurred when we turn to seasonal prediction. When looking at longer term forecasts, say for this coming summer or next winter, correctly characterising the initial conditions of parts of the climate system still matters. This is particularly true for slower varying fields like soil moisture and the upper ocean levels, and to a lesser extent for the atmosphere. However, on these timescales other challenges become increasingly important. As we move from forecasting days ahead, to weeks, months, seasons and decades, the number and complexity of physical processes that must be well described by the computer models used to simulate the weather increases. Delivering better forecasts requires improvements on both these fronts, which compete for limited computer resources.

The benefits of developing more sophisticated prediction models must be balanced against the critical need for an indication of the uncertainty related to a forecast. To do this meteorologists create not just one weather forecast, but a whole ensemble of predictions starting from slightly different initial conditions. Other ensembles to capture the uncertainty in the physics of the model itself are also simulated. The spread of the ensembles tell us something about the range of possible outcomes and their likelihoods. We can never have perfect knowledge of the current state of the weather, nor a perfect model, so running lots of ensembles to develop a probabilistic weather forecast in this way is really important.

To illustrate the trade-offs involved between running ensembles and improving model complexity, consider an example from the UK Met Office. The wetness and storminess of UK winters are strongly linked with the North Atlantic Oscillation (NAO), a pattern of sea surface pressure between Iceland and the Azores islands. By investing in research to uncover some of the important physical drivers for the NAO and including these in their model, the Met Office have recently achieved a significant improvement in NAO forecasts (Scaife, 2014a,b). At the European Centre for Medium Range Weather Forecasting (ECMWF), improving the way convection is modelled delivered better tropical precipitation forecasts and greater skill in predicting seasonal European weather (Bechtold, 2013).

Predictability itself is not a fixed quantity, but varies with each situation. Some weather systems are strongly influenced by larger scale phenomena happening over longer periods of time for which forecasts can be quite reliable. For example, weather in much of the tropics depends on the El Nino Southern Oscillation (ENSO), a pattern of sea surface temperature and atmospheric pressure in the tropical Pacific which persists for several months. Others may be dominated by brief, local processes that are much harder to predict, like convection.  In general, models are able to simulate the cascade of energy downwards, from large movements like the jet stream down to small waves and turbulence. The reverse case is not so easy: it is a major challenge to represent the effects of processes occurring at small physical scales and short time periods, like turbulence and convection, on the wider weather. Consistent with the butterfly effect, some of the effects of small-scale processes are inherently unpredictable and must be represented by random noise.

A good test for the usefulness of a seasonal forecast is whether it offers an improvement over simply looking at the average conditions. In other words, can we do a better job if we simply say that summer temperatures will be the same this year as they have been on average over the last 30 years? Weather prediction models beat statistical forecasts in the tropics, where the influence of ENSO is strong and fairly predictable. This has not in general been the case over Europe and in other higher latitude regions, where lots of different phenomena interact (Buizza, 2014).  However, the latest forecast systems are starting to show some skill even here (Scaife, 2014b).

Temperature forecasts several months ahead are often better than looking at long term data. Predictive skill for precipitation, however, is much worse. This is because rainfall is driven partly by local processes, right down to how individual raindrops are formed. Temperature on the other hand, tends to be controlled by larger, more predictable features (Buizza, 2014). That said, the disastrous floods in Pakistan in 2012 were well forecast a week ahead by ECMWF because, in that particular situation, the rainfall was controlled by large air movements that were relatively well understood (Hoskins, 2012).

The challenging reality is that predictability varies from case to case according to the physical factors controlling each phenomenon. Extracting predictive information across both space and time scales can allow us to unpick these convoluted problems and make real improvements in seasonal prediction (Hoskins, 2012).

With thanks to Brian Hoskins for his helpful review comments.

References

Lorenz, 1963. Deterministic non-periodic flow. JAS 20:130-141.

UKMO, 2014. Summer 2009. http://www.metoffice.gov.uk/climate/uk/2009/summer.html. Accessed 20/03/2014.

Bechtold, P., N. Semane, P. Lopez, J.-P. Chaboureau, A. Beljaars, N. Bormann, 2013: Representing equilibrium and non-equilibrium convection in large-scale models. ECMWF RD TM 705, available at http://www.ecmwf.int/publications/library/ecpublications/_pdf/tm/701-800/tm705.pdf

Buizza, R., 2014. Coupled prediction: opportunities and challenges. Seminar, Imperial College London, 18th March.

Scaife, A., 2014a. Forecasting European winters: present capability and potential improvements. Willis Research Network Seminar: Forecasting, Flood & Fortitude: An Afternoon with Willis, 18th March.

Scaife, A., 2014b. Skilful long range prediction of European and North American winters. GRL, in press.

Hoskins, B., 2012. The potential for skill across the range of seamless weather-climate prediction problem: a stimulus for our science. QJRMS 139(672):573-584.


[1] Convection is when local heating causes air to rise

Stranding our fossil assets or stranding the planet

By Helena Wright, Research Postgraduate, Centre for Environmental Policy

Earlier this month Carbon Tracker came to Imperial College London to discuss their report on ‘Unburnable Carbon’.  The report outlines research which shows between 60-80% of coal, oil and gas reserves of publicly listed companies are ‘unburnable’ if the world is to have a chance of keeping global warming below the globally-agreed limit of 2°C.  The event was followed by a lively debate.

The research, led by the Grantham Research Institute at LSE and the Carbon Tracker Initiative, outlines the thesis that a ‘carbon bubble’ exists in the stock market, as companies with largely ‘unburnable’ fossil fuel reserves are being overvalued.

In fact, the OECD Secretary-General Angel Gurria recently said:

“The looming choice may be either stranding those [high carbon] assets or stranding the planet.”

Digging a hole: ever deeper extraction, ever higher risks

The report found that despite these systemic risks, companies spent $674 billion last year to find and ‘prove’ new fossil fuel reserves.  Capital expenditure has been increasing, while production has been decreasing, with reserves ever harder-to-reach.

Companies like Exxon and Shell have been spending record sums trying to prove reserves, that ultimately risk being stranded in future. The research by Carbon Tracker suggests this is a faulty business model, and in fact risks inflating the ‘carbon bubble’.

If these high levels of capital expenditure continue, we will see over $6 trillion allocated to developing fossil fuel supplies over the next decade – a huge sum of wasted capital.  Luke Sassams outlined evidence that some companies are now starting to pick up on this and rein in their CAPEX spending.

Investors and regulators are now picking up on the issue.  A Parliamentary Report on the ‘carbon bubble’ was released last week, and Chair of the House of Commons EAC, Joan Walley MP, said: “The UK Government and Bank of England must not be complacent about the risks of carbon exposure in the world economy”.

Carbon Entanglement: Getting out of the bubble

One issue that has been highlighted is the fact that some OECD governments receive rents and revenue streams from fossil fuels.  There is also a policy and credibility issue.  If businesses do not believe governments are serious about tackling climate change, they may carry on investing in fossil fuels and perpetuate the entanglement.

It seems that investors are currently backing a dying horse. But continued expenditure on finding new fossil fuel reserves might also be testament to the failures of recent climate policy.

Some have argued the ‘carbon bubble’ thesis relies on the assumption that governments will act on climate change. But arguably, there is not a question of ‘whether’ this government regulation will happen, but merely a matter of ‘when’.   There is a systemic financial risk to fossil assets, whether the necessary government regulation happens pre-emptively, or as a result of severe climatic disruption.

In the discussion that followed, the audience discussed whether the ‘carbon bubble’ will actually burst, and several participants suggested it was likely to burst unless it is deflated in a measured way. An audience member asked: “Don’t the investors have the information already?” and various participants felt they do not, demonstrating the need for enhanced disclosure on carbon risk.

Finally, the discussion turned to institutional investors who are investing in fossil fuels.  Some commentators recognise the irony.  How can a pension fund claim to be helping pensioners, while potentially risking the lives of their grandchildren?  It has also been found that several universities invest in fossil fuels, including Imperial College, sparking a recent petition. The risks of climate change highlighted in the recently released IPCC AR5 report, are driving calls for all types of investors to recognise the risks of high carbon investment.

New Climate Economy Collaboration with Stanford University

By Phil Sandwell, Research postgraduate, Department of Physics and Grantham Institute for Climate Change

Stanford 2

This March, six Imperial students travelled to Palo Alto, California, to work with Stanford University students on the innovation stream of the New Climate Economy.

The Global Commission on the Economy and Climate was established to investigate the economic benefits and costs associated with climate change mitigation and adaptation. The flagship project of this is the New Climate Economy, a worldwide collaboration of internationally renowned research institutions. One such stream, focusing on innovation, was spearheaded by Stanford University and the Grantham Institute at Imperial College London.

The aim of this part of the project was to analyse how disruptive technologies, techniques or methods could develop, overtake their incumbent (and generally environmentally damaging) predecessors and mitigate greenhouse gas emissions. These ranged from carbon capture and storage to 3D printing, with my focus being concentrated photovoltaics (CPV).

Stanford 3

Beginning in January, we held weekly video conferences with the two Stanford professors facilitating the course. Using their guidance and experience, we established the current limitations of our chosen technologies, how they are likely to advance and the conditions under which their development can be accelerated.

After travelling to Palo Alto, we were divided into groups with the Stanford students based on the themes of our research, for example electric vehicles and car sharing. We then integrated our findings, investigating the synergies and similar themes, and together built models to quantify the potential for greenhouse gas emissions reduction and how it could become achievable.

My research led to the conclusion that CPV can become economically competitive with the most common solar technology, flat-­‐plate crystalline silicon cells, in the near future. Given the correct conditions being met (for example the cost per Watt continuing to decline as currently projected) CPV would compare favourably in regions with high direct normal irradiance, such as the Atacama Desert in Chile, the Australian outback and South Africa. One possible application of CPV would be supplying these countries’ mining industries, displacing their current fossil fuel-­‐intensive electricity generation and providing an environmentally responsible alternative – with even less embedded carbon and energy than silicon cells.

This project was a valuable addition to my PhD project, the focus of which is to investigate how several different photovoltaic technologies can mitigate greenhouse gas emissions. Collaborating on this project introduced me to interesting new ways of approaching my work, as well as identifying parallels between my research and that of others in the field of renewable energy technology.