Month: April 2014

Collaboration with Stanford University and Biofuels Research at the Joint BioEnergy Institute

By C. Chambon, Research Postgraduate, Department of Chemistry

As part of a group of six Imperial students who visited California, I travelled to San Francisco to work on two projects: the New Climate Economy project, and a research collaboration with the Joint BioEnergy Institute.

The New Climate Economy project is a government-commissioned project looking at how economic goals can be achieved in a way that also addresses climate change. The Innovation stream, led by Stanford University and the Grantham Institute at Imperial, is focused on the potential economic and environmental impact of disruptive technologies. Beginning in January, a group of six Imperial students each focused on a different technology for the project, researching and preparing case studies for our weekly teleconferences. The topics researched were as varied as solar PV, nanomaterials, customer segmentation and the smart grid. My focus was on carbon capture and storage or utilisation (CCUS) technologies, and the policies needed to support them.

Pic 1
The Imperial team at Stanford University

In Palo Alto, we worked together with Stanford students to construct a business model for each of our technology clusters. Our research findings were presented to NRG Energy’s newly formed Station A, a kind of skunkworks for energy resilience within NRG, a wholesale power company. The collaboration was a successful and a productive one, and several of us will continue to work with the New Climate Economy project to publish our research. The work will contribute to the UNFCCC COP negotiations in Paris in 2015.

During the latter half of the trip, I combined visits to Stanford with research for my PhD at Lawrence Berkeley National Lab across the bay. The San Francisco Bay Area is well-renowned as a bioscience and biotech hub, and is home to over 200 bioscience companies, start-ups and research institutes. One of these is the Joint BioEnergy Institute (JBEI), a branch of Lawrence Berkeley National Lab in the Berkeley hills. JBEI is a U.S. Department of Energy bioenergy research center dedicated to developing second-generation biofuels. These are advanced liquid fuels derived from the solar energy stored in plant biomass. The cellulosic biomass of non-food plants and agricultural waste can be converted to petrol, diesel and jet fuel, whilst the non-cellulosic part is a promising candidate to replace aromatic chemicals.

Pic 2
The Joint BioEnergy Research Center in Emeryville, California

My project at JBEI looked at the upgrading of lignin, extracted from the non-cellulosic part of woody biomass, into aromatic building-blocks. This experience was a valuable addition to my PhD project, which looks at the valorisation of lignin from pine wood to improve the economics of the biorefinery.  A highlight of my stay was a visit to the scaled-up biorefining facilities at LBNL, where a one-of-a-kind reactor is used to convert biofeedstocks into fuels. It was a very inspiring glance into the future of biorefining and I look forward to working closely with LBNL researchers and others working in the field of bioenergy.

The challenge of seasonal weather prediction

By Hannah Nissan, Research Assistant in Regional Climate Modelling, Physics

In April 2009 the UK Met Office issued their now infamous forecast: “odds-on for a BBQ summer”. By the end of August, total precipitation since June had climbed to 42% above average levels for 1971-2000 (UKMO, 2014). Why is it so challenging to provide seasonal forecasts several months ahead?

A question which arises often in conversations about climate change is “how can we predict the climate in 50 years when we can’t even get the weather right next week?” While we have no skill in forecasting the weather on a particular day in 2060, we can make much more confident projections about the average conditions at that time. Some explanation of this comes from mathematician and meteorologist Ed Lorenz, who made an accidental discovery while rerunning some calculations inspired by weather predictions on his computer (Lorenz, 1963). Taking a seemingly harmless shortcut, he rounded a number to a couple of decimal places, and the result was a completely different weather forecast. His finding was coined the “butterfly effect”, which invokes a powerful analogy: tomorrow’s weather forecast depends so strongly on getting today’s weather right (a significant challenge) that forgetting to account for the effect of a butterfly’s flapping wings is enough to derail it. In contrast, climate is an average of the weather over a few decades, and doesn’t suffer as strongly from this debilitating “initial conditions” problem.

The distinction becomes rather blurred when we turn to seasonal prediction. When looking at longer term forecasts, say for this coming summer or next winter, correctly characterising the initial conditions of parts of the climate system still matters. This is particularly true for slower varying fields like soil moisture and the upper ocean levels, and to a lesser extent for the atmosphere. However, on these timescales other challenges become increasingly important. As we move from forecasting days ahead, to weeks, months, seasons and decades, the number and complexity of physical processes that must be well described by the computer models used to simulate the weather increases. Delivering better forecasts requires improvements on both these fronts, which compete for limited computer resources.

The benefits of developing more sophisticated prediction models must be balanced against the critical need for an indication of the uncertainty related to a forecast. To do this meteorologists create not just one weather forecast, but a whole ensemble of predictions starting from slightly different initial conditions. Other ensembles to capture the uncertainty in the physics of the model itself are also simulated. The spread of the ensembles tell us something about the range of possible outcomes and their likelihoods. We can never have perfect knowledge of the current state of the weather, nor a perfect model, so running lots of ensembles to develop a probabilistic weather forecast in this way is really important.

To illustrate the trade-offs involved between running ensembles and improving model complexity, consider an example from the UK Met Office. The wetness and storminess of UK winters are strongly linked with the North Atlantic Oscillation (NAO), a pattern of sea surface pressure between Iceland and the Azores islands. By investing in research to uncover some of the important physical drivers for the NAO and including these in their model, the Met Office have recently achieved a significant improvement in NAO forecasts (Scaife, 2014a,b). At the European Centre for Medium Range Weather Forecasting (ECMWF), improving the way convection is modelled delivered better tropical precipitation forecasts and greater skill in predicting seasonal European weather (Bechtold, 2013).

Predictability itself is not a fixed quantity, but varies with each situation. Some weather systems are strongly influenced by larger scale phenomena happening over longer periods of time for which forecasts can be quite reliable. For example, weather in much of the tropics depends on the El Nino Southern Oscillation (ENSO), a pattern of sea surface temperature and atmospheric pressure in the tropical Pacific which persists for several months. Others may be dominated by brief, local processes that are much harder to predict, like convection.  In general, models are able to simulate the cascade of energy downwards, from large movements like the jet stream down to small waves and turbulence. The reverse case is not so easy: it is a major challenge to represent the effects of processes occurring at small physical scales and short time periods, like turbulence and convection, on the wider weather. Consistent with the butterfly effect, some of the effects of small-scale processes are inherently unpredictable and must be represented by random noise.

A good test for the usefulness of a seasonal forecast is whether it offers an improvement over simply looking at the average conditions. In other words, can we do a better job if we simply say that summer temperatures will be the same this year as they have been on average over the last 30 years? Weather prediction models beat statistical forecasts in the tropics, where the influence of ENSO is strong and fairly predictable. This has not in general been the case over Europe and in other higher latitude regions, where lots of different phenomena interact (Buizza, 2014).  However, the latest forecast systems are starting to show some skill even here (Scaife, 2014b).

Temperature forecasts several months ahead are often better than looking at long term data. Predictive skill for precipitation, however, is much worse. This is because rainfall is driven partly by local processes, right down to how individual raindrops are formed. Temperature on the other hand, tends to be controlled by larger, more predictable features (Buizza, 2014). That said, the disastrous floods in Pakistan in 2012 were well forecast a week ahead by ECMWF because, in that particular situation, the rainfall was controlled by large air movements that were relatively well understood (Hoskins, 2012).

The challenging reality is that predictability varies from case to case according to the physical factors controlling each phenomenon. Extracting predictive information across both space and time scales can allow us to unpick these convoluted problems and make real improvements in seasonal prediction (Hoskins, 2012).

With thanks to Brian Hoskins for his helpful review comments.

References

Lorenz, 1963. Deterministic non-periodic flow. JAS 20:130-141.

UKMO, 2014. Summer 2009. http://www.metoffice.gov.uk/climate/uk/2009/summer.html. Accessed 20/03/2014.

Bechtold, P., N. Semane, P. Lopez, J.-P. Chaboureau, A. Beljaars, N. Bormann, 2013: Representing equilibrium and non-equilibrium convection in large-scale models. ECMWF RD TM 705, available at http://www.ecmwf.int/publications/library/ecpublications/_pdf/tm/701-800/tm705.pdf

Buizza, R., 2014. Coupled prediction: opportunities and challenges. Seminar, Imperial College London, 18th March.

Scaife, A., 2014a. Forecasting European winters: present capability and potential improvements. Willis Research Network Seminar: Forecasting, Flood & Fortitude: An Afternoon with Willis, 18th March.

Scaife, A., 2014b. Skilful long range prediction of European and North American winters. GRL, in press.

Hoskins, B., 2012. The potential for skill across the range of seamless weather-climate prediction problem: a stimulus for our science. QJRMS 139(672):573-584.


[1] Convection is when local heating causes air to rise

Stranding our fossil assets or stranding the planet

By Helena Wright, Research Postgraduate, Centre for Environmental Policy

Earlier this month Carbon Tracker came to Imperial College London to discuss their report on ‘Unburnable Carbon’.  The report outlines research which shows between 60-80% of coal, oil and gas reserves of publicly listed companies are ‘unburnable’ if the world is to have a chance of keeping global warming below the globally-agreed limit of 2°C.  The event was followed by a lively debate.

The research, led by the Grantham Research Institute at LSE and the Carbon Tracker Initiative, outlines the thesis that a ‘carbon bubble’ exists in the stock market, as companies with largely ‘unburnable’ fossil fuel reserves are being overvalued.

In fact, the OECD Secretary-General Angel Gurria recently said:

“The looming choice may be either stranding those [high carbon] assets or stranding the planet.”

Digging a hole: ever deeper extraction, ever higher risks

The report found that despite these systemic risks, companies spent $674 billion last year to find and ‘prove’ new fossil fuel reserves.  Capital expenditure has been increasing, while production has been decreasing, with reserves ever harder-to-reach.

Companies like Exxon and Shell have been spending record sums trying to prove reserves, that ultimately risk being stranded in future. The research by Carbon Tracker suggests this is a faulty business model, and in fact risks inflating the ‘carbon bubble’.

If these high levels of capital expenditure continue, we will see over $6 trillion allocated to developing fossil fuel supplies over the next decade – a huge sum of wasted capital.  Luke Sassams outlined evidence that some companies are now starting to pick up on this and rein in their CAPEX spending.

Investors and regulators are now picking up on the issue.  A Parliamentary Report on the ‘carbon bubble’ was released last week, and Chair of the House of Commons EAC, Joan Walley MP, said: “The UK Government and Bank of England must not be complacent about the risks of carbon exposure in the world economy”.

Carbon Entanglement: Getting out of the bubble

One issue that has been highlighted is the fact that some OECD governments receive rents and revenue streams from fossil fuels.  There is also a policy and credibility issue.  If businesses do not believe governments are serious about tackling climate change, they may carry on investing in fossil fuels and perpetuate the entanglement.

It seems that investors are currently backing a dying horse. But continued expenditure on finding new fossil fuel reserves might also be testament to the failures of recent climate policy.

Some have argued the ‘carbon bubble’ thesis relies on the assumption that governments will act on climate change. But arguably, there is not a question of ‘whether’ this government regulation will happen, but merely a matter of ‘when’.   There is a systemic financial risk to fossil assets, whether the necessary government regulation happens pre-emptively, or as a result of severe climatic disruption.

In the discussion that followed, the audience discussed whether the ‘carbon bubble’ will actually burst, and several participants suggested it was likely to burst unless it is deflated in a measured way. An audience member asked: “Don’t the investors have the information already?” and various participants felt they do not, demonstrating the need for enhanced disclosure on carbon risk.

Finally, the discussion turned to institutional investors who are investing in fossil fuels.  Some commentators recognise the irony.  How can a pension fund claim to be helping pensioners, while potentially risking the lives of their grandchildren?  It has also been found that several universities invest in fossil fuels, including Imperial College, sparking a recent petition. The risks of climate change highlighted in the recently released IPCC AR5 report, are driving calls for all types of investors to recognise the risks of high carbon investment.

New Climate Economy Collaboration with Stanford University

By Phil Sandwell, Research postgraduate, Department of Physics and Grantham Institute for Climate Change

Stanford 2

This March, six Imperial students travelled to Palo Alto, California, to work with Stanford University students on the innovation stream of the New Climate Economy.

The Global Commission on the Economy and Climate was established to investigate the economic benefits and costs associated with climate change mitigation and adaptation. The flagship project of this is the New Climate Economy, a worldwide collaboration of internationally renowned research institutions. One such stream, focusing on innovation, was spearheaded by Stanford University and the Grantham Institute at Imperial College London.

The aim of this part of the project was to analyse how disruptive technologies, techniques or methods could develop, overtake their incumbent (and generally environmentally damaging) predecessors and mitigate greenhouse gas emissions. These ranged from carbon capture and storage to 3D printing, with my focus being concentrated photovoltaics (CPV).

Stanford 3

Beginning in January, we held weekly video conferences with the two Stanford professors facilitating the course. Using their guidance and experience, we established the current limitations of our chosen technologies, how they are likely to advance and the conditions under which their development can be accelerated.

After travelling to Palo Alto, we were divided into groups with the Stanford students based on the themes of our research, for example electric vehicles and car sharing. We then integrated our findings, investigating the synergies and similar themes, and together built models to quantify the potential for greenhouse gas emissions reduction and how it could become achievable.

My research led to the conclusion that CPV can become economically competitive with the most common solar technology, flat-­‐plate crystalline silicon cells, in the near future. Given the correct conditions being met (for example the cost per Watt continuing to decline as currently projected) CPV would compare favourably in regions with high direct normal irradiance, such as the Atacama Desert in Chile, the Australian outback and South Africa. One possible application of CPV would be supplying these countries’ mining industries, displacing their current fossil fuel-­‐intensive electricity generation and providing an environmentally responsible alternative – with even less embedded carbon and energy than silicon cells.

This project was a valuable addition to my PhD project, the focus of which is to investigate how several different photovoltaic technologies can mitigate greenhouse gas emissions. Collaborating on this project introduced me to interesting new ways of approaching my work, as well as identifying parallels between my research and that of others in the field of renewable energy technology.

A slow start to a global climate treaty

By Gabriele Messori, Stockholm University (former Imperial PhD student)

The United Nations’ climate negotiations usually gain the press spotlight once a year, when the big Conference of the Parties (COP) meeting takes place. The most recent COP, which took place in Warsaw last November, was discussed on this blog here. However, the efforts to design a global climate treaty under the umbrella of the United Nations are ongoing, and additional negotiations take place throughout the year. These are particularly important in preparing the ground for the COPs, and provide the occasion to iron out the contrasts which might hamper later work.

The most recent of these meetings took place last week in Bonn, Germany. Formally, this was the fourth part of the second session of the Ad Hoc Working Group on the Durban Platform for Enhanced Action, or ADP 2-4.The focus was on two distinct topics. Firstly, on the ongoing effort to design a global climate treaty, which should be agreed upon by 2015 and implemented by 2020. Secondly, on the promotion of ambitious mitigation plans for the period before 2020. However, several points of contention emerged in the talks.

Far from reaching a quick consensus on the key topics, the participating countries raised several procedural issues which bogged down the discussions. These ranged from trivial aspects, such as the size of the meeting rooms assigned to the different groups, to more important considerations on the modality of the negotiations.

The crucial point was whether to proceed with informal consultations or establish contact groups. In the jargon of the United Nations, a contact group is an open-ended meeting which provides a more formal setting for the negotiations. A contact group needs to be officially established and its sessions are generally open to observers. The last two years have seen the negotiations carried out as informal consultations. Some countries, among which the EU, opposed the creation of a contact group. Many others, including the least developed countries, argued that a new, formal setting was needed.

The latter proposal was finally adopted, thus establishing a contact group. However, the debate that preceded the decision was lengthy and time consuming. While having an appropriate setting for the negotiations is important, the focus should always remain on climate change, which is the reason for which these meetings exist in the first place!

A second crucial discussion concerned the Nationally Determined Contributions (NDCs). These are national plans for action on climate change, made by all countries participating in the talks, and should form an important part of the 2015 climate treaty. At present, there is still no clarity on fundamental points such as the form the NDCs should take, the topics they should address and the mechanisms for evaluating their progress. There is also a strong disagreement on how the burden of action should be shared between developed, developing and least developed countries. This is just a small selection of the unanswered questions concerning the national contributions; the complete list is much longer. Positions on these key aspects vary greatly. As an example, Brazil explicitly asked for the contributions to encompass the full range of actions needed to tackle climate change, including both mitigation and adaptation. Tuvalu, on the other hand, clearly stated that the NDCs should focus primarily on mitigation. Agreeing on the nature of the NDCs is one of the most challenging aspects of the negotiations.

On a more positive note, the work on pre-2020 action included for the first time technical expert meetings. These are meetings where experts can share best practices on renewable energy and energy efficiency with the country delegates. The meetings were praised by the vast majority of countries, and there were requests by a number of delegates, including those of the EU and the USA, to arrange similar meetings in future negotiations.

The week-long talks in Bonn also addressed many other topics, including transparency and equity in the 2015 climate agreement and climate finance.

Leaving aside the contrasts over specific items of the agenda, and considering the larger picture, the impression in Bonn was of a framework that is still missing some of its essential elements. While the technical expert meetings had a promising start, a lot still needs to be done both in terms of pre-2020 action and the 2015 climate treaty. In Warsaw last year, countries agreed to present their Nationally Determined Contributions “well in advance” of the 2015 COP, which will take place in Paris. In order for this to happen, there needs to be a rapid acceleration of the negotiations, and issues such as procedural aspects need to be dealt with swiftly, so that the discussion may focus on more concrete aspects of action on climate change.