New blog address
The Climate and Environment at Imperial blog has moved. Visit our new blog.
The Climate and Environment at Imperial blog has moved. Visit our new blog.
By Dr Flora Whitmarsh, Grantham Institute
The recently published 2015 Global Peace Index, produced by the Institute for Economics and Peace, said that although OECD countries became more peaceful in 2014, there has been substantial increase in annual war-related deaths since 2010, and there are now more refugees than at any time since the Second World War. It is currently difficult to give a definitive answer as to whether climate change could exacerbate these problems.
Climate change is likely to exacerbate a range of environmental problems including heat waves, water shortages, and extreme weather and flooding, but whether or not this will lead to increased rates of armed conflict is still the subject of research. According to the IPCC’s Fifth Assessment Report, some of the factors that increase the risk of conflict – recession, low per capita incomes, and inconsistent state institutions – are sensitive to climate change, but this does not necessarily mean that climate change will lead directly to higher rates of war and unrest.
Environmental factors more generally do play a key role in triggering unrest. An example of this is the role of water shortages in causing the recent Syrian war: farmers who had lost their livelihoods were forced to migrate to the cities, where they struggled to integrate and find work. The resulting uprisings then played a part in triggering war. Although conflict is sometimes triggered by a single environmental problem which acts as a tipping point, environmental factors usually combine with other factors to exacerbate war and unrest.
In his book, Global Crisis: War, Climate Change and Catastrophe in the Seventeenth Century, Geoffrey Parker argues that the unusually cool conditions (a 3 degree lower global temperature) during the seventeenth century “Little Ice Age” played a key role in sparking the steep rise in conflict sometimes referred to as the General Crisis. Surviving records from Europe and Asia paint a picture of extremely widespread famine and instability. In Europe for example, almost no country avoided war during the 1640s, and between 1618 and 1678, “Poland was at peace for only 27 years, the Dutch Republic for only 14, France for only 11, and Spain for only 3.”
Parker argues that the seventeenth century environmental deterioration caused by a decrease in solar energy reaching Earth and an increase in volcanic and El Niño activity has had “few parallels” and that “the frequency of wars and state breakdown created unprecedented political, social and economic instability.” Although the modern world is more integrated than the world of the seventeenth century, the argument linking periods of environmental instability with increased rates of conflict is potentially applicable to human induced climate change.
Dr Mark Workman of Imperial College London said, “The hypothesis for the causes of the General Crisis is very compelling”, but cautioned that “in the modern globalised and increasingly interconnected world the causative mechanisms are very controversial and I think the jury’s still out as to the degree to which environmental impacts are important as a cause in contemporary conflict.”
But environmental problems don’t always lead to conflict. “One thing that’s been missing from the current narrative is that there are opportunities for confidence building measures and co-operation,” says Dr Workman. Disputes over trans-boundary rivers often lead to increased cooperation because neighbouring regions or countries have an incentive to work together to make sure water is distributed fairly. “As history has demonstrated, through confidence building measures and sensible mechanisms of water allocation or resource allocation you can actually find ways to make one plus one equal two”, continued Dr Workman.
More research is needed to explain the precise mechanisms by which environmental factors spark conflict, and how this relates to climate change. Key to this is acquiring more and better data. Relevant data can be hard to obtain at present; for example the global FAO food price index is of little use in evaluating stressors in Tunisia and Syria, both of which have very localised food markets. In addition, research has tended to centre on the Middle East and North Africa Region, so there is a need for more widely applicable research.
A high proportion of global conflicts currently occur within, rather than between, countries, which would also apply to conflicts involving environmental problems, so future work is likely to focus especially on this area. Dr Workman says, “Environmental enhanced conflict will be more a subnational manifestation, because there is world trade at a national level, whereas within fragile conflict-affected states, which are home to nearly 20% of the world’s population, there are limited trade and transport mechanisms to address the resource arbitrage.”
With its wealth of technical expertise, the Grantham Institute at Imperial College London is well placed to bring together the fragmented work and carry out new research in this area. There will be three key steps involved in taking this forward. First, knowledge gaps need to be identified. Second, theories about how conflict interacts with environmental factors need to be verified using case studies. The third and most challenging step is developing ways of predicting how environmental factors may exacerbate or cause conflict in the future.
A publication released last week has highlighted an agenda for research.
This blog post by Malcom Graham, an SSCP DTP student, is part of a series on Responding to Environmental Change, an event organised by the Natural Environment Research Council (NERC) funded Doctoral Training Partnerships at Imperial (SSCP), and the University of Reading and the University of Surrey (SCENARIO).
See the full list of blogs in this series here.
Environmental hazards are becoming more frequent and severe, with potentially serious impacts on people, supply chains and infrastructure globally. Advancing our knowledge and understanding of these hazards, and the processes involved, will allow us to better predict, plan for and manage the risks in order to increase resilience to these changes.
This session focussed offered perspectives from academia (Imperial College London), the world of (re)insurance (Willis Re) and the charity sector (Oxfam).
David Simmons, the Head of Strategic Capital and Result Management at Willis Re, began proceedings and impressed us by speaking with no slides or notes, describing it as a ‘liberating’ experience. Despite (or perhaps helped by) the absence of visual aids, his delivery was nevertheless engaging and humorous.
His talk focussed on the world of reinsurance, which he assured us was the ‘sexy’ part of the insurance sector, specialising as it does in catastrophe risk. He contrasted this with the banal nature of regular insurance work and the social death that ensues for most practitioners.
We were told that reinsurance, which covers the insurance companies themselves against major disasters, is suffering from too much capital. Stoically, David explained the reasons behind this: essentially, due to financial uncertainty in other sectors, no one else could offer the low risk and high returns on investment now commonplace in the reinsurance industry. This he attributed to a much greater understanding of catastrophe risk over the last few years than had previously existed.
Following on from Don Friedman’s modelling of hurricanes in the 1980s, which provided a basis for hazard and probability analysis, David explained how there has since been massive investment in producing ever more reliable models to understand these elements. Indeed, the process of developing models in itself seems to have driven the understanding of various components and allowed constraints to be placed on the ‘unknown unknowns’, a Rumsfeldism which seems to make its way into most talks on modelling these days.
The price of reinsurance has apparently dropped substantially in recent times, driven by the unprecedented levels of investment. In particular, we were told that reinsurance for many parts of the developing world comes at negligible cost, due in part to a reduction in the number of deaths from droughts as a result of more reliable aid. Although this is clearly a positive development, David was keen to point out that the arrival of aid was often too slow to prevent significant human suffering and damage to assets and infrastructure. The focus has therefore turned to more timely interventions and having better systems in place for disaster response.
We learnt that insurers are now playing an important role in driving best practice from governments, with many African countries having to present draft disaster response plans, audited reports on actual responses implemented by the government and the results of anti-corruption tests before they can join insurance programs.
David’s talk closed with commentary on the growth of various large-scale insurance schemes, many of them covering multiple countries. He cited the example of the African Risk Capacity, which is expanding from 5 to 10 members, and a scheme in the Caribbean which is now expanding into Latin America. He did highlight some pitfalls with the more inclusive approach to insurance, contrasting the approach to flood insurance in the UK, where higher risk properties pay an additional premium, with the French system where all households pay the same, thereby removing some of the incentive for individuals to reduce their risk.
Our second talk of the session came from Martin Rokitzki, former resilience advisor for climate change adaption at Oxfam. Humbly professing to be ‘the least scientific person in the room’, he could nevertheless point to 15 years of practical experience working on climate change and environmental issues.
His talk began by looking at what is actually meant by the term ‘resilience’, which appears to have numerous definitions relating to one’s ability to cope, adapt, prepare or thrive when faced with shocks, stresses or uncertainties.
When presented with such an uncertain framework, we were unsurprised to learn that there is no ‘cookie-cutter or cook-book’ for resilience and that the term may be applied to a huge range of social and economic groups. By talking about his experiences with Oxfam, Martin was at least able to narrow his focus to addressing the resilience of the world’s poor.
Even within this constraint, understanding hazards and impacts was presented as a multi-faceted exercise. Variations in the spatial extent of damage, its intensity, duration, rate of onset and level of predictability could all have profound effects on the planning process. Counterintuitively, Martin felt that slow-onset hazards were often the hardest to address and his talk focussed on how to deal with challenges of that nature, such as the East African food crisis, glacier melt in Nepal and salt intrusion in Tuvalu.
We were told that Oxfam’s approach to resilience involves 5 key areas: livelihood viability (i.e. the economic buffer to disaster); innovation potential; contingency resources and support access (i.e. provision of aid); integrity of the natural and built environment (in the case of the extreme poor, they are directly dependent on the surrounding natural environment); and social and institutional capacity (i.e. governance).
In contrast to the preceding speaker, Martin’s presentation abounded with eye-catching schematics, highlighting various approaches to disaster management. Key to these were the integration of policy and projects to get a successful outcome. To illustrate this, he presented us with the ‘Cycle of Drought Management’ which moves through stages of preparedness, disaster response and relief, reconstruction and mitigation. Alas, the paucity of data in 80-90% of affected areas means that the preparedness stage is often a huge challenge. Our presenter highlighted this as a key reason for Oxfam to collaborate more closely with scientists.
Towards the end of his talk, Martin touched on Oxfam’s R4 approach to risk, encompassing risk reduction (managing resources well), risk transfer (insurance), risk taking (credit for investment) and risk reserves (savings). Without this sort of strategy, seasonal food shortages could easily become year round famines. As part of this Oxfam has been helping to administer financial services in remote rural areas and developing a focus on flexible and forward-looking decision making.
Martin’s final message was that we need more collaboration between the ‘thinkers’ and the ‘doers’ – a clear call for the science community to engage more directly and more frequently with aid agencies and other environmental organisations.
Our final speaker of the session was Imperial’s very own Professor Ralf Toumi, who described his ongoing work on the OASIS project, an open access model for looking at the impacts of extreme weather events on the built environment.
His main driver for the project was the limited number of companies providing assessments of risk in this area, thereby giving a fairly narrow field of views on risk to the insurance sector. He reflected that this has not been helped by a continuing barrier of information between researchers and insurers and the ‘black box’ approach to disaster modelling which exists within the commercial world.
Following the previous speaker’s flurry of eye-catching diagrams, Ralf was not shy to present a few schematics of his own, illustrating the concepts behind OASIS. These highlighted the user’s ability to select combinations of models to give a tailor-made view of risk, including a broader spread of results and a greater understanding of model bias and uncertainty. To highlight the point, Ralf asserted that vulnerability modelling (i.e. the damage caused by an event) has a much greater level of uncertainty than hazard modelling. Indeed, one of the key challenges of the OASIS project has apparently been to get hold of datasets on damage, information which some players in the industry have been reluctant to provide.
A further challenge, we were told, is the effect of giving insurers greater experience in using this modelling framework: the desire for greater complexity. Whilst models appear to be ever more powerful (a 30 year dataset can apparently now be used to predict a 1 in 1000 year event!) there is a serious challenge to translate this complexity from the academic / journal environment to insurance professionals. There has also been a need to standardise the wide array of different data formats associated with OASIS’ component models.
Despite these challenges, it appears that OASIS is flourishing. Our presenter proudly displayed a series of media articles after their press release went viral, along with a list of 44 members of the OASIS Loss Modelling Framework, a list that includes numerous insurance and reinsurance companies. Their many associate members include a variety of government bodies, academic institutions and IT companies.
A combined question and answer session followed on the three presentations. It began with the question of how all these ‘big complex’ models have been validated with data. Professor Toumi agreed that validation is a huge issue, although hazard validation is much easier to do, using historical datasets, than validating predictions of damage, which sometimes diverge wildly. David Simmons was able to point to a recent paper he had written on model validation and highlighted that the non-stationary world we live in means that there are never sufficient data. Nevertheless, he believed that even non-validated models are better than nothing and that the modelling process aids understanding as much as the end result. He also highlighted that satellite datasets can act as a useful first-pass method for validating models.
The second question focussed on how we transition from looking at short-term resilience to combatting longer-term changes. Martin Rokitzki responded that although we live in a short-term world, transformative scenario planning is more commonly done nowadays, which is often based on narratives rather than data alone. Adaptive management is also more common.
Another audience member (currently working for one of the London mayoral candidates) wondered what question we should pose to mayoral candidates of large cities in relation to risk management and resilience. The panel were somewhat stumped by this, but (ironically) opted to answer the question about a question with another question: Martin Rokitzki wondered who has responsibility for risk management. Should adaptation be a government service, should it be borne by individuals or even by the private sector? David Simmons cited an example of the World Bank trying to cover industrial areas against earthquakes and reward good design through financial incentives. Unfortunately, the scheme struggled through a lack of political will to take decisions which might be unpopular with their electorates, despite having clear long-term benefits.
The final question related to the possible impacts of a catastrophic asteroid impact and the huge disparity between the insurance fund set aside to cover Florida’s coastline from storm damage and flooding ($2 trillion) compared to a much smaller sum assigned globally for larger-scale catastrophes like asteroid impacts ($0.5 trillion). David Simmons responded that the insurance industry focuses on the short-term, partly due to the 5 year tenure of most CEOs. This makes asteroid impacts beyond the timescale of concern. Another contributor to the disparity is that flood insurance is governed by a regulator in Florida. Despite this, David felt that Florida now has enough reinsurance capacity and that there is now a need to better understand hazards like asteroids.
And as we all dwelt on what sort of cosmic destruction may be in store, the session was brought to a close, leaving us with the much simpler conundrum of what to have for our lunch.
Watch a video of the talk on our YouTube channel.
By Dr Flora MacTavish and Dr Simon Buckle
In the press coverage of the recent floods, there has been a lot of discussion about whether the authorities could have been better prepared or responded more effectively. The National Farmers Union has called for the reintroduction of river dredging, although experts argue that dredging may be limited in its effectiveness. Local authorities have been criticised by experts for distributing sand bags rather than encouraging the use of more effective alternatives such as wooden or metal boards.
These are essentially tactical issues, however. It is the government and local authorities that have the vital strategic responsibility for fully embedding weather and climate risks into decisions on the level and focus of investment into flood defences and planning regulations about what can be built and where.
The persistence of the weather pattern that has caused this winter’s exceptional rainfall and floods has been very unusual. However, as the Adaptation Sub-Committee of the UK Climate Change Committee noted in their 2011 report, heat waves, droughts and floods are all expected to get worse as a result of climate change. The recent Intergovernmental Panel on Climate Change assessment of the science (AR5) concluded that average precipitation was very likely to increase in the high and some of the mid latitudes, with a likely increase in the frequency and intensity of heavy precipitation events over land (see our note on The Changing Water Cycle). If we are to improve our resilience, we need to get the strategic policy framework and incentives right.
Unfortunately, for flood risk, this doesn’t seem to be happening yet, despite the Pitt Review after the 2007 floods. In 2011, the Climate Change Committee noted a decline in urban green space in each of six of local authority areas studied, and an increase in hard surfacing in five of the six. The Committee’s 2012 report showed that the UK has become more exposed to future flood risk. It judged that four times as many households and businesses in England could be at risk of flooding in the next twenty years if further steps are not taken to prepare for climate change.
In particular it noted that:
The Committee has acknowledged that the economic and social benefits of new developments may not always be outweighed by the risks of building on flood plains. Decision makers should weigh up the trade-offs between long term risks such as climate change and other shorter term priorities, but the Committee judged that this was not happening “widely or consistently” at the time they wrote their 2011 report.
The government is in the process of trying to implement the Flood Re scheme to address concerns over the affordability and availability of flood insurance, but as our colleagues at the Grantham Research Institute at LSE have noted in their response to the government consultation,
“The design of the Flood Re scheme, which is expected to last until at least 2035, has not taken into account adequately, if at all, how flood risk is being affected by climate change. For this reason, it is likely to be put under increasing pressure and may prove to be unsustainable because the number of properties in future that will be at moderate and high probability of flooding has been significantly underestimated. “
Whether or not these particular floods are due to climate change, this is the sort of thing we expect to see more of in the future. When the immediate crisis is over, the government needs to think hard about its strategic response, which must include mitigation action as well as measures to develop greater resilience to weather and climate related risks.
By Professor Sir Brian Hoskins
The US has been suffering from icy weather and snow storms in recent days. This image from NOAA shows the surface air temperature anomaly for the week 2-8 December – that is the difference from the mean temperature for this time of the year.
It was very cold over North America (where we get lots of news from!) but very warm in Eurasia and parts of the Arctic (where we don’t!). This is the sort of thing the atmosphere can do on short timescales through having a particular pattern of weather.
Governments, planning authorities, companies and individuals all need information about the impact of climate change on extreme weather in the future. A recent paper [1] investigates changes in extremes both globally and locally.
We can be relatively certain of the signature of climate change on a global or continental scale. On the other hand, estimating changes on a country-wide scale is harder and estimating them on a local scale (i.e. to the nearest few kilometres) is very difficult.
In the new study, scientists calculated the proportion of global land area in which certain weather extremes are expected to increase. This can be projected more accurately than the change at an individual location.
The researchers found that the hottest temperatures will get even hotter in half of the global land area within 30 years. This is more useful than simply saying that extreme temperatures will become hotter on average globally. It is also more accurate than making predictions about particular locations.
What impact is climate change having on extreme temperatures and precipitation?
Climate change is already having an impact on the weather we experience. As mean temperatures have increased, the extreme hottest and coldest temperatures have also gone up.
Rainfall patterns have shifted, and there has been an increase in heavy rainfall over most land areas since 1950 (IPCC, 2013).
Globally, these trends are expected to continue over the coming decades. Future trends at a local scale may be obscured by the natural variability of the climate.
Why is this paper interesting?
This study confirms that there are significant uncertainties in how extreme temperature and rainfall will change locally. In some regions, there will probably be no change in extreme weather over the next thirty to fifty years. A small proportion of places may even see a reduction in extreme temperatures or heavy rainfall events over this time period.
Importantly, these uncertainties are mainly due to the natural variability of the climate. Uncertainties in climate models are much less important. Even a perfect model could not accurately project changes in extremes locally on a thirty to fifty year timescale.
Despite this uncertainty, the study found that it is possible to estimate changes in extreme weather as a proportion of global land area. This technique turned out to be more reliable than making projections for individual regions. It is also more informative than the global average change in extremes.
This technique is not completely new, but the paper adds to a body of research looking at how to predict future extremes.
What does this mean for decision makers?
Reinsurance firms and commodities traders both operate in global markets. They could use this technique to evaluate changes in risk, helping to set premiums or prices more appropriately.
Local planners will probably still opt to build resilience to climate threats. This study does not provide any new information about the impacts of climate change on a local level.
This study does however provide further evidence that it is not possible to predict the pace of change locally. There is little reason to wait for more certainty in the science before beginning to build in greater resilience to extremes.
What methods were used in the paper?
The paper measures the proportion of land that will experience an increase or decrease in extremes. The paper assesses changes in the frequency of:
Climate models were used to estimate the proportion of global land area in which the above four measures will increase or decrease, and by how much. Firstly, the results of computational simulations using many different climate models were analysed to provide a measure of model uncertainty.
Secondly, the same climate model was run several times with slightly different atmospheric initial conditions. This provided a measure of the uncertainty due to climate variability.
What were the main conclusions?
There are significant uncertainties in how temperature and precipitation extremes will change on a regional scale. These uncertainties are largely due to climate variability, rather than any errors in how models represent climate processes.
The proportion of global land area that will be affected by changes in extremes can be estimated much more reliably. This technique should prove useful to organisations who can make use of global scale information about climate change impacts.
Half of land areas are expected to have hotter temperature extremes within 30 years.
By the period 2016-2035 there is expected to be an increase in the proportion of land areas that will experience intense precipitation.
Further reading from the Grantham Institute
Grantham Briefing Note 1: The slowdown in global mean surface temperature rise
Grantham Briefing Note 5: The changing water cycle
References
[1] Robust spatially aggregated projections of climate extremes, E. M. Fischer, U. Beyerle & R. Knutti. Nature Climate Change (2013).
[2] IPCC, 2013: Climate Change 2013: The Physical Science Basis. Contribution of Working Group I to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change [Stocker, T. F., D. Qin, G.-K. Plattner, M. Tignor, S. K. Allen, J. Boschung, A. Nauels, Y. Xia, V. Bex and P. M. Midgley (eds.)]. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA, in press. Available on the web at: http://www.climatechange2013.org/report/review-drafts/.
BBC’s Question Time on 14 November saw Lord Lawson citing the IPCC findings to support one of his arguments. Did I dream that? Then I realised that, of course, the reference to the IPCC was incomplete and misleading so I knew I was awake and back in the strange media-distorted world of the UK debate on climate change.
According to the Daily Express, Lord Lawson said that “If you look at the inter-governmental panel on climate change they say there is absolutely no connection between climate change and tropical storms.” Wrong, but convenient for someone who argues we probably don’t need to do anything much about climate change.
What the IPCC actually said in the admirably cautious Technical Summary of the Fifth Assessment Report (AR5) was that:
“Globally, there is low confidence in attribution of changes in tropical cyclone activity to human influence. This is due to insufficient observational evidence, lack of physical understanding of the links between anthropogenic drivers of climate and tropical cyclone activity, and the low level of agreement between studies as to the relative importance of internal variability, and anthropogenic and natural forcings.”
So far so good for Lord Lawson, but then, the IPCC goes on to say:
“Projections for the 21st century indicate that it is likely that the global frequency of tropical cyclones will either decrease or remain essentially unchanged, concurrent with a likely increase in both global mean tropical cyclone maximum wind speed and rain rates (Figure TS.26). The influence of future climate change on tropical cyclones is likely to vary by region, but there is low confidence in region-specific projections. The frequency of the most intense storms will more likely than not increase substantially in some basins. More extreme precipitation near the centers of tropical cyclones making landfall are likely in North and Central America, East Africa, West, East, South and Southeast Asia as well as in Australia and many Pacific islands.” (my emphasis).
In making this statement, the IPCC reflects the fact that, while the science is by no means settled, there are a number of studies that provide physical mechanisms linked to climate change that suggest the frequency of the most intense storms would increase with warming. As I understand it, the warmth of the near surface ocean provides the basic fuel for the cyclone: the winds spiralling around the system evaporate water which cools the ocean and puts latent heat into the atmosphere. When the air rises and the water condenses in deep convection in the storm the heating leads to extra ascent, drawing in more air and leading to faster surface winds. The warmer the ocean is, the more fuel there is for a potential tropical cyclone, and the stronger they could be. Many other aspects come into play such as the changing winds with height and the temperature of the atmosphere up to 15 km. However, in a warmer world, the potential for stronger storms is there.
Indeed, based on this sort of evidence, the quote highlighted above is a statement that the IPCC judges there is more than a 50% chance that the frequency of the most intense storms will increase substantially in some ocean basins. So if Typhoon Haiyan was not affected by climate change and yet was still one of the most powerful storms ever making landfall, it’s clear that the Philippines and other regions exposed to tropical cyclones have a lot to worry about unless we make a “substantial and sustained reductions of greenhouse gas emissions (SPM Section E).” It would be good to see Lord Lawson quoting that particular part of the IPCC AR5 report!