thumb ambientalThe high severity and increasing frequency of major flood events worldwide has driven flooding up the political and risk management agenda. David Martin, Technical Manager of flood risk assessment and modelling consultants Ambiental discusses the development of a new flood modelling system that is being in the UK and how it can be deployed through the use of Cloud Computing using the internet and the advantage of this approach. The risk of flooding to people and property is greater than ever. In the UK, climate scientists are predicting an increase in the frequency of heavy, monsoon-like rainfall whilst across the Atlantic, the US is still reeling from the 2008 hurricane season which was one of the most damaging on record. The forecast for 2009 does little to reassure.

fig1Flooding is now a key component of risk management. In the UK, both strategic decision makers and the general public are coming to terms with an overtly erratic climate, expanding floodplains and recent failures in intra-urban drainage systems.  It is no longer safe to assume that if you haven’t flooded before, you won’t flood in the future.

Increased Values at Risk of Flooding
A look at the facts and figures quickly justifies the attention flooding is receiving. Statistics from the US Federal Emergency Management Agency (FEMA) show a total insured value of over $1 trillion in force (for policies which are subject to some degree of flood risk) as part of the National Flood Insurance Program (NFIP). In the UK, the Environment Agency (EA) has previously estimated that around 10% of the population of England and Wales and assets to the value of £200 billion, are at risk of flooding. Although this figure may be much higher if the risk from surface water flooding (induced where local sewer systems cannot cope with heavy rainfall) is fully considered.

However, development within the floodplain still continues. Whilst UK planning policy is steering new proposals out of known floodplains, the guidance focuses on ‘high vulnerability’ schemes such as houses or schools, whilst being far less stringent for commercial buildings.

Furthermore, current policy allows for all but the highest risk developments to go ahead where all alternative ‘low risk’ sites have been ruled out. The problem is compounded by the inevitable concentration of high value assets in high risk areas; low lying areas near rivers and coastal regions have historically attracted human settlement for a variety of reasons including access to fertile land and transport. Indeed, many major international cities like London, Tokyo and New York are located next to rivers or within the coastal floodplain.

Increasing levels of urbanisation and affluence, as well as the tendency to locate high-value assets such as power stations next to water mean that the number and value of properties in the floodplain are continually on the rise.

Conversely, where new development is pushed to the urban fringes, or ends up occupying central urban green space, this places more pressure on the storm drainage network. Greenfield land has a convenient tendency to soak up rainfall and alleviate excess surface water flow, whilst paved areas have the opposite effect. Ageing drainage networks in cities such as London and New York City can no longer cope with the type of rainfall now being experienced.

However, traditional methods of flood risk assessment may not be adequate for predicting areas at risk of flooding, especially in urban areas.


The need for new approaches to flood risk management, mapping & modelling
There have been huge spending pledges towards flood management in the UK and USA. As our understanding of this peril increases, there has been a changing focus in the way that this management is carried out. A more holistic and sustainable approach is being adopted in many countries, identifying how the consequences of flooding can be reduced, as opposed to simply defending areas with ever-increasing levees and barriers.

Regardless of the adopted approach, a sound understanding of the hazard itself is required. This integral component of flood management, prediction, is achieved using flood modelling tools and techniques.

Flood modelling can be used to analyse, predict and model flood events. Detailed flood models can play a key role in helping to protect people and property from the potentially devastating effects of flooding. Quite simply a flood model determines the volume and passage of water which would be expected during a major flood event.

Depending on the application and level of complexity required, different flood models examine flood risk at different scales with varying levels of accuracy, from detailed, single site analyses to wide-area, catchment-based studies. There tends to be a trade-off between the cost of the modelling exercise and its resolution. Increasing resolution increases cost in terms of data collection, processing and the sophistication of the model required.

Flood modelling can be undertaken using 1-dimensional, 2-dimensional or 3-dimensional (1-d, 2-d or 3-d) modelling techniques:

  • 1-D Hydrodynamic Flood Models are applied to cross-sectional data of the river channel and floodplain. This technique effectively “fills in” the floodplain with the available level of excess / overbank water calculated at a specific point in the channel (hence, 1-dimensional) but does not consider the way in which the water will behave between the sampled cross sections, or when manipulated by obstacles on the floodplain.
  • 2-D Flood modelling differs from 1-d flood modelling in that it simulates differing flow conditions across a potential floodplain, represented as a 2-dimensional grid of cells. A 2-d Hydrodynamic Model divides the floodplain into a gridded domain (down-stream and cross-stream directions) – complex obstacles such as buildings can be taken into account.
  • 3-D river flood modelling allows flows in 3-directions to be modelled (i.e. down-stream direction; cross-stream and vertically through the channel) within a 3-d gridded domain. This is useful for studies of complex flow structures (e.g. turbulence, eddy dynamics) or for detailed, site specific flood modelling and risk assessment studies.

The required computational power for each technique increase by orders of magnitude as ‘dimensions’ are added (rendering 3-d modelling impractical, and indeed unnecessary, for the majority of commercial, wide-area projects).

1-d modelling utilises surveyed cross-sections to estimate river channel dimensions and equations to estimate flood stages for a given water flow. This approach works well for rapid, wide-area river flood zoning projects in rural areas but may not be appropriate when accurate predictions of flow paths, the depth of water on the floodplain, or the speed that water may be moving is required.

fig2

More importantly, 1-d flood modelling, using software such as HEC-RAS from the US Army Corps of Engineers does not enable the prediction of detailed flow paths in the wide gaps between surveyed cross-sections. Typically a flood map generated using a 1-d model simply involves the extension of the flood depth calculated in the channel, across the floodplain.

Therefore 1-d modelling has obvious shortcomings when, for example, flood flow paths associated with complex urban floodplains, breaching of flood defences, coastal inundation or surface water flows have to be modelled, mapped and communicated effectively.


Improved 2-dimensional flood modelling using Flowroute
2-d modelling represents the next generation of flood modelling and is becoming increasingly popular in countries such as Australia and the UK, where there are ambitious strategies in place to increase the accuracy and detail of existing nationwide flood maps.

In time, this is likely to translate to the US given the increasing emphasis on flood management and defences set out by the Obama administration. 2-d models such as Ambiental’s FlowrouteTM are being used for international flood modelling projects to create and distribute detailed river, coastal (storm-surge), defence breach, surface water, dam breach and sewer flood maps and models for at-risk areas.

Flowroute was developed in collaboration with flood scientists at Cambridge University. Incorporating over 15 person-years of R&D, this modular software platform can be used to predict almost any form of flooding, providing information on the depth, duration and the extent of flooding as well as water velocities and flow-path evolution down to the level of individual streets and buildings.

The software has the capability to be fully customised. Every single simulated cell can be assigned unique characteristics including: in/out flow, rainfall, standing or dynamic water elevation, limited capacity/surcharging drain, no-flow area, burst pipe etc. This capability allows a modeller to simulate a considerable range of unique scenarios including joint probability events.

To date the flood prediction system has been used by planners, utilities companies, engineers, insurers and local government to predict with increased precision and accuracy the flood hazard resulting from a wide range of flood sources and scenarios.

In Figure 2,  Flowroute model shows the 2007 UK summer floods loaded into Google Earth, where surface water (otherwise known as ‘pluvial’ flooding) inundated much of the city of Kingston-upon-Hull. The predictive model was found to be over 80% accurate when compared against actual flood survey data collected in the field, pluvial flood map.

Until recently, there have been two principle issues surrounding wide-area 2-d modelling. Firstly, this type of modelling is extremely computationally demanding – a model domain may contain 10 million cells, all of which need to interact with their neighbouring cells 50 times a second for anything up to 100 simulated hours.

fig3

This requires an immensely high number of calculations, which a few years ago would have taken your average desktop PC a year or more to compute. Although computer power has increased significantly over the years, it is the specialist optimisation of the code base and the intelligent linking of multiple processor cores that has really forced a step-change improvement in 2-dimensional flood modelling.

This step-change means that 2-dimensional flood modelling is now the de facto standard in many commercial and academic flood modelling applications. Recent code developments have improved Flowroute’s efficiency to the point that now, a standard office PC can generate a city-wide flood map, down to the level of individual buildings, in a few days. A cluster of PC’s can be deployed to generate highly detailed national-scale flood mapping products within a matter of weeks.

Figure 3 provides an example of a river defense breach model on the River Thames, London. 2-dimensional flood inundation models such as these can now be carried out in a matter of hours.

In the past, detailed topographic data was either too expensive to be commercially viable, or was simply not available. However, huge swaths of high resolution digital elevation model data have now been collected for many countries in Europe and North America. There has also been a marked increase in the availability of good quality digital mapping, which can be used to increase the realism of the digital floodplain and therefore enhance accuracy.

It is vital that high quality and high resolution topography is used within flood models – unsuitable topographic data can significantly increase the uncertainty of model predictions.


Breaking new ground - Cloud computing as a platform for improved flood modelling

Recent developments in the field of Cloud Computing have allowed us to greatly increase our capacity to process very large areas and multiple scenarios in parallel enabling us to run any number of return periods (e.g. 1 in 100 year event) at the same time. This scalability means that Flowroute™ is an effective platform for deterministic and probabilistic flood modelling approaches. The technology not only enables us to model domains the size of London (c. 1600 km-sq), at high grid resolution within a matter of days, but it also enables us to model any number of other domains at the same time.

Traditionally, the high accuracy requirements of deterministic flood modelling has been at odds with probabilistic modelling, the latter being of interest primarily for insurance applications. In deterministic modelling, an event is specified, simulated, and calibrated generating a flood zone and/or depth map for the 1 in 100 year flood from a given river.

fig4

In contrast, a probabilistic approach involves running hundreds, or even thousands of simulations, at a much lower grid resolution so as to be computationally feasible. The probabilistic approach can be used to generate maps showing the annual probability of flooding from the river at discrete locations within the floodplain or loss estimates which can be used for reinsurance premium setting.

Although the probabilistic method can be used effectively to analyse and communicate the spectrum of probabilities associated with flooding from multiple sources (e.g. rivers, storm surge, dam burst etc.) at a single location, when this is based upon lower quality data, errors propagate and uncertainties increase. Each technique has its clear merits and pitfalls but until now, the modeller has had to choose which method to follow.

However, by coupling the software with Compute-Cloud technology, doors have been opened with respect to on-demand, mass-market 2-d flood modelling. By combining a rapid 2-d code base with almost limitless, on-demand computing power, we are finally starting to bridge the gap between deterministic and probabilistic modelling. It is now possible to have the best of both worlds; detailed, national scale flood hazard maps with reduced uncertainty, lower cost and a more rapid turnaround. It is now up to government agencies, insurers, planners and risk managers to grasp the technology and realise these benefits.

--------------------------------------------------------------------------------------------

David Martin is Technical Manager of Ambiental Technical Solutions (www.ambiental.co.uk), a company that is developing pioneering new technology to help tackle the risks of environmental catastrophes.

Author of this article
Jeff Thurston

Jeff Thurston holds a Master of Science Degree in Geographic Information Science from Manchester Metropolitan University, UK and graduated in Forest Technology from Lakehead University in Canada. Jeff also graduated with an Advanced Diploma in Geographic Information Systems (UNIGIS) from Simon Fraser University, Vancouver, Canada. Previously, Jeff worked at the University of Alberta located in Edmonton, Canada where he managed research facilities for inter-disciplinary research projects. He is based in Berlin, Germany.