Standards dramatically advance streamflow and flood forecasting in US and elsewhere – first in a 5-part blog series

This is a story about how water data standards, computational hard work, high-performance computing, serendipity and synergy led to an operational capability for nationwide forecasting of streamflow and flooding at high-resolution, in near-real-time. This has been evolving for several years now, but has gone into hyper-drive in just the last couple years. 

In May 2014, the opening of a new building on the campus of the University of Alabama in Tuscaloosa triggered the sequence of events that is improving flood forecasting throughout the U.S. and the world. The new building is the U.S. National Water Center (NWC), built by the National Weather Service (NWS) and federal agency partners U.S. Geological Survey (USGS), U.S. Army Corps of Engineers (USACE), and the Federal Emergency Management Administration (FEMA). 

 

Figure 1. US National Water Center at University of Alabama in Tuscaloosa

At the NWC opening ceremony, Professor David Maidment from the University of Texas at Austin called for a transformative change to be brought about by this new resource. Currently, the NWS operations are conducted by 13 regional centers around the U.S., each running its own weather and river forecasting models, not integrated with the other regions. There has been no national-scale model of surface water flows. The creation of the NWC provided an opportunity to focus resources on understanding the nation’s water situation at a comprehensive, continental scale. This could help advance the U.S. Open Water Data Initiative (OWDI) (http://acwi.gov/spatial/owdi/). 

Dr. Maidment had an idea how this new center could leverage that opportunity, by coordinating research that had until then been promising but not yet robust enough for this task. He put out a call to NWS and to academia, to rise to the challenge of integrating continental-scale models of weather, land-surface dynamics, and hydrologic flow. The community met that challenge through the National Flood Interoperability Experiment (NFIE, https://www.cuahsi.org/NFIE) and blew away hurdles that were thought to be show-stoppers. They are now feeding near-term, high-resolution rainfall forecasts through a single land-surface model framework to get national forecasts of runoff and streamflow routing. These forecasts estimate the flow of water (cubic feet or meters per second) through 2.7 million stream segments, averaging 2 miles or 3 km in length, seamlessly across the entire continental U.S. These forecasts are updated every hour of every day. 

This is valuable and timely information at a scale that is useful at the local level for emergency response planning and mitigation. While some states and sophisticated urban areas have more detailed models to anticipate flooding, this will enable even the most rural areas to be alerted to extreme events much sooner than previously possible. 

There is still work to be done, but the initial capability will become operational for the nation at the NWC during 2016.  More than twenty federal, state and local agencies, academic research centers, and commercial software vendors are cooperating and coordinating to refine and expand on the initial capabilities. One of these efforts is led by Professor Jim Nelson at Brigham Young University (BYU) in Provo, Utah. The BYU team has developed a Python web application framework called Tethys (www.tethysplatform.org), which provides model integration and visualization tools. They have adapted the US-NWS approach for use in global applications, using weather forecasts and related models from the European Centre for Medium-Range Weather Forecasts (ECMWF), in Reading, UK. This builds on an international initiative known as the Global Flood Awareness System (GloFAS, http://www.globalfloods.eu/).  They are conducting outreach and education to bring this capability to the U.S., Latin America, Europe, New Zealand, and elsewhere. An important aspect of this forecasting workflow is that it is scalable and cloud-based; does not require costly high-performance computing for smaller watersheds and regions, so can be implemented by developing countries using their own resources and trained staff. 

Over the next three months, we will provide a summary of what’s been done so far, the underlying challenges and technologies, and what’s next. This story is organized in four parts: 

  1.  U.S. national streamflow and flood forecasting capability
  2.  Computational framework and outreach for global applications
  3.  OGC WaterML (http://www.opengeospatial.org/standards/waterml) and NetCDF (http://www.opengeospatial.org/standards/netcdf)
  4.  OGC Timeseries Conceptual Model and TimeseriesML (http://www.opengeospatial.org/standards/requests/137)

 

Parts 1 and 2 summarize the data, models and workflows being developed. Part 3 is about two key OGC data exchange standards which made this possible. Part 4 is about the evolution of WaterML into a broader standard for representing time series in general, not limited to observations of water resources. Stay tuned!