Data Is Integral To Understanding Hurricane Harvey

9/2/2017 | SAMANTHA FOX

Data Is Integral To Understanding Hurricane Harvey
Samantha Fox

Samantha Fox | Senior Product and Data Analyst | Bio

Houston has flooded. Trillions of gallons of water have fallen on the city. Rivers are gushing, freeways have become massive concrete canals, and areas once believed to be safe from flooding are under feet of water. Dams and reservoirs are trying to keep up with the overwhelming volume, but there is simply nowhere for the water to go. This is the worst natural disaster the region has experienced in modern times.

But how do we know this is the worst natural disaster the region has experienced in modern times? Terminology like “500 year flood” are appropriate to convey the gravity of this catastrophic event, but what does this presumed baseline mean? Five hundred year flood is just a measure of a probability of occurrence, with a 0.2% chance of occurring in a given year, and therefore these events can and do occur more frequently than once every 500 years. In fact, Houston has endured three major floods in just the last 3 years: the Memorial Day floods of 2015 and Tax Day floods of 2016, and now Hurricane Harvey.

Emergency response efforts, news reports, and hydrologic predictions rely on an elaborate system of data monitoring networks to understand flows across a flooded region. Federal agencies come together with state and local forces to measure water levels and understand how the system will change over time. When disaster strikes, emergency monitoring networks are set up to inform on the ground decision making. Unfortunately, we only have visibility into data up to a point. Our standard data monitoring networks are ill prepared for extreme circumstances even though this is when their numbers are needed most.

At Water Sage, we have assessed the available data to make two distinct observations:

1. River gages, at the time of the hurricane, reported discharge levels to be thousands of percent higher than normal levels.

2. Many river gages have gone offline, meaning they have been inundated and overwhelmed, creating gaps in real-time information available.


In the figure above, we can see where measurements have exceeded historical averages by orders of magnitude. The sizes of the dots at the gage locations signifies the percent increase from the 12-year average. Comparing the average levels from 2005 to now, gages are as much as 28,000 percent higher than their historic amounts.

The US Army Corps of Engineers (USACE) released water from the Addicks and Barker Reservoirs before the storm to make room for more water to come in, since the purpose of these reservoirs is for flood control. However, due to heavy rainfall, the reservoirs refilled in just days.


At the gage Whiteoak Bayou, Main St Houston, we can see a dramatic increase in gage height measurements. While the historic average measurement is just half a foot, in a matter of days measurements increased to an excess of 30 feet. Seeing this data illustrates the gravity and magnitude of the flooding event.


When data is unavailable, it makes it much harder to understand how to track these extraordinary events. Above, we can see the individual gages that have gone offline since Hurricane Harvey made landfall. There are fewer gages that survived the flooding and produced continuous data than gages that went offline.


The gage shown above, north of Addicks Reservoir, shows flood conditions prior to the releases, but the gage was rendered unable to properly measure during heavy rainfall and backflow, making it difficult to fully assess conditions on the ground. Since Addicks reached capacity even after USACE releases, the area north of the reservoir likely flooded – but the gage that would measure that has gone radio silent.


Addicks Reservoir itself also experienced a reduction of data. Outflow dramatically increased to over 4,000 cfs over a couple days, but soon, backflow conditions caused the gage to go offline.

Why does data like this matter? There are crisis conditions on the ground, and first responders and victims of the flooding know the magnitude and don’t need these numbers today. But as these events are unfolding, federal, state, and local agencies do, as do the media producing reports for public consumption. Assessments in the aftermath of Harvey and planning for future events will require reliable, complete data. These data gaps only handicap efforts to plan for future events. It’s obviously not as simple as just building more gages – when thousands or millions of tons of water come crashing against this infrastructure, it is going to break. As we contemplate the massive investments in new infrastructure that will be required over the next decade, we can’t forget the importance and value of monitoring technology. Can we deploy other technologies to augment our monitoring network to ensure data is available when these historic events happen? Imagine a world where our technology has caught up with our reality, and we can count on sensors to get help where, and when, it’s needed most.