Thank you for your support of DisasterMapping

As we approach 35000 visits to this blog and the single largest traffic day in the past two years with 1200+ unique visitors, I wanted to share some thoughts about the direction of the site. 

Originally when I started blogging , I was interested in looking at the application of geographic information systems and mapping to the disaster response and emergency management context.  Over time, I’ve learned a few key lessons about disaster communication that will guide where we go from here. Nature abhors a vacuum and good old fashioned journalism combined with critical thinking provides an opportunity to answer the questions that so many people are asking.

On peak days, most of the volume is driven to this site not by Twitter or Facebook but by everyday web searches like Google, Yahoo & Bing.  People are looking for answers to the events theyve seen unfold in the world around them.

Likewise when media (mainstream media or social media) or officials exaggerate or provide erroneous information people notice.  A great example of this is a tweet I just saw from Jason Prentice: “‘So, NBC Nightly News leads with Texas tornadoes “Out of Nowhere” while @CBSNews has accurate stat on 26 minute lead time. Who wins?'”

In the world disaster response people have long memories – people remember when you mess up. People remember when you didn’t do your homework.  People search out more trustworthy sources when you’ve you proven to be untrustworthy in the past.  And finally, when you can’t give people the answers they’re looking for they will go search for other sources even if that doesn’t paint a complete picture.

That brings me back to the purpose of the site. I share thoughts and ideas in order to stimulate discussion.  If that process helps people to engage with the world around them and help people to ask questions that help solve disaster management challenges then I’ve done my job to contribute to the dialogue.

Did you find an article interesting?  If so, I encourage you to share with people around you. Start talking about the ideas and thoughts – ask questions, because it is through that process that we will find the answers to many of our disaster management challenges.

With that said, I want to thank you again for your support and for taking time to read the articles on this site.


Central Plains Blizzard: Snow Amounts Likely to Take Many By Surprise

A Blizzard Warning and Winter Storm Warning has been issued parts of the Southern and Central Plains, the second such blizzard in a week for some residents.  However a concept in disaster preparedness can be readily displayed with the forecast for this event.  Currently, the National Weather Service is forecasting a foot as the upper limit to the snowfall values in Kansas and 15″ as the upper limit to snowfall values in extreme Northeastern part of the Texas Panhandle.  But the highest snowfall totals for this storm could be much much higher….

Recently, the NOAA Hydrometeorological Prediction Center (HPC) began issuing probabilistic snowfall graphics (shown below) that show snowfall forecasts where snowfall values are exceeded 90%, 75%, 50%, 25% and 10% of the time.


HPC 50th Percentile Snow Forecast
(Click for larger image)


HPC 90th Percentile Snow Forecast
(Click for larger image)

Essentially a 50th percentile snow forecast is the forecast that 50% of the time the amounts will be higher and 50% of the time, the amounts will be lower.  For planning purposes, this is the likely amount forecasted if you’re going to play the middle of the road.

However the 90th percentile forecast is quite different.  It shows the amounts that will be exceeded only 10% of the time.  While many people want to forecast snow amounts accurately, the 10% probability event is a great resource to “Plan for the worst” and the 50% probability event is “hoping for the best”.

The wild card in events like this is thunder snow. Essentially, areas where thunder snow occurs can receive locally higher snowfall amounts. The snow probabilities point to this potentiality, although the bands of intense snowfall will not cover the entire area. Depending on where the most intense bands set up, it will dramatically impact the amounts of snow received in those areas. This again points to the importance of the 10% exceedance threshold. Most people will receive snow amounts closer to the 50th percentile amount, but there will be pockets where people receive substantially higher amounts.


Comparison between NWS forecast (left) and HPC 90th percentile / 10% exceedance forecast (right) Current as of 0800CT on 2/24/2013

You may be saying right now, “that’s great but I hate math and hate probabilities”.  Communicating potential risk, especially in low probability, high impact events is critical for anticipating the worst and taking protective action while hoping for the best.  If there was a 10% chance of an intruder in your house, going after you and your family, would you take protective action?  If there was a 1 in 10 chance that you could lose your job, would you start developing a backup plan?

We’ll see how this specific event unfolds, but the current forecast (left in map above) isn’t even at the levels depicted in the 50th percentile event (likely underestimating snow  amounts).  Between that and the incredible disparity between the forecast and 10% potential snowfall amounts, this is a classic example where people can and likely will be caught surprised by the event.

Surviving the Coming 50% Budget Cuts – Part II

Marc Faber (Link to Business Insider Story)

As reported by the Drudge Report and Business Insider this afternoon, Marc Faber, investor and economist known for his spot-on assessment of the world economy and author of the Gloom, Boom and Doom report told CNBC in an interview on Monday, “The debt burden in the U.S. and other Western countries will continue to increase leading to a “colossal mess” within the next five to 10 years.  Additionally, bureaucracies in the U.S., as well as Europe, are far too big, and are a burden on the economy.”

So, what does he propose as the solution to the problem?  “My medicine for the U.S. is: Reduce government by minimum 50 percent.  The impact would be immediately an improvement in the economy.”

50%  Really???  That’s out of touch.. that’s extreme.. that’s… that’s.. that’s… just the right answer??? Continue reading

Hurricane Storm Surge Scales – What Does the National Hurricane Center Think?

Link: NHC Views on Storm Surge Scales
Released September 10, 2010

There are scales for tornadoes, hurricanes, earthquakes, and other natural hazards.  In the aftermath of Hurricane Isaac, there were numerous calls for the National Hurricane Center to add back in a storm surge scale into the hurricane scale.  In an August 31st article from the New York Times, “Climatologists like Kerry Emanuel of the Massachusetts Institute of Technology have said that any classification should include both wind speed and surge. Otherwise, he argues, coastal residents can be easily misled.”

In 2010, the National Hurricane Center removed verbage in the Saffir-Simpson Scale that referred to storm surge in hurricanes.  In a one-page document posted on the National Hurricane Center (NHC) website earlier this week, the NHC Public Affairs staff shared the reason why storm surge was removed from that scale.  Additionally, they elaborated on why: Continue reading

Lessons from History… If “You Didn’t Build That” Then Who Did?

“Now, we can look abroad and see large cities, handsome villages, fine fields, and rich gardens. We see good, smooth roads, strong bridges, and well finished houses.”

One of the greatest challenges with humanity is the personal and corporate failure to learn from history.  When we experience a natural disaster, or calamity we often say, “wow, this is the worst event since…” or “I’ve never seen anything like this…”  However when we say such things, we join in on the failure to know and remember history, and to look at the good and the bad. Continue reading

Can Big Data Help Streamline Government?

Yesterday, I read a good article about #BigData from Tibco.  The article is entitled “The 4 Biggest Problems with Big Data“.  The four main points are critical for business growth and profitability as well as for addressing key gaps in marketing and outreach capabilities.  These are great concepts for businesses that have profit/loss numbers that determine whether or not the companies survive.  However when we look at the incredible amounts of information, regulations, laws, documents, as well as the diversity of programs that government creates and runs at federal, state and local levels, I often wonder why I have not seen so many more people preaching the message of Big Data as a means to streamline government? Continue reading

The Costs of Data Quality Failure

The first response that I hear when I start to talk about “Big Data” and information sharing is that we don’t have enough people, time and resources to slow down to look through all of that information and make sense out of it.  It’s just too daunting of a task.

As I start to look more and more into “BigData” and see the role of understanding what is going on within billions of records, I have really begun to understand that we don’t have time NOT to engage in this discussion.  This point was driven home extremely well in a recent blog post entitled “The Costs of Data Quality Failure.”  In this post, the following statement should be a wakeup to everyone who thinks that this isn’t important.

“A recent report from Artemis Ventures indicated that poor data quality costs the United States economy roughly $3.1 trillion per year. To provide some perspective on this unimaginably large figure, that’s twice the size of the US Federal deficit. An estimate from the US Insurance Data Management Association puts the cost of poor quality data at 15% to 20% of corporations’ operating revenue.”

Can you live with your organization, agency or company losing 15-20% every year? Put another way, how effective are you really when faced with 15-20% cuts each and every year. (You might want to look at a post from November entitled “Surviving the Coming 50 Percent Budget Cuts“)

You can choose to ignore the issue but it is like treating any other financial or systemic problems… the longer it goes unaddressed the more challenging the fix is later on.

Operational Context – Extreme Rainfall

Today’s post is part 3 of a series on Operational Context (View the other posts in the series here).  In this third series, we will be looking at the same questions that we’ve been looking at throughout this series.  Today’s post will be looking at rainfall amounts, and is tied very closely to the last post on drought.  Essentially, many of the same datasets for extreme rainfall and drought can be used with one another, it’s just that the rainfall data is looking for an absence of rainfall when you’re looking at drought.  This will also tie into the next post on Operational Context – Flooding which will be looking at the extreme rainfall events and their impact as the water runs off.  So, let’s go ahead and look at today’s theme – extreme rainfall.

There are several ways to measure or estimate rainfall.  Most people know about rain gages and using them to measure rainfall (see examples in post on Innovative Ways to Teach the 3 M’s – Math, Maps, Measurement).  Rain gages are great for measuring what fell in a specific location, but there will always be gaps in coverage.  Rainfall can be enhanced or reduced by terrain or other geographic features, so point data by itself isn’t enough.  There are rain gages at major airports, and there are companies with portable weather stations where rainfall can be measured.  Additionally, did you know that you can participate in rainfall measurement directly by participating in CoCoRaHS (stands for Community Collaborative Rain, Hail, & Snow Network)? Continue reading

Operational Context – Drought

Day 1 of this series on Operational Context covered earthquakes.  The response to the first of this multi-part series was incredible – over 2,000 views of the post from yesterday on earthquakes.  I have only been blogging for a few months, so not used to this level of traffic yet, and was completely amazed at the incredible response you’ve given to this series.  I’m used to many fewer visitors each day, but then all of a sudden this “extreme event” occurred.  Time will tell whether this is a pattern, but it is my hope that you have connected and will continue to connect with the topics that are discussed.  Regardless, thank you for taking the time out of your day to learn and share your feedback on my posts.

I share these thoughts on yesterday’s blog activity because it really mirrors the point of this series as well as today’s topic.  There are times where we go for a long period of time without major disasters, and other times where we experience event after event one after another.  In my home state, we had a record setting earthquake and a major hurricane within a few days of one another.  Several years ago, we had back to back major snow storms dumping feet of snow on the many parts of the state.  And there are other periods where you may go several years without a major event.

History doesn’t always teaching us lessons in “neat packages” with back to back events to let us lean and immediately apply for the next disaster.  This is especially true when it comes to hazards that we face which may rear their ugly head only a few times each generation.  Continue reading

Operational Context – Earthquakes

When it comes to earthquakes and being aware / ready for major earthquakes, you might think of the San Andreas Fault, or places like Japan, Chile, China, Mexico or Indonesia.

However in the past year, there have been two extremely significant earthquakes in the United States that were “outliers” from previous events.  Both of these were felt over large areas and measured above 5.5 on the Richter Scale.

However, I just recently discovered an incredible post by the US Geological Service (USGS) on the Oklahoma Earthquake.  In this post at, the following image paints a very clear picture: Continue reading