The Case for Evidence-Based Disaster Technology Response

“The plural of ‘anecdote’ is not ‘data’.”

A disaster happens somewhere in the world. Disaster technologists and digital humanitarians mobilize. Maps are crowdsourced, satellite dishes and networks are deployed, UAVs are flown, apps are hacked in marathon sessions, social media mined. All these things happen incredibly rapidly because of the army of passionate individuals and organizations that fundamentally believe that the rapid flow of information helps to save lives and speed recovery to affected communities.

Eventually, as things move from response to recovery, most of these individuals and organizations will document what they did and lessons to be learned for the next time around. Pictures will be shared on social media and in press releases. The community of technology humanitarians will prepare for the next event… knowing there is always a next event.

But something fundamental is missing from our community: how do we actually know we made a difference in the outcome of the response?

Many of us (including myself!) have plenty of anecdotes and war stories about how something we did made a difference, but anecdotes are just that. Let’s face it… I make my living working in this space. Of course I want to believe that my work, my time away from my family, all those long hours in hardship conditions actually make a difference. Organizations want to tell the most positive story about their efforts, to ensure future funding, volunteers, and missions.

Loading a helicopter with tech in Vanuatu (Cyclone Pam, 2015) - let's use evidence to know how to best use our capabilities and skill.
Loading a helicopter with tech in Vanuatu (Cyclone Pam, 2015) – let’s use evidence to know how to best use our capabilities and skill.  This photo is not necessarily evidence of an effective response, as much as I might hope it is. 🙂

It’s called “confirmation bias.” As humans, we naturally seek our information that supports our beliefs and positions, and tend to avoid information that would call those beliefs and positions into question – or discredit them entirely!  The photo above is an an example… it’s certainly visually compelling, and I’d like to think that good things came from being out in that rain that day, but we need data to really measure effectiveness!

Our community of emergency techies is relatively small, and the whole multi-discipline sphere is relatively young. By way of analogy, this reminds me of where emergency medicine in its first few decades. During the dawn of the Emergency Medical Services (EMS) field, most treatments were based on assumptions (“backboard all suspected CSPINE patients”). Eventually, evidence-based medicine came into the field, with the results that we don’t backboard anywhere as often as we used to, and that we know how incredibly important chest compressions are in CPR, high-flow oxygen can actually harm a heart attack patient, and therapeutic hypothermia for a cardiac arrest patient actually makes sense.

So, while anecdotes make for great memories, they aren’t terribly useful for the long-term evolution of our field. Just like EMS and many other emergency disciplines, a collection of stories cannot be considered evidence of efficacy because of confirmation bias. One positive story about disaster technology response and one negative story about disaster technology response cancel each other out.

We need to move beyond anecdotes and move towards true evidence-based disaster technology response.

So what does that look like?

What technology interventions actually help the situation on the ground? How will we know this?

What technology interventions don’t help, and should be avoided? How will we know this?

While the field demands innovation and exploration, how do we ensure that we aren’t just enamored with our own technology at the expense of other meaningful activities that support disaster response?

We have to get beyond the feels. Many digital volunteers are moved by crisis to “do something.” But the goal of “doing something” should not be the social reinforcement that a volunteer or disaster worker gets on their Facebook page. It should not be to demonstrate some whiz-bang technology in crisis. After all, a fancy map that isn’t used by anybody to make a decision is merely a graphical representation of disaster trivia. A UAV that gathers video that isn’t used by anybody to make a decision is just a pretty YouTube video. In both cases, the intention and the tech are great, but the outcome is lousy.

If it happens that these other things happen along the way, so much the better. . But out first duty should be to focus on the outcome… verifiable, evidence-based outcomes.

I think there’s a strong role for academia in this effort.

We should look to the area crisis informatics to help inform the practitioner about   these questions of efficacy. They can help us develop the metrics, engage peer review, and help move us to that next level that our increasing workload and set of expectations demands. We in turn can influence the research questions and work to connect the results of academia to the work being done by practitioners.

By driving towards evidence-based operations, we are helping to mature our work and minimize wasted effort and cost. But most importantly, we enshrine the beneficiary of our work at the absolute center of the digital humanitarian universe. The disaster victims and responders who need information to make smarter, better decisions for themselves deserve nothing less.

The move towards evidence-based response will take all of us … let me know your thoughts in the comments.


6 thoughts on “The Case for Evidence-Based Disaster Technology Response

  1. Rakesh – very well stated. There are many notional indications of the value that various tech help provides, but not much in terms of evidence based assessments. And there are probably tech related deliverables that are not being used. It would be helpful to know – for all involved. Would be great to have an academic study of the value provided during any specific response. A study after the response using interviews perhaps of those who produced tech support and those who reportedly used that particular product, whether it was a situation report, a map, a cell tower or a server or translations support.

    1. I’d love to know whether Humanity Road is engaging with others in looking at this area. I think we need to absolutely care about whether all of this work matters to victims or responders. Otherwise, it’s just a lot of effort for little or no return.

  2. Well said Rakesh. I believe in this a lot and is more the focus of my research. There are many people working on analytics and the latest and greatest tech, which is a form of innovation that is needed. But we must understand the relative usefulness of these advances or risk aimless innovation.

    1. One of the other challenges we need to square away is how to get the work done by academia back into the community of people doing the work. When I was at NSF a few months back, it was clear to me that the academic community values the act of publishing – often in journals I’ve never heard of, at conferences I don’t go to, etc. There is little or no current reward to getting this info back into the hands of people who might actually make some use of the results. This is just how it currently is (no slam against academia) – but in order for this to really work, everyone has to make sure we close the loop.

      For me, the fundamental question is personal: How will I know if I am doing a good job and moving the needle?

      It can only be truly answered by being focused on outcomes.

      1. Thanks for a fantastic summary of a necessary discussion. Your comment takes me back to one of my grad school classes where the professor took the opportunity provided by the mid-term to give an impassioned, semi-spontaneous talk on folks needing to get out of academic studies and into the field to implement interventions. He taught one class a year and spent most of the rest of the time in the field doing humanitarian work and argued that when it came to basic refugee work (it was a course focused on water and refugee issues) but that folks seemed to like studying their studies rather than doing.

  3. I guess I’m a little late to join these comments, but anyways: I think we need convergence. Both in theory, but also from a pragmatich point of view. You need to get what you know out there, but it wouldn’t harm to know who can do what, and whether or not they’re around to do it and deliver results. And we need to be able to do this, even if we did not know each other before, and did not share any common goal (or friend/relation) but the wish to remedy the situation at that particular point in space and time. And that should be enought to give us the possibility to team up on that specific subject. And mayb save more lives, and/or get better results than when we all do our thing independantly. And that’s something that we’ve got to build. Not something that will gradually come into existence …
    At least, that’s what my opinion on the matter is …
    Greetings from Belgium …

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s