Where were you when the lights went out?

Olga Van Broeck

Senior UX Consultant

We all remember what we were doing when two planes hit the Twin Towers or when a tsunami washed out vast parts of Southeast Asia. You can bet your best boots on it that the 1,4 million human beings staying on Hawaii last January 13th will remember that day for the rest of their lives. At 08.07 AM all residents and tourists on the island received a mobile alert that a ballistic missile was on its way. And it was no drill. Repeat. No drill.

No wait, it wás a drill. A staff member of the Hawaii Emergency Management Agency (HIEMA)’s first task of the day was to test the internal missile alert system. But instead of keeping it internal, a missile alert was sent out to all cell phones on Hawaiian territory, causing widespread panic. Not until 38 minutes after the warning went out the state issued a correction: the alarm was false, there is no missile on the way, please finish your fruit juice.

 

The false alarm that was sent out on Saturday, January 13th.

 

Don’t shoot the operator

The poor operator was said to be very sorry for his mistake but the harm was done. The staff member got temporarily reassigned and was crowned scapegoat king in worldwide media coverage.  After the biggest mayhem was over, however, two mock screenshots of the employee’s screen emerged. It doesn’t take Hercule Poirot to find out how and why things went so terribly wrong. 

On both screenshots, the user interface appears to be nothing more than a list of unclearly labeled hyperlinks. There was no clear distinction between the test and the real thing or between a missile alert and a landslide. Both images include the False Alarm BMD label option, which would send a correction in case an alert was mistakenly issued. This option, however, has not been added until after the false alarm debacle of January 13th.

 

The first facsimile screenshot provided by HIEMA on Monday, January 15th

 

The second facsimile screenshot provided by HIEMA on Tuesday, January 16th. This screen should be a more accurate representation of the employee's screen.

 

Smells like covfefe

As UX professionals, we know that every user-unfriendly cloud has a silver lining. The Hawaii missile alert incident and the direct impact of bad UI will become the textbook example of ‘how not to…’. It will be mentioned in many UI and UX books to come. But because writing books takes time and we would like HIEMA to start redesigning that interface sooner rather than later, we already provide a shortlist of suggestions here:

1. It is key to use clear labels, in plain English if possible without acronyms. A good label is clear, short and descriptive. Every label should contain a verb and an object. Make sure to avoid all caps in labels. All caps reduces the readability and the comprehension of the text. 

In this specific case, you should make a clear distinction between test commands and real-life commands (but if the labels are well written, they will get the job done).

2. Make sure that an important or irreversible command takes more than one click to get executed. After the initial command is given, users should get a meaningful alert to warn them about a potentially hazardous consequence. The message should -yet again- be clear, and descriptive. People tend to not read the short yes/no question and answer messages. A better solution might be to ask the user a question that recaps the command he has just initiated.

3. If possible, it is important to provide an option to recall or amend the executed action in case a mistake has been made. This option might be best placed at the end of a section because if labeling has been done right, you will never need this option. But if something does go wrong, the option needs to be easy to find. So, easy but not too easy.

 

That’s the way we do it

Keeping those things in mind, our version of the interface would look like this: 

 

 

I think it is safe to say that if the same employee would have used the interface above, chances of picking the wrong option would have been a lot slimmer. And if he would have made the wrong choice, a clear alert would appear:

 

 

And even if the employee would have been persistent and the alert would have been sent, then a clear corrective action would become available.

As Don Norman states in his Design of Everyday Things: “Human error usually is a result of poor design”. So think about how your interfaces and software impact the daily lives of real people. Use your little grey cells, mes amis; think first, design later.

 

Or leave us a message