THIS IS NOT A DRILL: Don't blame the button presser, blame the button designer

On Saturday morning, people living in Hawaii faced a false-but-terrifying nuclear threat after an employee at the Hawaii Emergency Management Agency erroneously sent a public emergency alert that read “BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL” to people’s smartphone devices, causing widespread panic and fear for the 38 minutes it took for Hawaii authorities to correct the error.

How could such a massive error have happened? According to the Washington Post, the unidentified employee saw two options in a drop-down menu on a computer program — “Test missile alert” and “Missile alert” — and incorrectly picked the latter option of a real-life threat, which triggered alerts to the public’s smartphone devices and television screens.

Design failure, not human error, to blame

The employee responsible for the push alert failure has reportedly been reassigned but not fired. Instead of blaming the person who pushed the wrong button, the government blamed the state system that allowed one person to have so much power. “Based on the information we have collected so far, it appears that the government of Hawaii did not have reasonable safeguards or process controls in place to prevent the transmission of a false alert,” FCC Chairman Ajit Pai said in a statement on Sunday.

User interface expert Don Norman said the error is the “incompetent design” in the alert system that allowed it to occur:

In 2003, Norman wrote an oft-cited essay on how system failures within organizations too often get attributed simply to human error, which allows the errors to continue to occur unexamined.

“If we assume that the people who use technology are stupid…then we will continue to design poorly conceived equipment, procedures, and software, thus leading to more and more accidents, all of which can be blamed upon the hapless users rather than the root cause —ill-conceived software, ill-conceived procedural requirements, ill-conceived business practices, and ill-conceived design in general,” Norman wrote. “It is far too easy to blame people when systems fail.”

Under this logic, yes, humans can cause errors, but more attention and responsibility should be paid to the processes in place that led to such an error.

When a contractor can disable the president’s account

Norman’s philosophy can be applied to another recent incident of one worker’s mistake causing an outsized impact. In November, social media company Twitter blamed a contractor on his last day for disabling President Trump’s Twitter account for 11 minutes.

In a later interview, the contractor Bahtiyar Duysak explained that his action was a “mistake” because “he never thought the account would actually get deactivated,” according to TechCrunchThe New York Times reported that Twitter employees had long expressed concerns that high-level accounts like those belonging to the U.S. president were too easily accessible for hundreds of the company’s workers. By having a procedure that allowed for contractors like Duysak to access them, a system failure was more likely to occur.

The nuclear threat triggered by a dropped socket wrench

This is not the first time a nuclear situation has been caused by a human error facilitated by a flawed system. In 1980, two airman were performing maintenance on a U.S. Air Force Titan II ballistic missile in Arkansas when a socket wrench was accidentally dropped into the shaft of the missile silo. The socket hit a fuel tank and caused a missile explosion that lifted a warhead out of the ground and caused two people’s deaths.

If the warhead had actually detonated, the maintenance mishap could have made part of the state of Arkansas uninhabitable and countless lives would have been lost. The safeguards for such a fluke event were not in place. As one of the technicians in the missile’s control room put it, an accident like this “wasn’t on the checklist.”

“If the system worked properly, someone dropping a tool couldn’t send a nuclear warhead into a field,” Eric Schlosser, who wrote a book on the Air Force incident, said in a PBS documentary.

These examples show us that blaming a person is easy when an error causes headlines and panic. What’s more important, designers like Norman argue, is making sure the problem gets fixed so that mistakes and fluke events cannot be repeated by someone else.