Human error vs. technical difficulties

Opinion by Sarah Myers
Jan. 19, 2018, 3:00 a.m.

Recently, an Amtrak train traveling a new route from Seattle to Portland derailed, killing three people. So far, it’s been reported that the train was going 80 mph in a 30-mph zone. Trains don’t get speeding tickets, but there are supposed to be safeguards in place to regulate their speeds. These safeguards can be run by people or by computers. Systems run by computers are able to step in and slow down speeding trains, while systems run by humans cannot. Unfortunately, a computer speed control system had been installed in this particular section of track, but had not yet been activated.

This is a good example of the limits of technology meant to keep people safe. As the world becomes more connected, safeguards meant to protect people from tech (and vice versa) become more and more commonplace. Unfortunately, these safeguards aren’t necessarily becoming more effective.

Amtrak knows that computer-run systems are safer; it’s why a computer-run system had just been installed along the section of track where the crash took place. According to Amtrak, the system simply wasn’t yet operational. That raises an awkward question: Why didn’t the company wait to open the route until the speed control system was working? Admittedly, some older routes still use human-run systems, so doing without the computer isn’t completely crazy, but opening a new route with a system known to be less safe still seems unnecessarily reckless.

Of course, there’s also the awkward question about how a fully trained Amtrak employee accompanied by a second employee (who was meant to be getting firsthand experience as part of their own training) could possibly think that going 80 mph in a zone marked for 30 mph was a good idea. Two human errors in judgement combined to create a tragedy.

Electronic medical records (EMR) systems demonstrate this problem depressingly well. EMR systems are meant to collect and store patients’ medical information, including information about past illnesses, past and current medications, test results, other doctors the patient has seen and the patient’s family’s medical history. This information can be vital to ensuring that patients aren’t taking medications that conflict with one another and that they receive the best possible care. In order to make sure that all that important information is safe, EMR systems are designed to have extraordinarily high numbers of warnings. Changing a patient’s address requires clicking “okay” on at least two warning messages. Adding medications, past illnesses or test results requires clicking through multiple warning messages and entering one’s login credentials.

As someone who has used EMR systems, I have a lot of experience with clicking “okay” on a warning message without reading or noticing the message. My experience isn’t necessarily universal; I’ve discussed it with doctors and nurses, but haven’t found a full-scale study confirming that a large portion of medical professionals routinely ignore error warnings. That being said, I’m willing to semi-confidently predict that this is a common problem.

Mindlessly clicking through error warnings is more time-efficient but ultimately reckless. Some warnings contain important information, including alerts telling the user that the information they’ve just entered is illogical or telling the user that they’ve just deleted information, which they might not have meant to delete.

Sometimes, safety features fail. More often, humans fail to enact safety measures. It’s time to fix that, starting with Amtrak.

 

Contact Sarah Myers at smyers3 ‘at’ stanford.edu.

Sarah Myers '21 is pursuing a BA in International Relations while also studying Physics, Mandarin, and German. She enjoys writing about politics, ethics, and current events. She spends her free time reading and convincing herself that watching Chinese television counts as studying Mandarin.

Login or create an account