In Swedish we often refer to den mänskliga faktorn if a user of a system messes up and causes an accident, even when the system is working as intended. For instance a pilot causing a crash, a nurse using medical equipment wrongly or a bus driver causing an accident.
When reading about such events, it is easy for an engineer to think:
"Ah, stupid user. The system worked as intended and then this clumsy human screws up."
The designers of the system will most likely feel relief when the human factor is mentioned.
"Thank god, it was not our design. It was the user."
It is easy to blaim a person for something going wrong. And of course people make mistakes once in a while, especially when they are in stressful situations. This we all know. Engineers know this too. This is why we should design with that in mind.
Many years ago, in 1991, there was a plane crash in Gottröra in Sweden. Already shortly after take-off, two engines stopped working. The pilots tried to recover the situation but were hindered by an automatic system they had never used before. There were warning lamps lit all over the place. Not one or two, but almost all.
The pilots, with the help of a pilot who was a passenger, did the best they could given the situation. They ignored the system, ignored the rules and safely crash landed on a field. All passengers survived.
At times, the human factor saves the day.
SCANPIX/Leif R Jansson
As engineers, as designers, we should design with the most complex factor of the system; the human, in mind and make the best possible use of him/her, instead of using them as a scape goat when things go wrong.