In this episode of Beneficial Intelligence, I discuss blaming the humans. It often happens that a system failure is attributed to fallible humans. In that way, you don't have to admit embarrassing shortcomings in your system.
A recently declassified report showed that a weapons officer blamed for accidentally firing a missile back in the 1980s was actually the victim of a system error. Boeing initially tried to pin the blame for the 737 MAX-8 crashes on pilot error. Last year, Citibank accidentally paid out $900 million instead of just the few million they intended. They blame a back employee, not the archaic bank system that allowed the error.
If we look only at the last link of an accident chain, we find a human. But behind the human error is a system that created the situation where the human could err. The Harpoon missile system was eventually fixed. The Boeing 737 flight control software was fixed. And Citibank is looking at a long-overdue replacement of their arcane backend systems.
As a CIO or CTO, you need to make sure your organization extracts maximum learning when something goes wrong. Check some of the post-mortem reports from unfortunate incidents. If the error is blamed on a human that should just have acted differently, the analysis has not reached the root cause.
Beneficial Intelligence is a weekly podcast with stories and pragmatic advice for CIOs, CTOs, and other IT leaders. To get in touch, please contact me at firstname.lastname@example.org