The starting keynote at Velocity Conference Europe 2014 was all about how human error is too often the easiest way to explain away a failure and how a different approach is needed. The speaker, Steven Shorrock, is the European Safety Culture Programme Leader at Eurocontrol, the body that covers almost all aspects of air traffic management over Europe.
Steven, a bit tongue in cheek, defines human error as “someone did something that they weren’t supposed to do it according to someone”. More seriously, what we usually call a “human error” has some specific characteristics: it points to individuals in the context com a complex system; it sitgmatizes and scapegoats; it uses abilities that are vital for success (e.g. failing to learn and adapt), so they shouldn’t be suppressed. Human error tends to be a after the fact social judgment. Outcome and hindsight have a great influence on our thinking. Steven also noted how the English language has many words for error, but few for success. Humans naturally focus on the bad events.
The fact is that humans are just humans. Steven highlighted the well-known fact that people’s brains can only hold to 7 +/- 2 things at any given moment. If we combine this “limitation” with the fact that individuals work in ever more complex systems, it becomes clear that simply attributing errors to humans is not enough to improve matters.
The purpose of any service is to meet some kind of demand, not to prevent failures, even for safety-critical ones, so simplistic approaches like putting the entire burden on individuals are not enough. Steven proposes that a system needs to be studied in the normal context of work with field experts. How does the system meet the demand that justifies its existence and how does pressure affect it (more pressure leads to more errors)? Any service requires certain resources (e.g.: people with certain skills) that can be constrained by any number of factors (e.g.: is the necessary expertise available?). Most important, how do the people who work within that system adjust and vary their performance? What trade-offs do they make? Steven argues that these adjustments and trade-offs are not a bad thing. In fact, they are a necessity.
Eurocontrol is using a Systems Thinking approach applied to Safety, based on ten principles. Eurocontrol published a set of learning cards that describes these ten principles:
Field Expert Involvement. The people who do the work are the specialists in their work and are critical or system improvement
Local Rationality. People do things that make sense to them given their goals, understanding of the situation and focus of attention at that time.
Just Culture. People usually set out to do their best and achieve a good outcome.
Demand and Pressure. Demand and pressures relating to efficiency and capacity have a fundamental effect on performance.
Resources and Constraints. Success depends on adequate resources and appropriate constraints.
Interactions and Flows. Work progresses in flows of inter-related and interacting activities.
Trade-offs. People have to apply trade-offs in order to resolve goal conflicts and to cope with the complexity of the system and the uncertainty of the environment.
Performance Variability. Continual adjustments are necessary to cope with variability in demands and conditions.
Emergence. System behaviour in complex systems is often emergent; it cannot be reduced to the behaviour of components and is often not as expected.
Equivalence. Success and failure come from the same source - ordinary work.