Saturday, December 3, 2022

Learning from flying

I enjoy watching a TV series "Air Crash Investigations".

Initially, without having seen it, I thought it would be a sort of tabloid sensationalized program to attract eyeballs for advertisers; nothing wrong with that, of course, that's how TV works.

But it is not. It is a seriously crafted documentary series with excellent production values and carefully developed analysis of crash investigations.

From most episodes a project manager can learn, or use it to illustrate lessons about aspects of good project management practice.

Various episodes have taught:

Welcome Bad News

Importance of good crew communication (vital) with no member being unable to bring an error, a risk or a threat to the attention of the other crew members: bad news has to be welcomed by everyone.

Use Checklists

Checklists: make them, use them, and keep revising them as they are tested in use.

Communicate reliably

Before landing the flight crew does a landing briefing for the particular airport. Every time. One crew skipped it; confused roles on the flight deck, made a wrong decision and crashed into a mountain. All dead.

Disambiguate language

The famous Tenerife disaster seemed to arise in part from ambiguity in terms. The captain of the plane that crashed into the other plane on the same runway and the ATC confusedly used the term, 'clear for takeoff'. The captain appeared to interpret that as 'off you go'; the ATC meant it as 'standby for take-off'. He should have said 'hold for departure' as he know another plane was on that runway. The captain would have confirmed 'holding for departure', and awaited the clearance to 'clear for take-off' which would signal a clear runway as well as clearance for the flight plan post departure.

Use fail-safe markers

A couple of episodes featured problems with pitot tubes being blocked and giving altitude errors, leading to crashes.

One case was blockage by insects during storage, the other, covering by tape during a maintenance session. These are critical to aircraft operation. On both occasions the openings should have been covered by a large marker with a tag reading 'Remove before flight'.

Do not 'multi-task' because you cannot!

The futility (and fatal consequences) of 'multi-tasking'. In an environment demanding focused attention one cannot split attention between disconnecting streams of 'flow'. In one episode an air-crew broke the 'sterile cockpit' rule at takeoff. While working down the takeoff checklist, a flight attendant, a personal friend, entered the cockpit and they all chatted about dinner at the destination. Then the checklist was completed...but it wasn't; critical items had been omitted. The 'plane then proceed for takeoff without flaps being extended. It crashed with almost total loss of life.

Always use units when giving quantities

An aircraft was re-fueled at an airport during a transition from imperial to metric units. A flight requiring 1000 kg of fuel was provided 1000 lbs. No units were used on the documentation, leading to the miscommunication. Plane fell from sky.

Never assume/don't break the rules

A crew minimised fuel load routinely on a particular flight. They then defaulted to the risky routine and used the same load on a flight on the same route, but the reverse direction. Due to the altitudes of the airports involved, starting at a lower altitude airport than normal meant that more fuel was needed because of an increased climb height. Plane ran out of fuel and fell from sky. All dead.

Confirmation bias

Another fuel story. A flight lost its bearings due to instrument failure, but sought to regain them by trying to catch a particular radio station. The crew picked up a broadcast they took to be from the station they assumed and planned from that. They were wrong. It was a station hundreds of miles from where they thought they were. They didn't cross check the assumption, didn't have a 'devil's advocate'. They crashed into unexpected terrain: a mountain. All dead.

It's always the system/drive out fear

The aim of crash investigations is not to level blame, but to find causes. The fault needs to be found and eliminated; and the faults are one of:

  • technical: equipment or performance failure
  • systems: processes don't connect with each other adequately or are internally deficient
  • management: recruitment, training, coordination, resourcing

On only the rarest of occasions people are taken to court. The aim of investigations is to learn to increase safe performance, so while participants might be concerned that they will be responsible for an error, they seem assured that honesty is essential and only the most egregious personal negligence might bring consequences.

As Deming says: 'drive out fear'. Fear in an organisation or process leads to stifled communication, error, deceit, concealment, and inevitable failure. See the first lesson: Welcome Bad News. As Deming also says, if there's a problem in an activity or organisation, first examine the system that produces the problem...it is probably also its cause.

The Flaw of Averages

Never assume that an average multiplied by the number of units will be accurate. Remember the Normal Curve. Remember Standard Deviations. It can all go horribly wrong if you apply an average to a small sample/population.

One small aircraft was load assessed based on the average mass of a passenger and the average mass of per person luggage. That might have worked if there were 200 passengers, but there were only 16. The aircraft was overloaded and crashed on take off. Everyone died.

Another factor was that the average was out of date and didn't take into account the gormandizing tendency of too many modern Americans.

Fixation

If you concentrate too hard on the one thing (over-focus) you could die. One airline captain concentrated too hard on his altitude and forgot to pay attention to his airspeed. I think they all died too.

You  need a few systematic preventatives: one project rule: anyone can bring bad news at any time to anyone...and be rewarded for doing so (i.e. 'Thanks Kevin, I'm glad you spotted that.). Anyone who disparages bad news leaves the project. It could be fatal.

Similarly to Fixation error is continuation bias. With this a person is inclined, sometimes sub-consciously, to continue on a course of action that is indicated by objective signs to be the wrong course of action. The tip here, is when things change check all the antecedent conditions and consequential possible results.

Fatigue, lack of sleep.  No one can 'tough out' tiredness. Attention fails, reactions slip, critical evaluation goes out the window. Your are more useless than a drunk, because at least a drunk is obvious. In one episode fixation on the part of the captain, and attention failure by the co-pilot to even be aware of three separate obvious signs of danger led to a crash.