Minutes

Committee Meeting

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Minutes of Meeting of Liverpool Society Of Anaesthetists
Wednesday October 7th October 2015

“Doctor as Hero and Hazard”

Captain Trevor Dale, Atrainability

The meeting was opened by the President, Dr Ewen Forrest, who welcomed members to the first LSA meeting of the year. He offered congratulations to Professor Jennifer Hunter (Life Vice-President) who has been awarded an MBE for services to medicine in the recent Queen's Birthday Honours. He also awarded the first ‘John Utting Travel Scholarship’ to Dr Luke Winslow for his poster presentation on the ‘Peri and Post Operative Care of High Risk Surgical Patients’ at the European Society of Intensive Care Medicine Annual Congress in Berlin. Luke will orally present his work at our February meeting.

Dr Forrest then introduced the speaker, Captain Trevor Dale, a former British Airways (BA) pilot who retired in 2005.  He had been a pioneer of crew resource management to improve decision making skills and communication on the flight deck and he is the founder of the company ‘Atrainability’ which provides human factors training in the aviation and healthcare industry.

Captain Dale began his talk by commenting on the use of checklists when he was flying Boeing 747s where the same laminated version with no tick boxes and nowhere to sign (unlike the WHO checklist) was used for every flight. Since retirement, he had worked in healthcare, originally starting in Great Ormond Street Hospital (GOSH). Following the Bristol Royal Infirmary cardiac scandal, GOSH thought that their own morbidity and mortality data could be improved. He collected data on non clinical skills and he gave an example of the first aortic switch operation he saw on a tiny baby. There was no discussion with members of the theatre team prior to surgery, in stark contrast to the procedures on the flight deck. For example, Captain Dale commented the average short haul pilot will do eight briefings and debriefings on a four sector day. Human factors training had been introduced into a very sceptical peer group at BA. Subsequently, aviation has embedded this, with risk management training, into everything they do.

He thought that the WHO checklist had initially been of poor design and poorly introduced.  He said that the aviation industry similarly had had a difficult time to get pilots to do checklists even when their own lives were at risk.

Captain Dale explained that when he joined BA in 1971, the Captain was God. There was no such thing as first name terms; he was called Captain on and off the flight deck. Black box recordings show that often people don't know how to be assertive and have real difficulty in  challenging authority and hierarchy. Similarly, junior doctors find it very difficult to challenge their consultants. This had greatly changed in aviation.

During his time in healthcare, he had felt intimidated by several senior professionals and that those people seemed to enjoy their position of power. At BA, people who could not work with others were ‘let go’ as their behavior impaired safety. He quoted a number of examples of team members who didn't know the roles or names of others present in the operating theatre. He said the ‘Hello my name is’ campaign started by Kate Granger seems a bit robotic but may help.

He talked about what makes us human: being individual, compassionate, creative, emotional, subject to stress, interactive, proactive, reactive, empathetic, fallible, vulnerable, risk taking, caring etc.  He talked about what it is to be professional: trusted, considered, reliable, qualified, accountable, skilled etc. To be these things all the time is hard work but people have expectations of us.

Errors happen even when good people are trying their best. He showed a slide with Victoria Pendleton, Olympic cyclist, overtaking Jessica Varnish and making an error of judgment causing them to be disqualified from the team sprint at the London 2012 Olympics. He gave other examples such as the Formula One teams and the Kegworth air disaster. All these events have human factors at their core and were therefore avoidable. In aviation, error is avoidable in about 90% of accidents (a recent exception being the Las Vegas BA fire in September 2015).

He explained the Kegworth air disaster. One of the two aircraft engines lost power due to a fractured fan blade. Misinterpreting the information available, the crew shut down the wrong engine and the symptoms went away (confirmation bias). It was only when a precautionary landing at East Midlands airport was attempted that the consequences of the error were realised. The passengers at the back of the plane couldn't believe the captain could have made the most fundamental error in shutting down the wrong engine as they could see the smoke. This happens in anaesthesia, e.g. blocking the wrong side while the patient is awake. His advice was not to ask leading questions and avoiding interruptions and distractions.

He said if good people are doing their best what's good about blame?  Blaming someone may be a nice feeling for a very short time.  The NHS is riddled with people who love to blame and the media contribute to this.  He discussed the factors that make the day more difficult e.g. challenges, disorganisation, poor teamwork, changes to plan, personality clashes, repetitiveness etc. He showed the ‘Swiss Cheese’ system model of accident causation. He thought that there was an element of luck at work that the holes don't line up. Root cause analysis can help to identify and remedy the causes. In a work context, human factors encapsulate environment, organisation and job whereas individual characteristics influence behavior.

Captain Dale said he learnt to fly in the 1970s. In the 1980s, there was one plane crash a month.  The infamous KLM Tenerife crash in 1977 was caused by KLM's most senior pilot taking off in fog where 563 people died. As a consequence, the aeronautical side of NASA went on to fund original research in human factors at the University of Texas.  He showed a slide on the history of aviation human factors. He explained in the 1980’s the focus was on the individual, in the 1990’s on crew resource management, while today the focus is on the system. He showed a graph of BA air safety reporting which demonstrated how it had changed over time with high risk reports decreasing and low risk ones increasing as a result of having a no-blame reporting culture. He showed a slide of a BA plane that had its wing destroyed during an attempted take off in Johannesburg in 2013. The crew kept their jobs despite of the error due to the no blame culture. Analysis of any error is deemed far more important than apportioning blame.

Captain Dale then described the recent BA Las Vegas flight where an engine exploded on take-off. The crew would have discussed the eventuality of abandoning take-off as part of pre-flight briefing. In its rare event, all team members would know their roles. He questioned whether a pilot thinks about themselves being in jeopardy or the 400 people sitting behind them? The lives of all depend on checklists and their importance is constantly emphasized in training so that it becomes a routine part of everyday practice. He thought that the culture of an organization is what happens at 3am on a weekend with no oversight on the floor. He thought a safety culture was one that was informed, flexible, learning and just. The aim of a safety culture is to avoid, trap and mitigate threats; the purpose of the WHO checklist.  He said the high performing team is one that already has plan B in place. During the debriefing you should think: is there one thing you did well, one thing you could do more of and is there one thing you could do less of? Use ‘could’ as opposed to ‘should or would’. Briefing should plan for the expected, prepare for the unexpected, involve everyone, encourage questions, build the team, breakdown hierarchy and review.  The benefits are team familiarity (so they are more likely to share concerns), theatre roles are known (sharing of situational awareness) and planning is explicit.

Captain Dale then showed examples of disasters that happened when checklists are not used properly.  He said that one hospital had a 1:300 error rate form wrong side burr hole surgery (pre WHO checklist). His team put in a training programme despite some surgical resistance. Despite the cynicism of two team members, 21,000 operations have been subsequently performed without a wrong side error.  His team helped to give the theatre team the knowledge and confidence to challenge seniors.  Finally, he showed a Boeing 747 checklist which is an aide memoir for briefing prior to take off and before landing. There were no tick boxes. He quoted Dekker, Just Culture, 2007, “It is important not to blame individuals for what went wrong, but to understand why, what they did at the time, made sense to them."

Dr Forrest then invited questions for Captain Dale followed by Dr Amit Dawar offering  a vote of thanks to Captain Dale.

Gemma Redmond
Honorary Secretary
October 2015

 

 

 

.
   
Last updated: 16 November, 2015 LSA