Award-Winning Animation: SystemsThinking – A New Direction in Healthcare Incident Investigation


The public’s affection for the NHS is
beyond question, but Health care is under huge pressure. Overstretched budgets, rising costs, staff shortages, increased demand, alongside new technology and innovation, all increase the likelihood of safety incidents. With around 100 serious safety incidents reported daily across the NHS -a rate largely unchanged over the past two decades- we need to find more effective ways to guarantee patient safety. Analyses of safety incidents have revealed a wide range of contributing factors and the blameless staff often inherit existing system problems. For example, staff shortages can mean higher workloads, which increase turnover and create further staff shortages. Attempted alleviation through hiring temporary staff brings unfamiliarity with the workplace, which leads to more interruptions and ironically a workload increase. It’s fertile ground for mistakes to occur. Systems-thinking therefore encourages the view that incidents are NOT usually caused by a single catastrophic decision or action, but by dynamic interactions between
people, tasks, technology and working conditions including management, regulation and policy, which typically escape analysis. Let’s take an example of a medication error. The diabetes specialist nurse writes the recommended dosage using an assessment form. A busy prescribing doctor misreads it and prescribes 100 units instead of 10. The outcome? The patient is discharged but soon after readmitted to an already overworked Emergency Department. Root cause analysis identifies the mistake and may recommend retraining and personal reflection by those involved. Certainly, next time, the doctor will take more care. But what if, next time, it’s a different doctor? The same incident could happen again. Systems-thinking allows us to explore the underlying dynamic interactions between people, technologies
and policies within and across levels of the whole systems. It highlights a clear feedback mechanism for a purchasing team, regulatory bodies and manufacturers regarding the confusion risk in the medicine name and requests improvement. It gives the prescribing responsibility to the specialist nurse so that the number of potentially unsafe interactions can be reduced regardless of changes of staff. It identifies the need for an adequate workload level and recommend staff be reminded of their responsibility to voice their queries. As systems-thinking relies on more than single actions by individuals. It offers an opportunity for longer-term learning and lasting change. And if immediate action is impossible, organizations at the very least
accumulate an evidence base for future changes. By empowering people to speak up
and use their skills and knowledge to act safely, systems-thinking places
emphasis on staff as a resource for safety rather than a potential source of
problems. If you’re looking for effective and sustainable ways to prevent patient harm from incidents, embrace and encourage systems thinking in your investigations. For more information, please visit our website.

2 Replies to “Award-Winning Animation: SystemsThinking – A New Direction in Healthcare Incident Investigation”

  1. There is a mention here about systems problems and systems thinking. The AcciMap, while not a full systems analysis, moves in that direction, but the solution proposed here is not a systems intervention. It may be an effective solution for this problem (although that would have to be evaluated) but the concern with non-systems interventions is that they fix a local issue but might compromise safety somewhere else. For this to be a systems solution, we would have to take this insight that communications pass through too many people and think about whether that is happening in other parts of the system and possibly throughout the system. A systems intervention would then target a systems-wide redesign. The problem with the AcciMap strategy and, as far as I can tell, every other form of hazard analysis, is that there is no explicit design strategy. I suspect that is because analysts think that the hard part in all of this is to identify the hazard and they think that once the hazard is identified, the solution is obvious. In human systems at least, that is rarely the case.

  2. Thank you Gyuchan. Great video. It is especially interesting to me that Root Cause Analysis, in your video, is framed as an NHS 'performance-management tool' targeting clinician errors. This was never the intention when RCAs were introduced to Australian Healthcare. Instead it was meant as a 'systems-oriented' tool that was supposed to avoid blame. However (aside from the obvious shortcomings of applying RCA methodology in complex systems) there is a fundamental problem that healthcare executives and clinicians have not yet understood: when a serious adverse – analysis for determining accountability, and analysis for preventing future adverse events. Both processes are important and necessary (yes, some unsafe acts are blameworthy), but should be kept separate – otherwise any well-intended systems improvement tool (RCA/London Protocol etc) just becomes another (albeit more sophisticated) blame weapon.

Leave a Reply

Your email address will not be published. Required fields are marked *