Ladda ner presentationen
Presentation laddar. Vänta.
1
Patientsäkerhetens paradoxer
Säker primärvård Modul 2 Patientsäkerhetens paradoxer This module brings us to the worm that lies at the core of the patient safety apple—the fact that doctors (in particular) are raised in a culture of trained perfectibility that equates errors with incompetence, and has little tradition of sharing experiences of unintended patient harm, and yet the tasks they perform are among the most error-provoking on the face of the earth and the people they treat are highly vulnerable and more likely than most to suffer serious harm from human error. The module also previews two extreme views of the nature and management of error: the person model and the system model. Although the latter has been strongly endorsed by several prestigious patient safety publications, it is our contention both extremes can lead to undesirable consequences. The module ends with a discussion of a middle way in which system principles are retained but those at the sharp end are also given training in how to spot dangerous and risky situations. In short we outline ways in which junior doctors and nurses are given ‘error wisdom’ or what the NPSA has termed ‘foresight training’. James Reason Professor Emeritus University of Manchester, Storbritannien
2
Paradox 1 – å ena sidan Yrkesutbildning inom vårdområdet – särskilt läkarutbildningen – tycks baserad på en uppfattning att det går att lära sig att bli ofelbar. Efter en lång, mödosam och dyr utbildning förväntas man att göra allt rätt. Misstag anses bero på okunnighet. Det finns liten eller ingen tradition i vården av att lära av misstag. Medical schools take it as a given that if you study hard and have the appropriate postgraduate experience, you will get it right. Errors are stigmatized and marginalized. They are not shared, nor hardly talked about. Without a tradition of sharing and reporting, there is little opportunity to learn from slips, lapses and mistakes. This is in sharp contrast to the aviation community that is predicated on the assumption that people make errors and things will go wrong.
3
Paradox 2 – å andra sidan Sjukvårdsverksamhet innehåller många latenta risker: En stor variation i uppgifter och utrustning Ett “hantverk” med begränsade säkerhets-barriärer. Osäkerhet och ofullständig kunskap Sårbara och beroende patienter Avvikelser utreds där de inträffat En-till-en eller några-till-en verksamhet While it is appropriate to learn how aviation manages error, we must appreciate just how different aviation and health care are. Pilots, for the most part, fly just two kinds of airplane: those made by Boeing and those made by Airbus. Pilot behaviour is highly standardized and heavily proceduralised. Nor do they touch the aircraft very much: on a 13 hour flight they might handle the controls for about half an hour. The mechanics and electronics of aviation are well understood. Passengers, unlike patients, are not especially vulnerable and rarely come to any harm. Aviation, like other hazardous industries, has a few-to-many relationship with its customers. And all accidents are minutely investigated by professional investigators, and the results are widely disseminated. Pilots are protected by many barriers, warnings and safeguards—defences-in-depth. Most health carers have only their skills to protect the patient and themselves. And it’s all close and personal.
4
Paradoxerna i ett nötskal
Sjukvård är till sin karaktär en högriskbransch – ändå fördömer vårdföreträdare felhandlingar och de har liten eller ingen utbildning i riskhantering eller i att upptäcka fel. Here it is then: the people whose work provides arguably the greatest amount of error opportunity know the least about it. This is primarily a cultural problem. Culture change is the sine qua non for improving patient safety.
5
Felhandlingar är den vanligaste risken. Två synsätt på felhandlingar.
Individsynen Systemsynen Varje synsätt har sin förklaringsmodell och uppfattning om hur felen ska avhjälpas. Human error can be seen from many perspectives. Two of the most dominant models or approaches are shown on the slide. Each has its own view of how and why errors occur, and each has its own remedial measures.
6
Individsynen Fel ses som produkten av oberäkneliga mentala processer: glömska, distraktion, ouppmärksamhet, vårdslöshet, etc. Åtgärder riktas främst mot den individ som har gjort misstaget – man namnger, varnar, beskyller, klandrar, fortbildar, skriver nya PM, osv. The dominant ‘person’ view of human error is both longstanding and intuitively appealing. It focuses on the unsafe acts—errors and procedural violations—of people at the sharp end of the healthcare system: nurses, physicians, surgeons, anaesthetists, pharmacists, and the like. These unsafe acts are seen as having their origins in wayward psychological processes such as forgetfulness, inattention, poor motivation, carelessness and undesirable practice. Logically enough, the associated remedial measures are targeted primarily at the erring individuals at the ‘sharp end’. These counter-measures include ‘fear appeal’ poster campaigns, writing new protocols or adding to existing ones, sticks and carrots (mostly the former), threats of litigation, suspending and retraining, naming, blaming and shaming. The person model has some substance. People do respond to motivators, but usually only in workplaces that are personally hazardous, and even then they have only a limited impact.
7
Systemsynen Sjukvårdspersonal är människor. De kommer alla någon gång att göra fel. Det är inte en fråga om moral. Avvikelser har sin grund i latenta risker i systemet. Det är mer sannolikt att de som arbetar i frontlinjen ärver risker i systemet, snarare än orsakar dem. Förebyggande åtgärder syftar till att skapa barriärer för att fånga upp misstag och att undanröja fallgroparna. The basic premise here is that humans are fallible and that errors are to be expected, even in the best organisations. Errors are seen as consequences rather than causes, having their origins not so much in the perversity of human nature as in ‘upstream’ systemic factors. These include recurrent error traps in the workplace and the organisational processes that give rise to them. Its associated counter-measures are based on the assumption that while we cannot change the human condition, we can change the conditions under which humans work. A central idea is that of system defences. All hazardous technologies possess barriers and safeguards. When an adverse event occurs, the important issue is not who blundered, but how and why did the defences fail.
8
Systemsynen som förklaringsmodell till bristande säkerhet.
Vissa hål orsakas av aktiva felhandlingar Risker Risker This slide shows the ‘Swiss cheese’ model of accident causation. In an ideal world, all the defences would be intact. But, in reality, they all have holes or weaknesses. These holes arise for two reasons: (1) active failures—the errors and violations of people at the sharp end—that can open usually quite short-lived gaps in the defensive layers. (2) latent conditions (like resident pathogens in the body) that arise during the design or building stages, or as a consequence of high-level management decisions. The effects of latent conditions can be quite long lasting. However, the accident can only occur when all the holes in the cheese slices line up permitting an accident trajectory in which hazards come into damaging contact with people or assets. I should point out that, unlike Swiss cheese, these holes are in constant flux: opening and shutting and moving from place to place. Andra hål orsakas av latenta förhållanden Förluster Flera olika försvars-, barriär- och skyddsskikt
9
Var brast skyddsbarriärerna?
Skada Risker VAD? HUR? Riskabla handlingar Lokala arbetsplatsfaktorer Organisatoriska faktorer Latenta förhållanden Orsaker Utredning och analys Earlier it was stated that the key questions—from a systems perspective—are: How did the defences fail? Why did they fail? This slide summarises the basic argument of the system approach to accident causation. For each failed defence, we need to pose the following questions: Was the failure due to an unsafe act? If so, what were the local workplace factors that provoked it? And, still further upstream, what were the organisational factors that brought about the error-provoking conditions in the workplace? Of course, it is not always the case that unsafe acts are involved. Sometimes—as in the case of the Challenger accident or the King’s Cross Underground fire—defences fail due to a combination of local workplace and organisational factors without any significant unsafe acts. This possibility is indicated in the slide by the arrow labelled ‘latent condition pathways’. It will be noticed that the broad arrows go both ways. The upward directions indicates the pathways of causality, the downward direction indicates the steps to be taken during the course of an accident investigation. VARFÖR?
10
Systemsynen har stöd i. . . rapporter från US National Academy IOM
Det är mänskligt göra fel (2000) Att överbrygga kvalitetsklyftan (2001) Bibehålla patientsäkerheten (2004) Rapporter från UK Dept of Health En organisation som minns (2000) Att skapa ett tryggare NHS för patienterna (2001) Samt jämförbara rapporter från Australien, Nya Zeeland och Kanada Most, if not all, of these high-level reports, regard the system approach as the proper basis for improving the patient safety situation. But as we shall see in later slides, both the person and the system models have drawbacks if taken to extremes.
11
Att få balans mellan synsätten.
Individ- synen System- Klandra Förneka Isolera Inlärd hjälplöshet By focusing on the error-makers rather than the context in which the error occur, we isolate the errant person and fail to recognise the recurrent error traps that exist in workplaces. And if we only accept a system perspective, we are condemning HC professionals to learned helplessness—what can I do, it’s the system’s fault. Båda extremfallen har brister.
12
Att arbeta i frontlinjen. . .
Personal i frontlinjen (sjuksköterskor, läkare) har små möjligheter att förbättra systemet i stort. De måste bli mer riskmedvetna och observanta på fel – en kunskap och mental förmåga som innebär att känna igen riskfyllda situationer samt öka förmågan att upptäcka och åtgärda begångna misstag. Very often health carers could have thwarted an adverse event sequence if they were more sensitive to impending dangers. What is it that would make them more error wise and risk aware? Sometimes it’s simply a sixth sense, or the hair on the back of your neck. But it could also be an easily trainable mental skill.
13
3 - hinksmodellen för att bedöma risksituationer.
2 People are very good at making quick intuitive rating scale judgements. The 3-bucket model is a simple tool for directing attention at those aspects of the situation that are likely to determine error likelihood. Self: how are they feeling? Are there physical and emotional factors that have brought about a decline in well-being? Context: this relates to the nature the situation. Are there many distractions, interruptions, changes, poor team interactions and the like? Task: task steps vary widely in there ability to provoke error. We will give an example of this later. The brown stuff within the buckets (a universal coding) represents the stuff that is likely to hit the fan. The more there is, the greater the likelihood of error. 1 JAGET SAMMAN- HANGET UPPGIFTEN
14
Så fungerar modellen I vilken situation som helst är sannolikheten för att en riskfylld handling ska begås, lika med summan av mängden brunt material i de tre hinkarna. Fulla hinkar betyder inte alltid en riskabel handling och tomma garanterar inte att det är säkert. Modellen illustrerar sannolikheten - inte verkligheten. Men med kunskap om modellen kan vi bedöma nivåerna i en given situation och agera därefter.
15
Hur hinkmodellen kan användas för att snabbt bedöma risker.
9 Allvarlig risk: Undvik om möjligt att agera. 7 Måttlig till allvarlig risk: Var ytterst försiktig. 5 Normal till måttlig risk: Fortsätt, men var försiktig. 3
16
Uppgiftsrelaterade faktorer
Även om människor har en hygglig uppfattning om sina egna begränsningar och om sammanhanget, är de sannolikt mindre medvetna om att olika steg i processen varierar vad gäller risken för att göra fel. Det är den kunskapen som gör personal i frontlinjen kapabel att bedöma risken i en given situation.
17
En vanlig fallgrop- vad är det som oftas händer?
A simple desk top photocopier provides a good illustration of how certain steps are much more likely to be performed wrongly. What is the most likely omission when photocopying a short document? The answer: departing with the completed copy and leaving the last page of original on the platen under the lid. The reasons why are listed on the next slide. att sista sidan i originalet glöms kvar!
18
Varför? Fyra faktorer som gör att man glömmer sista sidan!
Nästan klart: moment i slutet av processen glöms lättare bort än i början. Ingen påminnelse: tidigare i processen togs en sida bort för att nästa skulle läggas dit Målet (att kopiera) uppnås, innan hela processen är klar. Det som inte syns finns inte. Sista bladet täcks av locket. These four factors combine to make the omission error almost inevitable.
19
Slutsats. . . Fel kan förutses & hanteras Glöm inte sista sidan!
Omissions can be reduced by providing reminders. Reminders ‘wear out’ and become part of the wallpaper, but they can and do work most of the time. The moral of this story: predictable errors can be avoided. Fel kan förutses & hanteras
20
Att lära sig leva med misstag
Inse att ofullkomlighet är normen. Misstag kan inte förhindras, men de kan hanteras. Misstag är både konsekvenser och orsaker. Misstag är tillfällen för lärande. Att hänga ut någon som gjort ett misstag ökar inte säkerheten. Skapa sjukvårdssystem för mänskliga varelser - människor av kött och blod. These are some of the cultural and cognitive factors necessary to combat the patient safety problem. We will expand upon them in later modules.
Liknande presentationer
© 2024 SlidePlayer.se Inc.
All rights reserved.