Teamwork has long been promoted as the panacea to medical errors and suboptimal care processes in medicine. For about two decades now, healthcare institutions have invested heavily in team training of various sorts, with a particular emphasis on communication and “speaking up” for patient safety. As well, there is a current trend towards “flattening the hierarchy” – reducing the preconceived levels of rank, authority, and power that exist explicitly or implied, and are typically informed by discipline, age or seniority, educational background, and title.
Today, I have the privilege of speaking to the Academy of Educators at University of North Carolina at Chapel Hill – an elite group of academic educators dedicated to innovation, research, and scholarship towards excellence in teaching. As I am often asked to do, I will speak about the influence of non-rational decision influencers and models of human cognition, and how those human nature tendencies conflict with the concept of “evidence based medicine” and the acquisition of domain expertise. As well, we’ll talk about the unintended consequences of emphasis on teamwork. Here are a few examples:
1. Social Shirking: when a supposed “double check” is actually missed entirely, with both parties assuming the other person has done the check. This is especially likely to occur when a hierarchical gradient is present. That is, a medical student is likely to assume that the senior surgeon has already checked, and the student’s check is not needed. (Moreover, the student is unlikely to “speak up” to “correct” the senior surgeon. As we’ve discussed before, there are many reasons people do not speak up, and at the heart of most of these reasons is the fact that almost always, the individual has the most to gain by remaining silent, and the most to lose by speaking up, even if the team ultimately makes the wrong decision.
2. Informational Cascades: emphasis is generally given to data that is the earliest data apparent to the entire team, without giving equal consideration to data that appears later or is only appreciated by a small subset of the team. This is sometimes compounded by the incorrect assumption that all data available to one party is also known by the other team members, and perhaps more importantly, interpreted to have the same significance among all team members. This data awareness and interpretation is called “situation awareness”.
3. Framing Effects: when a team is included in decision making (either via responding to an emergency, or perhaps during a joint conference such as “tumor board”), data is often packaged and presented in a way that makes sense to the person presenting it, and that cohesive narrative will generally be accepted by other team members. In this way, rather than preventing individual errors, groups actually increase reliance on heuristics and often display more overconfidence than individuals alone.
I’m looking forward to an engaging discussion on these issues, and hearing creative proposals for solutions from this tremendous group of thinkers! What’s your take on unintended consequences of teamwork – have you ever seen it go wrong?