Starting a database of diagnostic errors can help clinicians learn from their mistakes and improve patient care, Divvy Upadhyay, MD, MPH, of Geisinger Medical Center in Danville, Pennsylvania, said.
“Developing such a database is a critical part of an organization invested in learning and exploration of diagnostic excellence,” Upadhyay said Monday at the virtual annual meeting of the Society to Improve Diagnosis in Medicine.
Healthcare organizations find it challenging to analyze and address diagnostic errors, often for lack of easily available or well catalogued databases or registries, he said. “While quality improvement registries are impacting other fields, there is little similar traction for addressing diagnostic errors through a good database or registry.”
To better understand how healthcare organizations can systematically identify and learn from diagnostic errors, Upadhyay and colleagues on Geisinger’s Committee to Improve Clinical Diagnosis developed an on-site diagnostic error registry using a “Diagnostic Opportunities Intake Form” that enabled structured data collection for every case and addressed their organization’s operational, quality improvement, research, and education needs. The form captures the objective details of each case, some descriptive details and lessons learned, and how feedback and dissemination were addressed.
Some of the issues the form seeks to capture include diagnostic test misinterpretations, inappropriate follow-up of abnormal test results, and breakdowns in the referral process. “These categories help us understand the nature of the diagnostic error,” Upadhyay said. Incidents are also rated on a 3-point harm scale — “No harm,” “Harm,” or “Death” — in order to find opportunities for decreased morbidity and mortality. The form also tries to capture some operational details, but “we leave a lot to the department leaders … so they can have a constructive, supportive discussion with the providers involved, and they can further disseminate the lessons across the division or identify system issues we can facilitate,” he said.
Since physicians are very busy, “we invited doctors to share cases in three ways — a short message to our medical record, an anonymous voicemail to our hotline, or a page on the pager system — but we request them to keep it brief,” explained Upadhyay. Then, “when we analyze the cases at the committee and follow it to the department — that’s where the opportunity to dig deeper arises, and that’s where further nuances are obtained … It’s brief reporting at first and a deeper analysis at the department level, so we’re not over-prescriptive, we’re not seen as the police, and we give the department as well as the providers the freedom to discuss it in a collegiate manner.”
Of 578 cases collected since September 2017, 223 had been entered into the database as of June 2020, according to Upadhyay. Departments with opportunities to improve their diagnostic processes included surgery, with 17, laboratory medicine with 24, hospital medicine with 40, primary care with 46, radiology with 49, and the emergency department with 77. (Some cases may involve opportunities for more than one department.) Drilling down further, the committee found that diagnostic process breakdowns in the emergency department most often involved the clinical encounter (67% of the time), followed by test interpretation (21%), follow-up (10%), referral (1%), and patient issues (1%).
Missed diagnostic opportunities most often involved cancer, followed in declining order by stroke, fracture, sepsis, subdural hematoma, myocardial infarction, septic joint, appendicitis, epidural abscess, aortic dissection, pneumothorax, and pneumonia. Within the cancer category, missed diagnostic opportunities included abnormal test results not followed up in 30% of cases, clinical missed opportunities in 24%, pathology missed opportunities in 18%, radiology missed opportunities in 16%, specimen handling missed opportunities in 6%, and referral breakdowns in 6%.
“The database provides actionable information for improvement, including identifying local patterns in missed opportunities,” Upadhyay said. Geisinger is acting on the information by working with its clinical informatics team to learn how to flag certain abnormal results, and supporting education programs around strokes and epidural abscesses.
Because a significant subset of cases were associated with inadequate or inappropriate follow-up of abnormal results (for example, an incidental lung nodule or mass found on imaging, or a positive blood culture result), “we are currently working with our informatics, radiology, ED, hospital medicine and primary care leadership to understand how best to build redundancies in communication and follow up of radiology results to the patients as well as to ordering providers and also to primary care physicians,” Upadhyay said in an email to MedPage Today.
What about clinicians who are afraid to share information about diagnostic errors due to concerns about liability? “Certainly, culturally, clinicians prefer not to share errors for the fear of liability or just to prevent complications with fellow providers,” he said in the email. “But here we are trying to influence the culture of learning and culture of safety by highlighting the importance of learning widely from every diagnostic opportunity — that’s the goal of the Committee to Improve Clinical Diagnosis. A network of champions often helps in encouraging providers to share cases for wider learning.”
The committee has learned valuable lessons from the database, according to Upadhyay, whose group published a paper on some of its results in the Joint Commission Journal on Quality and Patient Safety. “Categorizing diagnostic errors needs a standardized nomenclature and approach,” he said. “Developing a diagnostic error database registry is an evolving, resource-intensive and time-intensive process but it definitely helps you get some actionable intelligence for improvement efforts.”