Top Drivers of Risk: Diagnostic Errors

In this blog series, Lori Atkinson discusses the top drivers of risk and malpractice claims and explores strategies to mitigate risk and enhance patient safety.
In my first blog of this series, Top Drivers of Risk: Surgical Allegations, I discussed the top malpractice allegations across all healthcare settings and highlighted one organization's strategy to mitigate surgical adverse events using a new surgeon onboarding program.
In this blog, I'd like to focus on diagnostic error allegations. At Curi, we analyze our malpractice claims to identify and examine the top allegations to help clinicians and their organizations implement strategies to reduce adverse events and claims.
Diagnostic Errors
Most of us know or love someone who’s been involved in a delayed or missed diagnosis. I’ve read estimates that diagnostic errors affect nearly 12 million people annually in the U.S. In almost half of Curi diagnostic error claims analyzed, high-severity harm, such as death and long-term disability, resulted from the delayed or missed diagnosis.
In the subset of Curi malpractice claims reviewed, we identified that diagnostic error allegations are number three in occurrence and two in cost across all healthcare settings. The largest percentage (59%) of claims originates from care provided in the medical office setting. The second largest percentage (26%) originates from care provided in the inpatient hospital setting. The remaining diagnostic error claims (15%) involved emergency department (ED) care.
We took a deeper dive into analyzing the diagnostic error claims occurring in the medical office and the ED and found most of these claims involved problems in the initial diagnostic assessment stage. This can happen when symptoms are not thoroughly evaluated, differential diagnoses are not considered, or tests aren't ordered to help determine an accurate, timely diagnosis.
Diagnostic errors in the assessment stage can arise due to cognitive biases, time constraints, complex, unusual, or vague presentations, clerical overload, and inadequate access to patient data.
Using Artificial Intelligence to Mitigate Diagnostic Errors in the Assessment Stage
No doubt you’ve read recent headlines warning of the potential dangers that artificial intelligence (AI) poses to safety and security. However, I’ve also been reading about the positive aspects of using AI to help clinicians and their teams in the initial diagnostic assessment stage, including:
- Clinical decision-support systems. AI-powered clinical decision-support systems (CDSS) integrated into an electronic health record can help clinicians identify potential diagnoses and rule out serious diagnoses. AI can analyze patient history and risk factors to flag potential high-risk conditions such as stroke, heart disease, or sepsis, enabling earlier intervention. AI-powered CDSS can provide real-time recommendations and help clinicians avoid diagnostic pitfalls, suggest differential diagnoses, and ensure adherence to evidence-based guidelines.
- AI scribes: AI scribes can help clinicians by reducing the clerical burden, improving workflow efficiency, and ensuring complete documentation. AI scribe software uses automation to listen to clinician-patient interactions and generate real-time clinical notes. AI scribes can ensure accurate, structured, and comprehensive documentation. This benefits other clinicians, such as specialists, who use documentation to arrive at an accurate diagnosis. Clinicians using AI scribes spend less time on documentation and more on the "clinical thinking" part of diagnosing. Some AI scribes can analyze clinical encounters and suggest diagnoses, treatment plans, or follow-up care reminders.
Risks of Using AI to Augment the Diagnostic Process
- AI models may inherit biases from the training data, and wrong or inaccurate data input can create erroneous output.
- Clinicians may rubber stamp an AI scribe-generated encounter note without recognizing that AI has generated false information.
- Clinicians may be prone to overreliance on the AI tools and forget to use their critical thinking skills necessary for the differential diagnosis process.
Risk Reduction Strategies
- Stay abreast of ongoing changes in federal and state laws on the use of AI in healthcare.
- Develop policies and procedures for acceptable use of AI, when and how to use it, and what tools are permitted. Involve your legal counsel to help guide policy development.
- Train clinicians on AI use and its limitations to maintain clinical judgment skills.
- Regularly validate AI outputs with clinician oversight.
- When using tools such as AI scribes during medical encounters, utilize the Curi Informed Consent Form - Use of Artificial Intelligence (AI) Scribe During Medical Encounters and the Curi Artificial Intelligence (AI) Scribe Checklist.
Curi clients, sign in to our Risk Solutions Resource Catalog to take our Diagnostic Error Risk Assessment, read our guidance on Mitigating Artificial Intelligence (AI) Risks in Healthcare Settings, and watch the on-demand video, Diagnostic Error:
Awareness and Mitigation Curi On-Demand Education.
If you have questions about this topic, please call 800-328-5532 to speak with one of Curi Advisory's Risk Solutions Consultants.
SHARE THIS POST
About the Author
.jpg)
Through research, she identifies client needs and emerging healthcare risks, and develops education—articles, online resources, in-person education and webinars—with a focus on using data to help solve issues proactively versus reactively.
Lori is a frequent author and lecturer for physician, medical office, hospital, and senior living administrators and staff audiences.
Comments