Misuse of AI chatbots tops annual list of health technology hazards

PR Newswire
Today at 4:21pm UTC

Misuse of AI chatbots tops annual list of health technology hazards

Canada NewsWire

Report also sounds the alarm on insufficient planning for systems outages, substandard medical products, missed recalls of home diabetes management devices, and more

WILLOW GROVE, Pa., Jan. 21, 2026 /CNW/ -- Artificial intelligence (AI) chatbots in healthcare top the 2026 list of the most significant health technology hazards. The report is prepared annually by ECRI, an independent, nonpartisan patient safety organization.

Chatbots that rely on large language models (LLMs) — such as ChatGPT, Claude, Copilot, Gemini, and Grok — produce human-like and expert-sounding responses to users' questions. The tools are not regulated as medical devices nor validated for healthcare purposes but are increasingly used by clinicians, patients, and healthcare personnel. More than 40 million people daily turn to ChatGPT for health information, according to a recent analysis from OpenAI.

ECRI says that chatbots can provide valuable assistance, but they can also provide false or misleading information that could result in significant patient harm. Thus, ECRI advises caution whenever using a chatbot for information that can impact patient care. Rather than truly understanding context or meaning, AI systems generate responses by predicting sequences of words based on patterns learned from their training data. They are programmed to sound confident and to always provide an answer to satisfy the user, even when the answer isn't reliable.

"Medicine is a fundamentally human endeavor. While chatbots are powerful tools, the algorithms cannot replace the expertise, education, and experience of medical professionals," said Marcus Schabacker, MD, PhD, president and chief executive officer of ECRI. "Realizing AI's promise while protecting people requires disciplined oversight, detailed guidelines, and a clear-eyed understanding of AI's limitations."

ECRI experts say that chatbots have suggested incorrect diagnoses, recommended unnecessary testing, promoted subpar medical supplies, and even invented body parts in response to medical questions while sounding like a trusted expert. For example, one chatbot gave dangerous advice when ECRI asked whether it would be acceptable to place an electrosurgical return electrode over the patient's shoulder blade. The chatbot incorrectly stated that placement was appropriate – advice that, if followed, would leave the patient at risk of burns.

The risks of using chatbots for healthcare decisions could become an even greater concern as higher healthcare costs and hospital or clinic closures reduce access to care, leading more patients to rely on them as a substitute for professional medical advice. ECRI's patient safety experts will discuss the hidden dangers of AI chatbots in healthcare in a live webcast January 28.

Chatbots can also exacerbate existing health disparities, according to ECRI's experts. Any biases embedded in the data used to train chatbots can distort how the models interpret information, leading to responses that reinforce stereotypes and inequities.

"AI models reflect the knowledge and beliefs on which they are trained, biases and all," said Dr. Schabacker. "If healthcare stakeholders are not careful, AI could further entrench the disparities that many have worked for decades to eliminate from health systems."

ECRI's report offers recommendations for using chatbots more wisely. Patients, clinicians, and other chatbot users can reduce risk by educating themselves on the tools' limitations and always verifying information obtained from a chatbot with a knowledgeable source. For their part, health systems can promote responsible use of AI tools by establishing AI governance committees, providing clinicians with AI training, and regularly auditing AI tools' performance.

The Top 10 Health Technology Hazards for 2026 in ranked order are:

  1. Misuse of AI chatbots in healthcare
  2. Unpreparedness for a "digital darkness" event, or a sudden loss of access to electronic systems and patient information
  3. Substandard and falsified medical products
  4. Recall communication failures for home diabetes management technologies
  5. Misconnections of syringes or tubing to patient lines, particularly amid slow ENFit and NRFit adoption
  6. Underutilizing medication safety technologies in perioperative settings
  7. Inadequate device cleaning instructions
  8. Cybersecurity risks from legacy medical devices
  9. Health technology implementations that prompt unsafe clinical workflows
  10. Poor water quality during instrument sterilization

Now in its 18th year, ECRI's Top 10 Health Technology Hazards report identifies critical healthcare technology issues. ECRI follows a rigorous review process to select topics, drawing insight from incident investigations, reporting databases, and independent medical device testing. Since its creation in 2008, the report has supported hospitals, health systems, ambulatory surgery centers, and manufacturers in mitigating risks.

An executive brief of the Top 10 Health Technology Hazards report is available for download. The full report is accessible to ECRI members and includes detailed steps that organizations and industry can take to reduce risk and improve patient safety. To learn more, visit www.ECRI.org.

About ECRI
ECRI is an independent, nonprofit organization improving the safety, quality, and cost-effectiveness of care across all healthcare settings. With a focus on technology evaluation and safety, ECRI is respected and trusted by healthcare leaders and agencies worldwide. Over the past six decades, ECRI has built its reputation on integrity and disciplined rigor, with an unwavering commitment to independence and strict conflict-of-interest rules. ECRI is the only organization worldwide to conduct independent medical device evaluations, with labs located in North America and Asia Pacific. ECRI is designated an Evidence-based Practice Center by the U.S. Agency for Healthcare Research and Quality. The ECRI and Institute for Safe Medication Practices PSO is a federally certified Patient Safety Organization (PSO) as designated by the U.S. Department of Health and Human Services. ECRI acquired The Institute for Safe Medication Practices (ISMP) in 2020 to address one of the most prolific causes of preventable harm in healthcare, medication errors; then acquired The Just Culture Company in 2024 to transform healthcare workplace cultures – thus creating one of the largest healthcare quality and safety entities in the world. Visit www.ecri.org.

Cision View original content to download multimedia:https://www.prnewswire.com/news-releases/misuse-of-ai-chatbots-tops-annual-list-of-health-technology-hazards-302666948.html

SOURCE ECRI