NewsVoice for Veterans

Actions

As VA expands AI in health care, OIG raises concerns over veteran safety protections

VA watchdog warns AI use in veteran health care lacks patient safety oversight
Screenshot 2026-01-28 at 3.51.36 PM.png
Posted

BALTIMORE — Veterans across Maryland depend on the Department of Veterans Affairs for critical health care, but a new watchdog national report is raising concerns about how artificial intelligence is being used inside the VA’s medical system and whether patient safety protections are keeping pace.

A preliminary report from the VA Office of Inspector General warns that the Veterans Health Administration does not have a formal process to report, track, or respond to patient safety risks associated with generative artificial intelligence tools.

Those tools include internal AI chat systems such as VA GPT and Microsoft Copilot, which are authorized for use with patient health information. According to the report, the tools can assist with clinical care and documentation and support medical decision-making, with some AI-generated content copied directly into veterans’ electronic health records.

However, the Inspector General found that while AI has been used in parts of the VA system since 2023, its deployment occurred without coordination with the VA’s National Center for Patient Safety, which is the office responsible for identifying and preventing harm to patients.

In an interview with WMAR-2 News Voice for Veterans reporter Cyera Williams, Inspector General Cheryl Mason said the concern is not the existence of AI itself, but the lack of standardized oversight when problems occur.

The report highlights that generative AI systems can produce inaccurate or misleading information, a phenomenon she referred to as “hallucination.” Without a consistent way to document and evaluate those errors, the Inspector General warns that risks to veterans’ health may go unnoticed.

“The concern we have, and that this preliminary report is advising, is that there aren’t any real protocols. Each VA medical professional is kind of on their own regarding how they report it, where they report it, and the oversight on it,” said Mason.

According to Mason, existing VA patient safety reporting systems were not formally adapted to account for AI-related risks. As a result, individual clinicians were left to determine how, or whether, issues involving AI-generated information should be reported.

The preliminary report was issued to raise awareness within the VA and prompt action, rather than to issue formal recommendations. The Inspector General says a final report with additional analysis is still underway.

WMAR-2 News reached out to the VA Maryland Health Care System with several questions following the report.

We asked how significant the patient safety risk is when there is no formal process to track AI-related concerns, what safeguards are currently in place to prevent inaccurate AI-generated information from affecting diagnosis or treatment, and why the rollout of AI tools was not coordinated with the National Center for Patient Safety.

The VA Maryland Health Care System responded with a brief statement saying:

“VA clinicians only use AI as a support tool, and decisions about patient care are always made by the appropriate VA staff.”

We followed up with additional questions, including how VA facilities in Maryland ensure AI-generated information does not negatively impact patient care, whether clinicians are required to independently verify AI-generated content before it is used in medical records or clinical decision-making, and how local VA leadership coordinates with national patient safety offices when new technologies are introduced.

Those follow-up questions were not answered. The VA referred WMAR-2 News back to its earlier statement.

Inspector General Mason emphasized that human oversight remains critical as the VA modernizes its systems. She says the goal of the watchdog’s work is not to slow innovation, but to ensure change is managed responsibly, with veterans’ health and safety coming first.