Photo: RNZ
Health NZ (HNZ) says staff have been caught using free AI tools like ChatGPT and Gemini to write clinical notes, a move it says could result in formal disciplinary action.
A memo seen by RNZ was sent this week from a senior manager to all Mental Health and Addiction Services staff in the Rotorua Lakes district, reminding them not to use tools like ChatGPT, Claude or Gemini in their work.
"It has come to my attention that there has been instances where it appears that AI (artificial intelligence) drafting tools have been used to prepare clinical notes," it says.
"The use of free AI tools (e.g. ChatGPT, Claude, Gemini) for clinical purposes is strictly prohibited due to data security, privacy and accountability concerns. You are also not allowed to use AI tools to draft notes and then transcribing it to handwritten or typed notes, even if you anonymise the patient information."
Doing so could result in formal disciplinary action, it said.
According to the HNZ-wide AI policy, any AI tools must be registered with the Health NZ National Artificial Intelligence and Algorithm Expert
Advisory Group (NAIAEAG) - this would include Heidi, an AI scribe tool being rolled out across EDs.
Sonny Taite, HNZ director of digital innovation and AI, said free AI tools presented risks to data security, privacy and accountability, and "any possible exemptions are assessed case by case".
"As with any new process in healthcare, we are working with our clinicians on new ways of working and this is an ongoing process."
... continue reading