LmCast :: Stay tuned in

Health NZ staff told to stop using ChatGPT to write clinical notes

Recorded: March 26, 2026, 4:02 a.m.

Original Summarized

Health NZ staff told to stop using ChatGPT to write clinical notes | RNZ News

News
Life
Radio
Podcasts
Video
Te Ao Māori
Pacific
IndoNZ
中文

Navigation for News Categories

New ZealandWorldPoliticsPacificTe Ao MāoriSportBusinessCountryLocal Democracy ReportingComment & AnalysisIn DepthWeather

New Zealand
technology

10:31 am today

Health NZ staff told to stop using ChatGPT to write clinical notes

10:31 am today

Share this

Share on Twitter
Share on Facebook
Share via email
Share on Reddit
Share on Linked In

Kate Green , Health Correspondent

kate.green@rnz.co.nz

Photo: RNZ

Health NZ (HNZ) says staff have been caught using free AI tools like ChatGPT and Gemini to write clinical notes, a move it says could result in formal disciplinary action.
A memo seen by RNZ was sent this week from a senior manager to all Mental Health and Addiction Services staff in the Rotorua Lakes district, reminding them not to use tools like ChatGPT, Claude or Gemini in their work.
"It has come to my attention that there has been instances where it appears that AI (artificial intelligence) drafting tools have been used to prepare clinical notes," it says.
"The use of free AI tools (e.g. ChatGPT, Claude, Gemini) for clinical purposes is strictly prohibited due to data security, privacy and accountability concerns. You are also not allowed to use AI tools to draft notes and then transcribing it to handwritten or typed notes, even if you anonymise the patient information."
Doing so could result in formal disciplinary action, it said.
According to the HNZ-wide AI policy, any AI tools must be registered with the Health NZ National Artificial Intelligence and Algorithm Expert
Advisory Group (NAIAEAG) - this would include Heidi, an AI scribe tool being rolled out across EDs.
Sonny Taite, HNZ director of digital innovation and AI, said free AI tools presented risks to data security, privacy and accountability, and "any possible exemptions are assessed case by case".
"As with any new process in healthcare, we are working with our clinicians on new ways of working and this is an ongoing process."
HNZ did not answer questions about how many instances there had been of staff using unapproved AI software, or whether anyone had been disciplined.
Staff turning to AI tools under 'enormous pressure' - union
Fleur Fitzsimons, national secretary for the Public Service Association, which represents many health and addiction service workers, said clinical staff were turning to AI tools because of the "enormous pressure" they were under.
A memo which opened by threatening formal disciplinary action was the wrong approach, she said.
"It's a warning shot that will make staff afraid to ask questions or seek help."
HNZ should be investing in proper training and approved tools, she said.
"Let's not forget that HNZ has been cutting the very teams responsible for digital systems and IT support. If staff are improvising with free tools, HNZ needs to examine why that is the case, not simply threatening staff with a breach of the Code of Conduct."
Sign up for Ngā Pitopito Kōrero, a daily newsletter curated by our editors and delivered straight to your inbox every weekday.

Tags:

health
technology

Share this

Share on Twitter
Share on Facebook
Share via email
Share on Reddit
Share on Linked In

Copyright © 2026, Radio New Zealand

Subscribe to RNZ's Daily Newsletter

 
 

View latest newsletter

Next story in New Zealand
Corrections ends Pūwhakamua prisoner reintegration contract after 'serious sexual allegations'

Related Stories

ED doctors estimate AI scribe saving up to 10 minutes per patient

17 Mar 2026

The tool, known as Heidi, was trialled in Hawke's Bay Hospital's ED, before the government announced it was being rolled out to all hospitals earlier this month. Audio

Emergency doctors estimate AI scribe 'Heidi' saving up to 10 minutes per patient

How online health tools show accessibility is still an afterthought

27 Mar 2024

Opinion - Accessibility in healthcare is often lacking or insufficient, both for staff and patients, Sally Britnell writes.

Accessibility remains an afterthought – how NZ’s digital health tools risk excluding people with disabilities

ChatGPT to get ads

22 Mar 2026

OpenAI will begin showing ads to some users in the United States in the coming weeks.

OpenAI to introduce ads to all ChatGPT free and Go users in US

ChatGPT to get ads

22 Mar 2026

OpenAI will begin showing ads to some users in the United States in the coming weeks.

OpenAI to introduce ads to all ChatGPT free and Go users in US

ChatGPT upgrade 'like talking to a PhD-level expert'

10 Aug 2025

GPT-5, which launched across OpenAI's free and paid tiers, will make ChatGPT better at tasks like writing, coding and answering health-related questions, OpenAI has claimed.

ChatGPT upgrade 'like talking to a PhD-level expert'

New Zealand

Live: Floods close highways as heavy rain hits North Island
Student arrested, teacher in hospital after Waikato high school put in lockdown
Rowi kiwi returned to South Westland
Former Anglican priest Jonathan Kirkpatrick sexually violated teen at Canterbury bach
High Court rules in favour of College of Midwives class action
More fuel shipments enroute than previously reported

Get the RNZ app
for ad-free news and current affairs

Top News stories

Live: Floods close highways as heavy rain hits North Island
NZ will not move up fuel alert level tomorrow, Willis says changes will not be sudden
Student arrested, teacher in hospital after Waikato high school put in lockdown
Rowi kiwi returned to South Westland
Former Anglican priest Jonathan Kirkpatrick sexually violated teen at Canterbury bach

Subscribe

Subscribe to RNZ's Daily Newsletter

 
 

View latest newsletter

New Zealand RSS
Follow RNZ News

Health New Zealand (HNZ) has instituted a firm prohibition on staff utilizing freely available artificial intelligence (AI) tools, specifically referencing instances where clinicians had employed platforms such as ChatGPT, Gemini, and Claude for drafting clinical notes. This directive, communicated via a memo distributed to Mental Health and Addiction Services personnel within the Rotorua Lakes district, stemmed from concerns surrounding data security, patient privacy, and accountability – critical considerations within the healthcare sector. The memo explicitly states that the use of these tools constitutes a breach of established protocols and could potentially trigger formal disciplinary action.

The rationale behind this mandate is rooted in HNZ’s broader AI policy, which mandates registration of all AI tools with the Health NZ National Artificial Intelligence and Algorithm Expert Advisory Group (NAIAEAG). This group, including individuals like Heidi – an AI scribe currently being piloted across emergency departments – underscores HNZ’s strategic exploration of AI-driven solutions. Sonny Taite, the director of digital innovation and AI for HNZ, emphasizes that free AI tools present unacceptable risks, necessitating a cautious approach and highlighting that any exceptions are subject to rigorous case-by-case evaluation.

Despite the overarching prohibition, HNZ is actively implementing AI solutions, notably through the ongoing trial of the “Heidi” AI scribe within Hawke’s Bay Hospital’s emergency department. Initial estimates from emergency doctors suggest that “Heidi” could be saving up to ten minutes per patient, streamlining the documentation process and freeing up clinician time. However, the extent of unauthorized AI usage within HNZ remains unclear, and the organization has not disclosed whether any disciplinary actions have been taken as a result.

Public Service Association (PSA) national secretary Fleur Fitzsimons criticizes the memo’s tone and approach, arguing that it constitutes a “warning shot” that could stifle innovation and discourage staff from seeking assistance. Fitzsimons contends that the pressure clinicians face – evidenced by their potential reliance on AI – requires investment in proper training and approved tools, not punitive measures. She further highlights the concurrent trend of HNZ’s reduction in digital systems and IT support, raising concerns that staff are being forced to improvise with unsupported technologies. Fitzsimons asserts that the health agency must address the underlying reasons for staff resorting to unapproved tools rather than simply imposing disciplinary action.

The situation reflects a broader national conversation surrounding the integration of AI in healthcare, particularly concerning data security and ethical considerations. The government's push to implement AI scribes like “Heidi” alongside the concurrent prohibition on free AI tools demonstrates a carefully managed approach to balancing innovation with established regulatory frameworks and risk mitigation. The ongoing evaluation of “Heidi” and the planned rollout across all hospitals signals HNZ's commitment to exploring the potential benefits of AI while carefully addressing potential challenges and maintaining patient privacy and data security.