Artificial Intelligence (AI) tools like ChatGPT are becoming increasingly popular in workplaces. They’re fast, accessible, and appear to produce professional-sounding material at the click of a button. It’s no surprise that both employers and employees are experimenting with AI for drafting employment letters, disciplinary responses, personal grievances, redundancy documents, and even investigating workplace issues. It is also no surprise that an experienced HR expert on the receiving end of the document can see right through them. In a case of you don’t know what you don’t know – an AI generated letter, or other documents might read well and appear to have what is needed to be included to the untrained eye, ultimately makes them easily attackable for people like us.
In our experience, AI can get it wrong, and does – frequently. When these tools are used without proper understanding or legal oversight, they create serious risk for employers, undermine fair process, good faith, and can quickly blow-up disputes.
What’s the problem with using AI in an employment context?
- Inaccurate or Misleading Legal References
We see that AI often provides advice based on overseas legal frameworks, which do not apply in New Zealand – if you are not familiar with legislation, this is easily missed. It frequently misrepresents the Employment Relations Act 2000, Holidays Act 2003, and Privacy Act 2020. Sometimes, AI even fabricates legislation or quotes sections that do not exist, those examples are definitely fun. These mistakes jump out to those that work in this space, and in a dispute these inaccuracies and where key information is unintentionally left out, undermine credibility.We have a few recent examples, but one sticks out most. AI advised an employer they can “terminate immediately for gross misconduct without process”, this is unlawful in New Zealand, and we sure were glad they contacted us before they took any steps that would have left them in hot water and in breach of their obligations. - Lack of Context and Human Judgment
Employment matters are rarely black and white, as employment experts, we like to think we operate in the grey (we are dealing with humans after all). AI cannot interpret a company’s history and culture, interpersonal conflict, previous warnings, cultural considerations, or the balance of fairness required under s4 of the Employment Relations Act 2000 (the duty of good faith). - Poor Quality HR Documents
AI-generated letters and reports may look professional, but often miss critical legal elements. The latest examples to come across our desk include:
* Disciplinary letters missing evidence or allegations, even the potential breaches.
* Restructure consultation letters with no genuine business justification and no real understanding of consultation.
* Investigation reports lacking analysis or a clear findings framework.These types of documents are easily challenged, which can be a very emotionally and financially taxing mistake to make from both an employee and employer position. Never forget the ability for either party to be awarded costs from the other.Are we prepared to see more of this? Yes. Earlier this year in a case before the Authority LMN vs STC, we understand the employee was self-representing (with the assistance of generative AI) and cited a fictitious case, that’s right, completely made up. That faux pas did not work in favour of the plaintiff. - Privacy and Confidentiality Risks
We don’t claim to be AI geniuses at PeopleHQ, but what we do know is uploading employee information, payroll records, or investigation notes into AI platforms may also bring about privacy risks. Many AI tools store and reuse data for machine learning, meaning sensitive information may be retained outside New Zealand.
Our thoughts on how AI can be used safely, and when it can’t
We would suggest, only as a support tool. We recommend:
Safe uses:
- Brainstorming ideas or tone
- Basic formatting (be careful of confidentiality obligations though)
- Rewriting for clarity
Unsafe uses:
- Drafting restructure or termination letters
- Managing disciplinaries or investigations
- Responding to personal grievances
- Giving legal advice
If you’re unsure whether your documentation or process meets legal and or procedural standards it probably doesn’t. Seek support before acting.
PeopleHQ is here to help.