2. Confidentiality needs to be upheld, as well as privacy suppression. You should not enter any information into an AI chatbot that is not already in the public domain. Do not enter private information that is already suppressed. Significant details are suppressed under the Family Courts Act. Please take advice from a lawyer or other competent professional about what details are suppressed and what your risks are in using AI.
3. You may be asked to disclose whether you have been using AI by the court.
4. You need to be aware of any ethical issues, including any biases, in the generative AI chatbot.
5. You need to check that any information generated is accurate. There have been reported cases in the Family Court, both in New Zealand and overseas, that have shown that generative AI chat makes up fake cases or refers to articles or texts that do not exist. Relying on these types of cases not only undermines any legal argument you may have but runs the risk that your case is struck out, or an adverse costs award is made against you.
Just recently, Microsoft Copilot’s terms of use have been in the spotlight after users identified terms that said that Copilot was “for entertainment purposes only” and: “It can make mistakes and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk”. Microsoft is now updating its terms, but this highlights how fluid the situation is.
My main comment is that the Family Court focuses on the relevant facts of each case. The most important thing in the Family Court is the chronology of key issues in each case, which you will already have without needing to use AI.
We have found that generative AI may help to summarise a chronology or other features. But you need to be careful about the limitations of that; it is just part of your case, in giving you some initial tools and insights.
Generative AI can be very useful in testing your thinking. Although it can be sycophantic and overly willing to please, if you ask it to assess the strengths and weaknesses of your position, and of the other party’s, it may help you to more objectively assess your position without emotion.
You can also ask it to help you to summarise and understand information that has been provided by other parties.
It may then help you to determine your strategy based on realistic options – some of which you may not have thought of yourself – and clearly structured logic, which could assist you, the court and the other party in achieving a better overall outcome, more quickly and at lower cost. Generative AI may help to point you in the right direction and open the door to opportunities for further research.
When making written submissions, some people use it to refine their drafts, make points clearer and more succinctly, as well as improving spelling and grammar – all of which can be positive for your argument and readability of any written submissions.
Summary
The parties that succeed in a case are those who have prepared well and have all the essential background issues and documents.
A generative AI chatbot cannot always tell the difference between facts and opinions and source information. You need to check any information that is provided as to whether it is correct or not with another source.
Generative AI can be a really useful tool in the toolbox, but should not be the only tool. Use AI as a great starting point, a springboard for research and supplement it with human judgment and reliable legal research.
Catch up on the debates that dominated the week by signing up to our Opinion newsletter – a weekly round-up of our best commentary.