
Using artificial intelligence to draft fiduciary documents comes with risks, by Appleton Managing Director, Lauren Hean
Warwick has chosen Appleton to provide Will and estate administration services to its clients. This month, Appleton Managing Director, Lauren Hean, discusses the dangers of using AI to draft Wills.
At Appleton, we increasingly come across questions regarding the use of artificial intelligence in document drafting. In this excellent article, FISA Member and Councillor in Cape Town, Chris Murphy, pinpoints some areas of concern.
Fiduciary practitioners who use AI without fully understanding the legal implications of their work may expose clients to legal, financial, and ethical dangers. Here is why using AI without the correct legal oversight in estate planning is highly risky.
Lack of customisation and context awareness
Estate planning is not a one-size-fits-all process. Each client’s situation is unique, involving complex financial, familial and personal considerations. AI-generated documents often follow generic templates that fail to account for nuanced client needs. Without proper legal expertise, fiduciary practitioners may not recognise when an AI-generated document is insufficient or requires customisation.
For example, blended families, minor beneficiaries, business ownership, special needs dependants, and tax implications all require careful planning. A generic AI-generated Will may not include provisions for a disabled beneficiary’s trust or may fail to address estate tax strategies that could save the estate significant sums. Without the ability to analyse and adapt AI-generated content, practitioners risk providing clients with inadequate or problematic documents.
Ambiguities and drafting errors
Drafting of documents in this field requires precision to avoid ambiguities and misinterpretations. AI tools, while capable of generating legally sound language, do not possess human judgment to identify inconsistencies or potential points of dispute. Estate planning documents must be clear and precise to prevent litigation and ensure that a client’s wishes are carried out accurately.
For example, if an AI-generated Will states, ‘I leave my estate equally to my children,’ but does not define what happens if one child predeceases the testator, this omission could lead to legal disputes among surviving heirs. A well-drafted Will would include contingency clauses, specifying whether the deceased child’s share passes to their descendants or is redistributed among surviving children.
Furthermore, AI may generate contradictory clauses within a single document. Without legal expertise, practitioners using AI may not detect such errors, leading to confusion and potential court intervention.
Failure to address tax implications
Effective estate planning requires careful consideration of tax law, including estate duty/death duty, and capital gains tax. AI tools may not fully understand the tax implications of asset distribution, leading to unintended tax burdens for heirs.
For example, an AI-generated trust deed might not consider the impact of capital gains tax on asset transfer or fail to include provisions that minimise estate tax liability. A formally trained tax professional would assess these factors and implement tax-saving strategies, such as setting up charitable trusts or utilising tax exemptions. Fiduciary practitioners without this expertise using AI may unknowingly create estate plans that expose clients to unnecessary taxation.
Ethical and professional liability concerns
Fiduciary practitioners are held to high ethical and professional standards. Clients trust them to act in their best interests and provide accurate, legally sound advice. Relying on AI without proper legal oversight could be considered negligence, particularly if the resulting documents lead to legal disputes or financial losses.
For example, if an AI-generated estate planning report contains incorrect legal advice, the fiduciary practitioner could face liability for professional misconduct. Clients who suffer financial harm due to faulty documents may seek legal action against the practitioner, leading to reputational damage and potential disciplinary consequences.
Additionally, AI lacks the ability to apply ethical considerations to estate planning. Certain decisions – such as disinheriting a family member or structuring a trust to prevent financial abuse – require human judgment and professional guidance. AI-generated documents cannot replace the ethical responsibility of fiduciary practitioners to advise clients appropriately.
Risk of data security and confidentiality breaches
Estate planning involves highly sensitive personal and financial information. When using AI tools, practitioners must consider data security risks. Many AI platforms store or process data in cloud-based environments, raising concerns about confidentiality and compliance with data protection laws.
If an AI tool is not compliant with privacy regulations (e.g., Protection of Personal Information Act 4 of 2013), clients’ sensitive information may be exposed to security breaches or unauthorised access. Fiduciary practitioners must ensure that any AI tool they use adheres to strict confidentiality standards to protect client data.
Conclusion
Fiduciary practitioners who rely on AI without a deep understanding of estate law, recent court cases, tax implications, and ethical considerations, put their clients at significant risk. AI-generated documents may be invalid, outdated, ambiguous, or legally deficient, leading to disputes, tax liabilities, and unintended asset distributions.
To protect clients and uphold professional integrity, estate planning should always involve human legal review. AI can assist with drafting and automation, but final documents must be scrutinised, customised, and approved by a qualified legal professional. By maintaining a balance between technological advancement and professional oversight, fiduciary practitioners can ensure that estate planning remains accurate, compliant, and in the best interests of their clients.
Chris Murphy FPSA® TEP BProc (Unisa) is a FISA member and FISA Councillor in Cape Town. This article was first published in De Rebus in 2025 (April) DR 7.
Other sections to read:
Disclaimer: The information, opinions and recommendations contained herein are and must be construed solely as statements of opinion and not statements of fact. No warranty, expressed or implied, as to the accuracy, timeliness, completeness, merchantability or fitness for any particular purpose of any such recommendation or information is given or made by Warwick Wealth (Pty) Ltd in any form or manner whatsoever. Each recommendation or opinion must be weighed solely as one factor in any investment or other decision made by or on behalf of any user of the information contained herein and such user must accordingly make its own study and evaluation of each strategy/security that it may consider purchasing, holding or selling and should approach its own financial advisers to assist the user in reaching any decision. This document is for information only and do not constitute advice or a solicitation for funds. Investors should note that the value of an investment is dependent on numerous factors which may include, but not limited to, share price fluctuations, interest and exchange rates and other economic factors. Performance is further affected by uncertainties such as changes in government policy, taxation and other legal or regulatory developments. Past performance provides no guarantee of future performance.
Warwick Wealth (Pty) Ltd (Registration number 2012/223370/07). An authorised financial services provider (FSP 44731)