AI in notetaking: Legal and ethical considerations for health practitioners

date
07 November 2025

The use of artificial intelligence (AI) is rapidly expanding across the healthcare sector, offering new ways to streamline administrative tasks such as clinical notetaking and record-keeping. While AI can enhance efficiency and accuracy, it also introduces legal and ethical risks that health practitioners must carefully manage.

A version of this article was originally published in the September 2025 edition of the Australian Podiatry Association’s Stride Magazine.

AI and law: A recent case

The use of AI tools and the potential pitfalls of them is an emerging area of law. While there have been limited court cases regarding health professionals using AI, there are some emerging cases in the legal context.

The recent matter of Murray on behalf of the Wamba Wemba Native Title Claim Group v State of Victoria [2025] FCA 731 involved a legal practitioner who provided an inaccurate AI-generated document to the Federal Court. The Court held that material generated by practitioners using AI is still subject to the same professional obligations – as if the practitioner had created the document.

We anticipate that courts will take a similar position in relation to health practitioners - that is, the practitioner is ultimately responsible for the accuracy of any health records generated using AI.

Professional obligations

The Shared Code of Conduct includes several obligations and requirements for health practitioners to maintain clear and accurate health records to ensure good care is provided to patients. These obligations are also reflected in the respective codes and guidelines applicable to practitioners not subject to the Shared Code of Conduct.

Appropriate records are records that are written clearly, concisely, and are organised and well-structured. The information contained in the records should be sufficient to enable a different practitioner to assume the patient’s care.

Practitioner notes should include:

  • any relevant clinical history
  • clinical findings and investigations
  • information given to the patient, and
  • the practitioner’s management of the patient.

AI usage and potential risk

The use of AI for record keeping can present many benefits including improved accuracy, efficiency, increased organisation, accessibility, and cost savings. However, there are some risks and limitations of AI which health practitioners should consider.

These include, but are not limited to:

  • the implementation of the AI software
  • informed consent
  • patient privacy, and
  • the accuracy of the information generated.

Implementation of AI

The use of AI requires proper software integration into the practitioner’s and/or clinic’s system.

Some essential integration elements include:

  • Training staff in use of the AI system: Training should ensure staff understand how the AI system works and how to use it, while still meeting their professional obligations.
  • Policies and procedures for the use of AI: Policies should provide guidelines to ensure that software is used appropriately, that only approved software is used, and to clarify what requirements must be met to ensure safe use.

Informed consent

When using AI, practitioners are required to inform their patients about its use and gain their consent, which must be documented in their records.

To ensure a practitioner has gained informed consent, they should:

  • inform their patient about the use of AI at the medical practice and consultation,
  • provide sufficient information to allow their patient to make an informed decision about its use in their consultation. At a minimum, this should include details of which AI tool is being used, what it does, and where the information it produces will be collected and stored,
  • record their consent, and
  • gain consent from each patient at every consultation where the AI tool will be used.

Where a patient does not consent to the use of AI to record their information, the practitioner is required to manually record patient information.

Health practitioners are encouraged to review their informed consent obligations under the Shared Code of Conduct and the Australian Health Practitioner Regulation Agency resources, with respect to meeting their professional obligations when using AI in healthcare.

Patient privacy

Practitioners are also required to understand whether the use of AI in their practice may result in their patient’s personal and health information being stored or used by the AI tool provider, and otherwise ensure their use is compliant with privacy legislation.

Practitioners are encouraged to carefully review the terms of use for AI tools, and consider their professional and legal obligations regarding patient information. This may require revision to clinic privacy policies, consent processes, and other procedures.

The Office of the Australian Information Commissioner has provided a guide to health privacy which contains useful information regarding compliance with privacy obligations.

Reviewing AI notes

To ensure the practitioner continues to meet their professional obligations, they are still required to review the records taken by any AI software used, to ensure the contents are accurate. Practitioners should only sign or approve the records generated by AI once they have reviewed the records and verified them as being complete and accurate.

Practitioners must keep in mind that patients have the right to access and correct their AI-generated health records as they do with any manually generated records.

Notetaking and insurance

Accurate and comprehensive clinical notes are not only essential for continuity of care, they also play a critical role in defending against professional negligence claims. In the event of a complaint or legal action, a practitioner’s records may be relied upon as evidence of the care provided, often long after the consultation took place.

Implications for you

As AI tools become more integrated into clinical practice, health practitioners must remain vigilant about their legal and professional responsibilities. Courts are likely to treat AI-generated records as if they were created by the practitioner themselves, meaning the same standards of accuracy, consent, and privacy apply.

Ask us how we can help

Receive our latest news, insights and events
Barry Nilsson acknowledges the traditional owners of the land on which we conduct our business, and pays respect to their Elders past, present and emerging.
Liability limited by a scheme approved under Professional Standards Legislation