The NHS Says You're Still Responsible (Even When AI Writes Your Notes)
Ali Vatan NHS England has published guidance on AI scribes in healthcare. The message is clear: the technology can help, but accountability stays with you.
NHS England published guidance on AI-enabled ambient scribing products in April 2025, and if you haven’t read it yet, you should. These are the tools that listen to your clinical conversations and generate structured notes automatically, and the central message from NHS England is straightforward: the technology can help, but accountability stays with you. I think that’s entirely reasonable.
What the guidance covers
The NHS England guidance sets out a framework for responsible adoption of AI scribes across health and care settings. Clinicians remain accountable for their clinical records, full stop: if an AI scribe generates your notes, those notes are still your notes, and you’re responsible for accuracy, completeness, and clinical appropriateness. Ambient scribing products using generative AI for summarisation, rather than simple transcription, are likely to qualify as medical devices and must meet relevant regulatory standards. All NHS organisations must ensure any ambient voice technology meets specified NHS standards, and non-compliant solutions, including those obtained through free trials, are not permitted. Organisations must complete a clinical safety risk assessment and a Data Protection Impact Assessment before deployment, and patient data from clinical sessions should be automatically deleted unless legally or operationally required.
The National Chief Clinical Information Officer also issued a priority notification warning about implementations not meeting clinical safety standards. The message was unambiguous: get this right or don’t do it at all.
I think this is the right approach
Some people in the AI space have called this guidance overly cautious, but I disagree. Of course clinicians should be accountable for their records; that’s not a new principle, it’s a foundational one. The fact that a machine wrote the first draft doesn’t change who’s responsible for the final product. If you sign off on notes, they’re yours, and that was true when your dental nurse wrote them just as it’s true when an AI writes them.
The guidance doesn’t say you can’t use AI scribes. It says you need to use them responsibly, with proper governance, risk assessment, and human oversight, and that’s not anti-innovation. That’s good clinical practice.
The detail problem, and why training matters
AI-generated clinical notes are only as good as the model producing them, and not all models are equal. The risk with ambient scribing is that the AI misses something: a detail mentioned in passing, a subtle clinical finding, a piece of patient history that matters. If you sign off without catching the omission, that gap becomes part of the official record.
But certain large language models are very good at maintaining detail across large context windows, and if the model is designed to be comprehensive rather than concise, specifically trained not to leave out details, the risks can be effectively mitigated. When it’s trained properly, this technology can produce notes that are not just adequate but thorough.
The problem isn’t AI scribes as a concept. The problem is poorly implemented ones: systems that prioritise brevity over completeness, that haven’t been validated in clinical settings, that haven’t been trained on the specific demands of healthcare documentation. The BDJ’s analysis raised an important point here, noting that clinicians are now in what some have termed a “liability sink” for AI tools. You’re accountable for the output, but you didn’t produce the output, and that’s an uncomfortable position that demands you actually understand what the tool is doing and verify its work.
Human in the loop, always
You can’t take responsibility for something you haven’t grasped. If an AI scribe generates notes and you rubber-stamp them without reading them properly, you haven’t fulfilled your duty of care; you’ve just automated your accountability away while keeping the liability. Being “in the loop” doesn’t mean glancing at the output and clicking approve. It means reading the notes critically, checking them against your clinical memory of the consultation, and correcting anything wrong or missing.
This is work, and it takes time, but it’s considerably less time than writing everything from scratch. Done properly, it can give you a better record than most clinicians produce manually, because the AI captures things you might have forgotten to document.
What this means for dental practices
For dentists specifically, AI scribes offer real potential. Clinical note-taking is one of the most time-consuming administrative tasks in practice, and anything that reduces that burden without compromising quality is welcome. But the NHS guidance applies to all health and care settings, and dental practices, whether NHS or private, need to take it seriously.
If you’re using or evaluating an AI scribe, check that the product meets NHS standards if you’re an NHS practice, and even private practices should use these standards as a benchmark. Read the notes every single time, because the convenience of automation is not an excuse for skipping review. Ask your vendor how the AI handles clinical detail: does it summarise aggressively, does it preserve nuance, has it been validated in dental settings? Complete your clinical safety risk assessment and DPIA properly, because they force you to think through the real risks. And keep patient consent front and centre, so patients know their consultation is being recorded and processed by AI.
My take
AI scribes are coming to dentistry and the efficiency gains are too significant to ignore, but the NHS guidance is a timely reminder that efficiency without accountability is dangerous. I think the clinicians who get this right will be the ones who treat AI scribes as powerful assistants rather than autonomous systems: reviewing the output, challenging it when something doesn’t look right, and never forgetting that their name is on those notes.
References
- NHS England. “Guidance on the use of AI-enabled ambient scribing products in health and care settings.” April 2025. england.nhs.uk
- NHS England. “AI-enabled ambient scribing products in health and care settings.” england.nhs.uk
- British Dental Journal. “NHS England guidance on AI scribes.” Nature, 2025. nature.com/articles/s41415-025-9061-0
- BDJ Team. “AI and record-keeping.” Nature, 2025. nature.com/articles/s41407-025-3071-2
- Chronicle Law. “NHS and AI Scribes.” January 2026. chroniclelaw.co.uk