AI Just Outperformed Periodontists (But That's Not the Real Story)
Ali Vatan A major study shows AI surpassing specialists in detecting periodontitis. But detection was never the real problem.
A multicenter study in Nature npj Digital Medicine reports that an AI model called HC-Net+ can outperform dental specialists, including periodontists, in detecting Stage II to IV periodontitis on radiographs. The model showed superior accuracy and stability compared to clinicians of varying experience levels, and the authors suggest it could function as a stand-alone diagnostic tool.
I read this and my first reaction was not “we’ve solved periodontics.” It was: can it make my patients actually turn up for their three-monthly hygiene visits?
Detection was never the bottleneck
Any competent clinician can diagnose periodontitis. You probe, you measure, you look at the radiographs, you assess clinical attachment loss. The BPE screening system that every GDP in the country uses at every check-up is specifically designed to flag periodontal disease, and good clinicians catch perio. They always have.
The problem is what happens after detection. Periodontal disease affects over one billion people globally, according to the WHO. Teeth with advanced periodontitis are lost not because we fail to diagnose, but because monitoring is inconsistent, patient compliance is unreliable, and managing a chronic inflammatory condition over years is exhausting for everyone involved. The patient doesn’t floss, they miss their maintenance appointments, the disease progresses. I’ve seen it more times than I can count.
Where AI does help: consistency
I want to be fair here, because there is a real clinical issue AI can address. Not every clinician is equally thorough, not every practice screens with the same rigour, and missed diagnoses are more common than any of us would like to admit, particularly in busy NHS practices where time pressures are relentless.
The HC-Net+ study matters because it demonstrates consistent, reproducible diagnostic performance, which is something human clinicians cannot guarantee by definition. An AI doesn’t get tired on a Friday afternoon. It doesn’t rush through a full-mouth series because the next patient is already waiting. It analyses every radiograph with the same attention, every time, and that consistency has real value as a safety net. If AI catches the early bone loss a stressed GDP misses, or the progression that goes unnoticed between annual radiographs, it is doing something useful.
More detection means more referrals, not fewer
If AI diagnoses periodontitis better than specialists, and GDPs already treat the majority of periodontal disease, what’s left for the periodontist? The answer is actually reassuring. If AI catches more cases at an earlier stage, that means more referrals, not fewer. Complex surgical cases, regenerative procedures, management of patients with systemic comorbidities: none of that gets replaced by an algorithm reading radiographs.
What changes is the funnel. More cases detected means more patients entering the treatment pathway, and AI-assisted diagnosis also means better documentation, more standardised staging, and clearer communication between referrer and specialist. I see that as a net positive.
The bigger question
AI can think faster than us. It analyses more data, stores more information, and performs pattern recognition at a scale no human brain can match. This study is one data point in a much larger trend: AI outperforming radiologists, dermatologists, pathologists, and now periodontists. The next wave, I believe, will be robotics, and Perceptive has already completed the first fully automated dental procedure on a human. The trajectory is clear.
So where does that leave the human clinician? It leaves us doing what machines cannot. Building relationships, earning trust, sitting with a nervous patient and explaining why they need treatment, holding the hand of someone who hasn’t seen a dentist in ten years because they’re terrified. AI can diagnose periodontitis better than a specialist, fine, but it cannot look a patient in the eye and help them understand why their oral health matters, why they need to change their habits, why showing up for that maintenance appointment could save their teeth. That’s our job, and I don’t see that changing.
What I take from this
The HC-Net+ research is good science, demonstrating that AI has reached diagnostic capability matching and exceeding specialist performance in a specific, well-defined task. But periodontitis is a chronic disease. Diagnosing it is step one, and managing it over years, through setbacks, through patient non-compliance, through the realities of how people actually live: that’s the hard part. Until AI can solve the human element, we’ll still be needed.
References
-
Nature npj Digital Medicine. “A novel AI-powered radiographic analysis surpasses specialists in stage II-IV periodontitis detection: a multicenter diagnostic study.” 2025. https://www.nature.com/articles/s41746-025-02077-0
-
Frontiers in Medical Technology. “Artificial intelligence-powered innovations in periodontal diagnosis: a new era in dental healthcare.” 2024. https://www.frontiersin.org/journals/medical-technology/articles/10.3389/fmedt.2024.1469852/full
-
World Health Organisation. “Oral health fact sheet.” https://www.who.int/news-room/fact-sheets/detail/oral-health
-
PMC. “Global, regional, and national burden of periodontal diseases from 1990 to 2021 and predictions to 2040.” Global Burden of Disease Study 2021. https://pmc.ncbi.nlm.nih.gov/articles/PMC12332980/