AI Diagnostics Clinical Research

Can AI Fix a Blurry X-Ray?

Dr Ali Vatan Ali Vatan
·

Overjet just got FDA clearance for AI that enhances dental radiographs in real time. It sounds brilliant — but I have concerns.

Can AI Fix a Blurry X-Ray?

Overjet has received FDA 510(k) clearance for IRIS, an AI system that enhances dental radiographs in real time by sharpening blurry images, reducing noise, and improving clarity. It’s the first and only dental AI platform cleared by the FDA for X-ray enhancement. They were also named one of TIME’s Best Inventions of 2024 (TIME, 2024).

Impressive credentials. But I have genuine concerns about this, and I think they’re worth airing.

What IRIS does

To be fair to the technology: IRIS uses a proprietary machine learning architecture to remove visual noise from dental radiographs without, according to Overjet, losing important clinical details like the subtle markers of decay. In their clinical study, dentists rated AI-enhanced images roughly 25 per cent higher in quality than the originals (Overjet, 2024).

I can see the appeal. We’ve all taken radiographs that aren’t quite what we wanted, whether the patient moved, the sensor wasn’t positioned perfectly, or the exposure was off. An AI that can rescue a suboptimal image rather than requiring a retake is genuinely attractive, especially when you’re trying to minimise radiation exposure.

Overjet has built an impressive portfolio overall: nine FDA-cleared modules covering caries detection, calculus detection, periapical radiolucency identification, and automated charting. Their paediatric product was what TIME recognised, and in clinical trials 100 per cent of dentists were more accurate in detecting cavities when assisted by the AI (Overjet, 2024).

So why am I sceptical?

AI enhancement is prediction, not revelation

When an AI “enhances” an image, it’s not simply making what’s there clearer. It’s making predictions about what should be there based on patterns learned from training data. That’s a fundamentally different thing. The AI cannot add patient-specific information that wasn’t captured by the sensor. What it can do is create images that look better (sharper, smoother, more detailed) while potentially introducing subtle visual elements that weren’t in the original (European Medical Journal, 2024).

The radiology literature has been warning about this for years. AI-based reconstructions may introduce artefacts, distortions, and what researchers call “hallucinations” into medical images. One study documented false positive findings of cartilage defects in MRI images reconstructed using neural networks; the defects looked real but weren’t present in the unenhanced originals (AJR, 2024).

I’m not saying Overjet’s system necessarily does this. Their FDA clearance process would have required them to demonstrate clinical safety. But the fundamental concern remains: when you look at an enhanced image, you’re not looking at raw clinical data anymore. You’re looking at an AI’s interpretation. That distinction matters enormously in clinical decision-making.

Good technique already gets us most of the way

A properly positioned digital sensor with correct exposure settings produces a diagnostic-quality image the vast majority of the time. If a clinician is consistently producing blurry or noisy radiographs, the solution isn’t an AI filter; it’s better technique, better equipment, or better training.

I worry that tools like IRIS could create a false sense of security. If clinicians know AI will “fix” their suboptimal images, does that reduce the incentive to get it right in the first place? Does it subtly lower the standard of what we accept as adequate image capture? I hope not, but the risk is real.

Fame is not safety

Overjet has nine FDA-cleared modules and a TIME Best Inventions award. That’s impressive, and I respect it. But a prestigious award is not a substitute for rigorous, independent clinical validation in real-world dental settings.

The fact that dentists rated enhanced images as higher quality doesn’t tell us whether those images led to better clinical outcomes. Looking better and being more diagnostically useful are not the same thing.

I want to see long-term studies. I want data on whether AI-enhanced radiographs lead to different diagnostic conclusions compared to originals, and whether those different conclusions are correct. I want false positive and false negative rates in enhanced versus standard radiographs. Until that evidence exists, caution is appropriate.

Overjet’s diagnostic tools are a different story

I’m not dismissing Overjet as a company. Their broader AI diagnostic tools have genuine clinical value. The caries detection modules, the automated charting: these flag things for human review rather than altering the underlying clinical data. That’s a meaningful distinction.

Image enhancement is different. It changes what you’re looking at. When that image is the basis for deciding whether to drill, extract, or refer, the stakes are too high for uncritical adoption.

If your radiograph isn’t diagnostic quality, take another one. Don’t let an algorithm convince you that a blurry image has become reliable. The patient deserves better than that.

References