AI won't replace radiologists just yet; human radiologists will be in greater demand than ever before

Radiology, where AI has been used to detect pneumonia more accurately than humans since 2017, has long been expected to replace it. However, in reality, human power is still needed, as explained by Deena Moosa, who works for a medical charity.
AI isn't replacing radiologists
CheXNet , an AI that debuted in 2017, is a model that detects the possibility of pneumonia by inputting chest X-ray images, and is widely known for being able to check the condition more accurately and quickly than humans.
Since the emergence of CheXNet, many similar models have been developed, including models that can detect hundreds of diseases simply by scanning, models that prioritize doctors' work lists, and models that are approved to operate without any need for a doctor to review images. There are more than 700 models approved by the US Food and Drug Administration alone.
However, in radiology, despite the fact that AI's greatest strength is pattern recognition, it is not being fully utilized and human intervention is still required. There is also a labor shortage, and as of 2025, radiology is expected to become the second highest-paying medical specialty in Japan.

Moussa explains that there are three factors behind this.
First, while AI models can outperform humans in benchmark tests, they cannot perform at the same level in real-world environments. Most tools can diagnose anomalies that appear frequently in the 'training data,' but their accuracy drops for anomalies that appear less frequently in the training data.
Second, there are legal barriers to expanding the use of AI in business. No matter how powerful an AI may be, its output can be erroneous, and AI itself has the drawback of being largely unable to verify whether its output is erroneous. For example, there was an
As a result, while regulatory authorities and health insurance companies approve AI that requires human assistance, they are reluctant to approve or cover fully autonomous radiology diagnostic models. Also, no matter how fast an AI image diagnosis is, it still needs to be checked by a human in the end, so in the end it may not lead to any time savings.
Third, even if it could accurately diagnose, AI would only be able to replace a small portion of a radiologist's work, as human radiologists spend very little time on diagnosis and most of their time on other activities, such as interacting with patients and fellow clinicians.

The main reasons are listed above, but there are many other reasons as well.
One issue is the limited versatility of AI. Many models used in radiology can only detect a single finding or pathology, and looking at multiple images requires switching between dozens of models and asking appropriate questions to each. In actual clinical settings, many models are limited to specific use cases, and there are also issues with not being able to fully train some cases due to insufficient data.
While Moussa points out that 'these hurdles could be overcome with stronger evidence and improved performance,' other requirements hinder widespread adoption. For example, retraining a model requires reapproval, even if the previous model was approved by the authorities. Additionally, medical malpractice claims are so high that few insurance companies are willing to pay for a technology that is still relatively unknown. This is another barrier to medical institutions adopting AI.
'The promise of AI in radiology is probably overestimated based on benchmark results alone,' said Musa. 'AI can certainly improve productivity, but it still requires human effort to use it. For now, the paradox is that the more powerful the machines, the busier radiologists become.'
Related Posts:
in Software, Posted by log1p_kr