Examples of correctly and incorrectly prioritized radiographs. (a) Radiograph was reported as showing large right pleural effusion (arrow). This was correctly prioritized as urgent. (b) Radiograph reported as showing “lucency at the left apex suspicious for pneumothorax.” This was prioritized as normal. On review by three independent radiologists, the radiograph was unanimously con¬sidered to be normal. (c) Radiograph reported as showing consolidation projected behind heart (arrow). The finding was missed by the artificial intelligence system, and the study was incorrectly prioritized as normal. (Credit: RSNA)

An artificial intelligence (AI) system can interpret and prioritize abnormal chest x-rays with critical findings, potentially reducing the backlog of exams and bringing urgently needed care to patients more quickly, according to a recent study. The AI system distinguished abnormal from normal chest x-rays with high accuracy.

In the UK, there are an estimated 330,000 x-rays at any given time that have been waiting more than 30 days for a report. Deep learning (DL) has been proposed as an automated means to reduce this backlog and identify exams that merit immediate attention, particularly in publicly-funded health care systems.

For the study, researchers used 470,388 adult chest x-rays to develop an AI system that could identify key findings. The images had been stripped of any identifying information to protect patient privacy. The radiologic reports were preprocessed using Natural Language Processing (NLP). For each x-ray, the researchers’ in-house system required a list of labels indicating which specific abnormalities were visible on the image.

The NLP analyzed the radiologic report to prioritize each image as critical, urgent, non-urgent or normal. An AI system for computer vision was then trained using labeled x-ray images to predict the clinical priority from appearances only. The researchers tested the system’s performance for prioritization in a simulation using an independent set of 15,887 images.

The AI system distinguished abnormal from normal chest x-rays with high accuracy. Simulations showed that critical findings received an expert radiologist opinion in 2.7 days, on average, with the AI approach — significantly sooner than the 11.2-day average for actual practice.

The researchers plan to expand their research to a much larger sample size and deploy more complex algorithms for better performance. Future research goals include a multi-center study to prospectively assess the performance of the triaging software.

Source