AI algorithm IDs wrist fractures with 98% sensitivity

Researchers have trained an AI algorithm to identify and locate radius and ulna fractures on wrist radiographs with 98 percent sensitivity and 73 percent specificity, on a per study basis. Their findings were published in the Radiological Society of North America’s new journal Radiology: Artificial Intelligence.

“Interpretation errors on radiographs are contributed by human and environmental factors, such as clinician inexperience, fatigue, distractions, poor viewing conditions and time pressures,” Yee Liang Thian, MBBS, FRCR, of the National University of Singapore, and colleagues wrote. “Automated analysis of radiographs by computers, which are consistent and indefatigable, would be invaluable to augment the work of emergency physicians and radiologists.” 

Thian et al. assessed a total of 7,356 wrist radiographic studies that were extracted from the hospital’s picture archiving and communication system (PACS). About 90 percent of the studies specifically with radius and ulna fractures were used for training and 10 percent were used for validation. Inception-ResNet Faster R-CNN version 2 was used as the deep learning model.

The models were first tested on an unseen set of 524 emergency department wrist radiographic studies and two radiologists were used as the reference standard.

The model detected and correctly identified 91 percent of the fractures on frontal views and 97 percent of all fractures seen on lateral views. Researchers found found:

  • The per-study sensitivity, specificity and AUC were 98 percent, 73 percent and 0.895, respectively.
  • For the frontal view, the per-image sensitivity, specificity and AUC were 96 percent, 83 percent and 0.918, respectively. 
  • For the lateral view, the per-image sensitivity, specificity and AUC were 97 percent, 86 percent and 0.933, respectively.

They found no significant difference in the AI’s performance between pediatric and adult wrist radiographs and between radiographs with or without casts. 

“The ability to predict location information of abnormality with deep neural networks is an important step toward developing clinically useful artificial intelligence tools to augment radiologist reporting,” the researchers concluded.