In a landmark study published in Gut, Mukherjee et al developed and validated the Radiomics-based Early Detection Model (REDMOD), an automated artificial intelligence (AI) framework that identifies subtle, preclinical imaging signatures of pancreatic ductal adenocarcinoma on routine computed tomography (CT) scans. The model targets “visually occult” disease—changes undetectable to radiologists—addressing a major barrier to early detection in a cancer for which more than 85% of cases are diagnosed at an advanced stage.
Study Details
The REDMOD framework was trained and validated on a large, multi-institutional data set reflecting real-world clinical conditions and the low prevalence of early pancreatic ductal adenocarcinoma. The study included 1,462 CT scans: 219 prediagnostic scans from patients later diagnosed with pancreatic ductal adenocarcinoma and 1,243 control scans from individuals without cancer, all confirmed with at least 3 years of follow-up. These were divided into a training cohort (n = 969) and an independent test cohort (n = 493).
The fully automated pipeline integrates deep learning–based pancreas segmentation with radiomic feature extraction, initially generating 968 quantitative imaging features per scan. These were reduced to 40 key features using minimum redundancy maximum relevance selection and incorporated into a heterogeneous ensemble model combining logistic regression and gradient boosting algorithms.
Key Results
Radiomic analysis revealed that 90% of the selected features were derived from multiscale, wavelet-filtered images, which significantly outperformed unfiltered data (are under the curve [AUC] = 0.82 vs 0.74; P = .007). The model was further evaluated for robustness across institutions and imaging platforms, as well as for longitudinal stability on repeat imaging.
In the independent test cohort, REDMOD achieved an AUC of 0.82 (95% confidence interval [CI] = 0.81–0.83), with a sensitivity of 73.0% (95% CI = 60.0%–78.7%) and specificity of 81.1% (95% CI = 75.2%–93.1%). This performance significantly exceeded that of radiologists, whose pooled sensitivity was 38.9% (P < .001), with the AI model demonstrating nearly double the detection rate. The advantage increased with longer lead times: more than 24 months before diagnosis, sensitivity was 68.0% for REDMOD vs 23.0% for radiologists.
Across prediagnostic intervals, sensitivity remained 75.0% for both 3 to 12 and 12 to 24 months before diagnosis. The model detected cancers at a median lead time of 475 days and maintained consistent performance across internal and external validation cohorts, with specificity of 81.3% and 87.5%, respectively. REDMOD also demonstrated strong longitudinal stability, with 90% to 92% concordance on repeat imaging.
The authors concluded: “REDMOD is an automated, mechanistically grounded, longitudinally stable, externally validated AI that surpasses radiologists for pancreatic ductal adenocarcinoma detection at its visually occult pre-diagnostic stage. These attributes position it for prospective validation in high-risk cohorts, a necessary step towards shifting the paradigm from late-stage symptomatic diagnosis to proactive preclinical interception.”
Sovanlal Mukherjee, PhD, of the Department of Radiology, Mayo Clinic, Rochester, Minnesota, is the corresponding author for the Gut article.
DISCLOSURE: The study was funded by the National Institutes of Health, Mayo Clinic Comprehensive Cancer Center, and others. For full disclosures of the study authors, visit bmj.com.

