A comparison between 2D and 3D GAN as a supporting tool for rectum segmentation on 0.35 T MR images
PD-0335
Abstract
A comparison between 2D and 3D GAN as a supporting tool for rectum segmentation on 0.35 T MR images
Authors: Marica Vagni1, Huong Elena Tran1, Angela Romano1, Luca Boldrini1, Giuditta Chiloiro1, Guillaume Landry2, Christopher Kurz2, Stefanie Corradini2, Maria Kawula2, Elia Lombardo2, Maria Antonietta Gambacorta1, Luca Indovina1, Claus Belka2,3, Vincenzo Valentini1, Lorenzo Placidi1, Davide Cusumano1
1Fondazione Policlinico Universitario "A. Gemelli" IRCCS, Department of Radiation Oncology , Rome, Italy; 2LMU Munich, Department of Radiation Oncology , Munich, Germany; 3German Cancer Consortium (DKTK), Department of Radiation Oncology , Munich, Germany
Show Affiliations
Hide Affiliations
Purpose or Objective
Manual recontouring of targets and OARs is a particularly time-consuming and operator-dependent task, which today represents a limiting factor in the online MR-guided radiotherapy (MRgRT) workflow. In particular, rectum contouring may be challenging due to its morphology and the presence of adjacent structures with similar intensities, thus making the delineation in correspondence of the sigma- and anorectal junction difficult. In this study, we explored the potential of two supporting neural networks, able to automatically perform the rectum segmentation, once its apical and caudal anatomical limits are indicated by the clinician.
Material and Methods
0.35 T 3D simulation MR scans from 72 prostate cancer patients treated with a MR-Linac were collected. The rectum delineation used in clinical practice and validated by two radiation oncologists represented ground truth. Patients’ volumes were resampled at the same spatial resolution (1.5 mm³) and corrected by removing the bias field artefacts through a dedicated image pre-processing pipeline. A 3D Generative Adversarial Network (GAN) (composed of a UNet and a PatchGAN) and its modified 2D version were trained on 53 patients, validated on 10 patients, and then tested on the remaining 9 cases. Islands removal was applied as image post-processing. Dice similarity coefficient (DSC) and 95th percentile Hausdorff distance (HD95th) were calculated against clinical delineations inside the ground truth rectum extension. The generation time of the segmented volume for each patient was also computed.
Results
Table 1 reports the DSC and HD95th across the test patients for the two investigated networks. The mean and standard deviation values for the DSC were 0.84 ± 0.04 for 2D and 0.82 ± 0.04 for 3D architecture, while for the HD95th were 8.85 ± 4.13 and 10.36 ± 5.35, respectively. Figure 1 shows an example of the generated contours for both 2D and 3D GANs. As regards the networks’ delineation time, the mean value for the 3D GAN was 1.03 s, while for the 2D GAN was 3.17 s (compared to an average time of up to 1 minute for the manual expert's contouring).
Conclusion
As far as the authors know, this is the first attempt to apply 3D GAN on 0.35 T MR images for auto-contouring purposes, with comparison against the corresponding 2D architecture. Quantitative results showed that the networks’ performances are comparable inside the ground truth extension of the rectum, providing a useful tool for the clinician to speed up the delineation during online adaptive MRgRT. Further developments will be implemented to extend these AI-based approaches to other OARs and tumors.