Overview: This software performs oncological PET segmentation using an estimation-based approach that estimates the fractional volume being occupied by the tumor within each voxel of a PET image.

Software description: Tumor segmentation in oncological PET is challenging, a major reason being partial-volume effects (PVEs) that arise due to low system resolution and finite voxel size. The latter results in tissue-fraction effects (TFEs), i.e. voxels contain a mixture of tissue classes. Conventional segmentation methods are typically designed to assign each image-voxel as belonging to a certain tissue class. Thus, these methods are inherently limited from TFEs. To address the challenge of accounting for PVEs, and in particular, TFEs, we propose a Bayesian approach to tissue-fraction estimation for oncological PET segmentation. Specifically, the segmentation problem is posed as a task of estimating the fractional volume that the tumor occupies within each PET-image voxel. Through this strategy, we are able to explicitly model the TFEs while performing segmentation. Further, we theoretically demonstrated that the proposed method, by minimizing an objective function based on binary cross-entropy loss, yields a posterior-mean estimate of tumor-fraction volume for each voxel in the PET image.

The proposed method was implemented and evaluated using a supervised deep-learning-based technique, on a simplified per-slice basis. As shown in the figure below, during the training phase, the network is provided with a population of 2-D PET images and the corresponding ground-truth tumor-fraction area map. The network, by minimizing the designed objective function over this population of images, becomes trained to yield a posterior-mean estimate of fractional areas given the input PET image.

Validation: The proposed method was first evaluated using clinically realistic 2-D simulation studies with known ground truth, in the context of segmenting the primary tumor in PET images of patients with lung cancer. The evaluation studies demonstrated that the method accurately estimated the tumor-fraction areas and significantly outperformed widely used conventional PET segmentation methods, including a U-net-based method, on the task of segmenting the tumor. In addition, the proposed method was relatively insensitive to PVEs and yielded reliable tumor segmentation for different clinical-scanner configurations.

The method was then evaluated using clinical images of patients with stage IIB/III non-small cell lung cancer from ACRIN 6668/RTOG 0235 multi-center clinical trial. Here, the results showed the proposed method significantly outperformed all other considered methods and yielded accurate tumor segmentation on patient images with high Dice similarity coefficient of 0.82. In particular, the method accurately segmented relatively small tumors.

Overall, this study demonstrates the efficacy of the proposed method to accurately segment tumors in PET images.

Download: The manuscript and software package can be accessed at this link.

Reference: Please cite the following references when using this software.

PET simulation:

  1. Liu, Ziping, Richard Laforest, Joyce Mhlanga, Tyler J. Fraum, Malak Itani, Farrokh Dehdashti, Barry A. Siegel, and Abhinav K. Jha. “Observer study-based evaluation of a stochastic and physics-based method to generate oncological PET images.” In Medical Imaging 2021: Image Perception, Observer Performance, and Technology Assessment, vol. 11599, p. 1159905. International Society for Optics and Photonics, 2021.

Segmentation:

  1. Liu, Ziping, Joyce C. Mhlanga, Richard Laforest, Paul-Robert Derenoncourt, Barry A. Siegel, and Abhinav K. Jha. “A Bayesian approach to tissue-fraction estimation for oncological PET segmentation.” Physics in Medicine & Biology 66, no. 12 (2021): 124002.
  2. Liu, Ziping, Richard Laforest, Hae Sol Moon, Joyce Mhlanga, Tyler Fraum, Malak Itani, Aaron Mintz, Farrokh Dehdashti, Barry Siegel, and Abhinav Jha. “An estimation-based segmentation method to delineate tumors in PET images.” (2020): 447-447.