A Novel Deep Learning Pipeline for Vertebra Labeling and Segmentation of Spinal Computed Tomography Images

Document Type

Article

Publication Title

IEEE Access

Abstract

Automatic segmentation of vertebrae from computed tomography (CT) scans play an important role in the clinical interpretation and treatment of spinal co-morbidities. Labelling and segmentation of vertebrae is labour intensive and challenging, due to various fields of view and fuzzy boundaries in CT scans. Therefore, successful labelling and segmentation is highly dependent on the level of expertise of the radiologist. In this paper, we propose a three-step fully-automated end-to-end pipeline for vertebra labelling and segmentation of spinal CT images. A novel deep learning architecture, Unbalanced-UNet, is proposed for extracting the region proposals for spine detection. A modified SpatialConfiguration-Net, 3D SCN, is used for labelling of vertebra and centroid extraction. Finally, a 3D U-Net is employed for the segmentation of each vertebra. The models were validated on VERSE'19 public dataset. An identification rate of 90.20% and 91.47% was obtained for the first and second test sets of the VERSE'19 dataset, respectively. Mean localization distance of 4.97 mm and 5.32 mm was obtained for the first and second test sets, respectively. The final segmentation stage shows a dice score and Hausdorff surface distance of 93.07% and 5.36 mm, respectively, for the first test set, and 92.01% and 5.63 mm, respectively, for the second test set. The results show that the proposed approach outperforms the state-of-art models for segmentation of vertebrae. The proposed Unbalanced-UNet architecture increased the accuracy of accruing the region proposals for spine detection. The proposed fully automated pipeline has potential clinical applications in treatment and surgical planning of spinal deformities.

First Page

15330

Last Page

15347

DOI

10.1109/ACCESS.2024.3358874

Publication Date

1-1-2024

This document is currently not available here.

Share

COinS