Depth to Anatomy: Organ Localization from Depth Images for Automated Patient Table Positioning in Radiology Workflow
Eytan Kats, Kai Geissler, Daniel Mensing, Julien Senegas, Jochen G. Hirsch, Stefan Heldman, Mattias P. Heinrich
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Automated patient positioning can improve radiology workflow efficiency by reducing the time required for manual table adjustments and scout-based scan planning. We propose a learning-based framework that predicts 3D organ locations and shapes for 41 anatomical structures, including both bones and soft tissues, directly from a single 2D depth image of the body surface. Leveraging 10,020 whole-body MRI scans from the German National Cohort (NAKO) dataset, we synthetically generate depth images paired with anatomical segmentations to train a convolutional neural network for volumetric organ prediction. Our method achieves a mean dice similarity coefficient of 0.440.2 and and a symmetric average surface distance of 7.695.68 mm across all structures. Furthermore, the model derives organ bounding boxes with a mean absolute detection offset of 10.995.54 mm. Qualitative results on real-world depth images confirm the ability of the model to generalize to practical clinical settings. These findings suggest that depth-only organ localization can support automated patient positioning reducing setup time, minimizing operator variability, and improving patient comfort.