ESTRO 2024 - Abstract Book
S3999
Physics - Inter-fraction motion management and offline adaptive radiotherapy
ESTRO 2024
[2] Visser S, Korevaar EW, Muijs CT, Wijsman R, Langendijk JA, Pisciotta P, et al. Clinical 3D/4D cumulative proton dose assessment methods for thoracic tumours with large motion. Radiotherapy and Oncology 2023;182:109575. https://doi.org/10.1016/J.RADONC.2023.109575.
898
Poster Discussion
Validation of Affordable Depth Camera System for Monitoring Deep Inspiratory Breath Hold
Mandeep Kaur, Marian Axente
Emory University School of Medicine, Radiation Oncology, Atlanta, USA
Purpose/Objective:
Breast cancer is the most diagnosed cancer worldwide 1 . Breast radiotherapy during deep inspiration breath hold (DIBH) is a proven technique to minimize the cardiac dose, a common late effect specific to left-breast patients 2–7 . Optical surface imaging systems provide accurate motion mitigation strategies for DIBH 8–11 . While enabling superior quality of care for left-breast patients, these products are not widespread due to cost and complexity. Our objective is to introduce and validate an innovative and affordable surface imaging platform to be utilized in breast radiotherapy using DIBH monitoring. This validation study pits the proposed system against state-of-the-art commercial surface imaging equipment and international standards. The proposed platform can make a significant healthcare equity impact 12 , enabling centers without access to respiratory motion management to deliver the best treatment for breast patients at risk for cardiac toxicity.
Material/Methods:
A real-time monitoring system was developed around the Intel ® RealSense™ depth camera (D455) and corresponding open-source software (Intel® RealSense™ SDK 2.0). This affordable off-the-shelf module leverages a stereoscopic camera, providing concomitant optical and depth data similar to current surface guidance systems used in radiotherapy. The optical feed is used by a Python (v.3.8.0) implementation of the Mask R-CNN (region-based convolutional neural network) algorithm 13 , which was trained to identify and segment surface-affixed simple shapes during live data acquisition. The system then tracks the segmented object in the camera field-of-view, providing a three-dimensional real-time displacement of the object centroid. The system was tested for measurement accuracy, precision, drift and overall latency against a synced laser displacement sensor (accuracy: ≤0.45mm±0.4mm, acquisition: 1kHz). For all dynamic tests, the CIRS Dynamic Thorax phantom (Model 008A) was used to produce nominal motion patterns. A 16cm foam cube was mounted to the phantom motion actuator, perpendicular to the system camera axis. Superior-inferior (SI) displacements (IEC 61217; patient axis) were measured using the camera depth sensor data, while anterior-posterior (AP) and lateral displacements (LR) were measured using the camera optical data. The same moving target was monitored simultaneously using the VARIAN Identify™ surface guidance system (v.2.3). Various motion patterns, including recorded irregular human breathing containing multiple breath holds, were used to compare the proposed system against Identify™. The validation hypothesis was that there were no significant differences between the “ground truth” and the tested system. Ground truth was defined either as input
Made with FlippingBook - Online Brochure Maker