ESTRO 2024 - Abstract Book
S3159
Physics - Autosegmentation
ESTRO 2024
Table2
Conclusion:
With the increasing clinical translation of DL models in healthcare, it is important to test performance for different demographic subgroups to ensure there is no bias by race or other factors. Our work has demonstrated for the first time that prostate DL autocontouring models can exhibit race bias when trained using data that are highly imbalanced by race. Enriching the training data of models with more diverse data is therefore crucial to ensure equitable benefits to all patient subgroups from the introduction of DL techniques into clinical workflows.
Keywords: Race bias, Prostate, MRI
References:
[1] E. Puyol-Antón et al., “Fairness in Cardiac Magnetic Resonance Imaging: Assessing Sex and Racial Bias in Deep Learning-Based Segmentation,” Front Cardiovasc Med, vol. 9, p. 859310, 2022, doi: 10.3389/fcvm.2022.859310.
[2] T. Lee, E. Puyol-Anton, B. Ruijsink, M. Shi, and A. P. King, “A systematic study of race and sex bias in CNN-based cardiac MR segmentation,” Sep. 2022, [Online]. Available: http://arxiv.org/abs/2209.01627
[3] Y. McQuinlan et al., “An investigation into the risk of population bias in deep learning autocontouring,” Radiotherapy and Oncology, vol. 186, p. 109747, 2023, doi: 10.1016/J.RADONC.2023.109747.
Made with FlippingBook - Online Brochure Maker