Fr. 82.00

Fairness of AI in Medical Imaging - Third International Workshop, FAIMI 2025, Held in Conjunction with MICCAI 2025, Daejeon, South Korea, September 23, 2025, Proceedings

English · Paperback / Softback

Shipping usually within 6 to 7 weeks

Description

Read more

This book constitutes the refereed proceedings of the Third International Workshop, FAIMI 2025, held in conjunction with MICCAI 2025, Daejeon, South Korea, in September 23, 2025.
The 21 full papers presented in this book were carefully reviewed and selected from 29 submissions.
FAIMI aimed to raise awareness about potential fairness issues in machine learning within the context of biomedical image analysis.
 

List of contents

.- LTCXNet: Tackling Long-Tailed Multi-Label Classification and Racial
Bias in Chest X-Ray Analysis.
.- Fairness and Robustness of CLIP-Based Models for Chest X-rays.
.- ShortCXR: Benchmarking Self-Supervised Learning Methods for
Shortcut Mitigation in Chest X-Ray Interpretation.
.- How Fair Are Foundation Models? Exploring the Role of Covariate
Bias in Histopathology.
.- The Cervix in Context: Bias Assessment in Preterm Birth Prediction.
.- Identifying Gender-Specific Visual Bias Signals in Skin Lesion Classification.
.- Fairness-Aware Data Augmentation for Cardiac MRI using
Text-Conditioned Diffusion Models.
.- Exploring the interplay of label bias with subgroup size and separability:
A case study in mammographic density classification.
.- Does a Rising Tide Lift All Boats? Bias Mitigation for AI-based CMR
Segmentation.
.- MIMM-X: Disentangeling Spurious Correlations for Medical Image
Analysis.
.- Predicting Patient Self-reported Race From Skin Histological Images
with Deep Learning.
.- Robustness and sex differences in skin cancer detection: logistic
regression vs CNNs.
.- Sex-based Bias Inherent in the Dice Similarity Coefficient: A Model
Independent Analysis for Multiple Anatomical Structures.
.- The Impact of Skin Tone Label Granularity on the Performance and
Fairness of AI Based Dermatology Image Classification Models.
.- Causal Representation Learning with Observational Grouping for CXR
Classification.
.- Invisible Attributes, Visible Biases: Exploring Demographic Shortcuts
in MRI-based Alzheimer s Disease Classification.
.- Fair Dermatological Disease Diagnosis through Auto-weighted
Federated Learning and Performance-aware Personalization.
.- Assessing Annotator and Clinician Biases in an Open-Source-Based
Tool Used to Generate Head CT Segmentations for Deep Learning
Training.
.- meval: A Statistical Toolbox for Fine-Grained Model Performance Analysis.
.- Revisiting the Evaluation Bias Introduced by Frame Sampling
Strategies in Surgical Video Segmentation Using SAM2.
.- Disentanglement and Assessment of Shortcuts in Ophthalmological
Retinal Imaging Exams.

Product details

Assisted by Veronika Cheplygina (Editor), Aasa Feragen (Editor), Aasa Feragen et al (Editor), Enzo Ferrante (Editor), Melani Ganz-Benjaminsen (Editor), Ben Glocker (Editor), Andrew King (Editor), Heisook Lee (Editor), Eike Petersen (Editor), Esther Puyol-Antón (Editor)
Publisher Springer, Berlin
 
Languages English
Product format Paperback / Softback
Released 19.10.2025
 
EAN 9783032058690
ISBN 978-3-0-3205869-0
No. of pages 220
Dimensions 155 mm x 13 mm x 235 mm
Weight 359 g
Illustrations XI, 220 p. 62 illus., 59 illus. in color.
Series Lecture Notes in Computer Science
Subjects Natural sciences, medicine, IT, technology > IT, data processing > Application software

Künstliche Intelligenz, machine learning, Maschinelles Lernen, Artificial Intelligence, Netzwerk-Hardware, Computer Vision, Computer Communication Networks, Medical imaging, Computer Application in Administrative Data Processing, Bias Mitigation, Algorithmic Fairness, fairness in medical imaging, legal bias, ethical bias

Customer reviews

No reviews have been written for this item yet. Write the first review and be helpful to other users when they decide on a purchase.

Write a review

Thumbs up or thumbs down? Write your own review.

For messages to CeDe.ch please use the contact form.

The input fields marked * are obligatory

By submitting this form you agree to our data privacy statement.