Fr. 102.00

Medical Image Understanding and Analysis - 29th Annual Conference, MIUA 2025, Leeds, UK, July 15-17, 2025, Proceedings, Part II. DE

Inglese · Tascabile

Pubblicazione il 15.08.2025

Descrizione

Ulteriori informazioni

The three-volume set LNCS 15916,15917 & 15918 constitutes the refereed proceedings of the 29th Annual Conference on Medical Image Understanding and Analysis, MIUA 2025, held in Leeds, UK, during July 15 17, 2025.
The 67 revised full papers presented in these proceedings were carefully reviewed and selected from 99 submissions. The papers are organized in the following topical sections:
Part I: Frontiers in Computational Pathology; and Image Synthesis and Generative Artificial Intelligence.
Part II: Image-guided Diagnosis; and Image-guided Intervention.
Part III: Medical Image Segmentation; and Retinal and Vascular Image Analysis.

Sommario

.- Image-guided Diagnosis.
.- FD-SSD: Semi-Supervised Detection of Bone Fenestration and Dehiscence in Intraoral Images.
.- Interpretable Prediction of Lymph Node Metastasis in Rectal Cancer MRI Using Variational Autoencoders.
.- Self-Guided SwinTransformer Improves Breast Cancer Detection Through Iterative Attention-Based Zooming.
.- Can AI Be Faster, Accurate, and Explainable? SpikeNet Makes It Happen.
.- A Novel Feature-Prioritized Loss Function for Enhanced Pneumonia Segmentation in Chest X-rays.
.- Bridging Accuracy and Explainability: A SHAP-Enhanced CNN for Skin Cancer Diagnosis.
.- Multi-Scale WSI Analysis: A Cascade Framework for Efficient Breast Cancer Metastasis Detection.
.- Learning to Harmonize Cross-vendor X-ray Images by Non-linear Image Dynamics Correction.
.- Modified CBAM: Sub-Block Pooling for Improved Channel and Spatial Attention.
.- WSI-AL: A Novel Active Learning Framework for Whole Slide Image Selection.
.- A Deep-learning Approach for Diagnosing and Grading Ankylosing Spondylitis Sacroiliitis by X-ray Images.
.- Towards Breast Tumor Aggressiveness Classification in Digital Mammograms Using Boundary-Aware Segmentation and Feature Analysis.
.- Image-guided Intervention.
.- Joint Dento-Facial Shape Model.
.- Out-of-Distribution Detection in Gastrointestinal Vision by Estimating Nearest Centroid Distance Deficit.
.- Deep Learning-Driven Pipeline for Automated Wound Measurement of Chronic Wounds.
.- Midline-constrained Loss in the Anatomical Landmark Segmentation of 3D Liver Models.
.- DepthClassNet: A Multitask Framework for Monocular Depth Estimation and Texture Classification in Endoscopic Imaging.
.- Assessing the Generalization Performance of SAM for Ureteroscopy Scene Segmentation and Understanding.
.- Modelling Uncertainty in Graph Convolutional Networks for Edge Detection in Mammograms.
.- Classification of Gastroscopy Images Under extreme Class Imbalance: A Deep Learning Pipeline.
.- Temporally Consistent Smoke Removal from Endoscopic Video Images.
.- Toward Patient-specific Partial Point Cloud to Surface Completion for Pre- to Intra-operative Registration in Image-guided Liver Interventions.
.- EfficientDet with Knowledge Distillation and Instance Whitening for Real-time and Generalisable Polyp Detection.

Riassunto

The three-volume set LNCS 15916,15917 & 15918 constitutes the refereed proceedings of the 29th Annual Conference on Medical Image Understanding and Analysis, MIUA 2025, held in Leeds, UK, during July 15–17, 2025.
The 67 revised full papers presented in these proceedings were carefully reviewed and selected from 99 submissions. The papers are organized in the following topical sections:
Part I: Frontiers in Computational Pathology; and Image Synthesis and Generative Artificial Intelligence.
Part II: Image-guided Diagnosis; and Image-guided Intervention.
Part III: Medical Image Segmentation; and Retinal and Vascular Image Analysis.

Recensioni dei clienti

Per questo articolo non c'è ancora nessuna recensione. Scrivi la prima recensione e aiuta gli altri utenti a scegliere.

Scrivi una recensione

Top o flop? Scrivi la tua recensione.

Per i messaggi a CeDe.ch si prega di utilizzare il modulo di contatto.

I campi contrassegnati da * sono obbligatori.

Inviando questo modulo si accetta la nostra dichiarazione protezione dati.