En savoir plus
How can we train powerful machine learning models together across smartphones, hospitals, or financial institutions without ever sharing raw data? This book delivers a compelling answer through the lens of federated learning (FL), a cutting-edge paradigm for decentralized, privacy-preserving machine learning. Designed for students, engineers, and researchers, this book offers a principled yet practical roadmap to building secure, scalable, and trustworthy FL systems from scratch.
At the heart of this book is a unifying framework that treats FL as a network-regularized optimization problem. This elegant formulation allows readers to seamlessly address personalization, robustness, and fairness challenges often tackled in isolation. You ll learn how to structure FL networks based on task similarity, leverage graph-based methods and apply distributed optimization techniques to implement FL systems. Detailed pseudocode, intuitive explanations, and implementation-ready algorithms ensure you not only understand the theory but can apply it in real-world systems.
Topics such as privacy leakage analysis, model heterogeneity, and adversarial resilience are treated with both mathematical rigor and accessibility. Whether you're building decentralized AI for regulated industries or in settings where data, users, or system conditions change over time, this book equips you to design FL systems that are both performant and trustworthy.
Table des matières
Chapter 1. Introduction to Federated Learning.- Chapter 2. Machine Learning Foundations for FL.- Chapter 3. A Design Principle for FL.- Chapter 4. Gradient Methods for Federated Optimization.- Chapter 5. FL Algorithms.- Chapter 6. Key Variants of Federated Learning.- Chapter 7. Graph Learning for FL Networks.- Chapter 8. Trustworthy FL.- Chapter 9. Privacy Protection in FL.- Chapter 10. Cybersecurity in FL: Attacks and Defenses.
A propos de l'auteur
Alexander Jung is Associate Professor of Machine Learning at Aalto University in Finland, where he combines cutting-edge research with a deep passion for teaching. He has supervised over 120 master’s theses and was honored with the Teacher of the Year Award by the Department of Computer Science. His research focuses on trustworthy federated learning, decentralized optimization, and signal processing, and he is the author of
Machine Learning: The Basics.
He earned his PhD from TU Vienna with sub auspiciis Praesidentis rei publicae, the highest academic distinction in Austria, awarded by the Federal President. When not explaining fixed-point iterations or debugging LaTeX macros, he enjoys cycling Austria’s wine yard-valleys and Finland’s coastlines.
Résumé
How can we train powerful machine learning models together—across smartphones, hospitals, or financial institutions—without ever sharing raw data? This book delivers a compelling answer through the lens of federated learning (FL), a cutting-edge paradigm for decentralized, privacy-preserving machine learning. Designed for students, engineers, and researchers, this book offers a principled yet practical roadmap to building secure, scalable, and trustworthy FL systems from scratch.
At the heart of this book is a unifying framework that treats FL as a network-regularized optimization problem. This elegant formulation allows readers to seamlessly address personalization, robustness, and fairness—challenges often tackled in isolation. You’ll learn how to structure FL networks based on task similarity, leverage graph-based methods and apply distributed optimization techniques to implement FL systems. Detailed pseudocode, intuitive explanations, and implementation-ready algorithms ensure you not only understand the theory but can apply it in real-world systems.
Topics such as privacy leakage analysis, model heterogeneity, and adversarial resilience are treated with both mathematical rigor and accessibility. Whether you're building decentralized AI for regulated industries or in settings where data, users, or system conditions change over time, this book equips you to design FL systems that are both performant and trustworthy.