Mehr lesen
Graph Neural Networks (GNNs) have revolutionized the way we learn representations from graph-structured data, becoming a cornerstone for applications in social networks, recommendation systems, biology, and beyond. However, mainstream GNNs rely heavily on message passing, an iterative process of propagating information between connected nodes. While powerful, this method often incurs significant computational costs, making efficient training a growing challenge as graph sizes scale up.
This book addresses these challenges by offering a comprehensive exploration of efficient GNN training through the lens of data management. It highlights how innovative techniques, rooted in decades of graph processing research, can optimize the entire training process without compromising performance. By focusing on system-level enhancements and practical solutions, it provides actionable strategies to overcome efficiency bottlenecks in large-scale GNN training.
Readers will gain a deeper understanding of the graph data lifecycle in GNN training, with examples that demonstrate how data management techniques can significantly enhance scalability and performance. The book is designed for a broad audience, including students, researchers, and professionals, offering clear explanations and practical insights for anyone looking to master efficient GNN training.
Über den Autor / die Autorin
Yanyan Shen is an associate professor at Shanghai Jiao Tong University, specializing in machine learning and data management systems. Her research focuses on developing scalable and efficient algorithms for large-scale data processing, with a strong emphasis on practical applications and system-level optimizations. She has authored numerous papers in leading journals and conferences, contributing significantly to the intersection of AI and data management.
Lei Chen is a Chair Professor in the Data Science and Analytics Thrust at HKUST (GZ), a Fellow of IEEE, and a Distinguished Member of ACM. His research spans diverse areas, including data-driven AI, knowledge graphs, blockchain, data privacy, crowdsourcing, spatial and temporal databases, and query optimization for large graphs and probabilistic databases. Prof. Chen has received numerous accolades, such as the SIGMOD Test-of-Time Award (2015), the Best Research Paper Award at VLDB (2022), and the Excellent Demonstration Award at VLDB (2014). He served as the PC Co-Chair for VLDB 2019 and is currently the Editor-in-Chief of IEEE Transactions on Data and Knowledge Engineering, as well as an executive member of the VLDB Endowment.
Zusammenfassung
Graph Neural Networks (GNNs) have revolutionized the way we learn representations from graph-structured data, becoming a cornerstone for applications in social networks, recommendation systems, biology, and beyond. However, mainstream GNNs rely heavily on message passing, an iterative process of propagating information between connected nodes. While powerful, this method often incurs significant computational costs, making efficient training a growing challenge as graph sizes scale up.
This book addresses these challenges by offering a comprehensive exploration of efficient GNN training through the lens of data management. It highlights how innovative techniques, rooted in decades of graph processing research, can optimize the entire training process without compromising performance. By focusing on system-level enhancements and practical solutions, it provides actionable strategies to overcome efficiency bottlenecks in large-scale GNN training.
Readers will gain a deeper understanding of the graph data lifecycle in GNN training, with examples that demonstrate how data management techniques can significantly enhance scalability and performance. The book is designed for a broad audience, including students, researchers, and professionals, offering clear explanations and practical insights for anyone looking to master efficient GNN training.