GDMA 2025 Program
(Monday 26 2025)
9:00 -10:30am keynote Dr. Ming Hu Optimization Algorithm and Framework Design for Federated Learning
11:00–11:45am Lixing Zhang, Guanhua Ye, Hongzheng Li, Shigang Li and Yingxia Shao. ParamSpMM: Adaptive and Efficient Sparse Matrix-Matrix Multiplication on GPUs for GNNs
11:45–12:30am Ran Liu, Zhongzhou Liu, Xiaoli Li, Hao Wu and Yuan Fang. Diversified and Adaptive Negative Sampling on Knowledge Graphs
Keynote speaker bio
Dr. Ming Hu
He is a research scientist at Singapore Management University. Previously, he was a research fellow at Nanyang Technological University (NTU), Singapore. His research interests include the design and construction of AI software and systems, federated learning, and Trustworthy AI. He has published 30+ research papers in top conferences or journals, such as RTSS, DAC, ICDE, KDD, ICSE, FSE, NeurIPS, TC, and TCAD. He has won the FSE2024 “Distinguished Paper Award”.
Title:
Optimization Algorithm and Framework Design for Federated Learning
Abstract:
Federated Learning (FL) has emerged as a promising paradigm for distributed machine learning, enabling model training across distributed devices while preserving data privacy. However, challenges such as heterogeneous data distributions, resource-constrained devices, and uncertain physical environments hinder its widespread adoption. In this talk, we explore how to design the optimization algorithm and framework of federated learning to address these challenges. Specifically, this talk will explore how to solve the problem of data heterogeneity using the heuristic searching strategies from the perspective of the loss landscape. In addition, this talk will introduce how to design the federated learning framework using asynchronous training and model heterogeneity strategies to address the problem of performance degradation caused by limited device resources and uncertain environments.