您当前位置: hjc888黄金城  >  科学研究  >  学术动态  >  正文


计算机科学技术专家讲座(九)——Yipeng Zhou

发布日期:2024-06-11 发布人: 点击量:

报告题目:Enhancing Federated Learning by Sparsifying Transmitted Model Updates



人:Yipeng Zhou


Dr Yipeng Zhou is a senior lecturer with the School of Computing, Faculty of Science and Engineering, Macquarie University. Before joining Macquarie University, he was a research fellow with the University of South Australia, and a lecturer with Shenzhen University, respectively. He got his Ph.D. and M.Phil degrees from The Chinese University of Hong Kong, and B.S. degree from University of Science and Technology of China, respectively. He received 2023 Macquarie University Vice-Chancellor's Research Excellence Award for Innovative Technology, and 2023 IEEE Open Journal of the Communications Society Best Editor Award. He was the recipient of 2018 Australia Research Council Discover Early Career Researcher Award (DECRA). His research interests lie in federated learning, data privacy-preservation, networking, etc. He has published 100+ papers in top venues, including IEEE INFOCOM, ICML, IJCAI, ICNP, IWQoS, IEEE ToN, JSAC, TPDS, TMC, TMM, etc.


Federated learning facilitates the collaborative training of a machine learning model among geographically dispersed clients by exchanging model updates with a central server via Internet communication. However, transmitting these updates between the server and numerous decentralized clients over the Internet consumes considerable bandwidth and is susceptible to malicious attacks. This presentation showcases our various contributions aimed at improving communication efficiency and preserving privacy in federated learning. Our focus lies in sparsifying the transmission of model updates between the server and clients. By meticulously evaluating both the learning value, communication cost and privacy cost of transmitting each individual model update, we effectively mitigate the exposure of low-value updates to minimize communication and privacy costs. Extensive experiments conducted on real datasets demonstrate that our algorithms can significantly reduce communication costs and bolster privacy protection compared to the state-of-the-art federated learning baselines.