Can We Theoretically Quantify the Impacts of Local Updates on the Generalization Performance of Federated Learning?

Peizhong Ju, Haibo Yang, Jia Liu, Yingbin Liang, Ness Shroff

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Federated Learning (FL) has gained significant popularity due to its effectiveness in training machine learning models across diverse sites without requiring direct data sharing. While various algorithms along with their optimization analyses have shown that FL with local updates is a communication-efficient distributed learning framework, the generalization performance of FL with local updates has received comparatively less attention. This lack of investigation can be attributed to the complex interplay between data heterogeneity and infrequent communication due to the local updates within the FL framework. This motivates us to investigate a fundamental question in FL: Can we quantify the impact of data heterogeneity and local updates on the generalization performance for FL as the learning process evolves? To this end, we conduct a comprehensive theoretical study of FL’s generalization performance using a linear model as the first step, where the data heterogeneity is considered for both the stationary and online/non-stationary cases. By providing closed-form expressions of the model error, we rigorously quantify the impact of the number of the local updates (denoted as) under three settings (= 1, < 1, and = 1) and show how the generalization performance evolves with the number of rounds C. Our investigation also provides a comprehensive understanding of how different configurations (including the number of model parameters ? and the number of training samples =) contribute to the overall generalization performance, thus shedding new insights (such as benign overfitting) for implementing FL over networks.

Original languageEnglish
Title of host publicationMobiHoc 2024 - Proceedings of the 2024 International Symposium on Theory, Algorithmic Foundations, and Protocol Design for Mobile Networks and Mobile Computing
Pages141-150
Number of pages10
ISBN (Electronic)9798400705212
DOIs
StatePublished - Oct 14 2024
Event2024 International Symposium on Theory, Algorithmic Foundations, and Protocol Design for Mobile Networks and Mobile Computing, MobiHoc 2024 - Athens, Greece
Duration: Oct 14 2024Oct 17 2024

Publication series

NameProceedings of the International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc)

Conference

Conference2024 International Symposium on Theory, Algorithmic Foundations, and Protocol Design for Mobile Networks and Mobile Computing, MobiHoc 2024
Country/TerritoryGreece
CityAthens
Period10/14/2410/17/24

Bibliographical note

Publisher Copyright:
© 2024 Copyright held by the owner/author(s).

Keywords

  • Federated Learning
  • Generalization Performance
  • Overfitting

ASJC Scopus subject areas

  • Hardware and Architecture
  • Computer Networks and Communications
  • Software

Fingerprint

Dive into the research topics of 'Can We Theoretically Quantify the Impacts of Local Updates on the Generalization Performance of Federated Learning?'. Together they form a unique fingerprint.

Cite this