Abstract
Federated Learning (FL) enables multiple resource-constrained edge devices with varying levels of heterogeneity to collaboratively train a global model. However, devices with limited capacity can create bottlenecks and slow down model convergence. One effective approach to addressing this issue is to use an efficient feature selection method, which reduces overall resource demands by minimizing communication and computation costs, thereby mitigating the impact of struggling nodes. Existing federated feature selection (FFS) methods are either considered as a separate step from FL or rely on a third party. These approaches increase computation and communication overhead, making them impractical for real-world high-dimensional datasets. To address this, we present Dynamic Sparse Federated Feature Selection (DSFFS), the first innovative embedded FFS that is efficient in both communication and computation. In the proposed method, feature selection occurs simultaneously with model training. During training, input-layer neurons, their connections, and hidden-layer connections are dynamically pruned and regrown, eliminating uninformative features. This process enhances computational efficiency on devices, improves network communication efficiency, and boosts global model performance. Several experiments are conducted on nine real-world datasets of varying dimensionality from diverse domains, including biology, image, speech, and text. The results under a realistic non-iid data distribution setting show that our approach achieves a better trade-off between accuracy, computation, and communication costs by selecting more informative features compared to other state-of-the-art FFS methods.
| Original language | English |
|---|---|
| Title of host publication | International Joint Conference on Neural Networks, IJCNN 2025 - Proceedings |
| ISBN (Electronic) | 9798331510428 |
| DOIs | |
| State | Published - 2025 |
| Event | 2025 International Joint Conference on Neural Networks, IJCNN 2025 - Rome, Italy Duration: Jun 30 2025 → Jul 5 2025 |
Publication series
| Name | Proceedings of the International Joint Conference on Neural Networks |
|---|---|
| ISSN (Print) | 2161-4393 |
| ISSN (Electronic) | 2161-4407 |
Conference
| Conference | 2025 International Joint Conference on Neural Networks, IJCNN 2025 |
|---|---|
| Country/Territory | Italy |
| City | Rome |
| Period | 6/30/25 → 7/5/25 |
Bibliographical note
Publisher Copyright:© 2025 IEEE.
Funding
This work is funded by career grant provided by the National Science Foundation (NSF) under the grant number 2340075.
| Funders | Funder number |
|---|---|
| National Science Foundation Arctic Social Science Program | 2340075 |
Keywords
- Dynamic Sparse training
- Feature selection
- Federated learning
ASJC Scopus subject areas
- Software
- Artificial Intelligence