[2602.13024] FedHENet: A Frugal Federated Learning Framework for Heterogeneous Environments
Summary
FedHENet introduces a frugal federated learning framework that enhances energy efficiency and stability in heterogeneous environments while maintaining competitive accuracy.
Why It Matters
This research addresses the challenges of privacy and resource consumption in federated learning, making it particularly relevant for industries handling sensitive data. By minimizing the need for local fine-tuning and hyperparameter tuning, FedHENet offers a more sustainable approach to machine learning, which is crucial in today's eco-conscious landscape.
Key Takeaways
- FedHENet uses a fixed feature extractor to reduce computational costs.
- The framework achieves competitive accuracy with improved energy efficiency, up to 70%.
- It eliminates hyperparameter tuning, reducing the carbon footprint of model training.
- FedHENet is designed for heterogeneous environments, enhancing its applicability.
- The method employs homomorphic encryption for secure client knowledge aggregation.
Computer Science > Computer Vision and Pattern Recognition arXiv:2602.13024 (cs) [Submitted on 13 Feb 2026] Title:FedHENet: A Frugal Federated Learning Framework for Heterogeneous Environments Authors:Alejandro Dopico-Castro, Oscar Fontenla-Romero, Bertha Guijarro-Berdiñas, Amparo Alonso-Betanzos, Iván Pérez Digón View a PDF of the paper titled FedHENet: A Frugal Federated Learning Framework for Heterogeneous Environments, by Alejandro Dopico-Castro and 3 other authors View PDF HTML (experimental) Abstract:Federated Learning (FL) enables collaborative training without centralizing data, essential for privacy compliance in real-world scenarios involving sensitive visual information. Most FL approaches rely on expensive, iterative deep network optimization, which still risks privacy via shared gradients. In this work, we propose FedHENet, extending the FedHEONN framework to image classification. By using a fixed, pre-trained feature extractor and learning only a single output layer, we avoid costly local fine-tuning. This layer is learned by analytically aggregating client knowledge in a single round of communication using homomorphic encryption (HE). Experiments show that FedHENet achieves competitive accuracy compared to iterative FL baselines while demonstrating superior stability performance and up to 70\% better energy efficiency. Crucially, our method is hyperparameter-free, removing the carbon footprint associated with hyperparameter tuning in standard FL. Code availa...