The emergence of 6G edge intelligence demands highly efficient, adaptive Machine Learning (ML) solutions that operate under severe resource constraints. In this paper, we introduce FEDCOM, a scalable framework designed to support Continual Learning (CL) in federated edge environments through selective communication and adaptive model updates. FEDCOM uses Federated Continual Learning (FCL), which combines Federated Learning with CL and low-power optimization strategies, to minimize energy usage and bandwidth consumption during communication between server and clients. The core idea is to reduce the environmental and computational costs of distributed training by transmitting only unfrozen (trainable) layers during communication and scheduling client participation based on a data novelty metric. This approach limits unnecessary computation and avoids redundant communication, aligning with the goals of sustainable and low-latency 6G edge intelligence. We tested FEDCOM in a precision agriculture setting for an object detection (OD) task, using the YOLOv12s model. The system leverages edge datasets across multiple domains and evaluates energy consumption and OD performance in realistic scenarios. Experimental results demonstrate that FEDCOM reduces computational costs and carbon footprint compared to full retraining and standard FCL, while preserving detection performance. It reduces energy consumption by over 60% compared to the baseline, especially in later domains, thanks to layer-freezing and client selection. FEDCOM shows how FCL, along with scheduling and communication-efficient strategies, can be powerful in 6G TinyML real-world applications, extending the operational lifespan of edge devices and improving the use of FCL in limited environments.
FEDCOM: FEDerated Communication-Efficient Object Detection Model / Amato, F.; Bruna, C. D.; Savoia, M.; Prezioso, E.; Piccialli, F.. - In: IEEE WIRELESS COMMUNICATIONS. - ISSN 1536-1284. - (2026), pp. 1-9. [10.1109/MWC.2025.3645721]
FEDCOM: FEDerated Communication-Efficient Object Detection Model
Amato F.;Savoia M.;Prezioso E.;Piccialli F.
2026
Abstract
The emergence of 6G edge intelligence demands highly efficient, adaptive Machine Learning (ML) solutions that operate under severe resource constraints. In this paper, we introduce FEDCOM, a scalable framework designed to support Continual Learning (CL) in federated edge environments through selective communication and adaptive model updates. FEDCOM uses Federated Continual Learning (FCL), which combines Federated Learning with CL and low-power optimization strategies, to minimize energy usage and bandwidth consumption during communication between server and clients. The core idea is to reduce the environmental and computational costs of distributed training by transmitting only unfrozen (trainable) layers during communication and scheduling client participation based on a data novelty metric. This approach limits unnecessary computation and avoids redundant communication, aligning with the goals of sustainable and low-latency 6G edge intelligence. We tested FEDCOM in a precision agriculture setting for an object detection (OD) task, using the YOLOv12s model. The system leverages edge datasets across multiple domains and evaluates energy consumption and OD performance in realistic scenarios. Experimental results demonstrate that FEDCOM reduces computational costs and carbon footprint compared to full retraining and standard FCL, while preserving detection performance. It reduces energy consumption by over 60% compared to the baseline, especially in later domains, thanks to layer-freezing and client selection. FEDCOM shows how FCL, along with scheduling and communication-efficient strategies, can be powerful in 6G TinyML real-world applications, extending the operational lifespan of edge devices and improving the use of FCL in limited environments.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


