[2603.20296] Collaborative Adaptive Curriculum for Progressive Knowledge Distillation
About this article
Abstract page for arXiv paper 2603.20296: Collaborative Adaptive Curriculum for Progressive Knowledge Distillation
Computer Science > Machine Learning arXiv:2603.20296 (cs) [Submitted on 19 Mar 2026] Title:Collaborative Adaptive Curriculum for Progressive Knowledge Distillation Authors:Jing Liu, Zhenchao Ma, Han Yu, Bobo Ju, Wenliang Yang, Chengfang Li, Bo Hu, Liang Song View a PDF of the paper titled Collaborative Adaptive Curriculum for Progressive Knowledge Distillation, by Jing Liu and 7 other authors View PDF HTML (experimental) Abstract:Recent advances in collaborative knowledge distillation have demonstrated cutting-edge performance for resource-constrained distributed multimedia learning scenarios. However, achieving such competitiveness requires addressing a fundamental mismatch: high-dimensional teacher knowledge complexity versus heterogeneous client learning capacities, which currently prohibits deployment in edge-based visual analytics systems. Drawing inspiration from curriculum learning principles, we introduce Federated Adaptive Progressive Distillation (FAPD), a consensus-driven framework that orchestrates adaptive knowledge transfer. FAPD hierarchically decomposes teacher features via PCA-based structuring, extracting principal components ordered by variance contribution to establish a natural visual knowledge hierarchy. Clients progressively receive knowledge of increasing complexity through dimension-adaptive projection matrices. Meanwhile, the server monitors network-wide learning stability by tracking global accuracy fluctuations across a temporal consensus window...