[2603.29520] TrafficMoE: Heterogeneity-aware Mixture of Experts for Encrypted Traffic Classification
About this article
Abstract page for arXiv paper 2603.29520: TrafficMoE: Heterogeneity-aware Mixture of Experts for Encrypted Traffic Classification
Computer Science > Cryptography and Security arXiv:2603.29520 (cs) [Submitted on 31 Mar 2026] Title:TrafficMoE: Heterogeneity-aware Mixture of Experts for Encrypted Traffic Classification Authors:Qing He, Xiaowei Fu, Lei Zhang View a PDF of the paper titled TrafficMoE: Heterogeneity-aware Mixture of Experts for Encrypted Traffic Classification, by Qing He and 2 other authors View PDF HTML (experimental) Abstract:Encrypted traffic classification is a critical task for network security. While deep learning has advanced this field, the occlusion of payload semantics by encryption severely challenges standard modeling approaches. Most existing frameworks rely on static and homogeneous pipelines that apply uniform parameter sharing and static fusion strategies across all inputs. This one-size-fits-all static design is inherently flawed: by forcing structured headers and randomized payloads into a unified processing pipeline, it inevitably entangles the raw protocol signals with stochastic encryption noise, thereby degrading the fine-grained discriminative features. In this paper, we propose TrafficMoE, a framework that breaks through the bottleneck of static modeling by establishing a Disentangle-Filter-Aggregate (DFA) paradigm. Specifically, to resolve the structural between-components conflict, the architecture disentangles headers and payloads using dual-branch sparse Mixture-of-Experts (MoE), enabling modality-specific modeling. To mitigate the impact of stochastic noise, a...