[2603.29520] TrafficMoE: Heterogeneity-aware Mixture of Experts for Encrypted Traffic Classification

[2603.29520] TrafficMoE: Heterogeneity-aware Mixture of Experts for Encrypted Traffic Classification

arXiv - AI 4 min read

About this article

Abstract page for arXiv paper 2603.29520: TrafficMoE: Heterogeneity-aware Mixture of Experts for Encrypted Traffic Classification

Computer Science > Cryptography and Security arXiv:2603.29520 (cs) [Submitted on 31 Mar 2026] Title:TrafficMoE: Heterogeneity-aware Mixture of Experts for Encrypted Traffic Classification Authors:Qing He, Xiaowei Fu, Lei Zhang View a PDF of the paper titled TrafficMoE: Heterogeneity-aware Mixture of Experts for Encrypted Traffic Classification, by Qing He and 2 other authors View PDF HTML (experimental) Abstract:Encrypted traffic classification is a critical task for network security. While deep learning has advanced this field, the occlusion of payload semantics by encryption severely challenges standard modeling approaches. Most existing frameworks rely on static and homogeneous pipelines that apply uniform parameter sharing and static fusion strategies across all inputs. This one-size-fits-all static design is inherently flawed: by forcing structured headers and randomized payloads into a unified processing pipeline, it inevitably entangles the raw protocol signals with stochastic encryption noise, thereby degrading the fine-grained discriminative features. In this paper, we propose TrafficMoE, a framework that breaks through the bottleneck of static modeling by establishing a Disentangle-Filter-Aggregate (DFA) paradigm. Specifically, to resolve the structural between-components conflict, the architecture disentangles headers and payloads using dual-branch sparse Mixture-of-Experts (MoE), enabling modality-specific modeling. To mitigate the impact of stochastic noise, a...

Originally published on April 01, 2026. Curated by AI News.

Related Articles

Machine Learning

Slides Help Teaching ML First Time [P]

I’m an electrical engineering teacher. One of our faculty members has fallen ill, so I’ve been asked to take over teaching machine learni...

Reddit - Machine Learning · 1 min ·
Machine Learning

easyaligner: Forced alignment with GPU acceleration and flexible text normalization (compatible with all w2v2 models on HF Hub) [P]

https://preview.redd.it/f4d5krhkjyvg1.png?width=1020&format=png&auto=webp&s=11310f377b22abbe3dd110cc7d362ba8aae35f8d I have b...

Reddit - Machine Learning · 1 min ·
Machine Learning

ICML 2026 - Heavy score variance among various batches? [D]

I've seen some people say in their batch very few papers have above 3.5 score, but then other reviewers say that most papers in their sco...

Reddit - Machine Learning · 1 min ·
Machine Learning

We’re proud to open-source LIDARLearn [R] [D] [P]

It’s a unified PyTorch library for 3D point cloud deep learning. To our knowledge, it’s the first framework that supports such a large co...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime