[2603.29537] Mean Masked Autoencoder with Flow-Mixing for Encrypted Traffic Classification
About this article
Abstract page for arXiv paper 2603.29537: Mean Masked Autoencoder with Flow-Mixing for Encrypted Traffic Classification
Computer Science > Cryptography and Security arXiv:2603.29537 (cs) [Submitted on 31 Mar 2026] Title:Mean Masked Autoencoder with Flow-Mixing for Encrypted Traffic Classification Authors:Xiao Liu, Xiaowei Fu, Fuxiang Huang, Lei Zhang View a PDF of the paper titled Mean Masked Autoencoder with Flow-Mixing for Encrypted Traffic Classification, by Xiao Liu and 3 other authors View PDF HTML (experimental) Abstract:Network traffic classification using self-supervised pre-training models based on Masked Autoencoders (MAE) has demonstrated a huge potential. However, existing methods are confined to isolated byte-level reconstruction of individual flows, lacking adequate perception of the multi-granularity contextual relationship in traffic. To address this limitation, we propose Mean MAE (MMAE), a teacher-student MAE paradigm with flow mixing strategy for building encrypted traffic pre-training model. MMAE employs a self-distillation mechanism for teacher-student interaction, where the teacher provides unmasked flow-level semantic supervision to advance the student from local byte reconstruction to multi-granularity comprehension. To break the information bottleneck in individual flows, we introduce a dynamic Flow Mixing (FlowMix) strategy to replace traditional random masking mechanism. By constructing challenging cross-flow mixed samples with interferences, it compels the model to learn discriminative representations from distorted tokens. Furthermore, we design a Packet-importa...