首页|LETFORMER: LIGHTWEIGHT TRANSFORMER PRE-TRAINING WITH SHARPNESS-AWARE OPTIMIZATION FOR EFFICIENT ENCRYPTED TRAFFIC ANALYSIS

LETFORMER: LIGHTWEIGHT TRANSFORMER PRE-TRAINING WITH SHARPNESS-AWARE OPTIMIZATION FOR EFFICIENT ENCRYPTED TRAFFIC ANALYSIS

扫码查看
Reliable encrypted traffic classification is fundamental for advancing cyber-security and effectively managing exponentially growing data streams. The success of large language models in fields such as natural language processing demonstrates the feasibility of learning general paradigms from extensive corpora, making pre-trained encrypted traffic classification methods a preferred choice. However, attention-based pre-trained classification methods face two key constraints: the large number of neural parameters is unsuitable for low-computation environments like mobile devices and real-time classification scenarios, and there is a tendency to fall into local minima, leading to overfitting. We develop a shallow, lightweight Transformer model named LETformer. We utilize sharpness-aware optimization during pre-training to avoid local minima while capturing temporal features with relative positional embeddings and optimizing the classifier to maintain classification accuracy for downstream tasks. We evaluate our method on four datasets - USTC-TFC2016, ISCX-VPN2016, ICCXTOR, CICIOT2022. Despite having only 17.6 million parameters, LETformer achieves classification metrics comparable to those of methods with ten times the number of parameters.

Encrypted traffic classificationLETformerDeep learningSpare relative position embeddingSharpness-aware optimization

ZHIYAN MENG、DAN LIU、JINTAO MENG

展开 >

Research Institute of Electronic Science and Technology University of Electronic Science and Technology of China No. 2006, Xiyuan Avenue, West Hi-Tech Zone, Chengdu 611731, P. R. China

National Key Laboratory of Security Communication No. 35, Huangjing Road, Shuangliu County, Chengdu 610041, P. R. China

2025

International journal of innovative computing, information and control
  • 28