首页|Threat intelligence ATT & CK extraction based on the attention transformer hierarchical recurrent neural network

Threat intelligence ATT & CK extraction based on the attention transformer hierarchical recurrent neural network

扫码查看
With the rapid growth of cyberattacks in the world wide, Tactics, Techniques & Procedures (TTPs) has become the most prevalent advanced indicator for a particular attack in cybersecurity community. However, extracting TTPs from unstructured cyber threat intelligence (CTI) can be arduous due to the large volume of the intelligence database. Although recent efforts on automatically extracting the structured TTPs from the unstructured intelligence have achieved promising results, they only employ simple statistical methods for TTP extraction and neglect the dependences among the hierarchical structure of TTPs. To solve those limitations, we proposed a novel attention-based method called Attention-based Transformer Hierarchical Recurrent Neural Network (ATHRNN) to extract the TTPs from the unstructured CTI. First of all, a Transformer Embedding Architecture (TEA) is designed to obtain high-level semantic representations of CTI and that of taxonomy of ATT & CK. Subsequently, an Attention Recurrent Structure (ARS) is developed to model the dependences between the tactical and technical labels in ATT & CK. Finally, a joint Hierarchical Classification (HC) module is developed to predict the final TTPs. Experiments of our approach on the collected dataset prove to be encouraging. The accuracies of TTPs extraction achieve 6.5% and 8.2% improvement in terms of Macro-F score and Micro-F score respectively. (C)& nbsp;2022 Elsevier B.V. All rights reserved.

ATT&CKCybersecurityCyber threat intelligenceThreat report analysisMulti-label classificationTransformerTEXT

Liu, Chenjing、Wang, Junfeng、Chen, Xiangru

展开 >

Sichuan Univ

2022

Applied Soft Computing

Applied Soft Computing

EISCI
ISSN:1568-4946
年,卷(期):2022.122
  • 4
  • 47