Predicting Future Security Events through Attention Neural Network

commentaires · 1509 Vues

As hackers become more and more sophisticated in their techniques, there is increased complexity in the level of computer attacks. In such cases, the defenders normally have very less time to figure out the exact nature of attack and also the impending steps that the attacker could take. In order to get ahead of intruders we can design a system powered with AI to predict future intrusions based on current and previous log requests. This will help to mitigate and predict future network intrusions

The deep learning model used in this project is Hierarchical Neural Attention Encoder. The  encoder takes into consideration all the previous network attack events so that it can learn by itself and predict accordingly. The baseline models propose LSTM to solve this problem. While LSTM has proven successful for sequential data input problems, it has many problems. One of the biggest drawbacks of LSTM is that it can only compute up to hundred input sequences at time. Also ,since, it is based on RNN, it consumes a lot of system resources due to its complex structure. Therefore, this project proposes designing Hierarchical Neural Attention Encoder which would provide an advantage over the tradition LSTM models.  Hierarchical Neural Attention Encoder consists of Bidirectional RNN coupled with Attention Network. The Bidirectional RNN will be used to determine the meaning of sequence of words and the Attention network would be used to return weights for each word. Attention network identifies only important features from all other noisy features which are not required in analysis. It also has higher computational efficiency of computing 10,000 input vectors at a time. The model will be initially trained on large sets of data gathered from different sources like acunetix logs and logs generated by python script. Compared to LSTM model, this model based on Attention Neural Network gives better precision and Accuracy on Attack Prediction

commentaires