A cross-layered cluster embedding learning network with regularization for multivariate time series anomaly detection
The Journal of Supercomputing(2023)
Abstract
The devices deployed across diverse industrial scenarios have generated significant network traffic related to time. The system’s irregular operation could result in substantial bad influence. Anomaly detection technologies utilized for identifying possible non-standard behaviours are paramount; furthermore, multivariate time series exhibit complex dependencies besides temporal correlation. However, most previous methods merely consider the temporal and variable correlation of time series data, neglecting the distance metrics among the sequences, leading to a deficiency in the model’s anomaly detection ability. We propose a multivariate time series anomaly detection model based on the encoder–decoder architecture (CCER-ED). The model considers the similarity measure between temporal subsequences and designs a multi-scale feature embedding module for leveraging more interrelated properties. Moreover, the interrelations among sensors are explicitly learned using a manifold regularization graph structure. On this basis, an improved data fusion approach based on a multi-head self-attention mechanism is designed for capturing global feature representation, effectively integrating various aspects of information. Evaluations using the real-world datasets SWAT and WADI and performance analysis show that the proposed approach achieves improvement over the baselines in the recall and F1-score of anomaly detection performance at 9.3% and 8.5% (maximum), respectively, outperforming other existing methods.
MoreTranslated text
Key words
Anomaly detection,Multivariate time series,Graph structure,Cluster embedding,Attention mechanism
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined