GADformer: A Transparent Transformer Model for Group Anomaly Detection on Trajectories
2024 International Joint Conference on Neural Networks (IJCNN)(2024)
Abstract
Group Anomaly Detection (GAD) identifies unusual pattern in groups whereindividual members might not be anomalous. This task is of major importanceacross multiple disciplines, in which also sequences like trajectories can beconsidered as a group. As groups become more diverse in heterogeneity and size,detecting group anomalies becomes challenging, especially without supervision.Though Recurrent Neural Networks are well established deep sequence models,their performance can decrease with increasing sequence lengths. Hence, thispaper introduces GADformer, a BERT-based model for attention-driven GAD ontrajectories in unsupervised and semi-supervised settings. We demonstrate howgroup anomalies can be detected by attention-based GAD. We also introduce theBlock-Attention-anomaly-Score (BAS) to enhance model transparency by scoringattention patterns. In addition to that, synthetic trajectory generation allowsvarious ablation studies. In extensive experiments we investigate our approachversus related works in their robustness for trajectory noise and novelties onsynthetic data and three real world datasets.
MoreTranslated text
Key words
Group Anomaly Detection,BERT,Model Inspection,Trajectories,Deep Learning,Artificial Intelligence
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined