Score distillation for anomaly detection

Jeongmin Hong,Seokho Kang

Knowledge-Based Systems(2024)

引用 0|浏览0
暂无评分
摘要
Recently, significant performance improvements have been achieved in deep learning-based anomaly detection methods by introducing large neural network architectures and complex anomaly scoring functions. However, the computational cost and memory usage required in the inference phase have also increased significantly, thereby limiting their use in real-time applications. In this paper, we propose a score distillation method that adopts the concept of knowledge distillation. An existing high-performance anomaly detection method is used as the teacher. A small neural network is then trained as the student to mimic the scoring function of the teacher. In the inference phase, the anomaly score for a query instance is obtained by a single forward pass through the student network without requiring any complicated computation processes. We demonstrate that the proposed method makes anomaly detection faster and more efficient while maintaining high performance.
更多
查看译文
关键词
Knowledge distillation,Score distillation,Anomaly detection,Unsupervised learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要