Augur: A Step Towards Realistic Drift Detection in Production ML Systems

2022 IEEE/ACM 1st International Workshop on Software Engineering for Responsible Artificial Intelligence (SE4RAI)(2022)

引用 4|浏览12
暂无评分
摘要
The inference quality of deployed machine learning (ML) models degrades over time due to differences between training and production data, typically referred to as drift. While large organizations rely on periodic training to evade drift, the reality is that not all organizations have the data and the resources required to do so. We propose a process for drift behavior analysis at model development time that determines the set of metrics and thresholds to monitor for runtime drift detection. Better understanding of how models will react to drift before they are deployed, combined with a mechanism for how to detect this drift in production, is an important aspect of Responsible AI. The toolset and experiments reported in this paper provide an initial demonstration of (1) drift behavior analysis as a part of the model development process, (2) metrics and thresholds that need to be monitored for drift detection in production, and (3) libraries for drift detection that can be embedded in production monitoring infrastructures. CCS CONCEPTS • Software and its engineering → Software design engineering; • Computing methodologies → Machine learning.
更多
查看译文
关键词
software engineering,machine learning,drift detection,model monitoring,responsible AI
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要