MTLFormer: Multi-Task Learning Guided Transformer Network for Business Process Prediction

IEEE Access(2023)

引用 0|浏览8
暂无评分
摘要
The predictive business process monitoring mainly focuses on the performance prediction of business process execution, i.e., predicting the next activity, the execution time of the next activity, and the remaining time, respectively, for an ongoing process instance based on the knowledge gained from historical event logs. Although there is a specific relationship between these three tasks, recent research has focused on training separate prediction models for each task, resulting in high costs and time complexity. Additionally, existing technologies are limited in their ability to capture long-distance dependent features in process instances, further impeding prediction performance. To address these issues, this paper proposes the MTLFormer approach, which leverages the self-attention mechanism of the Transformer network and conducts multi-task parallel training through shared feature representation obtained from different tasks. Our approach reduces the time complexity of model training while simultaneously improving prediction performance. We extensively evaluate our approach on four real-life event logs, demonstrating its capability to achieve multi-task online real-time prediction and effectively improve prediction performance.
更多
查看译文
关键词
Multi-task learning,predictive business process monitoring,self-attention,Transformer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要