PERT-GNN: Latency Prediction for Microservice-Based Cloud-Native Applications via Graph Neural Networks

KDD 2023(2023)

引用 1|浏览93
暂无评分
摘要
Cloud-native applications using microservice architectures are rapidly replacing traditional monolithic applications. To meet end-to-end QoS guarantees and enhance user experience, each component microservice must be provisioned with sufficient resources to handle incoming API calls. Accurately predicting the latency of microservices-based applications is critical for optimizing resource allocation, which turns out to be extremely challenging due to the complex dependencies between microservices and the inherent stochasticity. To tackle this problem, various predictors have been designed based on the Microservice Call Graph. However, Microservice Call Graphs do not take into account the API-specific information, cannot capture important temporal dependencies, and cannot scale to large-scale applications. In this paper, we propose PERT-GNN, a generic graph neural network based framework to predict the end-to-end latency for microservice applications. PERT-GNN characterizes the interactions or dependency of component microservices observed from prior execution traces of the application using the Program Evaluation and Review Technique (PERT). We then construct a graph neural network based on the generated PERT Graphs, and formulate the latency prediction task as a supervised graph regression problem using the graph transformer method. PERT-GNN can capture the complex temporal causality of different microservice traces, thereby producing more accurate latency predictions for various applications. Evaluations based on datasets generated from common benchmarks and large-scale Alibaba microservice traces show that PERT-GNN can outperform other models by a large margin. In particular, PERT-GNN is able to predict the latency of microservice applications with less than 12% mean absolute percentage error.
更多
查看译文
关键词
delay prediction,microservices,cloud computing,graph neural networks,graph transformers,machine learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要