OptMark: A Toolkit for Benchmarking Query Optimizers

ACM International Conference on Information and Knowledge Management(2016)

引用 7|浏览48
暂无评分
摘要
Query optimizers have long been considered as among the most complex components of a database engine, while the assessment of an optimizer’s quality remains a challenging task. Indeed, existing performance benchmarks for database engines (like TPC benchmarks) produce a performance assessment of the query runtime system rather than its query optimizer. To address this challenge, this paper introduces OptMark, a query optimizer benchmark for evaluating the quality of a query optimizer. OptMark is designed to offer a number of desirable properties. First, it decouples the quality of an optimizer from the quality of its underlying execution engine. Second it evaluates independently both the effectiveness of an optimizer (i.e., quality of the chosen plans) and its efficiency (i.e., optimization time). OptMark includes also a generic benchmarking toolkit that is minimum invasive to the DBMS that wishes to use it. Any DBMS can provide a system-specific implementation of a simple API that allows OptMark to run and generate benchmark scores for the specific system. We have implemented OptMark on the open-source MySQL engine as well as two commercial database systems. This paper discusses the benchmark’s design, the toolkit’s functionality and its API as well as its implementation on these DBMSs. Using these implementations we were able to assess the quality of the optimizers on these three systems based on the TPC-DS benchmark queries.
更多
查看译文
关键词
database systems,query optimizers,benchmarking
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要