Dli: Deep Learning Inference Benchmark

SUPERCOMPUTING (RUSCDAYS 2019)(2019)

引用 3|浏览7
暂无评分
摘要
We examine the problem of performance evaluation for deep neural networks. We develop a software, which, unlike the existing ones, is focused on evaluating the performance of deep models' inference on CPUs, integrated graphics and embedded devices. The implementation is open source and free available on GitHub: https://github.com/itlab-vision/openvino-dl-benchmark. The software is verified using the example of the well-known classification model ResNet-152 and the Inference Engine component of the OpenVINO toolkit which is distributed by Intel. The primarily advantage of the OpenVINO toolkit is the absence of restrictions on the choice of a library for model training, since the toolkit contains an utility for converting models into its own intermediate format. We analyze the performance of ResNet-152 in synchronous and asynchronous inference modes on the Intel CPUs and Intel Processor Graphics. We provide recommendations on the selection of the optimal execution parameters. Inference performance results for more than 20 well-known deep models on the available hardware are posted on the project web page: http://hpc-education.unn.ru/dli.
更多
查看译文
关键词
Deep learning, Inference engine, Performance evaluation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要