A deep learning model for noise prediction on near-term quantum devices

arxiv(2020)

引用 0|浏览5
暂无评分
摘要
We present an approach for a deep-learning compiler of quantum circuits, designed to reduce the output noise of circuits run on a specific device. We train a convolutional neural network on experimental data from a quantum device to learn a hardware-specific noise model. A compiler then uses the trained network as a noise predictor and inserts sequences of gates in circuits so as to minimize expected noise. We tested this approach on the IBM 5-qubit devices and observed a reduction in output noise of 12.3% (95% CI [11.5%, 13.0%]) compared to the circuits obtained by the Qiskit compiler. Moreover, the trained noise model is hardware-specific: applying a noise model trained on one device to another device yields a noise reduction of only 5.2% (95% CI [4.9%, 5.6%]). These results suggest that device-specific compilers using machine learning may yield higher fidelity operations and provide insights for the design of noise models.
更多
查看译文
关键词
noise prediction,quantum,deep learning,deep learning model,near-term
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要