Distributed Coordinate Descent Algorithm for Variational Quantum Classification

2023 IEEE International Conference on Quantum Computing and Engineering (QCE)(2023)

引用 0|浏览2
暂无评分
摘要
Quantum Machine Learning (QML) is one of promising applications for near-term quantum computing. Two popular methods in QML are kernel methods and variational methods. Variational methods, which consist of parametrized quantum circuits (PQCs) to encode data and define classifiers, work faster in theory (i.e., O(N) to learn from $N$ training examples) than kernel methods, which use quantum circuits to compute O(N 2 ) elements of kernel matrices. However, in practice when dealing with large N, it is necessary to speed up variational methods due to the slow quantum gates. In this work, we propose a parallelization of training variational quantum classifiers to utilize the availability of many quantum devices with dozens of qubits. In contrast to existing parallelization of variational methods with gradient-based algorithms, we develop a novel distributed coordinate descent algorithm to optimize parametrized gates of variational quantum circuits. There are several gradient-free methods to optimize PQCs that have been shown to converge faster. Here, by focusing on the so-called Free-axis selection (Fraxis) method, we show how the gradient-free methods can be parallelized, and demonstrate their efficacies by running the algorithm on both simulators and IBM Quantum devices. We confirm the proposed algorithm achieves high classification accuracy and gains almost-linear speedup with the degree of parallelization.
更多
查看译文
关键词
Quantum Applications,Machine Learning,Distributed Algorithm,Variational Quantum Algorithm,Classification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要