QiML Algorithm Architecture R&D; New Utilities, Higher Accuracies PDF + Discussion 11/2/23.
Despite all of the funding and attention garnered towards quantum computers, the time is likely coming soon for Quantum-Inspired Machine Learning to be the primary application of quantum computing. The standard QiML setup requires a CPU, a quantum library such as PennyLane or Qiskit, and additional RAM on a newer laptop. Separate GPU or CPU 'quantum simulators' can also be utilized to gain additional compute power and RAM. Quantum algorithm parameters are then updated along with any classical parameters and processes.
QiML has been readily available for 4+ years in models using CPUs or GPUs. Efficient differentiation methods, exact expectation values, and pure quantum states without quantum noise allow for continual QiML R&D. Next Steps should include a ‘Simulator’ Specific Quantum Circuit Architecture for Significantly better performance vs. Classical ML; and the founding of New Utilities through classical bits regulated by quantum mechanics. (Slide 3) QiML will likely be favored with smaller qubit circuits for tasks such as quantum feature mapping and data re-uploading. Available 'circuit cutting' techniques can also be used for larger qubit circuits.
Quantum computers may assist with large problems for QiML workflows in the future, but not likely with the main training, validation, or inference steps. In addition: NVIDIA ran 4000+ GPUs for quantum state vectors, ACM states that the time needed for a quantum computer with 10,000 error-corrected logical qubits would be several decades to match a single A100 GPU in performance, QiML experiments are being scaled up by several Industry Leaders including Brookhaven Lab with NERSC, and The Wall Street Journal claims GPUs are now here for Quantum. (Slide 10) Machine learning accounted for close to $20B in 2022, and due to a similar type of technology, QiML will likely experience larger adoption rates. 1, 2