You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am working with quantum circuits and curious to learn about how autograd works in this instance.
I know other libraries like Pennylane offer differentiation of quantum circuits via autograd however I am trying to understand how to do this with CUDA Quantum.
See code below and the error I am receiving:
!pip install cuda-quantum
import cudaq
from cudaq import spin
One can think of qnn as a neural network where the aim is to optimise thetas[0] just like we optimise the weights and biases of a NN. Moreover, rx is just a 2 by 2 matrix which should be differentiable.
from autograd import grad
gradf = grad(qnn)
gradf(2.4)
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
Cell In[54], line 3
1 gradf = grad(qnn)
----> 3 gradf(2.4)
File ~/.local/lib/python3.10/site-packages/autograd/wrap_util.py:20, in unary_to_nary.<locals>.nary_operator.<locals>.nary_f(*args, **kwargs)
18 else:
19 x = tuple(args[i] for i in argnum)
---> 20 return unary_operator(unary_f, x, *nary_op_args, **nary_op_kwargs)
File ~/.local/lib/python3.10/site-packages/autograd/differential_operators.py:28, in grad(fun, x)
21 @unary_to_nary
22 def grad(fun, x):
23 """
24 Returns a function which computes the gradient of `fun` with respect to
25 positional argument number `argnum`. The returned function takes the same
26 arguments as `fun`, but returns the gradient instead. The function `fun`
27 should be scalar-valued. The gradient has the same type as the argument."""
---> 28 vjp, ans = _make_vjp(fun, x)
29 if not vspace(ans).size == 1:
30 raise TypeError("Grad only applies to real scalar-output functions. "
31 "Try jacobian, elementwise_grad or holomorphic_grad.")
File ~/.local/lib/python3.10/site-packages/autograd/core.py:10, in make_vjp(fun, x)
...
7 kernel.rx(thetas[0], qubits[0])
9 hamiltonian = spin.z(0)
---> 11 return cudaq.observe(kernel, hamiltonian, theta_val).expectation()
RuntimeError: Invalid list-like argument to Kernel.__call__()
Any insight would be much appreciated. Thanks
The text was updated successfully, but these errors were encountered:
I am working with quantum circuits and curious to learn about how autograd works in this instance.
I know other libraries like Pennylane offer differentiation of quantum circuits via autograd however I am trying to understand how to do this with CUDA Quantum.
See code below and the error I am receiving:
One can think of
qnn
as a neural network where the aim is to optimisethetas[0]
just like we optimise the weights and biases of a NN. Moreover,rx
is just a 2 by 2 matrix which should be differentiable.Any insight would be much appreciated. Thanks
The text was updated successfully, but these errors were encountered: