Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to Build backward for Sparse tensor dot Multiplication #703

Open
LuCheng2000 opened this issue Jun 11, 2024 · 0 comments
Open

How to Build backward for Sparse tensor dot Multiplication #703

LuCheng2000 opened this issue Jun 11, 2024 · 0 comments

Comments

@LuCheng2000
Copy link

I attempted to achieve the effect of dot multiplication on sparse tensors, based on the spatial information of the indices in the sparse tensor corresponding to multiplication. Although the network model was successfully trained, the results were very unsatisfactory. So I checked the network gradients to determine the problem and found that all gradients before the dot multiplication operation were None, which means there are some issues with the backward. I have no idea to know how the backward in spconv is implemented. Is the data returned by grad_output in the ordinary tensor form? If so, how can I complete the corresponding spatial rule operation without the indices data of grad_output?

class z_sparse_dot_F(Function):
@staticmethod
def forward(ctx,x,y):
ctx.save_for_backward(x, y)
return z_sparse_dot_f(x,y)

@staticmethod
def backward(ctx, grad_output):
    pdb.set_trace()
    x,y=ctx.saved_tensors
    grad_x = z_sparse_dot_f(grad_output,y) 
    grad_y = z_sparse_dot_f(grad_output,x)
    return grad_x, grad_y
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant