Skip to content

Commit

Permalink
Use Grad in Div.
Browse files Browse the repository at this point in the history
  • Loading branch information
samuelburbulla committed Feb 15, 2024
1 parent 5234cc5 commit 1b98f06
Showing 1 changed file with 2 additions and 9 deletions.
11 changes: 2 additions & 9 deletions src/continuity/pde/grad.py
Original file line number Diff line number Diff line change
Expand Up @@ -84,15 +84,8 @@ def forward(self, x: Tensor, u: Tensor, y: Optional[Tensor] = None) -> Tensor:

assert x.requires_grad, "x must require gradients for divergence operator"

# Compute gradients
gradients = torch.autograd.grad(
u,
x,
grad_outputs=torch.ones_like(u),
create_graph=True,
retain_graph=True,
)[0]

# Compute divergence
gradients = Grad()(x, u)
return torch.sum(gradients, dim=-1, keepdim=True)


Expand Down

0 comments on commit 1b98f06

Please sign in to comment.