Skip to content

Commit

Permalink
make style
Browse files Browse the repository at this point in the history
  • Loading branch information
ylacombe committed Jul 26, 2023
1 parent 5ebcb6f commit 29f31b8
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion optimum/bettertransformer/models/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@ def bark_wrapped_scaled_dot_product(
is_causal = self.is_causal and query.shape[2] != 1

sdpa_result = torch.nn.functional.scaled_dot_product_attention(
query, key, value, attn_mask=None, dropout_p=self.dropout if self.training else 0., is_causal=is_causal
query, key, value, attn_mask=None, dropout_p=self.dropout if self.training else 0.0, is_causal=is_causal
)

return sdpa_result, None
Expand Down

0 comments on commit 29f31b8

Please sign in to comment.