Skip to content

Commit

Permalink
fix GQA error message
Browse files Browse the repository at this point in the history
Signed-off-by: Charlene Yang <[email protected]>
  • Loading branch information
cyanguwa committed Nov 12, 2024
1 parent 237b493 commit 062a7d0
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion transformer_engine/pytorch/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -7951,7 +7951,7 @@ def forward(
assert (
key_layer.shape[-2] == self.num_gqa_groups_per_partition
and value_layer.shape[-2] == self.num_gqa_groups_per_partition
), f"Keys and values must have num_gqa_group = {self.num_gqa_groups} heads!"
), f"Keys and values must have num_gqa_group = {self.num_gqa_groups_per_partition} heads!"
assert qkv_format in [
"sbhd",
"bshd",
Expand Down

0 comments on commit 062a7d0

Please sign in to comment.