Skip to content

Commit

Permalink
update error message
Browse files Browse the repository at this point in the history
  • Loading branch information
dakinggg committed Oct 10, 2023
1 parent cbbbf1d commit 64e2f03
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion llmfoundry/models/layers/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -213,7 +213,7 @@ def flash_attn_fn(
try:
from flash_attn import bert_padding, flash_attn_interface # type: ignore # yapf: disable # isort: skip
except:
raise RuntimeError('Please install flash-attn==1.0.9')
raise RuntimeError('Please install flash-attn==1.0.9 or flash-attn==2.3.2')

check_valid_inputs(query, key, value)

Expand Down

0 comments on commit 64e2f03

Please sign in to comment.