Skip to content

Commit

Permalink
precommit
Browse files Browse the repository at this point in the history
  • Loading branch information
dakinggg committed Oct 10, 2023
1 parent faa562d commit 3f2f11a
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion llmfoundry/models/layers/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -213,7 +213,8 @@ def flash_attn_fn(
try:
from flash_attn import bert_padding, flash_attn_interface # type: ignore # yapf: disable # isort: skip
except:
raise RuntimeError('Please install flash-attn==1.0.9 or flash-attn==2.3.2')
raise RuntimeError(
'Please install flash-attn==1.0.9 or flash-attn==2.3.2')

check_valid_inputs(query, key, value)

Expand Down

0 comments on commit 3f2f11a

Please sign in to comment.