Skip to content

Allow flash attention 2 and upgrade to transformers 4.34.1 #2577

Allow flash attention 2 and upgrade to transformers 4.34.1

Allow flash attention 2 and upgrade to transformers 4.34.1 #2577

Triggered via pull request October 14, 2023 21:15
@dakingggdakinggg
synchronize #672
Status Failure
Total duration 22m 37s
Artifacts

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu
Fit to window
Zoom out
Zoom in

Annotations

1 error
gpu-2.1.0 / pytest-gpu
Process completed with exit code 1.