Skip to content

Allow flash attention 2 and upgrade to transformers 4.34.1 #2571

Allow flash attention 2 and upgrade to transformers 4.34.1

Allow flash attention 2 and upgrade to transformers 4.34.1 #2571

Triggered via pull request October 14, 2023 02:41
@dakingggdakinggg
synchronize #672
Status Cancelled
Total duration 3m 45s
Artifacts

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu
Fit to window
Zoom out
Zoom in

Annotations

6 errors
gpu-latest / pytest-gpu
Process completed with exit code 1.
gpu-2.1.0 / pytest-gpu
Process completed with exit code 1.
gpu-2.0.1 / pytest-gpu
FailFast: cancelling since parallel instance has failed
gpu-2.0.1 / pytest-gpu
Process completed with exit code 1.
gpu-2.1.0-flash2 / pytest-gpu
FailFast: cancelling since parallel instance has failed
gpu-2.1.0-flash2 / pytest-gpu
The operation was canceled.