Skip to content

Allow flash attention 2 and upgrade to transformers 4.34.1 #2573

Allow flash attention 2 and upgrade to transformers 4.34.1

Allow flash attention 2 and upgrade to transformers 4.34.1 #2573

Triggered via pull request October 14, 2023 04:25
@dakingggdakinggg
synchronize #672
Status Cancelled
Total duration 7m 42s
Artifacts

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu
Fit to window
Zoom out
Zoom in

Annotations

7 errors
gpu-latest / pytest-gpu
Process completed with exit code 1.
gpu-2.0.1 / pytest-gpu
FailFast: cancelling since parallel instance has failed
gpu-2.0.1 / pytest-gpu
The operation was canceled.
gpu-2.1.0 / pytest-gpu
FailFast: cancelling since parallel instance has failed
gpu-2.1.0 / pytest-gpu
The operation was canceled.
gpu-2.1.0-flash2 / pytest-gpu
FailFast: cancelling since parallel instance has failed
gpu-2.1.0-flash2 / pytest-gpu
The operation was canceled.