Skip to content

Allow flash attention 2 and upgrade to transformers 4.34.1 #2444

Allow flash attention 2 and upgrade to transformers 4.34.1

Allow flash attention 2 and upgrade to transformers 4.34.1 #2444

Triggered via pull request October 14, 2023 21:25
Status Success
Total duration 21m 51s
Artifacts 3

pr-cpu.yaml

on: pull_request
Matrix: pytest-cpu
Coverage Results  /  coverage
11s
Coverage Results / coverage
Fit to window
Zoom out
Zoom in

Artifacts

Produced during runtime
Name Size
coverage-2a06881686756370adde3c9943d31ca51b0fa58e-cpu-2.0.1 Expired
236 KB
coverage-2a06881686756370adde3c9943d31ca51b0fa58e-cpu-2.1.0 Expired
236 KB
coverage-2a06881686756370adde3c9943d31ca51b0fa58e-cpu-latest Expired
236 KB