Skip to content

Allow flash attention 2 and upgrade to transformers 4.34.1 #2439

Allow flash attention 2 and upgrade to transformers 4.34.1

Allow flash attention 2 and upgrade to transformers 4.34.1 #2439

Triggered via pull request October 14, 2023 04:25
Status Success
Total duration 19m 25s
Artifacts 3

pr-cpu.yaml

on: pull_request
Matrix: pytest-cpu
Coverage Results  /  coverage
17s
Coverage Results / coverage
Fit to window
Zoom out
Zoom in

Artifacts

Produced during runtime
Name Size
coverage-e0be8dc34a1128aa007784b56f3847eb362a3903-cpu-2.0.1 Expired
236 KB
coverage-e0be8dc34a1128aa007784b56f3847eb362a3903-cpu-2.1.0 Expired
236 KB
coverage-e0be8dc34a1128aa007784b56f3847eb362a3903-cpu-latest Expired
236 KB