Skip to content

Allow flash attention 2 and upgrade to transformers 4.34.1 #1645

Allow flash attention 2 and upgrade to transformers 4.34.1

Allow flash attention 2 and upgrade to transformers 4.34.1 #1645

Triggered via pull request October 14, 2023 21:25
Status Success
Total duration 4m 1s
Artifacts

codeql-analysis.yml

on: pull_request
Matrix: Analyze
Fit to window
Zoom out
Zoom in

Annotations

1 warning
Analyze (python)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/