Skip to content

Bumping flash attention version to 2.6.3 and adding option for softcap in attention and lm_head logits. #4452

Bumping flash attention version to 2.6.3 and adding option for softcap in attention and lm_head logits.

Bumping flash attention version to 2.6.3 and adding option for softcap in attention and lm_head logits. #4452

Triggered via pull request September 22, 2024 18:05
Status Success
Total duration 1m 48s
Billable time 4m
Artifacts

smoketest.yaml

on: pull_request
Matrix: smoketest
Fit to window
Zoom out
Zoom in

Annotations

2 warnings
smoketest (3.10)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-python@v4. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
smoketest (3.9)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-python@v4. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/