Skip to content

Commit

Permalink
Add new base images and remove fa1 images (#970)
Browse files Browse the repository at this point in the history
  • Loading branch information
dakinggg committed Feb 12, 2024
1 parent 122f965 commit 78cbe08
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 16 deletions.
16 changes: 5 additions & 11 deletions .github/workflows/docker.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,22 +12,16 @@ on:
workflow_dispatch: {}
jobs:
docker-build:
runs-on: ubuntu-latest
runs-on: mosaic-4wide
if: github.repository_owner == 'mosaicml'
strategy:
matrix:
include:
- name: "2.1.0_cu121"
base_image: mosaicml/pytorch:2.1.0_cu121-python3.10-ubuntu20.04
dep_groups: "[gpu]"
- name: "2.1.0_cu121_flash2"
base_image: mosaicml/pytorch:2.1.0_cu121-python3.10-ubuntu20.04
- name: "2.1.2_cu121_flash2"
base_image: mosaicml/pytorch:2.1.2_cu121-python3.10-ubuntu20.04
dep_groups: "[gpu-flash2]"
- name: "2.1.0_cu121_aws"
base_image: mosaicml/pytorch:2.1.0_cu121-python3.10-ubuntu20.04-aws
dep_groups: "[gpu]"
- name: "2.1.0_cu121_flash2_aws"
base_image: mosaicml/pytorch:2.1.0_cu121-python3.10-ubuntu20.04-aws
- name: "2.1.2_cu121_flash2_aws"
base_image: mosaicml/pytorch:2.1.2_cu121-python3.10-ubuntu20.04-aws
dep_groups: "[gpu-flash2]"
steps:
- name: Maximize Build Space on Worker
Expand Down
8 changes: 3 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -113,11 +113,9 @@ You can select a specific commit hash such as `mosaicml/llm-foundry:1.13.1_cu117

| Docker Image | Torch Version | Cuda Version | LLM Foundry dependencies installed? |
| ------------------------------------------------------ | ------------- | ----------------- | ----------------------------------- |
| `mosaicml/pytorch:2.1.0_cu121-python3.10-ubuntu20.04` | 2.1.0 | 12.1 (Infiniband) | No |
| `mosaicml/llm-foundry:2.1.0_cu121-latest` | 2.1.0 | 12.1 (Infiniband) | Yes (flash attention v1. Warning: Support for flash attention v1 has been deprecated.) |
| `mosaicml/llm-foundry:2.1.0_cu121_flash2-latest` | 2.1.0 | 12.1 (Infiniband) | Yes (flash attention v2. Note: We recommend using flash attention v2.) |
| `mosaicml/llm-foundry:2.1.0_cu121_aws-latest` | 2.1.0 | 12.1 (EFA) | Yes (flash attention v1. Warning: Support for flash attention v1 has been deprecated.) |
| `mosaicml/llm-foundry:2.1.0_cu121_flash2_aws-latest` | 2.1.0 | 12.1 (EFA) | Yes (flash attention v2. Note: We recommend using flash attention v2.) |
| `mosaicml/pytorch:2.1.2_cu121-python3.10-ubuntu20.04` | 2.1.2 | 12.1 (Infiniband) | No |
| `mosaicml/llm-foundry:2.1.2_cu121_flash2-latest` | 2.1.2 | 12.1 (Infiniband) | Yes |
| `mosaicml/llm-foundry:2.1.2_cu121_flash2_aws-latest` | 2.1.2 | 12.1 (EFA) | Yes |


# Installation
Expand Down

0 comments on commit 78cbe08

Please sign in to comment.