Skip to content

Commit

Permalink
Update config_moe_args.py (#1112)
Browse files Browse the repository at this point in the history
#1111 needed to revert #1104 because the #1104 PR caused issues. Removing TODO and marking Jira with wont-do
  • Loading branch information
vchiley committed Apr 12, 2024
1 parent b58d68c commit 6257e5b
Showing 1 changed file with 0 additions and 6 deletions.
6 changes: 0 additions & 6 deletions llmfoundry/models/utils/config_moe_args.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,12 +18,6 @@ def create_process_group_ranks(ranks: tuple[int]):
Used in create_set_process_group and create_mod_process_group methods below.
This function is an alternative to `distributed.new_group(ranks)`.
When working with FSDP in torch1.13.1, using `distributed.new_group(ranks)`
resulted in an error but this method worked.
TODO(GRT-2416): When composer no longer has support for torch1.13.1, we should
consider using `distributed.new_group(ranks)` here and in composer's FSDP
custom process group init.
Args:
ranks (tuple[int]): Tuple of ranks of group members.
Expand Down

0 comments on commit 6257e5b

Please sign in to comment.