Copyright (c) Meta Platforms, Inc. and affiliates. All rights reserved.
Optimizers is a Github repository of PyTorch optimization algorithms. It is designed for external collaboration and development.
Currently includes the optimizers:
- Distributed Shampoo
See the CONTRIBUTING file for how to help out.
Optimizers is BSD licensed, as found in the LICENSE file.
This code requires python>=3.10
and torch>=2.2.0
.
Install distributed_shampoo
with all dependencies:
git clone [email protected]:facebookresearch/optimizers.git
cd optimizers
pip install .
If you also want to try the examples, replace the last line with pip install ".[examples]"
.
After installation, basic usage looks like:
import torch
from distributed_shampoo.distributed_shampoo import DistributedShampoo
from distributed_shampoo.shampoo_types import AdamGraftingConfig
model = ... # Instantiate model
optim = DistributedShampoo(
model.parameters(),
lr=1e-3,
betas=(0.9, 0.999),
epsilon=1e-8,
grafting_config=AdamGraftingConfig(
beta2=0.999,
epsilon=1e-8,
),
)
For more, please see the additional documentation here and especially the How to Use section.