Skip to content

facebookresearch/optimizers

Repository files navigation

Optimizers

Python 3.10 | 3.11 | 3.12 tests gpu-tests lint-ruff format-ruff format-usort type-check-mypy

Copyright (c) Meta Platforms, Inc. and affiliates. All rights reserved.

Description

Optimizers is a Github repository of PyTorch optimization algorithms. It is designed for external collaboration and development.

Currently includes the optimizers:

  • Distributed Shampoo

See the CONTRIBUTING file for how to help out.

License

Optimizers is BSD licensed, as found in the LICENSE file.

Installation and Dependencies

This code requires python>=3.10 and torch>=2.2.0. Install distributed_shampoo with all dependencies:

git clone [email protected]:facebookresearch/optimizers.git
cd optimizers
pip install .

If you also want to try the examples, replace the last line with pip install ".[examples]".

Usage

After installation, basic usage looks like:

import torch
from distributed_shampoo.distributed_shampoo import DistributedShampoo
from distributed_shampoo.shampoo_types import AdamGraftingConfig

model = ...  # Instantiate model

optim = DistributedShampoo(
    model.parameters(),
    lr=1e-3,
    betas=(0.9, 0.999),
    epsilon=1e-8,
    grafting_config=AdamGraftingConfig(
        beta2=0.999,
        epsilon=1e-8,
    ),
)

For more, please see the additional documentation here and especially the How to Use section.