Skip to content

Latest commit

 

History

History
38 lines (27 loc) · 2.77 KB

README.md

File metadata and controls

38 lines (27 loc) · 2.77 KB

ring-attention

Ring Attention leverages blockwise computation of self-attention on multiple GPUs and enables training and inference of sequences that would be too long for a single devices.

This repository contains notebooks, experiments and a collection of links to papers and other material related to Ring Attention.

Reserach / Material

Notebooks

Development References

How to contribute

Contact us on the GPU MODE discord server: https://discord.gg/gpumode, PRs are welcome (please create an issue first).