Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Initial PEFT support #612

Merged
merged 13 commits into from
Jun 7, 2024
Merged

Initial PEFT support #612

merged 13 commits into from
Jun 7, 2024

Conversation

michaelbenayoun
Copy link
Member

@michaelbenayoun michaelbenayoun commented May 28, 2024

What does this PR do?

Adds support for PEFT, tested for LoRA. For now, support is only for DDP. Support for TP will come in another PR.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@michaelbenayoun michaelbenayoun changed the title Peft Initial PEFT support May 29, 2024
@michaelbenayoun michaelbenayoun marked this pull request as ready for review May 29, 2024 15:14
orig_peft_model.save_pretrained(orig_model_path.as_posix())

# PEFT model saved using `NeuronPeftModel`.
seed_patcher = create_static_seed_patcher(LlamaForCausalLM, 42)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you really need to recreate this ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, otherwise it fails (tried that). Maybe we can make it "reusable" because it's annoying but did not want to spend too much time on that.


class NeuronPeftModel(PeftModel):
@requires_neuronx_distributed
def save_pretrained(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So this is the only method that needs to be implemented differently for Neuron models, right ?
Is it because everything else is already handled by the generic code in accelerator.prepare_model ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. Basically for now it's only about how we can save things the best way (more efficient). accelerator.prepare_model is indeed more general. We might need to change other methods in the future but for now this is only what's needed.

Copy link
Collaborator

@dacorvo dacorvo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the pull-request. It looks good to me, however unless I am mistaken there is no test verifying that the training works, only tests verifying that the model can be peft-ed, prepared, and saved correctly (but maybe I missed something).

@michaelbenayoun
Copy link
Member Author

michaelbenayoun commented Jun 4, 2024

I can add that now!

Copy link
Collaborator

@JingyaHuang JingyaHuang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM ! Just as David mentioned, a training test would be necessary, and if possible maybe move all peft tests under a tests/peft folder? Then I could put sd peft related stuff under it as well.

@michaelbenayoun
Copy link
Member Author

I moved the tests to tests/peft.

I wrote a "basic" test that runs training. It does not check about the loss on anything because it's not trivial to overfit with a random model only by finetuning the LoRA adapters. For now this test guarantees that the code runs at least.

@michaelbenayoun
Copy link
Member Author

Merging, the issue on TGI is going to be handled in #620

@michaelbenayoun michaelbenayoun merged commit da9d261 into main Jun 7, 2024
10 of 11 checks passed
@michaelbenayoun michaelbenayoun deleted the peft branch June 7, 2024 13:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants