Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change bf16 to amp_bf16 #322

Open
Srini-98 opened this issue May 10, 2023 · 3 comments · May be fixed by #443
Open

Change bf16 to amp_bf16 #322

Srini-98 opened this issue May 10, 2023 · 3 comments · May be fixed by #443

Comments

@Srini-98
Copy link

Srini-98 commented May 10, 2023

Hi,

I tried replicating the pretraining bert script and when I ran it with the yaml script I got the following error: Value bf16 is not available in Precision. I traced the code and changed bf16 to amp_bf16 to get it to work. Not sure if I am using the outdated code but flagging it here.

https://github.com/mosaicml/examples/blob/main/examples/bert/yamls/main/hf-bert-base-uncased.yaml - line 82

@jacobfulano
Copy link
Contributor

jacobfulano commented May 14, 2023

Thanks for catching this! You are correct in using the amp_bf16 flag. We are updating the yamls accordingly

@dakinggg
Copy link
Collaborator

@jacobfulano was this fixed?

Taytay added a commit to Taytay/examples that referenced this issue Jan 8, 2024
@Taytay Taytay linked a pull request Jan 8, 2024 that will close this issue
@Taytay
Copy link

Taytay commented Jan 8, 2024

@jacobfulano : I submitted a PR. Just got bitten by this myself.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants