Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

the snr dmd loss in pixart alplha 512x512 #138

Open
icelighting opened this issue Jul 24, 2024 · 1 comment
Open

the snr dmd loss in pixart alplha 512x512 #138

icelighting opened this issue Jul 24, 2024 · 1 comment

Comments

@icelighting
Copy link

thanks for your great work. when i use the dmd distillation code, i find the snr loss is not use the mse loss, but the coeff * latents, not the grad and may be negative. Is it related to the way model learning using snr gamma?

@Feynman1999
Copy link

thanks for your great work. when i use the dmd distillation code, i find the snr loss is not use the mse loss, but the coeff * latents, not the grad and may be negative. Is it related to the way model learning using snr gamma?

The default args.snr_gamma should be none ? I am also puzzled about the difference between these two, which one should use

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants