Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WGAN-gp loss keeps going large #54

Open
haonanhe opened this issue May 18, 2020 · 4 comments
Open

WGAN-gp loss keeps going large #54

haonanhe opened this issue May 18, 2020 · 4 comments

Comments

@haonanhe
Copy link

haonanhe commented May 18, 2020

Hello, I've implemented your code on my own dataset. However, the d_loss decreases from 10(which equals to lambda) to a very small negative number(like -10000), the wasserstein distance keeps going to order of million, and the gradient penalty changes from 10 to 0 and then goes to order of thousand. I've worked on this problem for several days but I still can't solve it. Can anyone help me with this?
@caogang

@yallien
Copy link

yallien commented Sep 28, 2020

Hello,I've met the same problem as yours,did u find out where the problem is?

@yifanjiang19
Copy link

@yallien @supermarian Are you using pytorch with version >= 1.4?

@mengxiangxiang414
Copy link

@haonanhe @yallien hello,did u solve this problem?

@hangexuexi
Copy link

how about the wasserstein distance at the start of the train, it is very smaller than lambda?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants