Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dropout application #188

Open
ArlindKadra opened this issue Apr 27, 2021 · 2 comments
Open

Dropout application #188

ArlindKadra opened this issue Apr 27, 2021 · 2 comments

Comments

@ArlindKadra
Copy link

I have a question regarding dropout and how it is applied. In the previous non-refactored version of AutoPyTorch there was a dropout_shape hyperparameter, which does not seem to be there anymore. Is now dropout applied with the same rate at every layer and if not, does it follow the network shape then ?

@ravinkohli
Copy link
Contributor

Hey, as far as I can tell, dropout_shape was not a hyperparameter but a parameter that depended on the network shape in the master branch. Similarly, it has exactly the same application in 'refactor_development'. You can find it here and for reference, it is implemented using this

@ravinkohli
Copy link
Contributor

This will also be fixed in #358

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants