Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Potential enhancements for DropBlock #188

Open
1 of 2 tasks
theabhirath opened this issue Jul 26, 2022 · 0 comments
Open
1 of 2 tasks

Potential enhancements for DropBlock #188

theabhirath opened this issue Jul 26, 2022 · 0 comments
Labels
enhancement New feature or request layers Related to the Layers module - generic layers for reuse by models

Comments

@theabhirath
Copy link
Member

theabhirath commented Jul 26, 2022

DropBlock is a type of regularisation that tries to replace dropout. The original paper describes it as best used with a linear scaling rate across blocks in a model, as has been implemented in #174. However, timm adds several experimental features, including:

  • a scaling value for the gamma calculated (already included in Overhaul of ResNet API #174).
  • a certain configuration of DropBlock rates picked by experimentation to give better results on ResNets - these however extend only to four-stage ResNets i.e. those similar to the models in the original ResNet paper.
@theabhirath theabhirath added enhancement New feature or request layers Related to the Layers module - generic layers for reuse by models labels Jul 26, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request layers Related to the Layers module - generic layers for reuse by models
Projects
No open projects
Development

No branches or pull requests

1 participant