We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The FLOPs of swin_t is 4.5G, but I get 3.13G here. Am I using it the wrong way?
import torchvision.models as models from ptflops import get_model_complexity_info net = models.swin_t(num_classes=1000) macs, params = get_model_complexity_info(net, (3, 224, 224), as_strings=True, print_per_layer_stat=False) print('{:<30} {:<8}'.format('Computational complexity: ', macs)) print('{:<30} {:<8}'.format('Number of parameters: ', params))
Computational complexity: 3.13 GMac Number of parameters: 28.29 M
The text was updated successfully, but these errors were encountered:
Transformer support is not full in the torch backend, to fix that you could switch to aten:
import torchvision.models as models from ptflops import get_model_complexity_info net = models.swin_t(num_classes=1000) macs, params = get_model_complexity_info(net, (3, 224, 224), as_strings=True, print_per_layer_stat=False, backend='aten') print('{:<30} {:<8}'.format('Computational complexity: ', macs)) print('{:<30} {:<8}'.format('Number of parameters: ', params))
Computational complexity: 4.5 GMac Number of parameters: 28.29 M
Sorry, something went wrong.
Transformer 支持在 torch 后端并不完全,要修复您可以切换到 aten 的问题: import torchvision.models as models from ptflops import get_model_complexity_info net = models.swin_t(num_classes=1000) macs, params = get_model_complexity_info(net, (3, 224, 224), as_strings=True, print_per_layer_stat=False, backend='aten') print('{:<30} {:<8}'.format('Computational complexity: ', macs)) print('{:<30} {:<8}'.format('Number of parameters: ', params)) Computational complexity: 4.5 GMac Number of parameters: 28.29 M
Transformer 支持在 torch 后端并不完全,要修复您可以切换到 aten 的问题:
Thanks a lot, it works now.
No branches or pull requests
The FLOPs of swin_t is 4.5G, but I get 3.13G here. Am I using it the wrong way?
The text was updated successfully, but these errors were encountered: