We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Getting 4.12B flops using your code whereas almost all research papers mentioned 4.09B flops for this configuration
(pytorch default 76.15% test accuracy for pretrained model)
Can you please modify the code or mention the reason for getting 0.03B increase in FLOPs?
The text was updated successfully, but these errors were encountered:
fvcore will get the output of 4.09G, but it will also print
Skipped operation aten::batch_norm 53 time(s) Skipped operation aten::max_pool2d 1 time(s) Skipped operation aten::add_ 16 time(s) Skipped operation aten::adaptive_avg_pool2d 1 time(s)
Perhaps those papers ignore the computation of some of the operators.
Sorry, something went wrong.
@jkhu29 you're right! ptflops also considers batch norms and poolings as non-zero ops, that's why it outputs slightly greater numbers than expected.
No branches or pull requests
Getting 4.12B flops using your code whereas almost all research papers mentioned 4.09B flops for this configuration
(pytorch default 76.15% test accuracy for pretrained model)
Can you please modify the code or mention the reason for getting 0.03B increase in FLOPs?
The text was updated successfully, but these errors were encountered: