Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

flops are counted multiple times if a module is shared by other modules #106

Open
CocytusDuo opened this issue Mar 30, 2023 · 4 comments
Open
Labels
bug Something isn't working wontfix This will not be worked on

Comments

@CocytusDuo
Copy link

If a module is passed to a sub-module, for example:

import torch.nn as nn
import ptflops

class Block(nn.Module):
    def __init__(self, linear_layer) -> None:
        super().__init__()
        self.linear_layer = linear_layer
    
    def forward(self, x):
        return self.linear_layer(x)
    
class Test_model(nn.Module):
    def __init__(self) -> None:
        super().__init__()
        self.linear_layer = nn.Linear(1000, 1000)
        self.block = Block(self.linear_layer)

    def forward(self, x):
        out = self.linear_layer(x)
        out = self.block(out)
        return out

net = Test_model()
print(ptflops.get_model_complexity_info(net, (20, 1000)))

then, the flops of module nn.Linear(1000, 1000) will be counted twice in Test_model and Block:

Warning: variables __flops__ or __params__ are already defined for the moduleLinear ptflops can affect your code!
Test_model(
  2.0 M, 200.000% Params, 80.0 MMac, 100.000% MACs, 
  (linear_layer): Linear(1.0 M, 100.000% Params, 40.0 MMac, 50.000% MACs, in_features=1000, out_features=1000, bias=True)
  (block): Block(
    1.0 M, 100.000% Params, 40.0 MMac, 50.000% MACs, 
    (linear_layer): Linear(1.0 M, 100.000% Params, 40.0 MMac, 50.000% MACs, in_features=1000, out_features=1000, bias=True)
  )
)
('80.0 MMac', '1.0 M')
@CocytusDuo CocytusDuo changed the title flops are counted multiple times if a module is passed to a sub-module flops are counted multiple times if a module is shared by other modules Mar 30, 2023
@sovrasov sovrasov added the bug Something isn't working label Apr 17, 2023
@sovrasov
Copy link
Owner

sovrasov commented Apr 17, 2023

Hi! Yeah, that's a drawback of the tracing process. Since it doesn't affect the output value, I'd consider it as a minor one. I'll have a look if a simple fix is possible.

@sovrasov
Copy link
Owner

sovrasov commented Apr 17, 2023

I added more meaningful warning for that case in #109
Manipulations with module totals will also lead to contradictive results.

@sovrasov sovrasov added the wontfix This will not be worked on label Apr 17, 2023
@erwangccc
Copy link

Since it doesn't affect the output value, I'd consider it as a minor one

Hi, @sovrasov
Why you said it doesn't affect the output value, if the linear_layer just use one time, we calculate the op's MACs twice. I I think it's wrong. Please correct me if there is misunderstanding.

Thanks!

@sovrasov
Copy link
Owner

sovrasov commented Jun 20, 2023

By the output value I assume the return value of get_model_complexity_info, and it is not affected. For calculating extended statistics during model printing, I use a different mechanism, which can not handle shared submodules, and therefore counts parameters of shared submodules several times.
At the same time, inference of those shared submodules is conducted several times, so they contribute to the final MACs several times as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working wontfix This will not be worked on
Projects
None yet
Development

No branches or pull requests

3 participants