Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unexpected behavior with object.__setattr__ in canonizers.MergeBatchNorm #208

Open
karray opened this issue Jul 19, 2024 · 1 comment · May be fixed by #209
Open

Unexpected behavior with object.__setattr__ in canonizers.MergeBatchNorm #208

karray opened this issue Jul 19, 2024 · 1 comment · May be fixed by #209

Comments

@karray
Copy link

karray commented Jul 19, 2024

Description

The c30e3cc commit breaks saliency map generation. The behavior of object.__setattr__ for setting the bias parameter in canonizers.MergeBatchNorm differs from using direct assignment or setattr() (object.__setattr__ bypasses custom nn.Module.__setattr__). Since the bias is None, this will lead to inconsistencies in parameter registration.

Input Result Expected result

Reproducible Example

import torch
import torch.nn as nn

class CustomModule(nn.Module):
    def __init__(self):
        super().__init__()
        self.weight = nn.Parameter(torch.randn(1))

    def __setattr__(self, name, value):
        super().__setattr__(name, value)

module = CustomModule()

# Method 1: Direct assignment
module.bias1 = nn.Parameter(torch.zeros(1))

# Method 2: Using setattr
setattr(module, 'bias2', nn.Parameter(torch.zeros(1)))

# Method 3: Using object.__setattr__ bias3 won't be registered as a parameter
object.__setattr__(module, 'bias3', nn.Parameter(torch.zeros(1)))

print("Registered parameters:", module._parameters.keys()) # => ['weight', 'bias1', 'bias2']
karray pushed a commit to karray/zennit that referenced this issue Jul 19, 2024
@karray karray linked a pull request Jul 19, 2024 that will close this issue
@chr5tphr
Copy link
Owner

chr5tphr commented Aug 1, 2024

Hey @karray,

thanks a lot for spotting this, and for the PR. I guess the issue is that we modify the bias, but it remains unused by the module unless it is explicitly registered as a Parameter. As we are using a Canonizer, I think just adding the Parameter is the right call. I will leave you a small comment on the PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants