1

I want to create a class that creates a simple network with X fully connected layers, where X is an input given by the user. I tried this using the setattr/getattr but for some reason is not working.

class MLP(nn.Module):
      def __init__(self,in_size, out_size,n_layers, hidden_size):
        super(MLP,self).__init__()
        self.n_layers=n_layers

        for i in range(n_layers):
            if i==0:
                layer_in_size = in_size
            else:
                layer_in_size = hidden_size

            if i==(n_layers-1):
                layer_out_size = out_size
            else:
                layer_out_size = hidden_size

            setattr(self,'dense_{}'.format(i), nn.Linear(layer_in_size,layer_out_size))

        def forward(self,x):
            out = x
            for i in range(self.n_layers):
                if i==(self.n_layers-1):
                    out = getattr(self,'dense_{}'.format(i),out)
                else:
                    out = F.relu(getattr(self,'dense_{}'.format(i),out))

            return out

This is the error im getting when trying a forward pass with the net:

enter image description here

Some insights of what's the issue will be helpful.

1 Answer 1

1

This seems like a problem with forward implementation with the mod2 function. Try the pytorch functions (torch.fmod and torch.remainder) or if you don't need the backprop capabilities try to do .detach() before the mod2 function.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.