Quantcast
Channel: Active questions tagged python - Stack Overflow
Viewing all articles
Browse latest Browse all 14331

Changing the number of hidden layers in my NN results in an error

$
0
0

As the title says, if I change the number of hidden layers in my pytorch neural network to be anything different from the amount of input nodes it returns the error below.

RuntimeError: mat1 and mat2 shapes cannot be multiplied (380x10 and 2x10)

I think that the architecture is incorrectly coded but I am relatively new to pytorch and neural networks so I can't spot the mistake. Any help is greatly appreciated, I've included the code below

class FCN(nn.Module):def __init__(self, N_INPUT, N_OUTPUT, N_HIDDEN, N_LAYERS):    super().__init__()    activation = nn.Tanh    self.fcs = nn.Sequential(*[        nn.Linear(N_INPUT, N_HIDDEN),        activation()])    self.fch = nn.Sequential(*[                  nn.Sequential(*[                      nn.Linear(N_INPUT, N_HIDDEN),                      activation()]) for _ in range(N_LAYERS-1)])    self.fce = nn.Linear(N_INPUT, N_HIDDEN)def forward(self, x):    x = self.fcs(x)    x = self.fch(x)    x = self.fce(x)    return xtorch.manual_seed(123)pinn = FCN(2, 2, 10, 8)

If the pinn architecture is defined as pinn = FCN(2, 2, 2, 8) no errors are returned but neural network does not perform well.

Other information:

  • the input is a matrix tensor with a batch size of 380

Please let me know if you need anymore information and thank you!


Viewing all articles
Browse latest Browse all 14331

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>