PyTorch: display lessons learned correctly

I am trying to extract the weights from the line layer, but they do not change, although the error drops monotonously (i.e. training occurs). When printing out the sum of the weights, nothing happens because it remains constant:

np.sum (model.fc2.weight.data.numpy ())

Here are the code snippets:

def train(epochs):
    model.train()
    for epoch in range(1, epochs+1):
        # Train on train set
        print(np.sum(model.fc2.weight.data.numpy()))
        for batch_idx, (data, target) in enumerate(train_loader):
            data, target = Variable(data), Variable(data)
            optimizer.zero_grad()
            output = model(data)
            loss = criterion(output, target)
            loss.backward()
            optimizer.step()

      

and

# Define model
class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        # an affine operation: y = Wx + b
        self.fc1 = nn.Linear(100, 80, bias=False)
        init.normal(self.fc1.weight, mean=0, std=1)
        self.fc2 = nn.Linear(80, 87)
        self.fc3 = nn.Linear(87, 94)
        self.fc4 = nn.Linear(94, 100)

    def forward(self, x):
        x = self.fc1(x)
        x = F.relu(self.fc2(x))
        x = F.relu(self.fc3(x))
        x = F.relu(self.fc4(x))
        return x

      

I may be looking for the wrong parameters, although I checked the docs. Thank you for your help!

+3


source to share


1 answer


Use model.parameters()

to get a training weight for any model or layer. Don't forget to put it in the list (), or you can't print it.

Next edited code snapshot



>>> import torch
>>> import torch.nn as nn
>>> l = nn.Linear(3,5)
>>> w = list(l.parameters())
>>> w

      

+4


source







All Articles