How to set different learning rate for another layer in pytorch

I'm fine with pytorch using resnet50 and want to set the learning rate of the last fully connected layer to 10 ^ -3 and the learning rate of the other layers to 10 ^ -6. I know I can just follow the method in his document:

optim.SGD([{'params': model.base.parameters()},
           {'params': model.classifier.parameters(), 'lr': 1e-3}], 
          lr=1e-2, momentum=0.9)

      

But it's all the same that I don't need to set parameters by layers

+3


source to share





All Articles