pytorch – PINN – Loss function not getting updated

I am trying to work on a simple PINN, but for some reason the the loss function is not getting updated.

x = torch.linspace(-1,1,50).view(-1,1)
x.requires_grad = True
y = 2*x*x

class net(nn.Module):
    def __init__(self):
        super().__init__()
        self.fc1 = nn.Linear(1, 10)
        self.fc2 = nn.Linear(10, 1)
        self.tanh = nn.Tanh()
        
    def forward(self, x):
        x = self.fc1(x)
        x = self.tanh(x)
        x = self.fc2(x)
        return x
    
model = net()

optimizer= torch.optim.Adam(model.parameters(),lr=0.01)

for epoch in range(5001):
    optimizer.zero_grad()
    pred = model(x)
    
    #loss = criterion(pred,y)
    dydx = torch.autograd.grad(pred,x,torch.ones_like(x),retain_graph=True)[0]
    
    loss = 0.01*torch.sum(dydx-4*x)
    
    loss.backward(retain_graph=True)
    optimizer.step()
    
    if epoch%100 == 0:
        print(f'Epoch : {epoch}\tLoss: {loss.item()}')
        print(torch.sum(dydx-4*x).item())

I tried varying the coefficients of the loss function.

  • What do you mean with the loss not getting updated? What exactly happens?

    – 

  • After each epoch, when I print out the loss it is the same as the initial one. The weights are not getting updated.

    – 

  • Can you check whether optimizer.step() changes the parameters?

    – 

  • The model parameters are not getting updated

    – 

  • Oh I think you need to calculate the gradient of the loss and not the gradient of the model.

    – 

Leave a Comment