W.grad is of NoneType while using inside a function

def linear_regression_func(inputs,targets,iteration,learning_rate):
          inputs = torch.from_numpy(inputs)
          targets = torch.from_numpy(targets)
      #initializing weights and bias
      w = torch.randn(2,3, requires_grad=True)
      b = torch.randn(2, requires_grad= True)
      for i in range(iteration):
          predicts = model(inputs)
          loss = mse(targets,predicts)
          loss.backward()
          with torch.no_grad():
            w -= w.grad * learning_rate
            b -= b.grad * learning_rate
            w.grad.zero_()
            b.grad.zero_()
      return w,b

Does anyone know, why w.grad is giving NoneType while applying it inside a function? This exact same code will run as expected if it is not implemented through function.

error message:

           16           with torch.no_grad():
    ---> 17             w -= w.grad * learning_rate
          18             b -= b.grad * learning_rate
          19             w.grad.zero_()

TypeError: unsupported operand type(s) for *: 'NoneType' and 'float'

You packed this into torch.no_grad() block. This means that any variable inside won’t have grad (it will be None). So w.grad or b.grad is None, which you try to multiply by learning_rate which is of type float probably. That’s why you get an error that you can’t multiply None by float.
Just noticed that you do backward outside this block. Damn, I’m moron lately :stuck_out_tongue:
Although I think you never involve the w and b variables in your calculations (they should be used somehow inside the model function). If they aren’t involved in calculating the predicts tensor then they won’t have the grad, since they don’t contribute to this result.

And you aren’t invoking any grad() method or function, it’s just a field from the tensor object.

3 Likes

Thank you for taking the time to explain this. :smiley: That is exactly what I was looking for.

1 Like

Can’t see the relation between loss and (weight and bias), I hope if loss is somehow dependent to weigth and biases your problem will be resolved

1 Like