Linear Regression & Gradient Descent

Hi,

In Linear Regression & Gradient Descent.

Why we are doing only subtraction every time?

What if the w.grad is negative? Don’t we write like this w += w.grad * 1e-5 ?

Code snipet:::::::

with torch.no_grad():
w -= w.grad * 1e-5
b -= b.grad * 1e-5
w.grad.zero_()
b.grad.zero_()

Thanks in advance