Learn practical skills, build real-world projects, and advance your career

This a mirror of the fastai notebooks on jvn for our DSNet meetup. Orignal repo:https://github.com/fastai/course-v3

import jovian
%matplotlib inline
from fastai.basics import *

In this part of the lecture we explain Stochastic Gradient Descent (SGD) which is an optimization method commonly used in neural networks. We will illustrate the concepts with concrete examples.

Linear Regression problem