Learn practical skills, build real-world projects, and advance your career
!pip install torch --upgrade
Collecting torch Downloading torch-1.5.1-cp37-none-macosx_10_9_x86_64.whl (80.5 MB) |████████████████████████████████| 80.5 MB 13.2 MB/s eta 0:00:01 Requirement already satisfied, skipping upgrade: future in /Users/yw4818/opt/anaconda3/lib/python3.7/site-packages (from torch) (0.18.2) Requirement already satisfied, skipping upgrade: numpy in /Users/yw4818/opt/anaconda3/lib/python3.7/site-packages (from torch) (1.18.1) Installing collected packages: torch Successfully installed torch-1.5.1
!pip install jovian --upgrade
Collecting jovian Downloading jovian-0.2.15-py2.py3-none-any.whl (94 kB) |████████████████████████████████| 94 kB 2.0 MB/s eta 0:00:011 Collecting uuid Downloading uuid-1.30.tar.gz (5.8 kB) Requirement already satisfied, skipping upgrade: click in /Users/yw4818/opt/anaconda3/lib/python3.7/site-packages (from jovian) (7.1.1) Requirement already satisfied, skipping upgrade: requests in /Users/yw4818/opt/anaconda3/lib/python3.7/site-packages (from jovian) (2.23.0) Requirement already satisfied, skipping upgrade: pyyaml in /Users/yw4818/opt/anaconda3/lib/python3.7/site-packages (from jovian) (5.3.1) Requirement already satisfied, skipping upgrade: certifi>=2017.4.17 in /Users/yw4818/opt/anaconda3/lib/python3.7/site-packages (from requests->jovian) (2020.4.5.1) Requirement already satisfied, skipping upgrade: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /Users/yw4818/opt/anaconda3/lib/python3.7/site-packages (from requests->jovian) (1.25.8) Requirement already satisfied, skipping upgrade: chardet<4,>=3.0.2 in /Users/yw4818/opt/anaconda3/lib/python3.7/site-packages (from requests->jovian) (3.0.4) Requirement already satisfied, skipping upgrade: idna<3,>=2.5 in /Users/yw4818/opt/anaconda3/lib/python3.7/site-packages (from requests->jovian) (2.9) Building wheels for collected packages: uuid Building wheel for uuid (setup.py) ... done Created wheel for uuid: filename=uuid-1.30-py3-none-any.whl size=6500 sha256=38d87f63be82ff3af00f117160c4b30b527cbc5e92d529a8c9362b4b239d6b79 Stored in directory: /Users/yw4818/Library/Caches/pip/wheels/2a/ea/87/dd57f1ecb4f0752f3e1dbf958ebf8b36d920d190425bcdc24d Successfully built uuid Installing collected packages: uuid, jovian Successfully installed jovian-0.2.15 uuid-1.30

Introduction to Deep Learning with PyTorch

In this notebook, you'll get introduced to PyTorch, a framework for building and training neural networks. PyTorch in a lot of ways behaves like the arrays you love from Numpy. These Numpy arrays, after all, are just tensors. PyTorch takes these tensors and makes it simple to move them to GPUs for the faster processing needed when training neural networks. It also provides a module that automatically calculates gradients (for backpropagation!) and another module specifically for building neural networks. All together, PyTorch ends up being more coherent with Python and the Numpy/Scipy stack compared to TensorFlow and other frameworks.

Neural Networks

Deep Learning is based on artificial neural networks which have been around in some form since the late 1950s. The networks are built from individual parts approximating neurons, typically called units or simply "neurons." Each unit has some number of weighted inputs. These weighted inputs are summed together (a linear combination) then passed through an activation function to get the unit's output.

alt

Mathematically this looks like:

y=f(w1x1+w2x2+b)y=f(iwixi+b)\begin{align} y &= f(w_1 x_1 + w_2 x_2 + b) \\ y &= f\left(\sum_i w_i x_i +b \right) \end{align}

With vectors this is the dot/inner product of two vectors:

h=[x1x2xn][w1w2wn]h = \begin{bmatrix} x_1 \, x_2 \cdots x_n \end{bmatrix} \cdot \begin{bmatrix} w_1 \\ w_2 \\ \vdots \\ w_n \end{bmatrix}

Tensors

It turns out neural network computations are just a bunch of linear algebra operations on tensors, a generalization of matrices. A vector is a 1-dimensional tensor, a matrix is a 2-dimensional tensor, an array with three indices is a 3-dimensional tensor (RGB color images for example). The fundamental data structure for neural networks are tensors and PyTorch (as well as pretty much every other deep learning framework) is built around tensors.

alt

With the basics covered, it's time to explore how we can use PyTorch to build a simple neural network.