Join the “Zero to Data Analyst” Bootcamp. Limited seats.

A 20-week program covering 7 courses, 12 assignments, 4 real-world projects, and 12 months of career support.

Jovian is a platform that helps data scientists and ML engineers

- track & reproduce data science projects
- collaborate easily with friends/colleagues, and
- automate repetitive tasks in their day-to-day workflow.

It's really easy to get started with Jovian!

*To follow along with this tutorial, click the 'Run' button above
or Click Here to start a Jupyter notebook server (hosted by mybinder.org)*

`jovian`

python libraryYou can do this from the terminal, or directly within a Jupyter notebook.

Part 1 of "Pytorch: Zero to GANs" This post is the first in a series of tutorials on building deep learning models with PyTorch, an open source neural networks library developed and maintained by Facebook. Check out the full series:

PyTorch Basics: Tensors & Gradients

Linear Regression & Gradient Descent

Image Classfication using Logistic Regression

Training Deep Neural Networks on a GPU

Coming soon.. (CNNs, RNNs, GANs etc.)

This series attempts to make PyTorch a bit more approachable for people starting out with deep learning and neural networks. In this post, we’ll cover the basic building blocks of PyTorch models: tensors and gradients.

In [2]:

`!pip install jovian -q --upgrade`

In [3]:

`import jovian`

`jovian.commit`

After writing some code, running some experiments, training some models and plotting some charts, you can save and commit your Jupyter notebook.

In [4]:

`jovian.commit()`

```
[jovian] Saving notebook..
```

```
[jovian] Please enter your API key ( from https://jovian.ml/ ):
API Key:········
[jovian] Updating notebook "50fa86388c8b42dcbc4b12ee6a8d372e" on https://jovian.ml/
[jovian] Uploading notebook..
[jovian] Capturing environment..
[jovian] Committed successfully! https://jovian.ml/sammarhashmi/jovian-demo-50fa8
```

Here's what `jovian.commit`

does:

- It saves and uploads the Jupyter notebook to your Jovian account.
- It captures and uploads the python virtual environment containing the list of libraries required to run your notebook.
- It returns a link that you can use to view and share your notebook with friends or colleagues.

**NOTE**: When you run `jovian.commit`

for the first time, you'll be asked to provide an API, which you can find on your Jovian account.

Once a notebook is uploaded to Jovian, anyone (including you) can download the notebook and it's Python dependencies by running `jovian clone <notebook_id>`

command on the Linux/Mac terminal or Windows Command Prompt. Try clicking the 'Clone' button at the top of this page to copy the command (including notebook ID) to clipboard.

```
pip instal jovian --upgrade
jovian clone 903a04b17036436b843d70443ef5d7ad
```

Once cloned, you can enter the directly and setup the virtual environment using `jovian install`

.

```
cd jovian-demo
jovian install
```

Jovian uses conda internally, so make sure you have it installed before running the above commands. Once the libraries are installed, you can activate the environment and start Jupyter in the usual way:

```
conda activate jovian-demo
jupyter notebook
```

In this way, Jovian seamlessly ensures the end-to-end reproducibility of your Jupyter notebooks.

Updating existing notebooks is really easy too! Just run `jovian.commit`

once again, and Jovian will automatically identify and update the current notebook on your Jovian account.

In [5]:

```
# Updating the notebook
jovian.commit()
```

```
[jovian] Saving notebook..
```

```
[jovian] Updating notebook "50fa86388c8b42dcbc4b12ee6a8d372e" on https://jovian.ml/
[jovian] Uploading notebook..
[jovian] Capturing environment..
[jovian] Committed successfully! https://jovian.ml/sammarhashmi/jovian-demo-50fa8
```

Jovian keeps track of existing notebooks using a `.jovianrc`

file next to your notebook. If you don't want to update the current notebook, but create a new notebook instead, simply delete the `.jovianrc`

file. Note that if you rename your notebook, Jovian will upload a new notebooko when you commit, instead of updating the old one.

If you run into issues with updating a notebook, or want to replace a notebook in your account using a new/renamed notebook, you can provide the `notebook_id`

argument to `jovian.commit`

.

In [6]:

`jovian.commit(notebook_id="903a04b17036436b843d70443ef5d7ad")`

```
[jovian] Saving notebook..
```

```
[jovian] Creating a new notebook on https://jovian.ml/
[jovian] Uploading notebook..
[jovian] Capturing environment..
[jovian] Committed successfully! https://jovian.ml/sammarhashmi/jovian-demo-6d5d6
```

Once a notebook has been updated, the new changes can be retrieved at any cloned location using the `jovian pull`

command.

```
cd jovian-demo # Enter cloned directory
jovian pull # Pull the latest changes
```

You can also include additional files like Python scripts and output files while committing a notebook to Jovian, using the `files`

and `artifacts`

arguments.

- Use
`files`

to include python scripts, input CSVs and anything else you need to execute your notebook - Use
`artifacts`

to include the outputs of your notebooks (trained models, output images, CSVs etc.)

Let's look at an example. I'm going to use a function called `sigmoid`

imported from a Python script called `utils.py`

.

In [ ]:

```
import numpy as np
from utils import sigmoid
inputs = np.array([1, 2, 3, 4, 5, 6, 7, 8])
outputs = sigmoid(inputs)
print(outputs)
np.savetxt("outputs.csv", outputs, delimiter=",")
```

In [ ]:

`!cat utils.py`

In [ ]:

`!cat outputs.csv`

In [10]:

`jovian.commit(files=['utils.py'], artifacts=['outputs.csv'])`

```
[jovian] Saving notebook..
```

```
[jovian] Updating notebook "6d5d65f285104b8986b92d6a3a63d3a9" on https://jovian.ml/
[jovian] Uploading notebook..
[jovian] Capturing environment..
[jovian] Uploading additional files..
[jovian] Uploading artifacts..
[jovian] Committed successfully! https://jovian.ml/sammarhashmi/jovian-demo-6d5d6
```

In [11]:

`import torch`

In [12]:

`t1 = torch.tensor(4.)`

In [13]:

```
t1
```

Out[13]:

`tensor(4.)`

In [14]:

`t1.dtype`

Out[14]:

`torch.float32`

In [15]:

```
# vector
t2 = torch.tensor([1.,2,3,4])
t2
```

Out[15]:

`tensor([1., 2., 3., 4.])`

In [16]:

```
# Matrix
t3 = torch.tensor([[5., 6], [7, 8], [9, 10]])
t3
```

Out[16]:

```
tensor([[ 5., 6.],
[ 7., 8.],
[ 9., 10.]])
```

In [17]:

```
# 3-dimensional array
t4 = torch.tensor([[[11 , 12 , 13], [13 , 14, 15]], [[15 , 16 , 17], [17 , 18 , 19]]])
t4
```

Out[17]:

```
tensor([[[11, 12, 13],
[13, 14, 15]],
[[15, 16, 17],
[17, 18, 19]]])
```

In [18]:

`t1.shape`

Out[18]:

`torch.Size([])`

In [19]:

`t2.shape`

Out[19]:

`torch.Size([4])`

In [20]:

`t3.shape`

Out[20]:

`torch.Size([3, 2])`

In [21]:

`t4.shape`

Out[21]:

`torch.Size([2, 2, 3])`

In [22]:

```
# Create tensors. []
x = torch.tensor(3.)
w = torch.tensor(4., requires_grad=True)
b = torch.tensor(5., requires_grad=True)
```

In [23]:

```
# Arithmetic operations
y = w * x + b
y
```

Out[23]:

`tensor(17., grad_fn=<AddBackward0>)`

In [24]:

```
# Compute derivatives
y.backward()
```

In [25]:

```
# Display gradients
print('dy/dx:', x.grad)
print('dy/dw:', w.grad)
print('dy/db:', b.grad)
```

```
dy/dx: None
dy/dw: tensor(3.)
dy/db: tensor(1.)
```

Numpy is a popular open source library used for mathematical and scientific computing in Python. It enables efficient operations on large multi-dimensional arrays, and has a large ecosystem of supporting libraries:

Matplotlib for plotting and visualization OpenCV for image and video processing Pandas for file I/O and data analysis Instead of reinventing the wheel, PyTorch interoperates really well with Numpy to leverage its existing ecosystem of tools and libraries.

In [26]:

```
import numpy as np
x = np.array([[1, 2], [3, 4.]])
x
```

Out[26]:

```
array([[1., 2.],
[3., 4.]])
```

We can convert a Numpy array to a PyTorch tensor using torch.from_numpy.

In [27]:

```
# Convert the numpy array to a torch tensor.
y = torch.tensor(x)
y
```

Out[27]:

```
tensor([[1., 2.],
[3., 4.]], dtype=torch.float64)
```

Let's verify that the numpy array and torch tensor have similar data types.

In [28]:

`x.dtype, y.dtype`

Out[28]:

`(dtype('float64'), torch.float64)`

We can convert a PyTorch tensor to a Numpy array using the .numpy method of a tensor.

In [29]:

```
# Convert a torch tensor to a numpy array
z = y.numpy()
z
```

Out[29]:

```
array([[1., 2.],
[3., 4.]])
```

The interoperability between PyTorch and Numpy is really important because most datasets you'll work with will likely be read and preprocessed as Numpy arrays.

As a final step, we can save and commit out work using the jovian library.

In [30]:

`import jovian`

In [31]:

`jovian.commit()`

```
[jovian] Saving notebook..
```

```
[jovian] Updating notebook "6d5d65f285104b8986b92d6a3a63d3a9" on https://jovian.ml/
[jovian] Uploading notebook..
[jovian] Capturing environment..
[jovian] Committed successfully! https://jovian.ml/sammarhashmi/jovian-demo-6d5d6
```

Tensors in PyTorch support a variety of operations, and what we've covered here is by no means exhaustive. You can learn more about tensors and tensor operations here: https://pytorch.org/docs/stable/tensors.html

You can take advantage of the interactive Jupyter environment to experiment with tensors and try different combinations of operations discussed above. Here are some things to try out:

What if one or more x, w or b were matrices, instead of numbers, in the above example? What would the result y and the gradients w.grad and b.grad look like in this case?

What if y was a matrix created using torch.tensor, with each element of the matrix expressed as a combination of numeric tensors x, w and b?

What if we had a chain of operations instead of just one i.e. y = x * w + b, z = l * y + m, w = c * z + d and so on? What would calling w.grad do?

If you're interested, you can learn more about matrix derivates on Wikipedia (although it's not necessary for following along with this series of tutorials): https://en.wikipedia.org/wiki/Matrix_calculus#Derivatives_with_matrices

In [ ]:

`jovian.commit()`

```
[jovian] Saving notebook..
```

In [ ]:

` `