Learn practical skills, build real-world projects, and advance your career

Implementing the Gradient Descent Algorithm

In this lab, we'll implement the basic functions of the Gradient Descent algorithm to find the boundary in a small dataset. First, we'll start with some functions that will help us plot and visualize the data.

import matplotlib.pyplot as plt
import numpy as np
import pandas as pd

#Some helper functions for plotting and drawing lines

def plot_points(X, y):
    admitted = X[np.argwhere(y==1)]
    rejected = X[np.argwhere(y==0)]
    plt.scatter([s[0][0] for s in rejected], [s[0][1] for s in rejected], s = 25, color = 'blue', edgecolor = 'k')
    plt.scatter([s[0][0] for s in admitted], [s[0][1] for s in admitted], s = 25, color = 'red', edgecolor = 'k')

def display(m, b, color='g--'):
    plt.xlim(-0.05,1.05)
    plt.ylim(-0.05,1.05)
    x = np.arange(-10, 10, 0.1)
    plt.plot(x, m*x+b, color)

Reading and plotting the data

data = pd.read_csv('data.csv', header=None)
X = np.array(data[[0,1]])
y = np.array(data[2])
plot_points(X,y)
plt.show()
data.head()
Notebook Image

TODO: Implementing the basic functions

Here is your turn to shine. Implement the following formulas, as explained in the text.

  • Sigmoid activation function

σ(x)=11+ex\sigma(x) = \frac{1}{1+e^{-x}}

  • Output (prediction) formula

y^=σ(w1x1+w2x2+b)\hat{y} = \sigma(w_1 x_1 + w_2 x_2 + b)

  • Error function

Error(y,y^)=ylog(y^)(1y)log(1y^)Error(y, \hat{y}) = - y \log(\hat{y}) - (1-y) \log(1-\hat{y})

  • The function that updates the weights

wiwi+α(yy^)xi w_i \longrightarrow w_i + \alpha (y - \hat{y}) x_i

bb+α(yy^) b \longrightarrow b + \alpha (y - \hat{y})