Learn data science and machine learning by building real-world projects on Jovian

10 Monkey Species Classification using Logistic Regression in PyTorch

Dataset can be downloaded from Kaggle: https://www.kaggle.com/slothkong/10-monkey-species

The dataset contains images of 10 monkey species which includes:

  • n0 --> alouattapalliata
  • n1 --> erythrocebuspatas
  • n2 --> cacajaocalvus
  • n3 --> macacafuscata
  • n4 --> cebuellapygmea
  • n5 --> cebuscapucinus
  • n6 --> micoargentatus
  • n7 --> saimirisciureus
  • n8 --> aotusnigriceps
  • n9 --> trachypithecusjohnii

There are two files in the dataset training and validation files. Both training and validation folder contains 10 subfolders labelled as n0-n9, representing a species of monkey as labeled above.

Each images are at least 400 x 300 px [JPEG format]

  • Total number of images in training folder: 1096
  • Total number of images in validation folder: 272
In [1]:
# Import relevant libraries

import torch
import jovian
import torchvision
import torchvision.transforms as transforms
import torch.nn as nn
import pandas as pd
import numpy as np
import matplotlib
import matplotlib.pyplot as plt
import seaborn as sns
import torch.nn.functional as F
from torchvision.datasets.utils import download_url
from torch.utils.data import DataLoader, TensorDataset, random_split

from PIL import Image
import glob
In [2]:
project_name='10-Monkey-Species-Classification' # will be used by jovian.commit
In [3]:
jovian.commit(project=project_name)
[jovian] Attempting to save notebook.. [jovian] Updating notebook "karthicksothivelr/10-monkey-species-classification" on https://jovian.ml/ [jovian] Uploading notebook.. [jovian] Capturing environment.. [jovian] Committed successfully! https://jovian.ml/karthicksothivelr/10-monkey-species-classification
In [4]:
# Hyperparameters
batch_size = 16
learning_rate = 1e-3

jovian.reset()
jovian.log_hyperparams(batch_size=batch_size, learning_rate=learning_rate)
[jovian] Hyperparams logged.

Step 1: Load and Explore the data

In [5]:
# Load image and convert image to multidimensional array
def image_to_array(images_folder):
    dataset = []
    for i in range(10):
        for filename in glob.glob(images_folder + "/n{}/*.jpg".format(i)):
            im = Image.open(filename)
            im = im.resize((400,300))
            pixels = np.asarray(im).astype('float32')
            pixels /= 255.0
            pixels = torch.from_numpy(pixels)
            dataset.append((pixels, i))
    return dataset
In [6]:
# Load Training Data
train_dataset = image_to_array("monkey_species/training/training")

# Load Test Data
test_dataset = image_to_array("monkey_species/validation/validation")

In [7]:
# View a sample Image
img_tensor, label = train_dataset[0]
print(img_tensor.shape)

plt.imshow(img_tensor)
print('Label:', label)
torch.Size([300, 400, 3]) Label: 0
Notebook Image
In [8]:
# View a sample Image
img_tensor, label = test_dataset[100]
print(img_tensor.shape)

plt.imshow(img_tensor)
print('Label:', label)
torch.Size([300, 400, 3]) Label: 3
Notebook Image
In [9]:
print("Total number of Training Data: ", len(train_dataset), "\nTotal number of Test Data: ", len(test_dataset))
Total number of Training Data: 1096 Total number of Test Data: 272
In [10]:
label_dict = {0:"alouattapalliata", 1:"erythrocebuspatas", 2:"cacajaocalvus", 3:"macacafuscata",
             4:"cebuellapygmea", 5:"cebuscapucinus", 6:"micoargentatus", 7:"saimirisciureus",
             8:"aotusnigriceps", 9:"trachypithecusjohnii"}
In [11]:
jovian.commit(project=project_name)
[jovian] Attempting to save notebook.. [jovian] Updating notebook "karthicksothivelr/10-monkey-species-classification" on https://jovian.ml/ [jovian] Uploading notebook.. [jovian] Capturing environment.. [jovian] Attaching records (metrics, hyperparameters, dataset etc.) [jovian] Committed successfully! https://jovian.ml/karthicksothivelr/10-monkey-species-classification

Prepare the dataset for training

In [12]:
# Training and Validation dataset
val_size = round(0.2*len(train_dataset))
train_size = len(train_dataset) - val_size
train_ds, val_ds = random_split(train_dataset, [train_size, val_size])

# Dataloaders
train_loader = DataLoader(train_ds, batch_size=batch_size, shuffle=True)
val_loader = DataLoader(val_ds, batch_size=batch_size)
test_loader = DataLoader(test_dataset, batch_size=batch_size)
In [13]:
# Verify batch
for xb, yb in train_loader:
    print("inputs:", xb)
    print("targets:", yb)
    break
inputs: tensor([[[[0.0078, 0.0196, 0.0000], [0.0039, 0.0118, 0.0000], [0.0039, 0.0118, 0.0000], ..., [0.0078, 0.0039, 0.0157], [0.0039, 0.0078, 0.0118], [0.0000, 0.0157, 0.0039]], [[0.0078, 0.0157, 0.0039], [0.0078, 0.0157, 0.0039], [0.0078, 0.0157, 0.0039], ..., [0.0078, 0.0000, 0.0039], [0.0078, 0.0078, 0.0118], [0.0039, 0.0118, 0.0078]], [[0.0078, 0.0157, 0.0118], [0.0039, 0.0118, 0.0078], [0.0039, 0.0118, 0.0039], ..., [0.0275, 0.0157, 0.0039], [0.0078, 0.0078, 0.0078], [0.0039, 0.0078, 0.0078]], ..., [[0.4902, 0.3647, 0.1647], [0.4039, 0.2902, 0.1137], [0.2980, 0.2078, 0.0667], ..., [0.0157, 0.0157, 0.0078], [0.0078, 0.0078, 0.0078], [0.0039, 0.0118, 0.0078]], [[0.5059, 0.3804, 0.1804], [0.4549, 0.3373, 0.1608], [0.3451, 0.2431, 0.0980], ..., [0.0157, 0.0157, 0.0078], [0.0078, 0.0078, 0.0078], [0.0039, 0.0118, 0.0078]], [[0.4314, 0.3216, 0.1176], [0.4667, 0.3569, 0.1725], [0.3765, 0.2706, 0.1216], ..., [0.0118, 0.0118, 0.0000], [0.0039, 0.0078, 0.0039], [0.0039, 0.0118, 0.0078]]], [[[0.9961, 0.9961, 0.9961], [0.9961, 0.9961, 0.9961], [0.9961, 0.9961, 0.9961], ..., [0.9922, 0.9922, 0.9922], [0.9922, 0.9922, 0.9922], [0.9882, 0.9882, 0.9882]], [[0.9961, 0.9961, 0.9961], [0.9961, 0.9961, 0.9961], [0.9961, 0.9961, 0.9961], ..., [0.9922, 0.9922, 0.9922], [0.9922, 0.9922, 0.9922], [0.9882, 0.9882, 0.9882]], [[0.9961, 0.9961, 0.9961], [0.9961, 0.9961, 0.9961], [0.9961, 0.9961, 0.9961], ..., [0.9922, 0.9922, 0.9922], [0.9922, 0.9922, 0.9922], [0.9882, 0.9882, 0.9882]], ..., [[0.9961, 0.9961, 0.9961], [0.9961, 0.9961, 0.9961], [0.9961, 0.9961, 0.9961], ..., [0.9922, 0.9922, 0.9922], [0.9922, 0.9922, 0.9922], [0.9882, 0.9882, 0.9882]], [[0.9961, 0.9961, 0.9961], [0.9961, 0.9961, 0.9961], [0.9961, 0.9961, 0.9961], ..., [0.9922, 0.9922, 0.9922], [0.9922, 0.9922, 0.9922], [0.9882, 0.9882, 0.9882]], [[0.9961, 0.9961, 0.9961], [0.9961, 0.9961, 0.9961], [0.9961, 0.9961, 0.9961], ..., [0.9922, 0.9922, 0.9922], [0.9922, 0.9922, 0.9922], [0.9882, 0.9882, 0.9882]]], [[[0.8471, 0.8667, 0.8784], [0.8588, 0.8784, 0.8902], [0.8667, 0.8863, 0.8980], ..., [0.8510, 0.8706, 0.8863], [0.8431, 0.8627, 0.8784], [0.8471, 0.8667, 0.8824]], [[0.8588, 0.8784, 0.8902], [0.8627, 0.8824, 0.8941], [0.8627, 0.8824, 0.8941], ..., [0.8510, 0.8706, 0.8863], [0.8431, 0.8627, 0.8784], [0.8471, 0.8667, 0.8824]], [[0.8706, 0.8902, 0.9020], [0.8667, 0.8863, 0.8980], [0.8667, 0.8863, 0.8980], ..., [0.8510, 0.8706, 0.8863], [0.8431, 0.8627, 0.8784], [0.8471, 0.8667, 0.8824]], ..., [[0.3490, 0.3137, 0.2941], [0.3373, 0.3020, 0.2863], [0.3490, 0.3098, 0.3020], ..., [0.2627, 0.2196, 0.2078], [0.2863, 0.2431, 0.2314], [0.2784, 0.2353, 0.2196]], [[0.3569, 0.3216, 0.3020], [0.3529, 0.3176, 0.2980], [0.3529, 0.3176, 0.3059], ..., [0.3333, 0.3098, 0.3255], [0.3608, 0.3373, 0.3529], [0.3176, 0.2980, 0.3137]], [[0.3725, 0.3333, 0.3373], [0.3647, 0.3255, 0.3255], [0.3529, 0.3137, 0.3059], ..., [0.3725, 0.3569, 0.3686], [0.4078, 0.4000, 0.4157], [0.3843, 0.3804, 0.4000]]], ..., [[[0.4902, 0.6784, 0.0471], [0.4980, 0.6745, 0.0510], [0.5098, 0.6784, 0.0549], ..., [0.2471, 0.3569, 0.0588], [0.2392, 0.3294, 0.0627], [0.2118, 0.3020, 0.0824]], [[0.4980, 0.6784, 0.0549], [0.5059, 0.6824, 0.0627], [0.5098, 0.6863, 0.0745], ..., [0.2196, 0.3333, 0.0353], [0.2078, 0.3059, 0.0314], [0.1922, 0.2902, 0.0471]], [[0.5098, 0.6902, 0.0549], [0.5059, 0.6941, 0.0588], [0.5059, 0.6941, 0.0824], ..., [0.1961, 0.3176, 0.0118], [0.1882, 0.2980, 0.0157], [0.1843, 0.2980, 0.0235]], ..., [[0.5569, 0.5725, 0.4392], [0.4392, 0.4353, 0.3137], [0.3569, 0.3294, 0.2314], ..., [0.2078, 0.2118, 0.1294], [0.2627, 0.2745, 0.1961], [0.3255, 0.3451, 0.2588]], [[0.5490, 0.5765, 0.4588], [0.4627, 0.4706, 0.3686], [0.3882, 0.3765, 0.2863], ..., [0.1608, 0.1647, 0.0588], [0.2000, 0.1765, 0.1059], [0.2235, 0.1922, 0.1294]], [[0.5412, 0.5882, 0.4824], [0.4627, 0.4863, 0.4235], [0.4039, 0.4039, 0.3333], ..., [0.1804, 0.2510, 0.1137], [0.2078, 0.2157, 0.1216], [0.1725, 0.1373, 0.0667]]], [[[0.0235, 0.0235, 0.0000], [0.0118, 0.0196, 0.0039], [0.0118, 0.0275, 0.0118], ..., [0.1765, 0.1843, 0.1020], [0.3608, 0.3569, 0.2745], [0.3765, 0.3686, 0.2863]], [[0.0275, 0.0235, 0.0000], [0.0392, 0.0392, 0.0235], [0.0235, 0.0275, 0.0157], ..., [0.1961, 0.1843, 0.1059], [0.3725, 0.3608, 0.2784], [0.4392, 0.4196, 0.3373]], [[0.1294, 0.1059, 0.0745], [0.0471, 0.0314, 0.0118], [0.0314, 0.0314, 0.0118], ..., [0.2549, 0.2314, 0.1490], [0.3647, 0.3373, 0.2510], [0.4275, 0.3961, 0.3098]], ..., [[0.4078, 0.4941, 0.2510], [0.4549, 0.5451, 0.2941], [0.4706, 0.5725, 0.3137], ..., [0.2706, 0.3882, 0.1412], [0.2863, 0.4118, 0.1608], [0.2745, 0.4039, 0.1569]], [[0.4392, 0.5255, 0.2941], [0.4706, 0.5647, 0.3137], [0.4627, 0.5686, 0.3059], ..., [0.2549, 0.3647, 0.1137], [0.2784, 0.3843, 0.1373], [0.2902, 0.3882, 0.1490]], [[0.4392, 0.5294, 0.2941], [0.4784, 0.5765, 0.3255], [0.4784, 0.5843, 0.3216], ..., [0.2314, 0.3412, 0.0863], [0.2784, 0.3725, 0.1176], [0.3020, 0.3804, 0.1333]]], [[[0.4549, 0.5020, 0.3451], [0.5255, 0.5961, 0.4039], [0.5647, 0.6784, 0.4039], ..., [0.3647, 0.6471, 0.0235], [0.3333, 0.6392, 0.0000], [0.3412, 0.6353, 0.0118]], [[0.3608, 0.4588, 0.2510], [0.4431, 0.5529, 0.3647], [0.5333, 0.6549, 0.4863], ..., [0.3490, 0.6314, 0.0078], [0.3373, 0.6392, 0.0000], [0.3451, 0.6275, 0.0118]], [[0.4078, 0.5529, 0.1882], [0.4549, 0.5922, 0.2431], [0.4863, 0.6275, 0.2745], ..., [0.3529, 0.6392, 0.0118], [0.3333, 0.6275, 0.0039], [0.3412, 0.6118, 0.0039]], ..., [[1.0000, 1.0000, 1.0000], [1.0000, 1.0000, 1.0000], [1.0000, 1.0000, 1.0000], ..., [0.1922, 0.1765, 0.0627], [0.1804, 0.1804, 0.0588], [0.1961, 0.1686, 0.0549]], [[1.0000, 1.0000, 1.0000], [1.0000, 1.0000, 1.0000], [1.0000, 1.0000, 1.0000], ..., [0.1255, 0.2118, 0.0314], [0.1412, 0.2235, 0.0431], [0.1569, 0.2078, 0.0392]], [[1.0000, 1.0000, 1.0000], [1.0000, 1.0000, 1.0000], [1.0000, 1.0000, 1.0000], ..., [0.0627, 0.2078, 0.0000], [0.0706, 0.2118, 0.0078], [0.0745, 0.2078, 0.0118]]]]) targets: tensor([1, 8, 3, 6, 4, 4, 9, 3, 2, 1, 1, 8, 4, 5, 5, 2])
In [14]:
jovian.commit(project=project_name)
[jovian] Attempting to save notebook.. [jovian] Updating notebook "karthicksothivelr/10-monkey-species-classification" on https://jovian.ml/ [jovian] Uploading notebook.. [jovian] Capturing environment.. [jovian] Attaching records (metrics, hyperparameters, dataset etc.) [jovian] Committed successfully! https://jovian.ml/karthicksothivelr/10-monkey-species-classification

Create Logistic Regression Model

In [15]:
input_size = 300*400*3
num_classes = len(label_dict)
print("Input Size: ", input_size, "\nNumber of Classes: ", num_classes)
Input Size: 360000 Number of Classes: 10
In [16]:
class MonkeyClassificationModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.linear = nn.Linear(input_size, num_classes)
        
    def forward(self, xb):
        xb = xb.reshape(-1, input_size)
        out = self.linear(xb)
        return out
    
    def training_step(self, batch):
        images, labels = batch 
        out = self(images)                  # Generate predictions
        loss = F.cross_entropy(out, labels) # Calculate loss
        return loss
    
    def validation_step(self, batch):
        images, labels = batch 
        out = self(images)                    # Generate predictions
        loss = F.cross_entropy(out, labels)   # Calculate loss
        acc = accuracy(out, labels)           # Calculate accuracy
        return {'val_loss': loss.detach(), 'val_acc': acc.detach()}
        
    def validation_epoch_end(self, outputs):
        batch_losses = [x['val_loss'] for x in outputs]
        epoch_loss = torch.stack(batch_losses).mean()   # Combine losses
        batch_accs = [x['val_acc'] for x in outputs]
        epoch_acc = torch.stack(batch_accs).mean()      # Combine accuracies
        return {'val_loss': epoch_loss.item(), 'val_acc': epoch_acc.item()}
    
    def epoch_end(self, epoch, result):
        print("Epoch [{}], val_loss: {:.4f}, val_acc: {:.4f}".format(epoch, result['val_loss'], result['val_acc']))
    
model = MonkeyClassificationModel()
list(model.parameters())
Out[16]:
[Parameter containing:
 tensor([[-1.1180e-03,  1.3643e-04,  8.6210e-04,  ...,  1.3941e-03,
          -5.1687e-04,  1.5668e-03],
         [ 8.7920e-04,  5.3320e-04, -1.4293e-03,  ..., -1.2301e-03,
          -5.1245e-04,  4.4275e-04],
         [ 7.4490e-04,  5.3269e-04,  4.8095e-04,  ..., -8.6895e-04,
           8.3951e-05,  5.7988e-04],
         ...,
         [-6.2974e-04,  4.3498e-04,  8.9195e-04,  ...,  1.5886e-03,
          -9.1369e-04, -1.0547e-03],
         [ 1.0745e-03,  1.5523e-04, -1.2529e-03,  ..., -2.4892e-04,
           1.7336e-04,  3.8140e-04],
         [ 1.3334e-03, -7.6073e-04, -1.1721e-03,  ...,  5.7532e-04,
          -9.2551e-04,  8.2609e-04]], requires_grad=True),
 Parameter containing:
 tensor([-9.0110e-04,  1.6071e-03, -9.8587e-04,  1.1254e-03,  1.3491e-03,
         -2.8277e-05,  5.0024e-04, -4.8811e-04,  2.1660e-04,  1.5862e-03],
        requires_grad=True)]
In [17]:
jovian.commit(project=project_name)
[jovian] Attempting to save notebook.. [jovian] Updating notebook "karthicksothivelr/10-monkey-species-classification" on https://jovian.ml/ [jovian] Uploading notebook.. [jovian] Capturing environment.. [jovian] Attaching records (metrics, hyperparameters, dataset etc.) [jovian] Committed successfully! https://jovian.ml/karthicksothivelr/10-monkey-species-classification

Train the model

In [18]:
def accuracy(outputs, labels):
    _, preds = torch.max(outputs, dim=1)
    return torch.tensor(torch.sum(preds == labels).item() / len(preds))

def evaluate(model, val_loader):
    outputs = [model.validation_step(batch) for batch in val_loader]
    return model.validation_epoch_end(outputs)

def fit(epochs, lr, model, train_loader, val_loader, opt_func=torch.optim.SGD):
    history = []
    optimizer = opt_func(model.parameters(), lr)
    for epoch in range(epochs):
        # Training Phase 
        for batch in train_loader:
            loss = model.training_step(batch)
            loss.backward()
            optimizer.step()
            optimizer.zero_grad()
        # Validation phase
        result = evaluate(model, val_loader)
        model.epoch_end(epoch, result)
        history.append(result)
    return history
In [19]:
history1 = fit(100, learning_rate, model, train_loader, val_loader)
history2 = fit(100, learning_rate/10, model, train_loader, val_loader)
history3 = fit(100, learning_rate/10, model, train_loader, val_loader)
history4 = fit(100, learning_rate/100, model, train_loader, val_loader)
history5 = fit(100, learning_rate/1000, model, train_loader, val_loader)

history = history1 + history2 + history3 + history4 + history5

accuracies = [r['val_acc'] for r in history]
plt.plot(accuracies, '-x')
plt.xlabel('epoch')
plt.ylabel('accuracy')
plt.title('Accuracy vs. No. of epochs')
Epoch [0], val_loss: 25.2372, val_acc: 0.1205 Epoch [1], val_loss: 13.3127, val_acc: 0.1538 Epoch [2], val_loss: 6.6860, val_acc: 0.3300 Epoch [3], val_loss: 6.0765, val_acc: 0.2942 Epoch [4], val_loss: 5.1709, val_acc: 0.3519 Epoch [5], val_loss: 8.1948, val_acc: 0.2358 Epoch [6], val_loss: 5.7293, val_acc: 0.2898 Epoch [7], val_loss: 19.4877, val_acc: 0.1027 Epoch [8], val_loss: 5.5625, val_acc: 0.4075 Epoch [9], val_loss: 7.4576, val_acc: 0.2808 Epoch [10], val_loss: 4.0401, val_acc: 0.3697 Epoch [11], val_loss: 7.3042, val_acc: 0.2407 Epoch [12], val_loss: 4.8840, val_acc: 0.3584 Epoch [13], val_loss: 3.7030, val_acc: 0.4144 Epoch [14], val_loss: 6.7799, val_acc: 0.4115 Epoch [15], val_loss: 2.6010, val_acc: 0.4610 Epoch [16], val_loss: 2.5675, val_acc: 0.5077 Epoch [17], val_loss: 2.7752, val_acc: 0.4570 Epoch [18], val_loss: 2.5450, val_acc: 0.4858 Epoch [19], val_loss: 2.4448, val_acc: 0.4773 Epoch [20], val_loss: 3.5451, val_acc: 0.4278 Epoch [21], val_loss: 2.3667, val_acc: 0.5369 Epoch [22], val_loss: 2.2240, val_acc: 0.5483 Epoch [23], val_loss: 2.2162, val_acc: 0.5373 Epoch [24], val_loss: 2.5923, val_acc: 0.4968 Epoch [25], val_loss: 2.3860, val_acc: 0.4834 Epoch [26], val_loss: 2.2912, val_acc: 0.5170 Epoch [27], val_loss: 2.2509, val_acc: 0.5304 Epoch [28], val_loss: 2.2687, val_acc: 0.5170 Epoch [29], val_loss: 2.4311, val_acc: 0.5170 Epoch [30], val_loss: 2.2086, val_acc: 0.5239 Epoch [31], val_loss: 2.2614, val_acc: 0.5146 Epoch [32], val_loss: 2.1933, val_acc: 0.5260 Epoch [33], val_loss: 2.2856, val_acc: 0.5191 Epoch [34], val_loss: 2.2664, val_acc: 0.5170 Epoch [35], val_loss: 2.2471, val_acc: 0.5349 Epoch [36], val_loss: 2.2716, val_acc: 0.5126 Epoch [37], val_loss: 2.2541, val_acc: 0.5215 Epoch [38], val_loss: 2.1796, val_acc: 0.5170 Epoch [39], val_loss: 2.1934, val_acc: 0.5304 Epoch [40], val_loss: 2.2571, val_acc: 0.5170 Epoch [41], val_loss: 2.1864, val_acc: 0.5304 Epoch [42], val_loss: 2.2450, val_acc: 0.5260 Epoch [43], val_loss: 2.2150, val_acc: 0.5394 Epoch [44], val_loss: 2.2001, val_acc: 0.5239 Epoch [45], val_loss: 2.2356, val_acc: 0.5304 Epoch [46], val_loss: 2.2235, val_acc: 0.5304 Epoch [47], val_loss: 2.2193, val_acc: 0.5260 Epoch [48], val_loss: 2.1926, val_acc: 0.5373 Epoch [49], val_loss: 2.1807, val_acc: 0.5394 Epoch [50], val_loss: 2.2260, val_acc: 0.5349 Epoch [51], val_loss: 2.2406, val_acc: 0.5215 Epoch [52], val_loss: 2.2228, val_acc: 0.5394 Epoch [53], val_loss: 2.2251, val_acc: 0.5304 Epoch [54], val_loss: 2.2558, val_acc: 0.5126 Epoch [55], val_loss: 2.1971, val_acc: 0.5373 Epoch [56], val_loss: 2.1817, val_acc: 0.5438 Epoch [57], val_loss: 2.1768, val_acc: 0.5438 Epoch [58], val_loss: 2.1746, val_acc: 0.5349 Epoch [59], val_loss: 2.1802, val_acc: 0.5438 Epoch [60], val_loss: 2.2360, val_acc: 0.5394 Epoch [61], val_loss: 2.1894, val_acc: 0.5394 Epoch [62], val_loss: 2.1831, val_acc: 0.5394 Epoch [63], val_loss: 2.2016, val_acc: 0.5349 Epoch [64], val_loss: 2.2062, val_acc: 0.5349 Epoch [65], val_loss: 2.1923, val_acc: 0.5329 Epoch [66], val_loss: 2.2215, val_acc: 0.5394 Epoch [67], val_loss: 2.2155, val_acc: 0.5349 Epoch [68], val_loss: 2.2136, val_acc: 0.5304 Epoch [69], val_loss: 2.2060, val_acc: 0.5394 Epoch [70], val_loss: 2.1972, val_acc: 0.5329 Epoch [71], val_loss: 2.1949, val_acc: 0.5438 Epoch [72], val_loss: 2.1999, val_acc: 0.5438 Epoch [73], val_loss: 2.2065, val_acc: 0.5438 Epoch [74], val_loss: 2.1861, val_acc: 0.5483 Epoch [75], val_loss: 2.2072, val_acc: 0.5418 Epoch [76], val_loss: 2.1873, val_acc: 0.5438 Epoch [77], val_loss: 2.2134, val_acc: 0.5483 Epoch [78], val_loss: 2.1974, val_acc: 0.5304 Epoch [79], val_loss: 2.2064, val_acc: 0.5284 Epoch [80], val_loss: 2.2036, val_acc: 0.5394 Epoch [81], val_loss: 2.1978, val_acc: 0.5528 Epoch [82], val_loss: 2.2003, val_acc: 0.5438 Epoch [83], val_loss: 2.2028, val_acc: 0.5438 Epoch [84], val_loss: 2.2042, val_acc: 0.5483 Epoch [85], val_loss: 2.1934, val_acc: 0.5329 Epoch [86], val_loss: 2.2071, val_acc: 0.5304 Epoch [87], val_loss: 2.1927, val_acc: 0.5394 Epoch [88], val_loss: 2.2268, val_acc: 0.5394 Epoch [89], val_loss: 2.2220, val_acc: 0.5438 Epoch [90], val_loss: 2.2168, val_acc: 0.5483 Epoch [91], val_loss: 2.2027, val_acc: 0.5438 Epoch [92], val_loss: 2.2140, val_acc: 0.5483 Epoch [93], val_loss: 2.2044, val_acc: 0.5528 Epoch [94], val_loss: 2.2144, val_acc: 0.5438 Epoch [95], val_loss: 2.2250, val_acc: 0.5438 Epoch [96], val_loss: 2.2202, val_acc: 0.5373 Epoch [97], val_loss: 2.2098, val_acc: 0.5438 Epoch [98], val_loss: 2.2305, val_acc: 0.5438 Epoch [99], val_loss: 2.2194, val_acc: 0.5483 Epoch [0], val_loss: 2.2159, val_acc: 0.5438 Epoch [1], val_loss: 2.2149, val_acc: 0.5438 Epoch [2], val_loss: 2.2142, val_acc: 0.5438 Epoch [3], val_loss: 2.2147, val_acc: 0.5438 Epoch [4], val_loss: 2.2143, val_acc: 0.5438 Epoch [5], val_loss: 2.2139, val_acc: 0.5438 Epoch [6], val_loss: 2.2141, val_acc: 0.5438 Epoch [7], val_loss: 2.2148, val_acc: 0.5438 Epoch [8], val_loss: 2.2147, val_acc: 0.5438 Epoch [9], val_loss: 2.2142, val_acc: 0.5438 Epoch [10], val_loss: 2.2141, val_acc: 0.5438 Epoch [11], val_loss: 2.2144, val_acc: 0.5438 Epoch [12], val_loss: 2.2144, val_acc: 0.5438 Epoch [13], val_loss: 2.2147, val_acc: 0.5438 Epoch [14], val_loss: 2.2148, val_acc: 0.5438 Epoch [15], val_loss: 2.2153, val_acc: 0.5438 Epoch [16], val_loss: 2.2153, val_acc: 0.5438 Epoch [17], val_loss: 2.2146, val_acc: 0.5438 Epoch [18], val_loss: 2.2151, val_acc: 0.5438 Epoch [19], val_loss: 2.2148, val_acc: 0.5438 Epoch [20], val_loss: 2.2146, val_acc: 0.5438 Epoch [21], val_loss: 2.2152, val_acc: 0.5438 Epoch [22], val_loss: 2.2149, val_acc: 0.5438 Epoch [23], val_loss: 2.2145, val_acc: 0.5438 Epoch [24], val_loss: 2.2143, val_acc: 0.5438 Epoch [25], val_loss: 2.2142, val_acc: 0.5438 Epoch [26], val_loss: 2.2144, val_acc: 0.5438 Epoch [27], val_loss: 2.2157, val_acc: 0.5438 Epoch [28], val_loss: 2.2157, val_acc: 0.5438 Epoch [29], val_loss: 2.2152, val_acc: 0.5438 Epoch [30], val_loss: 2.2149, val_acc: 0.5438 Epoch [31], val_loss: 2.2157, val_acc: 0.5438 Epoch [32], val_loss: 2.2155, val_acc: 0.5438 Epoch [33], val_loss: 2.2164, val_acc: 0.5438 Epoch [34], val_loss: 2.2161, val_acc: 0.5438 Epoch [35], val_loss: 2.2156, val_acc: 0.5438 Epoch [36], val_loss: 2.2157, val_acc: 0.5438 Epoch [37], val_loss: 2.2155, val_acc: 0.5438 Epoch [38], val_loss: 2.2157, val_acc: 0.5438 Epoch [39], val_loss: 2.2158, val_acc: 0.5438 Epoch [40], val_loss: 2.2150, val_acc: 0.5438 Epoch [41], val_loss: 2.2152, val_acc: 0.5438 Epoch [42], val_loss: 2.2154, val_acc: 0.5438 Epoch [43], val_loss: 2.2152, val_acc: 0.5438 Epoch [44], val_loss: 2.2158, val_acc: 0.5438 Epoch [45], val_loss: 2.2166, val_acc: 0.5438 Epoch [46], val_loss: 2.2159, val_acc: 0.5438 Epoch [47], val_loss: 2.2162, val_acc: 0.5438 Epoch [48], val_loss: 2.2164, val_acc: 0.5438 Epoch [49], val_loss: 2.2168, val_acc: 0.5438 Epoch [50], val_loss: 2.2165, val_acc: 0.5438 Epoch [51], val_loss: 2.2167, val_acc: 0.5438 Epoch [52], val_loss: 2.2171, val_acc: 0.5438 Epoch [53], val_loss: 2.2161, val_acc: 0.5438 Epoch [54], val_loss: 2.2160, val_acc: 0.5438 Epoch [55], val_loss: 2.2163, val_acc: 0.5438 Epoch [56], val_loss: 2.2167, val_acc: 0.5438 Epoch [57], val_loss: 2.2164, val_acc: 0.5438 Epoch [58], val_loss: 2.2164, val_acc: 0.5438 Epoch [59], val_loss: 2.2167, val_acc: 0.5438 Epoch [60], val_loss: 2.2157, val_acc: 0.5438 Epoch [61], val_loss: 2.2159, val_acc: 0.5438 Epoch [62], val_loss: 2.2162, val_acc: 0.5438 Epoch [63], val_loss: 2.2161, val_acc: 0.5438 Epoch [64], val_loss: 2.2157, val_acc: 0.5438 Epoch [65], val_loss: 2.2160, val_acc: 0.5438 Epoch [66], val_loss: 2.2166, val_acc: 0.5438 Epoch [67], val_loss: 2.2170, val_acc: 0.5438 Epoch [68], val_loss: 2.2169, val_acc: 0.5438 Epoch [69], val_loss: 2.2167, val_acc: 0.5438 Epoch [70], val_loss: 2.2174, val_acc: 0.5438 Epoch [71], val_loss: 2.2176, val_acc: 0.5438 Epoch [72], val_loss: 2.2171, val_acc: 0.5438 Epoch [73], val_loss: 2.2178, val_acc: 0.5438 Epoch [74], val_loss: 2.2173, val_acc: 0.5438 Epoch [75], val_loss: 2.2171, val_acc: 0.5438 Epoch [76], val_loss: 2.2172, val_acc: 0.5438 Epoch [77], val_loss: 2.2173, val_acc: 0.5438 Epoch [78], val_loss: 2.2172, val_acc: 0.5438
Epoch [79], val_loss: 2.2177, val_acc: 0.5438 Epoch [80], val_loss: 2.2176, val_acc: 0.5438 Epoch [81], val_loss: 2.2177, val_acc: 0.5438 Epoch [82], val_loss: 2.2180, val_acc: 0.5438 Epoch [83], val_loss: 2.2183, val_acc: 0.5438 Epoch [84], val_loss: 2.2181, val_acc: 0.5438 Epoch [85], val_loss: 2.2183, val_acc: 0.5438 Epoch [86], val_loss: 2.2186, val_acc: 0.5438 Epoch [87], val_loss: 2.2185, val_acc: 0.5438 Epoch [88], val_loss: 2.2177, val_acc: 0.5438 Epoch [89], val_loss: 2.2179, val_acc: 0.5438 Epoch [90], val_loss: 2.2180, val_acc: 0.5438 Epoch [91], val_loss: 2.2182, val_acc: 0.5438 Epoch [92], val_loss: 2.2171, val_acc: 0.5438 Epoch [93], val_loss: 2.2177, val_acc: 0.5438 Epoch [94], val_loss: 2.2192, val_acc: 0.5438 Epoch [95], val_loss: 2.2184, val_acc: 0.5483 Epoch [96], val_loss: 2.2188, val_acc: 0.5438 Epoch [97], val_loss: 2.2179, val_acc: 0.5438 Epoch [98], val_loss: 2.2186, val_acc: 0.5438 Epoch [99], val_loss: 2.2180, val_acc: 0.5438 Epoch [0], val_loss: 2.2179, val_acc: 0.5438 Epoch [1], val_loss: 2.2179, val_acc: 0.5438 Epoch [2], val_loss: 2.2180, val_acc: 0.5438 Epoch [3], val_loss: 2.2185, val_acc: 0.5483 Epoch [4], val_loss: 2.2182, val_acc: 0.5483 Epoch [5], val_loss: 2.2180, val_acc: 0.5438 Epoch [6], val_loss: 2.2184, val_acc: 0.5438 Epoch [7], val_loss: 2.2187, val_acc: 0.5438 Epoch [8], val_loss: 2.2191, val_acc: 0.5438 Epoch [9], val_loss: 2.2199, val_acc: 0.5438 Epoch [10], val_loss: 2.2199, val_acc: 0.5438 Epoch [11], val_loss: 2.2195, val_acc: 0.5438 Epoch [12], val_loss: 2.2191, val_acc: 0.5438 Epoch [13], val_loss: 2.2189, val_acc: 0.5438 Epoch [14], val_loss: 2.2199, val_acc: 0.5438 Epoch [15], val_loss: 2.2191, val_acc: 0.5438 Epoch [16], val_loss: 2.2185, val_acc: 0.5438 Epoch [17], val_loss: 2.2187, val_acc: 0.5438 Epoch [18], val_loss: 2.2192, val_acc: 0.5438 Epoch [19], val_loss: 2.2195, val_acc: 0.5438 Epoch [20], val_loss: 2.2189, val_acc: 0.5438 Epoch [21], val_loss: 2.2195, val_acc: 0.5438 Epoch [22], val_loss: 2.2192, val_acc: 0.5438 Epoch [23], val_loss: 2.2198, val_acc: 0.5483 Epoch [24], val_loss: 2.2201, val_acc: 0.5438 Epoch [25], val_loss: 2.2197, val_acc: 0.5438 Epoch [26], val_loss: 2.2188, val_acc: 0.5483 Epoch [27], val_loss: 2.2194, val_acc: 0.5438 Epoch [28], val_loss: 2.2192, val_acc: 0.5438 Epoch [29], val_loss: 2.2199, val_acc: 0.5483 Epoch [30], val_loss: 2.2199, val_acc: 0.5438 Epoch [31], val_loss: 2.2206, val_acc: 0.5438 Epoch [32], val_loss: 2.2201, val_acc: 0.5438 Epoch [33], val_loss: 2.2208, val_acc: 0.5438 Epoch [34], val_loss: 2.2199, val_acc: 0.5438 Epoch [35], val_loss: 2.2197, val_acc: 0.5438 Epoch [36], val_loss: 2.2204, val_acc: 0.5438 Epoch [37], val_loss: 2.2197, val_acc: 0.5438 Epoch [38], val_loss: 2.2197, val_acc: 0.5438 Epoch [39], val_loss: 2.2198, val_acc: 0.5438 Epoch [40], val_loss: 2.2199, val_acc: 0.5438 Epoch [41], val_loss: 2.2195, val_acc: 0.5438 Epoch [42], val_loss: 2.2198, val_acc: 0.5483 Epoch [43], val_loss: 2.2197, val_acc: 0.5483 Epoch [44], val_loss: 2.2188, val_acc: 0.5483 Epoch [45], val_loss: 2.2201, val_acc: 0.5438 Epoch [46], val_loss: 2.2198, val_acc: 0.5483 Epoch [47], val_loss: 2.2200, val_acc: 0.5483 Epoch [48], val_loss: 2.2197, val_acc: 0.5483 Epoch [49], val_loss: 2.2199, val_acc: 0.5483 Epoch [50], val_loss: 2.2193, val_acc: 0.5483 Epoch [51], val_loss: 2.2197, val_acc: 0.5483 Epoch [52], val_loss: 2.2211, val_acc: 0.5483 Epoch [53], val_loss: 2.2212, val_acc: 0.5438 Epoch [54], val_loss: 2.2199, val_acc: 0.5438 Epoch [55], val_loss: 2.2204, val_acc: 0.5438 Epoch [56], val_loss: 2.2203, val_acc: 0.5438 Epoch [57], val_loss: 2.2206, val_acc: 0.5438 Epoch [58], val_loss: 2.2201, val_acc: 0.5483 Epoch [59], val_loss: 2.2203, val_acc: 0.5483 Epoch [60], val_loss: 2.2210, val_acc: 0.5438 Epoch [61], val_loss: 2.2208, val_acc: 0.5438 Epoch [62], val_loss: 2.2219, val_acc: 0.5438 Epoch [63], val_loss: 2.2218, val_acc: 0.5438 Epoch [64], val_loss: 2.2216, val_acc: 0.5438 Epoch [65], val_loss: 2.2213, val_acc: 0.5483 Epoch [66], val_loss: 2.2209, val_acc: 0.5438 Epoch [67], val_loss: 2.2212, val_acc: 0.5483 Epoch [68], val_loss: 2.2221, val_acc: 0.5438 Epoch [69], val_loss: 2.2207, val_acc: 0.5483 Epoch [70], val_loss: 2.2208, val_acc: 0.5483 Epoch [71], val_loss: 2.2206, val_acc: 0.5483 Epoch [72], val_loss: 2.2209, val_acc: 0.5438 Epoch [73], val_loss: 2.2213, val_acc: 0.5438 Epoch [74], val_loss: 2.2209, val_acc: 0.5483 Epoch [75], val_loss: 2.2209, val_acc: 0.5483 Epoch [76], val_loss: 2.2210, val_acc: 0.5483 Epoch [77], val_loss: 2.2217, val_acc: 0.5483 Epoch [78], val_loss: 2.2216, val_acc: 0.5483 Epoch [79], val_loss: 2.2219, val_acc: 0.5483 Epoch [80], val_loss: 2.2215, val_acc: 0.5483 Epoch [81], val_loss: 2.2210, val_acc: 0.5483 Epoch [82], val_loss: 2.2215, val_acc: 0.5483 Epoch [83], val_loss: 2.2213, val_acc: 0.5483 Epoch [84], val_loss: 2.2218, val_acc: 0.5483 Epoch [85], val_loss: 2.2220, val_acc: 0.5483 Epoch [86], val_loss: 2.2222, val_acc: 0.5483 Epoch [87], val_loss: 2.2218, val_acc: 0.5483 Epoch [88], val_loss: 2.2219, val_acc: 0.5483 Epoch [89], val_loss: 2.2216, val_acc: 0.5483 Epoch [90], val_loss: 2.2219, val_acc: 0.5483 Epoch [91], val_loss: 2.2217, val_acc: 0.5483 Epoch [92], val_loss: 2.2217, val_acc: 0.5483 Epoch [93], val_loss: 2.2216, val_acc: 0.5483 Epoch [94], val_loss: 2.2220, val_acc: 0.5483 Epoch [95], val_loss: 2.2220, val_acc: 0.5483 Epoch [96], val_loss: 2.2217, val_acc: 0.5483 Epoch [97], val_loss: 2.2222, val_acc: 0.5483 Epoch [98], val_loss: 2.2225, val_acc: 0.5483 Epoch [99], val_loss: 2.2222, val_acc: 0.5483 Epoch [0], val_loss: 2.2222, val_acc: 0.5483 Epoch [1], val_loss: 2.2222, val_acc: 0.5483 Epoch [2], val_loss: 2.2222, val_acc: 0.5483 Epoch [3], val_loss: 2.2222, val_acc: 0.5483 Epoch [4], val_loss: 2.2222, val_acc: 0.5483 Epoch [5], val_loss: 2.2222, val_acc: 0.5483 Epoch [6], val_loss: 2.2222, val_acc: 0.5483 Epoch [7], val_loss: 2.2222, val_acc: 0.5483 Epoch [8], val_loss: 2.2222, val_acc: 0.5483 Epoch [9], val_loss: 2.2222, val_acc: 0.5483 Epoch [10], val_loss: 2.2222, val_acc: 0.5483 Epoch [11], val_loss: 2.2222, val_acc: 0.5483 Epoch [12], val_loss: 2.2223, val_acc: 0.5483 Epoch [13], val_loss: 2.2223, val_acc: 0.5483 Epoch [14], val_loss: 2.2223, val_acc: 0.5483 Epoch [15], val_loss: 2.2223, val_acc: 0.5483 Epoch [16], val_loss: 2.2223, val_acc: 0.5483 Epoch [17], val_loss: 2.2223, val_acc: 0.5483 Epoch [18], val_loss: 2.2223, val_acc: 0.5483 Epoch [19], val_loss: 2.2223, val_acc: 0.5483 Epoch [20], val_loss: 2.2223, val_acc: 0.5483 Epoch [21], val_loss: 2.2223, val_acc: 0.5483 Epoch [22], val_loss: 2.2223, val_acc: 0.5483 Epoch [23], val_loss: 2.2223, val_acc: 0.5483 Epoch [24], val_loss: 2.2223, val_acc: 0.5483 Epoch [25], val_loss: 2.2224, val_acc: 0.5483 Epoch [26], val_loss: 2.2224, val_acc: 0.5483 Epoch [27], val_loss: 2.2224, val_acc: 0.5483 Epoch [28], val_loss: 2.2224, val_acc: 0.5483 Epoch [29], val_loss: 2.2224, val_acc: 0.5483 Epoch [30], val_loss: 2.2224, val_acc: 0.5483 Epoch [31], val_loss: 2.2224, val_acc: 0.5483 Epoch [32], val_loss: 2.2224, val_acc: 0.5483 Epoch [33], val_loss: 2.2224, val_acc: 0.5483 Epoch [34], val_loss: 2.2224, val_acc: 0.5483 Epoch [35], val_loss: 2.2224, val_acc: 0.5483 Epoch [36], val_loss: 2.2224, val_acc: 0.5483 Epoch [37], val_loss: 2.2224, val_acc: 0.5483 Epoch [38], val_loss: 2.2225, val_acc: 0.5483 Epoch [39], val_loss: 2.2224, val_acc: 0.5483 Epoch [40], val_loss: 2.2224, val_acc: 0.5483 Epoch [41], val_loss: 2.2224, val_acc: 0.5483 Epoch [42], val_loss: 2.2225, val_acc: 0.5483 Epoch [43], val_loss: 2.2225, val_acc: 0.5483 Epoch [44], val_loss: 2.2225, val_acc: 0.5483 Epoch [45], val_loss: 2.2225, val_acc: 0.5483 Epoch [46], val_loss: 2.2225, val_acc: 0.5483 Epoch [47], val_loss: 2.2225, val_acc: 0.5483 Epoch [48], val_loss: 2.2225, val_acc: 0.5483 Epoch [49], val_loss: 2.2225, val_acc: 0.5483 Epoch [50], val_loss: 2.2225, val_acc: 0.5483 Epoch [51], val_loss: 2.2225, val_acc: 0.5483 Epoch [52], val_loss: 2.2225, val_acc: 0.5483 Epoch [53], val_loss: 2.2225, val_acc: 0.5483 Epoch [54], val_loss: 2.2225, val_acc: 0.5483 Epoch [55], val_loss: 2.2225, val_acc: 0.5483 Epoch [56], val_loss: 2.2225, val_acc: 0.5483 Epoch [57], val_loss: 2.2225, val_acc: 0.5483
Epoch [58], val_loss: 2.2225, val_acc: 0.5483 Epoch [59], val_loss: 2.2225, val_acc: 0.5483 Epoch [60], val_loss: 2.2225, val_acc: 0.5483 Epoch [61], val_loss: 2.2225, val_acc: 0.5483 Epoch [62], val_loss: 2.2225, val_acc: 0.5483 Epoch [63], val_loss: 2.2225, val_acc: 0.5483 Epoch [64], val_loss: 2.2225, val_acc: 0.5483 Epoch [65], val_loss: 2.2225, val_acc: 0.5483 Epoch [66], val_loss: 2.2225, val_acc: 0.5483 Epoch [67], val_loss: 2.2225, val_acc: 0.5483 Epoch [68], val_loss: 2.2225, val_acc: 0.5483 Epoch [69], val_loss: 2.2225, val_acc: 0.5483 Epoch [70], val_loss: 2.2225, val_acc: 0.5483 Epoch [71], val_loss: 2.2225, val_acc: 0.5483 Epoch [72], val_loss: 2.2225, val_acc: 0.5483 Epoch [73], val_loss: 2.2225, val_acc: 0.5483 Epoch [74], val_loss: 2.2225, val_acc: 0.5483 Epoch [75], val_loss: 2.2225, val_acc: 0.5483 Epoch [76], val_loss: 2.2225, val_acc: 0.5483 Epoch [77], val_loss: 2.2225, val_acc: 0.5483 Epoch [78], val_loss: 2.2225, val_acc: 0.5483 Epoch [79], val_loss: 2.2225, val_acc: 0.5483 Epoch [80], val_loss: 2.2225, val_acc: 0.5483 Epoch [81], val_loss: 2.2225, val_acc: 0.5483 Epoch [82], val_loss: 2.2225, val_acc: 0.5483 Epoch [83], val_loss: 2.2225, val_acc: 0.5483 Epoch [84], val_loss: 2.2225, val_acc: 0.5483 Epoch [85], val_loss: 2.2225, val_acc: 0.5483 Epoch [86], val_loss: 2.2225, val_acc: 0.5483 Epoch [87], val_loss: 2.2225, val_acc: 0.5483 Epoch [88], val_loss: 2.2225, val_acc: 0.5483 Epoch [89], val_loss: 2.2225, val_acc: 0.5483 Epoch [90], val_loss: 2.2225, val_acc: 0.5483 Epoch [91], val_loss: 2.2226, val_acc: 0.5483 Epoch [92], val_loss: 2.2226, val_acc: 0.5483 Epoch [93], val_loss: 2.2226, val_acc: 0.5483 Epoch [94], val_loss: 2.2226, val_acc: 0.5483 Epoch [95], val_loss: 2.2226, val_acc: 0.5483 Epoch [96], val_loss: 2.2226, val_acc: 0.5483 Epoch [97], val_loss: 2.2226, val_acc: 0.5483 Epoch [98], val_loss: 2.2226, val_acc: 0.5483 Epoch [99], val_loss: 2.2226, val_acc: 0.5483 Epoch [0], val_loss: 2.2226, val_acc: 0.5483 Epoch [1], val_loss: 2.2226, val_acc: 0.5483 Epoch [2], val_loss: 2.2226, val_acc: 0.5483 Epoch [3], val_loss: 2.2226, val_acc: 0.5483 Epoch [4], val_loss: 2.2226, val_acc: 0.5483 Epoch [5], val_loss: 2.2226, val_acc: 0.5483 Epoch [6], val_loss: 2.2226, val_acc: 0.5483 Epoch [7], val_loss: 2.2226, val_acc: 0.5483 Epoch [8], val_loss: 2.2226, val_acc: 0.5483 Epoch [9], val_loss: 2.2226, val_acc: 0.5483 Epoch [10], val_loss: 2.2226, val_acc: 0.5483 Epoch [11], val_loss: 2.2226, val_acc: 0.5483 Epoch [12], val_loss: 2.2226, val_acc: 0.5483 Epoch [13], val_loss: 2.2226, val_acc: 0.5483 Epoch [14], val_loss: 2.2226, val_acc: 0.5483 Epoch [15], val_loss: 2.2226, val_acc: 0.5483 Epoch [16], val_loss: 2.2226, val_acc: 0.5483 Epoch [17], val_loss: 2.2226, val_acc: 0.5483 Epoch [18], val_loss: 2.2226, val_acc: 0.5483 Epoch [19], val_loss: 2.2226, val_acc: 0.5483 Epoch [20], val_loss: 2.2226, val_acc: 0.5483 Epoch [21], val_loss: 2.2226, val_acc: 0.5483 Epoch [22], val_loss: 2.2226, val_acc: 0.5483 Epoch [23], val_loss: 2.2226, val_acc: 0.5483 Epoch [24], val_loss: 2.2226, val_acc: 0.5483 Epoch [25], val_loss: 2.2226, val_acc: 0.5483 Epoch [26], val_loss: 2.2226, val_acc: 0.5483 Epoch [27], val_loss: 2.2226, val_acc: 0.5483 Epoch [28], val_loss: 2.2226, val_acc: 0.5483 Epoch [29], val_loss: 2.2226, val_acc: 0.5483 Epoch [30], val_loss: 2.2226, val_acc: 0.5483 Epoch [31], val_loss: 2.2226, val_acc: 0.5483 Epoch [32], val_loss: 2.2226, val_acc: 0.5483 Epoch [33], val_loss: 2.2226, val_acc: 0.5483 Epoch [34], val_loss: 2.2226, val_acc: 0.5483 Epoch [35], val_loss: 2.2226, val_acc: 0.5483 Epoch [36], val_loss: 2.2226, val_acc: 0.5483 Epoch [37], val_loss: 2.2226, val_acc: 0.5483 Epoch [38], val_loss: 2.2226, val_acc: 0.5483 Epoch [39], val_loss: 2.2226, val_acc: 0.5483 Epoch [40], val_loss: 2.2226, val_acc: 0.5483 Epoch [41], val_loss: 2.2226, val_acc: 0.5483 Epoch [42], val_loss: 2.2226, val_acc: 0.5483 Epoch [43], val_loss: 2.2226, val_acc: 0.5483 Epoch [44], val_loss: 2.2226, val_acc: 0.5483 Epoch [45], val_loss: 2.2226, val_acc: 0.5483 Epoch [46], val_loss: 2.2226, val_acc: 0.5483 Epoch [47], val_loss: 2.2226, val_acc: 0.5483 Epoch [48], val_loss: 2.2226, val_acc: 0.5483 Epoch [49], val_loss: 2.2226, val_acc: 0.5483 Epoch [50], val_loss: 2.2226, val_acc: 0.5483 Epoch [51], val_loss: 2.2226, val_acc: 0.5483 Epoch [52], val_loss: 2.2226, val_acc: 0.5483 Epoch [53], val_loss: 2.2226, val_acc: 0.5483 Epoch [54], val_loss: 2.2226, val_acc: 0.5483 Epoch [55], val_loss: 2.2226, val_acc: 0.5483 Epoch [56], val_loss: 2.2226, val_acc: 0.5483 Epoch [57], val_loss: 2.2226, val_acc: 0.5483 Epoch [58], val_loss: 2.2226, val_acc: 0.5483 Epoch [59], val_loss: 2.2226, val_acc: 0.5483 Epoch [60], val_loss: 2.2226, val_acc: 0.5483 Epoch [61], val_loss: 2.2226, val_acc: 0.5483 Epoch [62], val_loss: 2.2226, val_acc: 0.5483 Epoch [63], val_loss: 2.2226, val_acc: 0.5483 Epoch [64], val_loss: 2.2226, val_acc: 0.5483 Epoch [65], val_loss: 2.2226, val_acc: 0.5483 Epoch [66], val_loss: 2.2226, val_acc: 0.5483 Epoch [67], val_loss: 2.2226, val_acc: 0.5483 Epoch [68], val_loss: 2.2226, val_acc: 0.5483 Epoch [69], val_loss: 2.2226, val_acc: 0.5483 Epoch [70], val_loss: 2.2226, val_acc: 0.5483 Epoch [71], val_loss: 2.2226, val_acc: 0.5483 Epoch [72], val_loss: 2.2226, val_acc: 0.5483 Epoch [73], val_loss: 2.2226, val_acc: 0.5483 Epoch [74], val_loss: 2.2226, val_acc: 0.5483 Epoch [75], val_loss: 2.2226, val_acc: 0.5483 Epoch [76], val_loss: 2.2226, val_acc: 0.5483 Epoch [77], val_loss: 2.2226, val_acc: 0.5483 Epoch [78], val_loss: 2.2226, val_acc: 0.5483 Epoch [79], val_loss: 2.2226, val_acc: 0.5483 Epoch [80], val_loss: 2.2226, val_acc: 0.5483 Epoch [81], val_loss: 2.2226, val_acc: 0.5483 Epoch [82], val_loss: 2.2226, val_acc: 0.5483 Epoch [83], val_loss: 2.2226, val_acc: 0.5483 Epoch [84], val_loss: 2.2226, val_acc: 0.5483 Epoch [85], val_loss: 2.2226, val_acc: 0.5483 Epoch [86], val_loss: 2.2226, val_acc: 0.5483 Epoch [87], val_loss: 2.2226, val_acc: 0.5483 Epoch [88], val_loss: 2.2227, val_acc: 0.5483 Epoch [89], val_loss: 2.2227, val_acc: 0.5483 Epoch [90], val_loss: 2.2227, val_acc: 0.5483 Epoch [91], val_loss: 2.2227, val_acc: 0.5483 Epoch [92], val_loss: 2.2227, val_acc: 0.5483 Epoch [93], val_loss: 2.2227, val_acc: 0.5483 Epoch [94], val_loss: 2.2227, val_acc: 0.5483 Epoch [95], val_loss: 2.2227, val_acc: 0.5483 Epoch [96], val_loss: 2.2227, val_acc: 0.5483 Epoch [97], val_loss: 2.2227, val_acc: 0.5483 Epoch [98], val_loss: 2.2227, val_acc: 0.5483 Epoch [99], val_loss: 2.2227, val_acc: 0.5483
Out[19]:
Text(0.5, 1.0, 'Accuracy vs. No. of epochs')
In [20]:
# Evaluate on test dataset
result = evaluate(model, test_loader)
result
Out[20]:
{'val_loss': 2.2965147495269775, 'val_acc': 0.5661764740943909}
Notebook Image
In [21]:
jovian.log_metrics(test_acc=result['val_acc'], test_loss=result['val_loss'])
[jovian] Metrics logged.
In [22]:
jovian.commit(project=project_name)
[jovian] Attempting to save notebook.. [jovian] Updating notebook "karthicksothivelr/10-monkey-species-classification" on https://jovian.ml/ [jovian] Uploading notebook.. [jovian] Capturing environment.. [jovian] Attaching records (metrics, hyperparameters, dataset etc.) [jovian] Committed successfully! https://jovian.ml/karthicksothivelr/10-monkey-species-classification

Make predictions using the trained model

In [23]:
def predict_image(input_img, model):
    inputs = input_img.unsqueeze(0)
    predictions = model(inputs)
    _, preds  = torch.max(predictions, dim=1)
    return preds[0].item()
In [24]:
# Test 1
img, label = test_dataset[200]
plt.imshow(img)
print('Label:', label_dict[label], ', Predicted:', label_dict[predict_image(img, model)])
Label: saimirisciureus , Predicted: saimirisciureus
Notebook Image
In [25]:
# Test 2
img, label = test_dataset[100]
plt.imshow(img)
print('Label:', label_dict[label], ', Predicted:', label_dict[predict_image(img, model)])
Label: macacafuscata , Predicted: macacafuscata