In my CNN model, I can observe that the mini batch accuracy reaches 100% after few epochs and the loss keeps decreasing. Is it correct to get such results? What is the reason behind these values?

Hi… I am using CNN for image classification. From the simulation results it is observed that the mini batch accuracy reaches 100% after few epochs and mini batch loss keeps on decreases as shown below:

|Epoch| Iterations| Time Elasped| Mini-batch accuracy | Mini-batch loss | base learning rate

| 1 | 1 | 00:00:10 | 43.75% | 8.9676 | 1.0000e-04 |
| 1 | 9 | 00:00:34 | 43.75% | 2.3432 | 1.0000e-04 |
| 2 | 18 | 00:00:57 | 68.75% | 0.4471 | 1.0000e-04 |
| 3 | 27 | 00:01:19 | 56.25% | 0.8530 | 1.0000e-04 |
| 4 | 36 | 00:01:43 | 81.25% | 0.4184 | 1.0000e-04 |
| 5 | 45 | 00:02:08 | 93.75% | 0.3022 | 1.0000e-04 |
| 6 | 54 | 00:02:33 | 81.25% | 0.2594 | 1.0000e-04 |
| 7 | 63 | 00:02:59 | 87.50% | 0.5467 | 1.0000e-04 |
| 8 | 72 | 00:03:23 | 93.75% | 0.1394 | 1.0000e-04 |
| 9 | 81 | 00:03:50 | 100.00% | 0.0409 | 1.0000e-04 |
| 10 | 90 | 00:04:14 | 100.00% | 0.0920 | 1.0000e-04 |
| 11 | 99 | 00:04:38 | 100.00% | 0.0318 | 1.0000e-04 |
| 12 | 108 | 00:05:02 | 93.75% | 0.1280 | 1.0000e-04 |
| 13 | 117 | 00:05:26 | 93.75% | 0.1724 | 1.0000e-04 |
| 14 | 126 | 00:05:50 | 87.50% | 0.2529 | 1.0000e-04 |
| 15 | 135 | 00:06:14 | 100.00% | 0.0252 | 1.0000e-04 |
| 16 | 144 | 00:06:38 | 100.00% | 0.0362 | 1.0000e-04 |
| 17 | 153 | 00:07:04 | 100.00% | 0.0345 | 1.0000e-04 |
| 18 | 162 | 00:07:28 | 100.00% | 0.0400 | 1.0000e-04 |
| 19 | 171 | 00:07:53 | 100.00% | 0.0552 | 1.0000e-04 |
| 20 | 180 | 00:08:17 | 100.00% | 0.0704 | 1.0000e-04 |

Is it correct to get such results or I am doing something wrong??
What is the reason behind these values??

Hey, @navneetbrar5 Welcome to the forum.
It’s completely correct to get such results, You can train till your loss is decreasing, You are getting 100% accuracy because you are using some Dataset that was previously cleaned, though it is really hard to get 100% accuracy with real-world data or some limited data. I think you got the advantage of having clean data, a sufficient amount of data, and choosing the correct hyperparameters. Though accuracy is 100% it doesn’t mean your model will always give a correct result with any images of a similar class, whenever you use an image different from the dataset that you have, you might get some error.
Also, I am not sure if the loss you mentioned is validation loss or training loss. You can continue the training until the validation loss becomes stable or increases, if the validation loss is stable/increases while the training loss is still decreasing it might mean that your model is overfitting.

Hey thanks for the explanation.
and the loss mentioned in the table is training loss.

1 Like

Is this network overfit or not?? how to verify that whether this model is overfitting or not?? Can CNN be applied to small image datasets?

Overfit → big training accuracy, low validation accuracy.