Share Your Work - Assignment 3 - Feed Forward Neural Networks

Please share your work from Assignment 3 on this thread.

Share your Jupyter notebooks, blog posts, demo videos, etc. to get feedback on your work from the entire community, and you will also get to learn from all the other participants.

Reply to this thread to share what you’re working on. You can share the following:

  • Jupyter notebooks hosted on Jovian. Be sure to add a nice title and a helpful description of your work.
  • Blog posts or tutorials you have written as part of the assignments and course project (yes, you’ll be writing blog posts!)
  • Video demos or animations of how your model is performing
  • Anything else you have created as part of this course, or otherwise.

Note: While commenting on others’ work, please be courteous, supportive, and give constructive feedback to make this a positive learning environment for everyone.

Hey all, check my assignment3 https://jovian.ai/rajibdasbhagat/03-cifar10-feedforward.

Is there any scope for improvement in terms of losses and accuracy?

From previous course I remember that participants got similar scores. You’ll probably improve this in next lecture, by using convolutional layers, but ofc nothing stops you from doing it now - just this assignment assumes to use linear layers to be able to understand why convlayers are important in image processing.

As to what you can do now (that wouldn’t leave the scope of assignment): increase the number of linear layers, apply different activation functions, change loss function.

As to what you can do now (outside of the assignment): convolutional layers, some custom activation function/unusual one, arranging the modules into residual blocks or inception module.

Consider changing the Optimizer method, it works for me !

Deep Learning with Pytorch Assignment 3

Hi ,
Assignment 3

My assignment

I have completed my assignment 3 of deep learning course have a look…

Here my assignment3…

Hi there! This is my submission for Assignment 3.

I started at a 37% accuracy and was able to reach a 53% after trying different learning rates at each history, decreased batch size, added another layer, replaced Relu function by Relu6 and replaced optimizer SDG by ASDG.

I suspect I reached a local optima and could not improve further.

See the full log below. Any feedback will be very much appreciated.

image

I think it is SGD(Stochastic Gradient Descent) you can also try using ADAM Optimizer

Thanks Birajs! Yes, ASDG is in the same family as simple SDG. Now that you mention Adam I will give it a try.

1 Like

My assignment

Here is the link to my Assignment 3

I used relu6 and also implemented gradient checking to know if model is heading in the right direction.
Assignment 3: Feed Forward Neural Networks

Here is my one! If anything is wrong, please let me know!

assignment #3

activation fxs: F.relu and torch.sigmoid
arch = "3 layers (128,64,32)"
lrs = [0.5,0.3,0.20,0.20]
epochs = [10,10,10,5]

here what i have done in a3, love to here from you all.

managed to achieve ~54% acc in the test dataset with a model architecture [265, 128, 64, 10]

Note: in the notebook there is cifar10 beta class if you could figure out what is wrong with it, plz reply and tell me what i can do to make it run perfect and many Thanks


arch = “3 layer (256,64,10)”
lrs = [0.1,0.05,0.01,0.0075,0.001,0.0001]
epochs = [5,5,5,5,5,5]

Hello everyone, here’s my final assignment. Achieved 52.5% accuracy on the test set.
Please check and give feedback.