Here are three links you should check out today:
[Cheatsheet] Machine Learning Super VIP Cheatsheet: This repository aims at summing up in the same place all the important notions that are covered in Stanford’s CS 229 Machine Learning course including refreshers, cheatsheets, and an ultimate compilation of concepts.
- [Paper] Annotated Deep Learning Research Papers: Check out this repository by Akash Nain if you find research papers intimidating and are looking for annotated papers in ML & DL that are much easier to understand. https://github.com/AakashKumarNain/annotated_research_papers
- [Tutorial] CNN Model Compression with Keras: Knowledge Distillation is a procedure for model compression, in which a small (student) model is trained to match a large pre-trained (teacher) model. Learn to implement knowledge distillation using Keras.
We’d love to hear from you - just reply on this thread.
- Working on a course or a project? Let us know.
- Came across an interesting article or tutorial? Share it here.
- Are you stuck with something? Ask a question.
Want to get this daily newsletter in your email inbox? Just click the bell icon and select “Watching First Post” on the newsletter homepage.