Hey Fam, I was out for a while, sad to see no one picked up the slack. We had a good thing going on. Anyways, I have moved on quite a bit since the last "study week", but a dedicated study group, however small, did wonders for not only motivating me but also clearing the fog when trying to learn new concepts or explain them.

For the sake of continuity, this study week will start from where we left off.

  • Read pages 140 - 165 of the Deep Learning book. Here we’ll be introduced to basic Supervised and Unsupervised Learning algos and, most importantly the workhorse of the modern deep learning advances, the Stochastic Gradient Descent algo.
    Make sure to go over it thoroughly.
  • Leave at least one comment below, e.g. a question about something you don't understand or a link to a brilliant resource. It must somehow relate to pages 140 - 165.
  • Go over Andrew Trask's brilliant A Neural Network in 11 lines of Python (Part 1) and
    A Neural Network in 13 lines of Python (Part 2 - Gradient Descent) posts to solidify your concepts for this week and help prepare you for the next

Use your extra time to help answer the questions others leave 🙏, this may go a long way to improve your understanding of many concepts, too.
As always, if you're feeling lost mention it in the comments and ask for help!

Enjoy 🍻, see you next week.