Got it! That explanation really helps. Thanks!
I personally didn't understand the figure. What did he mean by "two datapoints shown in blue"? If that mean that the two datapoints both coincide, then shouldn't the mean of the MLE be at that point itself?
The examples about over-fitting were really clear, but towards the end of the 28 pages, I found the explanation of Bayesian interpretation and maximum likelihood a bit difficult to understand. I still need to read up on Bayesian interpretation somewhere else, but as for the maximum likelihood, I found the examples section of the corresponding Wikipedia article to be very illustrative. The rest of the material was pretty much standard for me, given that I'm doing a course on Probability right now in my college. I think I'm missing some takeaway lessons in the maximum likelihood area because I didn't really get the Bayesian part, so I think I'll read up on that now!
Yeah, I too came to know about Cyc only recently in the context of real-life examples of Lisp programs. The idea seems really weird (hard-coding thought processes and information), but its cool that someone really carried it far enough to be useful to some extent : Cyc.
The idea of learning computer programs itself (neural Turing machines) is really exciting! I really liked the motivation for the need of non-linearity in neural networks (the XOR example). The rationale given for the failure of small networks is also illuminating --- "In retrospect, it is not particularly surprising that neural networks with fewer neurons than a leech were unable to solve sophisticated artiﬁcial intelligence problems".
By the way, I have been hearing about Keras / Tensorflow / Caffe etc, for quite a while, but I'm reluctant to just work with them as blackboxes. Is it recommended to write a (maybe small) neural network on your own before going on to use these tools? I mean, is it feasible to write something in a short period of time, and have it work on some dataset (maybe Iris) to some extent?
Count me in as well.
Great. Count me in as well.