I have been following some stuff on machine learning and realized that over time I have forgotten some of the calculus stuff that I learned in college.
I mostly remembered single variable differentiation, integration, maxima/minima etc but could not recollect the precise meaning of limit, continuity, series convergence and also the taylor series and multivariable calculus stuff(partial derivatives, optimization of multivariable function, lagrange multipliers etc).
This note is the shortcut that I took to quickly recap all of the above.
First, I read following( relevant) chapters/sections from the Calculus book by Professor Gilbert Strang.
Section 2.6(Limits) and 2.7(Continuous Functions)
Chapter 10 (Infinite Series)
Chapter 11 (Vectors and Matrices)
Chapter 13 (Partial Derivatives)
I must say that the book is great and covers a lot of breadth maintaining enough depth. However, I could not fully grasp the contents of Chapter-13(Partial Derivatives), probably because of the 3d diagrams on 2d paper(or maybe I just could not pay enough attention).
In order to understand this chapter I followed all the video lectures on Partial Derivatives(Lecture-8 to Lecture-14) from MIT Multivariable calculus course. The video lectures are easier to follow and also explains diagrams done in applets which are much easier to understand. Also, the lecture notes give a very good summary of stuff covered in the lecture videos.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment