My takeaways from 4th lecture of stanford machine learning course.
This lecture starts with explaining Newton's method as noted in the last post. Then introduces exponential family distributions and shows that bernoulli and gaussian belong to exponential family distributions. In fact many more like poisson, gamma, beta, multinomial distribution etc, all of them belong to exponential family.
Then GLM(Generalized Linear Model) is introduced which can be use to solve any regression or classification problem as long as P(y|x;$\theta$) has a distribution that belongs to exponential family and certain conditions are fulfilled.
Then the lecturer shows that linear and logistic regression are just 2 special cases of GLM. He derives the hypothesis functions for linear and logistic regression using GLM as for linear regression P(y|x;$\theta$) is gaussian and for logistic regression it is bernoulli... both of them are exponential family distributions.
Then Softmax regression is explained and worked out which is yet another special case of GLM where response variable has multinomial distribution. Softmax regression can be used for classification problems where output/response variable can take more than 2 values and logistic regression is not appropriate.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment