|
I'm looking for a brief and concise introductory paper/book that explains the main and the most used optimization techniques in machine learning. Some thing equivalent to "Gibbs Sampling for the Uninitiated" and "Bayesian Inference with Tears", but for optimization.
This question is marked "community wiki".
|
|
Just read openopt docs http://openopt.org/Problems |
|
How about "Optimization for Machine Learning" from Sra, Nowozin and Wright? There is also "Foundations of Machine Learning" which apparently also talks about optimization. I haven't read it yet, though. The Sra, Nowozin, and Wright book is great for someone interested in researching optimization for machine learning and wanting to get a peek at some deep places where interesting ideas lie. It is not really an introduction, and it assumes a lot of familiarity with the topic.
(Sep 27 '12 at 09:30)
Alexandre Passos ♦
|
|
Mark Schmidt wrote a very nice paper listing the most common methods used in Machine Learning and the links between them. I strongly encourage you, and anyone else, to read it. |
|
Complementing Nick's answer, it might help to hear these algorithms being explained out loud, and for that I strongly recommend Stephen Wright's nips tutorial on optimization in ML. |
|
Smola's machine learning book has a good self-contained chapter for optimization.
This answer is marked "community wiki".
|
I'm going to answer my own question and add some useful resources that I found in the last day.