This course will give a detailed introduction to learning theory with a focus on the classification problem. It will be shown how to obtain (pobabilistic) bounds on the generalization error for certain types of algorithms. The main themes will be: * probabilistic inequalities and concentration inequalities * union bounds, chaining * measuring the size of a function class, Vapnik Chervonenkis dimension, shattering dimension and Rademacher averages * classification with real-valued functions Some knowledge of probability theory would be helpful but not required since the main tools will be introduced.
R. O'Donnell. (2021)cite arxiv:2105.10386Comment: First edition originally published April 2014, in hardcover book format by Cambridge University Press, and electronically on the author's website. This arXiv version corrects 100+ typos and errors, but is otherwise essentially the same.