This course will give a detailed introduction to learning theory with a focus on the classification problem. It will be shown how to obtain (pobabilistic) bounds on the generalization error for certain types of algorithms. The main themes will be: * probabilistic inequalities and concentration inequalities * union bounds, chaining * measuring the size of a function class, Vapnik Chervonenkis dimension, shattering dimension and Rademacher averages * classification with real-valued functions Some knowledge of probability theory would be helpful but not required since the main tools will be introduced.
M. Brennan, and G. Bresler. (2020)cite arxiv:2005.08099Comment: 175 pages; subsumes preliminary draft arXiv:1908.06130; accepted for presentation at the Conference on Learning Theory (COLT) 2020.
S. Mukhopadhyay, and K. Wang. (2020)cite arxiv:2004.09588Comment: We'd love to hear your feedback. Email us. (We thank those who have already sent us their comments.).