This course will give a detailed introduction to learning theory with a focus on the classification problem. It will be shown how to obtain (pobabilistic) bounds on the generalization error for certain types of algorithms. The main themes will be: * probabilistic inequalities and concentration inequalities * union bounds, chaining * measuring the size of a function class, Vapnik Chervonenkis dimension, shattering dimension and Rademacher averages * classification with real-valued functions Some knowledge of probability theory would be helpful but not required since the main tools will be introduced.
A. Gasnikov, P. Dvurechensky, E. Gorbunov, E. Vorontsova, D. Selikhanovych, and C. Uribe. (2018)cite arxiv:1809.00382Comment: In the current version we present a translation into English of the main derivations, which first appeared on September 2, 2018 in Russian, extend the analysis from the case of strongly convex objective to the case of uniformly convex objectives and add the numerical analysis of our results.
D. Selsam, P. Liang, and D. Dill. (2017)cite arxiv:1706.08605Comment: To appear at the Thirty-fourth International Conference on Machine Learning (ICML) 2017.
B. Lake, T. Ullman, J. Tenenbaum, and S. Gershman. (2016)cite arxiv:1604.00289Comment: In press at Behavioral and Brain Sciences. Open call for commentary proposals (until Nov. 22, 2016). https://www.cambridge.org/core/journals/behavioral-and-brain-sciences/information/calls-for-commentary/open-calls-for-commentary.