This course will give a detailed introduction to learning theory with a focus on the classification problem. It will be shown how to obtain (pobabilistic) bounds on the generalization error for certain types of algorithms. The main themes will be: * probabilistic inequalities and concentration inequalities * union bounds, chaining * measuring the size of a function class, Vapnik Chervonenkis dimension, shattering dimension and Rademacher averages * classification with real-valued functions Some knowledge of probability theory would be helpful but not required since the main tools will be introduced.
V. Papyan, J. Sulam, and M. Elad. (2017)cite arxiv:1707.06066Comment: This is the journal version of arXiv:1607.02005 and arXiv:1607.02009, accepted to IEEE Transactions on Signal Processing.
B. Ghojogh, M. Sikaroudi, H. Tizhoosh, F. Karray, and M. Crowley. (2020)cite arxiv:2004.01857Comment: Accepted (to appear) in International Conference on Image Analysis and Recognition (ICIAR) 2020, Springer.
C. Blundell, J. Cornebise, K. Kavukcuoglu, and D. Wierstra. Proceedings of the 32nd International Conference on Machine Learning, volume 37 of Proceedings of Machine Learning Research, page 1613--1622. Lille, France, PMLR, (07--09 Jul 2015)
A. Fisher, and E. Kennedy. (2018)cite arxiv:1810.03260Comment: This manuscript version includes 2 additional supplemental figures to further aid intuition. In total: 4 figures, 36 pages (double spaced).
H. Li, Z. Xu, G. Taylor, C. Studer, and T. Goldstein. (2017)cite arxiv:1712.09913Comment: NIPS 2018 (extended version, 10.5 pages), code is available at https://github.com/tomgoldstein/loss-landscape.