This course will give a detailed introduction to learning theory with a focus on the classification problem. It will be shown how to obtain (pobabilistic) bounds on the generalization error for certain types of algorithms. The main themes will be: * probabilistic inequalities and concentration inequalities * union bounds, chaining * measuring the size of a function class, Vapnik Chervonenkis dimension, shattering dimension and Rademacher averages * classification with real-valued functions Some knowledge of probability theory would be helpful but not required since the main tools will be introduced.
S. Chen, E. Dobriban, and J. Lee. (2019)cite arxiv:1907.10905Comment: Changed title. Added results on overparametrized 2-layer nets. Added error bars to experiments. Numerous other minor improvements.
J. Negrea, M. Haghifam, G. Dziugaite, A. Khisti, and D. Roy. (2019)cite arxiv:1911.02151Comment: 23 pages, 1 figure. To appear in, Advances in Neural Information Processing Systems (33), 2019.
J. Frankle, D. Schwab, and A. Morcos. (2020)cite arxiv:2002.10365Comment: ICLR 2020 Camera Ready. Available on OpenReview at https://openreview.net/forum?id=Hkl1iRNFwS.
K. Lee, K. Lee, J. Shin, and H. Lee. (2019)cite arxiv:1910.05396Comment: Accepted in ICLR 2020 and NeurIPS Workshop on Deep RL 2019 / First two authors are equally contributed.
G. Dziugaite, and D. Roy. (2017)cite arxiv:1703.11008Comment: 14 pages, 1 table, 2 figures. Corresponds with UAI camera ready and supplement. Includes additional references and related experiments.
S. Mei, and A. Montanari. (2019)cite arxiv:1908.05355Comment: We added two sections in version 3. One section provides the precise asymptotics of the training error. The other section describes a Gaussian covariate model, which gives the same asymptotic test error as the random features model.
J. Frankle, G. Dziugaite, D. Roy, and M. Carbin. (2019)cite arxiv:1912.05671Comment: This submission subsumes 1903.01611 ("Stabilizing the Lottery Ticket Hypothesis" and "The Lottery Ticket Hypothesis at Scale").