This course will give a detailed introduction to learning theory with a focus on the classification problem. It will be shown how to obtain (pobabilistic) bounds on the generalization error for certain types of algorithms. The main themes will be: * probabilistic inequalities and concentration inequalities * union bounds, chaining * measuring the size of a function class, Vapnik Chervonenkis dimension, shattering dimension and Rademacher averages * classification with real-valued functions Some knowledge of probability theory would be helpful but not required since the main tools will be introduced.
J. Huggins, M. Kasprzak, T. Campbell, und T. Broderick. (2019)cite arxiv:1910.04102Comment: A python package for carrying out our validated variational inference workflow -- including doing black-box variational inference and computing the bounds we develop in this paper -- is available at https://github.com/jhuggins/viabel. The same repository also contains code for reproducing all of our experiments.
D. Diochnos, S. Mahloujifar, und M. Mahmoody. (2018)cite arxiv:1810.12272Comment: Full version of a work with the same title that will appear in NIPS 2018, 31 pages containing 5 figures, 1 table, 2 algorithms.
J. Negrea, M. Haghifam, G. Dziugaite, A. Khisti, und D. Roy. (2019)cite arxiv:1911.02151Comment: 23 pages, 1 figure. To appear in, Advances in Neural Information Processing Systems (33), 2019.