This course will give a detailed introduction to learning theory with a focus on the classification problem. It will be shown how to obtain (pobabilistic) bounds on the generalization error for certain types of algorithms. The main themes will be: * probabilistic inequalities and concentration inequalities * union bounds, chaining * measuring the size of a function class, Vapnik Chervonenkis dimension, shattering dimension and Rademacher averages * classification with real-valued functions Some knowledge of probability theory would be helpful but not required since the main tools will be introduced.
On April 25, 1903, Soviet mathematician Andrey Nikolaevich Kolmogorov was born. He was one of the most important mathematicians of the 20th century, who advanced various scientific fields, among them probability theory, topology, intuitionistic logic, turbulence, classical mechanics, algorithmic information theory and computational complexity.
The EDRL research group works around a theoretical strain (embodied cognition), a methodological line (design-based research), and a disciplinary emphasis (mathematics). Thus, the laboratory hosts the full cycle of design-research projects that are geared to contribute to theory and practice of multi-modal mathematical learning and reasoning as well as to design theory.
M. Raginsky, и I. Sason. (2012)cite arxiv:1212.4663Comment: Foundations and Trends in Communications and Information Theory, vol. 10, no 1-2, pp. 1-248, 2013. Second edition was published in October 2014. ISBN to printed book: 978-1-60198-906-2.
A. Gorban, и I. Tyukin. (2018)cite arxiv:1801.03421Comment: Accepted for publication in Philosophical Transactions of the Royal Society A, 2018. Comprises of 17 pages and 4 figures.
K. Kawaguchi, L. Kaelbling, и Y. Bengio. (2017)cite arxiv:1710.05468Comment: To appear in Mathematics of Deep Learning, Cambridge University Press. All previous results remain unchanged.