This course will give a detailed introduction to learning theory with a focus on the classification problem. It will be shown how to obtain (pobabilistic) bounds on the generalization error for certain types of algorithms. The main themes will be: * probabilistic inequalities and concentration inequalities * union bounds, chaining * measuring the size of a function class, Vapnik Chervonenkis dimension, shattering dimension and Rademacher averages * classification with real-valued functions Some knowledge of probability theory would be helpful but not required since the main tools will be introduced.
The EDRL research group works around a theoretical strain (embodied cognition), a methodological line (design-based research), and a disciplinary emphasis (mathematics). Thus, the laboratory hosts the full cycle of design-research projects that are geared to contribute to theory and practice of multi-modal mathematical learning and reasoning as well as to design theory.
K. Kawaguchi, L. Kaelbling, und Y. Bengio. (2017)cite arxiv:1710.05468Comment: To appear in Mathematics of Deep Learning, Cambridge University Press. All previous results remain unchanged.