This course will give a detailed introduction to learning theory with a focus on the classification problem. It will be shown how to obtain (pobabilistic) bounds on the generalization error for certain types of algorithms. The main themes will be: * probabilistic inequalities and concentration inequalities * union bounds, chaining * measuring the size of a function class, Vapnik Chervonenkis dimension, shattering dimension and Rademacher averages * classification with real-valued functions Some knowledge of probability theory would be helpful but not required since the main tools will be introduced.
K. Kawaguchi, L. Kaelbling, and Y. Bengio. (2017)cite arxiv:1710.05468Comment: To appear in Mathematics of Deep Learning, Cambridge University Press. All previous results remain unchanged.
L. Lee. Approaches to algebra: perspectives for research and teaching, Kluwer Academic Publishers, p 102
… it is much of a challenge to demonstrate that functions, modelling, and problem solving are all types of generalizing activities, that algebra and indeed all of mathematics is about generalizing patterns.
p 103
The history of the science of algebra is the story of the growth of a technique for representing of finite patterns.
The notion of the importance of pattern is as old as civilization. Every art is founded on the study of patterns.
Mathematics is the most powerful technique for the understanding of pattern, and for the analysis of the relationships of patterns.(1996)