This course will give a detailed introduction to learning theory with a focus on the classification problem. It will be shown how to obtain (pobabilistic) bounds on the generalization error for certain types of algorithms. The main themes will be: * probabilistic inequalities and concentration inequalities * union bounds, chaining * measuring the size of a function class, Vapnik Chervonenkis dimension, shattering dimension and Rademacher averages * classification with real-valued functions Some knowledge of probability theory would be helpful but not required since the main tools will be introduced.
T. Dreyfus, C. Rasmussen, N. Apkarian, and M. Tabach. INDRUM 2018, (2018)"The complexity of knowledge flow in the classroom, even based on this one class session, is far greater than one might imagine. ".
D. Spikol, and J. Eliasson. The 6th IEEE International Conference on Wireless, Mobile, and Ubiquitous Technologies in Education, page 137--141. IEEE, (2010)