This course will give a detailed introduction to learning theory with a focus on the classification problem. It will be shown how to obtain (pobabilistic) bounds on the generalization error for certain types of algorithms. The main themes will be: * probabilistic inequalities and concentration inequalities * union bounds, chaining * measuring the size of a function class, Vapnik Chervonenkis dimension, shattering dimension and Rademacher averages * classification with real-valued functions Some knowledge of probability theory would be helpful but not required since the main tools will be introduced.
What is modeling instruction? Instead of relying on lectures and textbooks, the Modeling Instruction program emphasizes active student construction of conceptual and mathematical models in an interactive learning community. Students are engaged with simple scenarios to learn to model the physical world. Find a Modeling Workshop in your area this summer! Modeling Instruction in the…
While machine learning has a rich history dating back to 1959, the field is evolving at an unprecedented rate. In a recent article, I discussed why the broader artificial intelligence field is…
M. Carlson, M. Oehrtman, and P. Thompson. Making the Connection: Research and Practice in Undergraduate Mathematics, Mathematical Association of America, Washington, DC, (2008)
A. Waraich. ITiCSE '04: Proceedings of the 9th annual SIGCSE conference on Innovation and technology in computer science education, page 97-101. New York, NY, USA, ACM, (2004)
C. Bergsten. Proceedings of the 30th Conference of the International Group for the Psychology of Mathematics Education, 2, page 153-160. Prague, Czech Republic July 16-21, (2006)