This course will give a detailed introduction to learning theory with a focus on the classification problem. It will be shown how to obtain (pobabilistic) bounds on the generalization error for certain types of algorithms. The main themes will be: * probabilistic inequalities and concentration inequalities * union bounds, chaining * measuring the size of a function class, Vapnik Chervonenkis dimension, shattering dimension and Rademacher averages * classification with real-valued functions Some knowledge of probability theory would be helpful but not required since the main tools will be introduced.
Five years ago, around Christmas 2012, I wrote an article about Cynefin, the sensemaking framework. I focused it on software development, because that was the main industry I worked in, and particularly focused on using it to work out which of our requirements were complex, so that we could embrace uncertainty and risk, and avoid…
Looking at the teaching of science and the importance language plays. Study shows how a taxonomy of language is built by students. Exploring how students then used language as a resource to unlock meaning and to decide upon an appropriate. answer. Conclusion focused on the implications for teaching science.