It never bodes well to dive into the unknown without preparation. To define, design and enable learning analytics, it’s essential to have a clear strategy in place. Prep yourself with these evaluation questions before you dive into learning analytics.
Game Learning Analytics (GLA) is the process of applying Learning Analytics techniques to Serious Games in order to get insight about how the game is being used and improve the educational experience.
For about 10 years, from 2005 – 2015, much of the discussion about tracking eLearning revolved around the Shareable Content Object Reference Model (SCORM) and learning management systems (LMS).
As Massive Open Online Courses (MOOCs) generate a huge amount of learning activity data through its thousands of users, great potential is provided to use this data to understand and optimize the learning experience and outcome.
I'll start this article by making one simple statement: Feedback loops work. Why? That’s the way we human beings learn, as feedback provides us with a sense of where we stand and an evaluation of our progress.
An interesting question arose at a recent xAPI Camp hosted by The eLearning Guild: “What happened to objectives in xAPI?” We should be able to use xAPI to document successful completion of eLearning, but without statements of learning objectives in the content, this is not possible.
Now that the “the only constant is change” in society, our capacity to engage with novel challenges is of first order importance. What are the personal dispositions that authentic learning needs to cultivate, and can we make these assessable and visible to learners and educators?
Recommender systems provide users with content they might be interested in. Conventionally, recommender systems are evaluated mostly by using prediction accuracy metrics only. But, the ultimate goal of a recommender system is to increase user satisfaction.
Selten war ein Gesetz so dysfunktional wie das Leistungsschutzrecht für Presseverleger. Die Bundesregierung weigert sich, das einzugestehen - weil sie es in der ganzen EU einführen will.
Most institutions say they value teaching. But how they assess it tells a different story. University of Southern California has stopped using student evaluations of teaching in promotion decisions in favor of peer-review model. Oregon seeks to end quantitative evaluations of teaching for holistic model.
Presentation used by Tinne De Laet, KU Leuven, for a keynote presentation during an event: organised by Leiden University, Erasmus University Rotterdam, and Delft University of Technology.
The presentations presents the results of two case studies from the Erasmus+ project ABLE and STELA, and provides 9 recommendations regarding learning analytics
Wondering why Interpreting Learning Analytics is vital to eLearning? Check why Interpreting Learning Analytics is vital when you design or refine eLearning.
The e-Design Assessment Tool (eDAT) is a tool to help tutors represent and evaluate effective blended or distance learning designs. The eDAT combines a simple analysis of the learning activities with reflections on the teaching and learning perspective that underpins the design.
R. Sebastiani, P. Giorgini, und J. Mylopoulos. 16th International Conference on Advanced Information Systems Engineering, Volume 3084 von Lecture Notes in Computer Science, Seite 20-35. Riga, Latvia, (Juni 2004)
T. Joachims. Proceedings of the 17th International Conference on Machine Learning (ICML 2000), June 29-July 2, 2000, Stanford, CA, USA, Seite 431--438. Morgan-Kaufman Publishers, San Francisco, CA, USA, (2000)
D. Lewis. Proceedings of Speech and Natural Language Workshop, Feb 1991, Asilomar, CA, USA, Seite 312--318. Morgan-Kaufman Publishers, San Francisco, CA, USA, (1991)
R. Mihalcea, T. Chklovski, und A. Kilgarriff. Proceedings of SENSEVAL-3: Third International Workshop on the Evaluation of Systems for the Semantic Analysis of Text CD-ROM, Seite 25--28. (2004)
L. Albert. Proceedings of the Ninth Conference on Foundations of Software Technology and Theoretical Computer Science, Seite 223--241. London, UK, Springer-Verlag, (1989)
Y. Yang, und X. Liu. SIGIR '99: Proceedings of the 22nd annual international ACM SIGIR conference on Research and development in information retrieval, Seite 42--49. New York, NY, USA, ACM Press, (1999)