The Experience API (xAPI) allows us to collect data about any type of learning experience or activity, but does that mean we should? Should we generate massive amounts of xAPI data for every possible type of interaction and then expect to make sense of it all later? This approach can be costly in terms of data storage, but also in terms of your time.
Sunday Blake dives into the latest in learning analytics and engagement data, and asks how universities can act upon it to make our interactions with students more human.
In dieser Fortsetzungsfolge zum Thema Learning Analytics erläutern Marius Wehner und Lynn Schmodde von der Wirtschaftswissenschaftlichen Fakultät der Heinrich-Heine-Universität Düsseldorf das Verbundprojekt Fair Enough. Zur Fairness von Learning Analytics-Systemen legen sie empirische Evaluationsergebnisse verschiedener Stakeholder-Gruppen dar und geben einen Ausblick auf zukünftige Entwicklungen. Interviewer in Folge 12 des DINItus Podcasts ist Erik Reidt vom ZIM/Multimediazentrum der HHU Düsseldorf.
Artificial intelligence in higher education isn't without its risks. Here are three possible trouble spots for the use of AI. Elana Zeide is Associate Professor of Law at the University of Nebraska.
Natercia Valle tells a cautionary tale about the use of learning analytics dashboards to increase student motivation, and the challenges of translating theory into design solutions.
March 20, 2022
How can product developers use data analytics to improve products, prove their effectiveness, and increase the fidelity of implementation? Learn more in the latest Nexus story by Rachel Schechter
According to The Kirkpatrick Model, Level 3: Behavior is the degree to which participants apply what they learned during training when they are back on the job. The prevailing belief that a Level 3 plan for post-training support and accountability is difficult, expensive and out of training’s purview is untrue. Here are the deceptively simple steps to create learning experiences with true value.
The Experience API (xAPI for short) is far more than just an update to SCORM, the popular standard for tracking data from a learning management system. xAPI opens up a whole new world of possibilities for learning analytics. Examples of what real organizations are doing with it in real-life situations make it easier to grasp the scale of this advance and apply the learnings to your own situation.
There’s no question that the shift to remote and flexible learning has highlighted the importance of technology in education, but at the same time, this shift has also complicated some key aspects of a teacher’s job.
Jisc has been supporting seven research projects in learning analytics at UK universities over the past year. These have been in the areas of curriculum analytics, mental health and wellbeing and the evaluation of institutional learning analytics projects.Join us to hear the projects present their interesting findings.
It never bodes well to dive into the unknown without preparation. To define, design and enable learning analytics, it’s essential to have a clear strategy in place. Prep yourself with these evaluation questions before you dive into learning analytics.
Game Learning Analytics (GLA) is the process of applying Learning Analytics techniques to Serious Games in order to get insight about how the game is being used and improve the educational experience.
For about 10 years, from 2005 – 2015, much of the discussion about tracking eLearning revolved around the Shareable Content Object Reference Model (SCORM) and learning management systems (LMS).
As Massive Open Online Courses (MOOCs) generate a huge amount of learning activity data through its thousands of users, great potential is provided to use this data to understand and optimize the learning experience and outcome.
I'll start this article by making one simple statement: Feedback loops work. Why? That’s the way we human beings learn, as feedback provides us with a sense of where we stand and an evaluation of our progress.
An interesting question arose at a recent xAPI Camp hosted by The eLearning Guild: “What happened to objectives in xAPI?” We should be able to use xAPI to document successful completion of eLearning, but without statements of learning objectives in the content, this is not possible.
Now that the “the only constant is change” in society, our capacity to engage with novel challenges is of first order importance. What are the personal dispositions that authentic learning needs to cultivate, and can we make these assessable and visible to learners and educators?
Recommender systems provide users with content they might be interested in. Conventionally, recommender systems are evaluated mostly by using prediction accuracy metrics only. But, the ultimate goal of a recommender system is to increase user satisfaction.
Presentation used by Tinne De Laet, KU Leuven, for a keynote presentation during an event: organised by Leiden University, Erasmus University Rotterdam, and Delft University of Technology.
The presentations presents the results of two case studies from the Erasmus+ project ABLE and STELA, and provides 9 recommendations regarding learning analytics
Wondering why Interpreting Learning Analytics is vital to eLearning? Check why Interpreting Learning Analytics is vital when you design or refine eLearning.
This presentation explores shortcomings of learning analytics for the wide adoption in educational organisations. It is NOT about ethics and privacy rather than focuses on shortcomings of learning analytics for teachers and students in the classroom (micro-level).
Not all learning analytics are the same. Discover how proactive learning analytics help you influence and improve ongoing learning processes by predicting the future and creating recommendations for action. Identify the 4 key elements that will determine the success of your analytics journey.