Dissemin is shutting down on January 1st, 2025

Published in

Springer Verlag, Lecture Notes in Computer Science, p. 183-197

DOI: 10.1007/978-3-319-42706-5_14

Links

Tools

Export citation

Search in Google Scholar

Knowledge acquisition for learning analytics : comparing teacher-derived, algorithm-derived, and hybrid models in the moodle engagement analytics plugin

This paper was not found in any repository, but could be made available legally by the author.
This paper was not found in any repository, but could be made available legally by the author.

Full text: Unavailable

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

One of the promises of big data in higher education (learning analytics) is being able to accurately identify and assist students who may not be engaging as expected. These expectations, distilled into parameters for learning analytics tools, can be determined by human teacher experts or by algorithms themselves. However, there has been little work done to compare the power of knowledge models acquired from teachers and from algorithms. In the context of an open source learning analytics tool, the Moodle Engagement Analytics Plugin, we examined the ability of teacher-derived models to accurately predict student engagement and performance, compared to models derived from algorithms, as well as hybrid models. Our preliminary findings, reported here, provided evidence for the fallibility and strength of teacher-and algorithm-derived models, respectively, and highlighted the benefits of a hybrid approach to model-and knowledge-generation for learning analytics. A human in the loop solution is therefore suggested as a possible optimal approach. ; 15 page(s)