The potential of massive open online learning environments is vast, but the accompanying learning analytics are still in their infancy. TU Delft researchers Guanliang Chen and Dan Davis defended their PhD theses on this subject on May 6th and May 7th respectively.
Massive Open Online Courses (MOOCs) have become a popular option for people to learn outside of a formal education setting. Researcher Dan Davis, who defended his PhD thesis on Tuesday May 7th, concludes that the potential of massive open online learning environments (with their low barrier to accessing high-quality educational materials) is vast, but that learning analytics (i.e. analysing data about learners to improve learning materials, the learning environment, etc.) as a ﬁeld is still in its infancy.
‘The scalable nature of online learning platforms now affords researchers to conduct studies of unprecedented scale in the learning context. The click stream data from online learning platforms offers an unprecedented level of granularity in our ability to gauge and track a learner’s path through a learning experience. Prior to the widespread adoption of online learning environments, research in this domain would be limited to an analysis of the instructor’s syllabus, or resource-intensive manual observing of learners’ trajectories through tasks or activities in order to uncover insights in how knowledge is formed in the learning context. But now, researchers have access to every action a learner takes within a platform, from opening a new page, to changing the playback speed of a video, to scrolling through their peers’ comments on the course discussion forum.’
Goal: personalised learning
‘We must however continue to identify deviations where what works in a traditional learning environment does not effectively work online. It has become clear that interventions and techniques known to work in the traditional classroom cannot simply work ‘out of the box’ in the online setting. They must be adapted accordingly based on the context. The goal is to arrive at a set of best practices that will give the most learners the greatest chance of succeeding, with the ultimate goal of personalised learning: identifying each learner’s individual needs at a given moment and acting upon them accordingly.’
Interventions: social comparison and study planning
Among others, Davis created an intervention for MOOCs which provides learners with a dashboard-type visualisation tool enabling them to track their own learning behaviour, and compare it to a previously successful learner in the same course. It was found that this social comparison intervention signiﬁcantly increased passing rates in every course examined: the increases in passing rates ranged from 3%–6%.
Davis also designed and deployed a study planning intervention in a MOOC. ‘We created an interface which prompted learners at the beginning of each week to state their goals and their plans on how they will achieve those goals for the week. And at the end of each course week, the system prompted learners to reﬂect back on their goals and plans and write down how well they did in achieving those. While this type of intervention had previously been found to be highly effective in traditional learning environments, we observed high levels of noncompliance: learners largely ignored the intervention. This is one of the examples that clearly show that more care needs to be paid to translating interventions to the online context.’
MOOC learners on the Social Web
‘To better support MOOC learners, many works have been proposed to investigate MOOC learning in the past decade’, says Davis’ colleague Guanliang Chen, who defended his thesis on Monday May 6th. There remain though unexplored avenues. ‘For learner modelling, we proposed that we can better understand learners by moving beyond the MOOC platforms and explore other data sources on the wider Web, especially the Social Web. We first investigated whether MOOC learners are active in the Social Web. We considered over 320,000 learners from eighteen MOOCs in edX and made efforts to identify their accounts across ﬁve popular Social Web platforms: Gravatar, Twitter, LinkedIn, StackExchange and GitHub.’
Platforms such as GitHub allow researchers to not only track learners’ progress during a MOOC, but also before and after via their public coding traces. When exploring to what extent learners actually employ their acquired knowledge in practice for a Functional Programming MOOC, the analyses revealed that only about 8% of engaged learners transferred some of their acquired knowledge into practice.
MOOC learners as real-world task solvers
Chen further investigated whether learners could apply the acquired knowledge to solve real-world tasks such as paid tasks which are retrieved from online marketplaces and can be solved by applying the knowledge taught in a course. ‘For this purpose, we considered a MOOC teaching data analysis in edX and manually selected a set of paid tasks from Upwork, one of the most popular freelancing marketplaces in the world, and presented the selected tasks to learners and observed how learners interacted with these real-world tasks. We observed that these tasks could be solved by MOOC learners with high accuracy and quality.’
Thesis Dan Davis in TU Delft Repository:
Large-Scale Learning Analytics: Modeling Learner Behavior & Improving Learning Outcomes in Massive Open Online Courses