16 SES 14 B, Online Learning / 1:1 policies
The use of technology-based training and the growing adoption of Learning Management systems have more than doubled in the past decade (Brown, Charlier, & Pierotti, 2012). Higher education have seen paradigm shifts from the traditional classrooms and tangible learning resources to asynchronous e-learning environments. This paradigm shift has fundamentally changed how learners are engaged. Technology enhanced learning has many challenges, one of which is the learners’ engagement. Learner engagement is a critical issue to the success of both online education and professional learning. Bruner (2013) asserted that “engagement is the ultimate test” (p. 34) to successful e-Learning adoption in learning.
As economies of the world continue to evolve, there is a continuum of educational and training needs for adult learners in lifelong learning. According to OECD statistics, 57% of the population aged 25-64 years old in Singapore, participated in formal and/or non-formal education in 2015. From an adult education survey, this statistic saw an upward trend in EU-28, from 2011 to 2015. Increasingly, course developers and instructional designers have to grapple with the learning needs of this growing group of adult learners (Kankaraš, Montt, Paccagnella, Quintini, & Thorn, 2016).. As formal learning take place in an increasingly networked environment over e-learning platforms and learning management systems, in order to cater to the changing demographics of learners. The shift towards digital environments in learning make it possible to capture, store, manage and retrieve increasingly large amounts of data, over the cloud. This provides an unprecedented opportunity to capture data related to learner engagement. Aided by data mining methods, the analysis and sense-making of the interaction data between the learner, learning environment and learning activities has become less cumbersome than before, and this can support a better understanding of the engagement process (Gašević, Dawson & Siemens, 2015). The importance of understanding these interactions and what might increase effectiveness of such interactions in online education is paramount for meaningful learning. In particular, this research focuses on discovering meaningful patterns of engagement and disengagement in learning activities from traces of the adult learners’ online engagement behavioural data.
Learner engagement has been defined in several different ways. Learner engagement is made up of the interaction that a learner has with their instructor, course content, and other learners. According to Fredricks and MsColskey (2012), “researchers, educators, and policymakers are increasingly focused on student engagement as the key to address problems of low achievement, high levels of student boredom, alienation, and high dropout rates” (p.763). The definition and measurement of learner engagement become more complex in the case of online learning environments.
Measuring learner engagement and its influence on learning is challenging. Online learning invariably leaves behind rich data trails from the learners’ interactions with the learning resources (e.g. study notes, course content, quizzes, recorded lectures and readings), peers and instructors (e.g. discussions). In an online environment, learners’ timely learning behaviours can be observed by accessing log data. While the definition of learner engagement should stay consistent with more traditional learning environments, the measurement of learner engagement should be unique to the data availability of the online learning environment. Identifying proxies of online learner engagement can provide a degree of measurability that can be used to inform and improve upon existing teaching and learning practices. Hence, the purpose of this study is twofold: (1) to explore the potential of reconstructing a variation of the RFM (a marketing segmentation technique based on customers’ recency, frequency and monetary purchasing behaviour) analysis, as a framework to codify and quantify the adult learners’ online engagement; and (2) to explore the online engagement patterns of adult learners using data mining techniques.
The data from an online learning activity in a six-week long undergraduate course offered primarily for adult learners was used for scoring. The dataset contained behavioural data logged from 419 adult learners, comprising of more than 100,000 points of online access. The course is a university core module, offered across multi-disciplinary bachelor-level programmes, typically taken during the first year of enrolment. The delivery of the course was conducted predominantly online using interactive study materials. Learners received guidance and support largely from an embedded technology-enhanced learning environment, supplemented with some face-to-face sessions. A data mining approach is used to analyse the unique characteristics of the adult learners’ engagement from online behavioural data in an online learning context. In this study, the RFM (or Recency, Frequency, Monetary) Model is used as a framework to derive online engagement metrics. The RFM Model is most widely used in marketing research and practice to quantify customer loyalty and purchase behaviour. In the context of online learning, however, the R and M measures are replaced: the measure of Recency with Immediacy, and Monetary with Duration (of access, or what is commonly referred to as “time-on-task”). Accordingly, corresponding proxy measures are derived to quantify the three measures of online engagement behaviour of Immediacy, Frequency and Duration (IFD). Immediacy (I) typifies the learner’s sense of urgency or excitement to learn. It is the time interval between the learner’s first immediate access to a learning activity and the time that it is made available for access. Frequency (F) characterises the learner’s utility and intensity on learning. It calculates the sum total of the number of episodes of online access within a certain period (i.e. learning interval). Duration (D) which measures the learner’s extent of involvement or sustained cognitive effort on a learning activity. It measures the total access time (or time-on-task) from the duration of each episode of access. Taking reference from the RFM approach, every learner will have their individual I, F and D values In this study, we performed clustering using a two-stage approach. Firstly, the data is passed through a TwoStep clustering analysis algorithm (available in IBM SPSS Modeler Version 18.0) to determine an optimal number of k clusters. Then, k-means is applied to discover the groupings. The application of RFM segmentation (re-adapted here as IFD) results in 27 possible groupings, from a 3 x 3 x 3 RFM matrix.
The limitations of the study affect both the findings and the directions for future research. Some of the limitations are embedded within an overarching limitation of how learner’s engagement is defined in an online learning environment. Whilst the primary focus of our study is on discovering useful and meaningful data features of engagement, as well as disengagement in learning activities from traces of learners’ online behavioural data, it is important to bear in mind that these measures are just surrogate indicators of online engagement, and may also not represent an all-encompassing measure of the extent of online engagement. This is further constricted by the accessibility and availability of LMS data, and the specimen data sample extracted for this study. Finally, there were also limitations within the process of converting the continuous learner behaviours to items for measure development. More importantly, the methodology is closely linked with the current state of our understanding of the data features. For example, a degradation of information (or information loss to the variables) may be compromised for the benefit of a greater ease of interpretation, and descriptive rigor. And although k-means is simple and efficient, and can be used for a wide variety of data types, it has trouble clustering data that contains outlying values. Thus, with further exploration and expansion to develop a more intimate understanding of the data, the methodology can only become more compact and robust, over time. Future research can derive new online engagement metrics to aid education research and practice in the technology enhanced learning environment. Future research can also examine and test the reliability, validity and utility of the IFD approach and its metrics. In line with this, it will be useful if other teaching and learning attributes that are associated with different patterns of online engagement are examined.
Brown, K. G., Charlier, S. D., & Pierotti, A. (2012). E-learning at work: Contributions of past research and suggestions for the future. In Hodgkinson, G. P. & Ford, J. K. (Eds.), International review of industrial and organizational psychology, (Vol. 27, pp. 89-114). Oxford, UK: Wiley-Blackwell. Bruner E. (2013). Mobility: It’s about user adoption. Chief Learning Officer, 13(3), 34-37. Fredricks, J. A., & McColskey, W. (2012). The measurement of student engagement: A comparative analysis of various methods and student self-report instruments. In Handbook of research on student engagement (pp. 763-782). Springer US. Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64-71. Kankaraš, M., Montt, G., Paccagnella, M., Quintini, G., & Thorn, W. (2016). Skills Matter: Further Results from the Survey of Adult Skills. OECD Skills Studies. OECD Publishing.
00. Central Events (Keynotes, EERA-Panel, EERJ Round Table, Invited Sessions)
Network 1. Continuing Professional Development: Learning for Individuals, Leaders, and Organisations
Network 2. Vocational Education and Training (VETNET)
Network 3. Curriculum Innovation
Network 4. Inclusive Education
Network 5. Children and Youth at Risk and Urban Education
Network 6. Open Learning: Media, Environments and Cultures
Network 7. Social Justice and Intercultural Education
Network 8. Research on Health Education
Network 9. Assessment, Evaluation, Testing and Measurement
Network 10. Teacher Education Research
Network 11. Educational Effectiveness and Quality Assurance
Network 12. LISnet - Library and Information Science Network
Network 13. Philosophy of Education
Network 14. Communities, Families and Schooling in Educational Research
Network 15. Research Partnerships in Education
Network 16. ICT in Education and Training
Network 17. Histories of Education
Network 18. Research in Sport Pedagogy
Network 19. Ethnography
Network 20. Research in Innovative Intercultural Learning Environments
Network 22. Research in Higher Education
Network 23. Policy Studies and Politics of Education
Network 24. Mathematics Education Research
Network 25. Research on Children's Rights in Education
Network 26. Educational Leadership
Network 27. Didactics – Learning and Teaching
The programme is updated regularly (each day in the morning)
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.