Session Information
03 SES 02 A, Curriculum and Student Performance Measurement
Paper Session
Contribution
The mass storage of raw student information has caught the attention of academics, researches and Higher Education (HE) management (Richards, Bilgin & Marrone 2013, p68). This novel research area is widely referred to as Learner Analytics (LA). The purpose of LA is to collect, explore, study and analyse data to reveal a coherent meaning (Philips, 2011, p997). In other words LA can measure, store, analysis and report data about learners and their direct behaviour. This can be used to enhance both the pedagogic experience of the learner and their learning environment.
This new field in research has provided opportunities to improve the student learning experience by informing changes in course material, structure and assessment (Siemens & Long, 2011, p444 and Diaz & Brown, 2012, p4). Further studies have also suggested that intervention mechanisms can be triggered using LA (Beer, Tickner & Jones 2014, p247). However, Pardo and Siemens (2014, p444) assert that LA can easily become immorally intrusive into a student’s personal behaviour, raising issues of privacy and ethics. Even more worrying, Beer, Jones, & Clark (2012, p82) suggest the analysis of such large sets of data at a macro-level can lead to unsound conclusions. The abstraction of general patterns can ignore the underlying complexity that is not apparent in a broad linear relationship analysis.
Learning management systems (LMS) automatically generate and store a large amount of log data about students’ clickstream activities (Jones, Beer & Clark 2013, p2). They primarily record macro-level data, such as access-time data for digital resources e.g. web pages, videos and assessments. Furthermore, they can also record student test scores and attempts made, if the LMS is capable of such functionality.
The study uses retroactive data from a 2nd year undergraduate cohort studying web software development. The data was aggregated from the web server which was used for the development of web sites and thus became a development web server for students. Our work is similar to a study by Tian, Rudraraju & Li (2004, p756) that used this data to evaluate the reliance of web code. The study will instead evaluate the development cycle of students web develop files and code. This exploratory data analysis will attempt to create a predictive analytics model, an analytical approach to uncover patterns and relationships which could predict student performance. A limitation of only studying the records within the development environment is that they do not explain why a particular behaviour occurs.
Method
Expected Outcomes
References
Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity: Learning and leading for the future. In Proceedings of the 29th Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education (ASCILITE 2012) (pp. 78-87). Australasian Society for Computers in Learning in Tertiary Education (ASCILITE). Diaz, V., & Brown, M. (2012). Learning analytics: A report on the ELI focus session. In Educause (Ed.), Educause Learning Initiative (Paper 2, 2012 ed., Vol. ELI Paper 2: 2012, pp. 18). Educause: Educause. Available from: http://www.educause.edu/library/resources/learning-analytics-report-eli-focus-session Last accessed: 14/01/2016. Jones, D., Beer, C. & Clark, D. (2013). The IRAC framework: Locating the performance zone for learning analytics. In H. Carter, M. Gosper and J. Hedberg (Eds.), Electric Dreams. Proceedings ascilite 2013 Sydney. (pp.446-450) Pardo, A., & Siemens, G. (2014). Ethical and privacy principles for learning analytics. British Journal of Educational Technology, 45(3), 438-450. Phillips, R., Maor, D, Cumming-Potvin., Roberts, P., Herrington, J., Preston, G. & Moore, E.(2011). Learning analytics and study behaviour: A pilot study. In G. Williams, P. Statham, N. Brown & B. Cleland (Eds.), Changing Demands, Changing Directions. Proceedings ascilite Hobart 2011. (pp.997-1007). Romero, C., Ventura, S., & García, E. (2008). Data mining in course management systems: Moodle case study and tutorial. Computers & Education, 51(1), 368-384. Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. EDUCAUSE review, 46(5), 30. Tian, J., Rudraraju, S., & Li, Z. (2004). Evaluating web software reliability based on workload and failure data extracted from server logs. Software Engineering, IEEE Transactions on, 30(11), 754-769. Wohlers, S. D., & Jamieson, J. (2014). “What in me is Dark, Illumine”: developing a semantic URL learning analytics solution for Moodle. In B. Hegarty, J. McDonald, & S.-K. Loke (Eds.), Rhetoric and Reality: Critical perspectives on educational technology. Proceedings ascilite Dunedin 2014 (pp. 110-119).
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.