Session Information
16 SES 16 B, ICT and Learning Achievements / Assessment
Paper Session
Contribution
The literature demonstrates that there exist various studies that examined factors regarding usability and acceptance of the electronic exams. Majority of students have satisfying opinions towards execution of electronic exams but reported some difficulties they experienced. For example, Dermo (2009)’s study found that students indicated positive attitudes, but at the same time reported their concerns towards exam fairness due to the random question selection which results in diverse complexity level for students. Mashaqbeh and Al Hamad (2010)’s study revealed that learners possess positive attitudes to adopt the online exam. Sorensen (2013) analyzed learners’ perceptions of e-assessment and explored that e-assessment provided value to their learning, and presented immediate feedback. Hillier (2014) revealed students have generally positive attitudes, yet explored their worries that students from technology departments easily adapted to electronic exams compared to ones from other majors as well as technical failures, and cheating possibilities. Laine et al. (2016) found out that students were satisfied with exam arrangements and suitability of questions. However, they reported the inadequacy of the system to enter mathematical calculations and improper use of the calculator of the system. Cabı (2016) examined learners’ perceptions on various e-assessment methods. According to her study, students preferred e-exams thanks to their capability of immediate feedback, motivation for study and self assessment. The concerns were about possibilities of cheating, technical problems, and lack of exam sessions. In a latest study, Alsadoon (2017) investigated students’ attitudes regarding e-assessment in Saudi Electronic University and indicated their judgment especially for the immediate feedback, unbiased grading, enhanced self-learning features of the electronic exams.
Distance Education Center manages distant programs of vocational college level programs, faculty level programs, graduate level programs, and certificate programs of Ankara University. Of these programs, vocational college level programs have recently started to utilize electronic exams. Moodle as the learning management system (LMS) was considered for the implementation of electronic exams of online vocational college students. Pilot study of electronic exams was conducted in 2015, by which the aim is providing electronic exams as makeup exams. In the succeeding year, electronic exams were started to be employed as midterm exams. Yet, the usability of the e-Exam system has not been investigated until this year.
This study aims to examine usability of the e-Exam System which was developed within the Moodle system and devoted to be used for execution of electronic exams. Specifically, the research objectives of the study are:
- To find System Usability Score of the e-Exam System
- To examine usability of the e-Exam System,
- To identify usability problems of the e-Exam System,
For the accomplishment of research objectives, we considered to employ two different usability studies. During the electronic exams, students were supported with the live chat tool to indicate their problems about using the system. By analyzing learners’ messages, we aimed to examine usability problems of the e-Exam System. When the electronic exam sessions ended, learners were provided with an online survey that consists of the questions of the System Usability Scale (Brooke, 1996) and an open ended question related to problems of learners. Results of usability assessments have been reported in this paper.
Method
During 2017-2018 Fall term, learners of online vocational college level programs participated into e-Exam System and submitted their solutions. Totally, 962 students entered to e-Exam system, which results in 85% participation rate. While exam sessions were taking place, learners reported their difficulties by the help of live chat tool integrated to e-Exam System, thus technical supports could be provided. When the exam sessions terminated, e-mails were delivered to learners for their assessment of the system usability and provision of their ideas regarding the e-Exam system. The online survey consists of the questions of the System Usability Scale (Brooke, 1996) and open ended question related to problems of learners. System Usability Scale 1. I think that I would like to use this system frequently 2. I found the system unnecessarily complex 3. I thought the system was easy to use 4. I think that I would need the support of a technical person to be able to use this system 5. I found the various functions in this system were well integrated 6. I thought there was too much inconsistency in this system 7. I would imagine that most people would learn to use this system very quickly 8. I found the system very cumbersome to use 9. I felt very confident using the system 10. I needed to learn a lot of things before I could get going with this system 11. What are your problems regarding use of the e-Exam system? Participants of the study are 93 voluntary learners of the vocational college level programs. The calculation procedure was applied to analyze the System Usability Scale. For the calculation of the System Usability Scale score, score contributions were found for each item, which ranges from 0 to 4. For items 1,3,5,7 and 9 the score contribution was identified as the scale position minus 1. For items 2,4,6,8 and 10, the contribution was calculated as 5 minus the scale position. Then, the ultimate System Usability score was resulted as the multiplication of complete score by 2.5, and changes between 0 and 100. In order to investigate learners’ answers to live chat messages and open ended question, we considered the content analysis approach, which has a qualitative perspective and provides the difficulties that learners faced with while using the system.
Expected Outcomes
We calculated the System Usability Score of the e-Exam System as 73.11, which demonstrates that e-Exam System has passable usability capability based on Bangor et al. (2008). Learners’ chat messages were analyzed for revealing their difficulties during e-Exams. The results showed that 43 questions were received from students. Their problems can be investigated in following categories. • Termination of exams before student submission • Login problems • Registration problems • Problems about content of questions • Lack of response from the system • Technical problems The findings of the open ended question explored that only 7,5% of students faced with problems while utilizing the system. These problems can be examined in following titles: Lack of exam time: Each exam requires completion of 30 questions in 30 minute interval. Some students reported this duration as insufficient to answer questions especially the ones that need mathematical calculations. Existence of improper questions: Little number of questions in the exams had lack of correct choice, consist of more than one right choice, or missing explanations. Problems related to user interface: Some students reported that the interface of e-Exam System is very similar to the interface of learning management system, hence they experienced problems in recognizing the correct system for participating into exams. Technical problems: One student reported no response status of the system at a time, two students indicated their dissatisfaction related to existence of questions in different pages, one student stated that he couldn’t get immediate feedback from the live chat.
References
Al-Mashaqbeh, I. F., & Al Hamad, A. (2010, May). Student's perception of an online exam within the Decision Support System Course at Al al Bayt University. In Computer Research and Development, 2010 Second International Conference on (pp. 131-135). IEEE. Alsadoon, H. (2017). Students' Perceptions of E-Assessment at Saudi Electronic University. Turkish Online Journal of Educational Technology-TOJET, 16(1), 147-153. Bangor, A., Kortum, P., & Miller, J. (2008). An empirical evaluation of the system usability scale. International Journal of Human-Computer Interaction, 24(6), 574–594. Brooke, J. (1996). SUS-A quick and dirty usability scale. Usability evaluation in industry, 189(194), 4-7. Cabı, E. (2016). Uzaktan Eğitimde E-Değerlendirme Üzerine Öğrenci Algıları. Journal of Higher Education & Science/Yüksekögretim ve Bilim Dergisi, 6(1). Dermo, J. (2009). e‐Assessment and the student learning experience: A survey of student perceptions of e‐assessment. British Journal of Educational Technology, 40(2), 203-214. Hillier, M. (2014). The very idea of e-Exams: student (pre) conceptions. Proceedings Ascilite 2014, 77-88. Laine, K., Sipilä, E., Anderson, M., & Sydänheimo, L. Electronic Exam in Electronics Studies. Sorensen, E. (2013). Implementation and student perceptions of e-assessment in a Chemical Engineering module. European Journal of Engineering Education, 38(2), 172-185.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.