Realities In Rhetoric Of Online Testing: A Higher Education Case Study
Author(s):
Ken Brown (presenting / submitting) Jarkko Hurme Victor Lally
Conference:
ECER 2016
Format:
Paper

Session Information

24 SES 01, STEM

Paper Session

Time:
2016-08-23
13:15-14:45
Room:
Vet-Theatre 114
Chair:
Javier Diez-Palomar

Contribution

Introduction

 

A review of STEM provision in Ireland (McCraith, 2015) at primary and post-primary level raised issues concerning, transition to 3rd level, the use of ICT and international performance and comparison. Research provides evidence that similar issues are also pertinent in Finland (Kinnari, 2010; Rinneheimo, 2010) suggesting that PISA results differ from ‘teachers experiences of students’.

 

The importance of the role of assessment and ICT is well documented within the literature. An extensive review of e-assessment, focusing on online computer-marked quizzes conducted by Jordan (2013), highlighted the increasing role of eAssessment technologies within the learning environment and how this environment may be optimised beyond simple quizzing (Johnson, Becker, Cummins, Estrada & Freeman, 2015).

 

This paper presents the findings from a recent research project to gather information on the conceptions and expectations that first year engineering mathematics students have in relation to online assessment and their reflections immediately following online assessment. Analysis of anecdotal observations gathered over several years, from formal and informal feedback media, suggests that many students may inadvertently experience negative behavioural attributes in advance of, or following, the online assessment.  Recent research (Gallimore & Stewart, 2014; Tempel & Newman, 2014; Gill, Mac an Bhaird & Ni Fhlionn, 2010) suggests that negative attributes may be more deeply embedded resulting in the need to introduce additional mathematical support at third level.

 

 

The outputs from the research will provide an evidenced data source for discussion in the design of new programmes as they expand their online provision. The evidence will help designers frame their understanding of the effects of the technology on the learning process, examining pedagogical barriers and support, and examining how this relates to levels of interaction and engagement online.

 

The project was designed within a socio-cognitive theoretical framework of self-efficacy (Bandura, 1977) to help the researchers understand the experiences and perceptions that learners bring in their transition to third level engineering mathematics. The main thrust of self-efficacy theory is that the actions of the learner and their subsequent reactions are influenced by their observations and experiences. Within this framework the research focused on pre-existing attributes, perceived barriers and self-confidence, and the awareness of existing support mechanisms for learners.

Method

Methodology The research reported in this paper is part of a joint study that was developed to examine (within the boundaries of first year engineering mathematics) if the anecdotal observations were accurate. The curricula of the participating higher education institutions were analysed to determine levels of similarity in first year engineering mathematics prior to the research. Levels of similarity in programme content, assessment methods and student cohort were considered to be sufficiently close allowing comparisons to be made. Learner Group First year BEng Engineering ordinary degree or equivalent learners from Ireland (n=67) and Finland (n=60) participated in the study. The setting was in the natural class environment to maintain a structured, non a-priori, contextual setting leading to a case study with phenomenological output to establish a baseline of the status of online assessment of first year learners in engineering mathematics. A mixed methods approach was taken to determine indicative outputs with qualitative and quantitative approaches operating simultaneously. All participants completed a questionnaire containing open and closed questions within a short timescale to ensure synchronic reliability. The questionnaire was tested to ensure issues of language were not problematic between the two countries. Sampling for Focus Group activity was non-self selecting and based on convenience as determined by the availability of learners to the researcher. Focus Group activity utilised a semi-structured, standardised open question approach timed to occur shortly after questionnaire operationalization and immediately after the first online assessment exercise. All topics and issues to be covered were specified in advance; all interviewees were asked the same basic questions to ensure comparability of responses. Lecturer Group The study involved lecturers from Ireland (n=3); interviews are planned to interview a similar number of lecturers in Finland. Each lecturer engaged with consent in a semi-structured video interview and was asked the same questions to allow comparisons to be made. The questions were formed around the following thematic areas: Training/Preparation for online assessment, Perceptions of student confidence for online assessment, Perceptions/knowledge of barriers for optimal online assessment. A coding schema was developed from the questionnaire, focus group and interviews. The unit analysis selected for open question responses was complete response/phrase rather than individual words, and in order to reduce the complexity only major thematic responses would be included. Prior to commencement of the project all institutions involved engaged in a process of extensive ethical scrutiny.

Expected Outcomes

Results and Discussion The learners engaged in this study are not considered to be from higher academic tracks. The learner group from Ireland resides in the top 67% of the student base when ability is based on Central Applications Office points. Approximately 15% of the Irish learner group entered third level from a non-CAO route such as mature access. A similar pattern describes the learner group from Finland. The researchers are cognizant that the process is subjective and the focus of the research is to establish a baseline from which to develop meaningful assessment processes. Preliminary analysis of questionnaires (n=127) reveals that many learners struggle to engage online with abstract mathematical concepts and consider a loss of reward to be a negative attribute of the assessment process. The major thematic outputs are in the areas of self-efficacy relating to self-esteem, confidence and self, for example 37% indicated negative experiences of eAssessment. Higher-level STEM cognitive assessments include calculations, determination of expressions or equations. It is suggested that the current mechanics of assessment are inadequate to fully address the needs of the educator in their endeavour to provide prompt, accurate, objective feedback. Deep knowledge based questioning is problematic to assess automatically and research has been conducted to explore this area (Ashton et al, 2006; Sangwin et al, 2013). The eAssessment of learners exists in many programmes of study, offering a myriad of mechanisms for exploring learning at an individual and group level. We conclude that the justice executed by eAssessment of the learner needs to be enhanced further in order to provide teachers with a more sophisticated profile of the learner. The outcomes of this research will guide a second stage in the application of eAssessment in both institutions, in order to develop this sophistication.

References

References Ashton, H.S., Beevers, C.E., Korabinski, A.A. & Youngson, M.A., 2006, Incorporating partial credit in computer-aided assessment of Mathematics in secondary education, Br J Educ Technol, 37(1), pp. 93-119 Bandura, A., 1977, Self-efficacy: Toward a Unifying Theory of Behavioral Change, Psychological Review, 84(2), pp. 191-215. Gallimore, M. & Stewart, J., 2014, Increasing the impact of mathematics support on aiding student transition in higher education, Teaching Mathematics and its Applications, 33(2), pp. 98-109. Gill, O., Mac An Bhaird, C. & Ni Fhloinn, E., 2010, The Origins, Development and Evaluation of Mathematics Support Services, Irish Maths Society, 66, pp. 51 – 63. Jordan, S., 2013, E-assessment: Past, present and future, New Directions, 9(1), pp. 87-106. Kinnari, H., 2010, A study of the mathematics proficiency, 1st intl workshop on maths and ICT: Education, Research and Applications, Bucharest, pp. 35-39. McCraith, B., 2015, Average is no longer good enough – it’s time for a step change in STEM education in Ireland, in Education Matters, Yearbook 2016 – 2016, pp. 13-18. Johnson, L., Adams Becker, S., Cummins, M., Estrada, V., and Freeman, A., 2015, NMC Technology Outlook for Higher Education in Ireland: A Horizon Project Regional Report. Austin, Texas: The New Media Consortium. Rinneheimo, K., 2010, Methods for teaching mathematics case Tampere University of Applied Sciences, 1st intl workshop on maths and ICT: Education, Research and Applications, Bucharest, pp. 48-55. Sangwin, C., Hunt, T. & Butcher, P., 2013, Embedding and enhancing eAssessment in the leading open source VLE, The Higher Education Academy. Tempel, T. & Neumann, R., 2014, Stereotype threat, test anxiety, and mathematics performance, Social Psychology of Education, 17(3), pp. 491-501.

Author Information

Ken Brown (presenting / submitting)
Letterkenny Institute of Technology
Engineering
Letterkenny
Oulu University of Applied Sciences, Finland
University of Glasgow, Scotland

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.