The Effects of an Assessment in Higher Education Training Programme on Lecturers Perceived Importance, Competence and Use of Assessment.
Conference:
ECER 2014
Format:
Paper

Session Information

22 SES 02 C, Academic Work and Professional Development

Paper Session

Time:
2014-09-02
15:15-16:45
Room:
B022 Anfiteatro
Chair:
Didi M.E. Griffioen

Contribution

Since the conceptualisation of assessment for learning (Gipps 1994), various frameworks acknowledging assessment as a core aspect of the teaching and learning process have been proposed.

The influence of assessment on students’ learning has been widely recognised (Boud & Molloy 2012; Jessop, Yakim & Gibbs 2013). Assessment in higher education is understood as a core aspect of the teaching and learning process, and it is considered an opportunity for significant learning and competence development. In this process, the implementation of assessment for learning is essential.

To improve assessments implemented at universities, lecturers’ assessment competence and skills must be enhanced (Quesada, Rodríguez & Ibarra in evaluation).

Learning-oriented e-assessment (e-LOA), which was initially coined by Rodríguez et al. (2009), aims to enhance learning through assessment in a technology-mediated context. e-LOA is defined as a technology-mediated learning process whose aim is to help university students develop competences and skills that are useful for their current academic practices and professional future (Gómez, Rodríguez & Ibarra 2013). From the lecturers’ perspective, the main goal of e-LOA is to promote students’ strategic learning and self-regulation.

Three main aspects of the e-LOA framework must be considered:

  • e-assessment tasks as learning tasks. e-assessment tasks must be meaningful, must represent real-life situations and must promote in-depth student learning.
  • e-Feedback as e-feedforward. Feedback is essential for students to monitor their own learning (Nicol 2010). Web 2.0 facilitates the option of a continuous and faster (Williams et al. 2012) form of feedback that can be accessed rapidly and reviewed at any time.
  • Student participation in e-assessment. Participative strategies are crucial to developing learning-to-learn skills, durable learning (Brew 2003) and sustainable assessment (Boud 2000).

 

In this context, the current study -which is part of the Re-Evalúa research project  [Ref. P08-SEJ-03502]- aimed to evaluate the effect of a training programme based on assessment on lecturers’ perceived importance, competence and use of assessment. 

Method

To analyse and evaluate the effect of e-LOA on lecturers’ perceived importance, competence and use of assessment, a pre-test/post-test quasi-experimental design with both control and experimental groups was employed The ActEval questionnaire (for the analysis and evaluation of university teacher assessment activity) (Quesada, Rodríguez & Ibarra 2013) was employed to measure the dependent variables. The questionnaire includes 31 items, with responses ranging from 1 (never/not) to 6 (always/totally). These items were grouped into four categories: (1) ‘Assessment planning and design’, (2) ‘Monitoring of student learning’, (3) ‘Participation of students in the assessment process’ and (4) ‘Improvement and adjustment of the assessment practice’. The sample of the pre-test and post-test design consisted of 70 lecturers. A total of 148 lecturers participated in the training course and workshop, and 37 participated in the follow-up phase of guidance and implementation. In the control group, 104 lecturers completed the pre-test questionnaire, and 56 completed the post-test. Of these lecturers, 33 completed both the pre-test and post-test measures. The quasi-experimental group included 37 lecturers, whereas the control group included 33. A paired-samples t-test was conducted to compare the mean results of the two variables for the experimental group and for the control group. The aim was to determine whether there was a significant difference among each matched-pair group. An independent-samples t-test procedure was used to compare the means of the experimental and control groups and to determine whether there were differences between the two groups in the pre-test and post-test results.

Expected Outcomes

Overall, the findings of the study were encouraging, indicating that the training and guidance programme had a noticeable effect. The lecturers in the experimental group noted increased competence in two categories: ‘Assessment planning and design’ and ‘Student participation in the assessment process’, as well as the use of tasks related to the ‘Assessment planning and design’, ‘Student participation in the assessment process’ and ‘Improvement and adjustment of the assessment practice’ categories. Regarding the differences between the experimental and control groups in the post-test ratings, the experimental and control post-test scores differed significantly in three categories: 1- ‘Assessment planning and design’ importance (p=.03) and use (p=.033), 3- ‘Student participation in the assessment process’ importance (p=.001), competence (p=.040) and use (p=.002) and the importance given to 4- ‘Improvement and adjustment of the assessment practice’ (p=.011). The results obtained are being used to design and implement an improved training program to enhance assessment in higher education within the DevalSimWeb research project [Ref. DCI-ALA/19.09.01/11/21526/264-773/ALFAIII-(2011)-10].

References

Boud, D. 2000. Sustainable assessment: rethinking assessment for the Learning society. Studies in Continuing Education, 22, (2), 151-167 Boud, D. & Molloy, E. 2012. Feedback in Higher and Professional Education. London:Routledge. Brew, A. 2003. La autoevaluación y la evaluación por los compañeros. En S. Brown & A. Glasner (Eds.), Evaluar en la Universidad. Problemas y nuevos enfoques, pp. 79-189. Madrid: Narcea. Gipps, C. 1994. Beyond Testing. London: The Falmer Press. Gómez, M.A., Rodríguez, G., Ibarra, M.S. 2013. Development of basic competences of students in Higher Education through Learning Oriented e-Assessment. RELIEVE, v. 19 (1), art. 1. DOI: 10.7203/relieve.19.1.2457 Jessop, T., El Yakim, Y. & G. Gibbs. (2013) The whole is greater than the sum of its parts: a large-scale study of students’ learning in response to different programme assessment patterns. Assessment & Evaluation in Higher Education. Published online 29 April 2013. http://dx.doi.org/10.1080/02602938.2013.792108 Nicol, D. 2010. From monologue to dialogue: improving written feedback processes in mass higher education. Assessment & Evaluation in Higher Education 35(5): 501-517 Quesada-Serra, V., Rodríguez-Gómez, G. & Ibarra-Sáiz, M.S. (2013). ActEval: a questionnaire for the analysis and consideration of university teachers’ assessment activity. Revista de Educación, 362,69-104. DOI: 10-4438/1988-592X-RE-2011-362-153. Quesada-Serra, V., Rodríguez-Gómez, G. & Ibarra-Sáiz, M.S. (In evaluation). What are we missing? Spanish lecturers’ perceptions of their assessment practices. Rodríguez-Gómez, G., Ibarra-Sáiz, M.S., Dodero-Beardo, J.M., Gómez-Ruiz, M.A., Gallego-Noche, B., Cabeza-Sánchez, D., Quesada-Serra, V. (2009). Developing the e-Learning-oriented e-Asessment. Internacional Conference on Multimedia and Information and Communication Technologies in Education. Lisboa. Williams, B., Brown, T. and Benson, R. 2012. Feedback in the digital environment. 125-139.

Author Information

Victoria Quesada-Serra (presenting / submitting)
Universidad de Cádiz
Puerto Real. Cádiz
Universidad de Cádiz, Spain
Universidad de Cádiz, Spain

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.