Session Information
Contribution
In the last three decades, assessment practices in education made a transition from a testing culture, characterized by standardized tests focusing on measuring knowledge, to an assessment culture with competence-based assessments (CBAs) that focus on assessing a student's capability to integrate and apply relevant knowledge, skills and attitudes to perform professional tasks.This shift resulted in the development of various forms of CBAs that did not fit with the traditional way of evaluating test quality. CBAs necessitated new ways of thinking about evaluating the quality of these more subjective, less standardized and performance-based assessments (Dierick & Dochy, 2001).This paper argues that evaluating the quality of CBAs requires (1) clear and contextualized descriptions of the characteristics of different CBAs (2) new quality criteria and clear operationalisation for applying them to assessments (Baartman et al., 2006), and (3) involvement of all stakeholders (Guba & Lincoln, 1989) In line with the hermeneutic approach to assessment of Guba and Lincoln (1989) we argue that the quality of a CBA depends on the agreement between stakeholder groups and their trust in the assessment as an appropriate instrument for evaluating a student's capability to perform at the labor market (Finucane, 2002).The research questions in this study are(1) how do students, teachers, practitioners (from the workfield) and external assessment experts perceive the quality of CBAs on several crucial quality criteria? And do these groups differ or agree? (2) can the assessment quality and differences in the perceived quality be explained by differences in the assessmentscenarios?Based on previous research (e.g., Baartman et al. 2006), twelve quality criteria for CGAs were selected: acceptance, authenticity, cognitive complexity, comparability, fairness, fitness for competence-based educational purposes, transparency, reproducibility of results, educational consequences, alignment to instruction, meaningfulness, and efficiency. A 5-point Likert scale quality-questionnaire was developed in which every quality criterion was operationalised in three to five questions. Students, teachers and practitioners from two vocational education and training institutes and one higher vocational education school participated in this study. The schools represented three forms of CBA which were described elaboratively, based on several sources of input, to stress differences and similarities between the assessments and to allow two external assessment experts to rate the quality of the practices on the quality-questionnaire. Stakeholder perceptions of the quality of the assessments were examined via the quality-questionnaire and semi-structured focus group and individual interviews. Cross-case analyses were conducted to analyze whether or not quality experiences could be attributed to specific assessment characteristics.The results, described in a full paper, describe the perceived quality per stakeholdergroup per assessment; compare the experienced quality of the three assessments and explore generalizations with respect to the quality of CBAs, and display relations between certain assessmentscenarios and their quality. The discussion will elaborate on the usefulness of the twelve quality criteria and the surplus value of involving multiple stakeholders in the evaluation of the quality of new modes of assessments. Baartman, L. K. J., Bastiaens, T. J., Kirschner, P. A., & Vleuten, C. P. M. v. d. (2006). The wheel of competency assessment: presenting quality criteria for competency assessment programmes. Studies in Educational Evaluation, 32, 153-170. Dierick, S., & Dochy, F. (2001). New lines in edumetrics: new forms of assessment lead to new assessment criteria. Studies in Educational Evaluation, 27(4), 307-329. Finucane, P. M., Baron, S. R., Davies, H. A., Hadfield-Jones, R. S., & Kaigas, T. M. (2002). Towards an acceptance of performance assessment. Medical Education, 36, 959-964. Guba, E. G., & Lincoln, Y. S. (1989). Fourth generation evaluation. Londen: Londen Sage. We intend to submit the full paper as an article to an international peer-reviewed scientific journal.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.