Session Information
02 SES 12 A, Recent Developments in Assessment in VET
Paper Session
Contribution
There have been recurrent debates about assessment design in vocational qualifications both in the UK and internationally, dividing opinion on which methods of assessment should be used (e.g., portfolios) and who should mark and set them. In England, the education system is currently undergoing governmental reforms. The department for education (DfE) now requires vocational qualifications (taken by 14-19 years olds) to contain a certain amount of external assessment and to involve employers in the delivery of the course and/or assessment (DfE, 2015a, 2015b).
Employers are key stakeholders of vocational qualifications; their opinions affect a candidate’s entry and progression in the labour market (Wolf, 2011). To ensure that qualifications are valued by employers, it is important to explore the views of current employers on assessment design and examine the extent to which they align with current practices, reforms and theoretical perspectives. It is also useful to understand the extent to which their opinions converge with assessment practices internationally, as this would help understand the feasibility of adopting preferred approaches.
A diverse variety of assessment methods have been used in vocational qualifications in the UK. Methods have often been chosen because of specific theoretical perspectives on the construct of vocational understanding and pedagogy. For example, performance assessment has been a dominant method used in work-based qualifications (Johnson, 2008). These qualifications are rooted in a performance-based view of vocational competence, which argues that a person’s competence can be exclusively assessed through their performance on occupational tasks (e.g., Jessup, 1991). This approach to vocational qualifications has been criticised for being reductionist in its view of vocational competence and assessment methods. For example, Hodkinson (1992) argues for an interactive view of vocationally-related understanding that is assessed with a range of methods of assessments, including tests of performance as well as learning logs, journals and written tests. Accordingly, VET courses in many European, American and Australian countries include a combination of written, practical and oral assessments, although the specific tasks vary across countries and qualifications.
Opinions on assessment design may also be influenced by the distinction between skills and knowledge. Bathmaker (2013) notes that vocational education in England has been focused on skills but “there is now a growing interest amongst researchers in the question of ‘knowledge’ in vocational education” (p. 88). This shift in importance may have implications for the kinds of assessments that are valued by employers.
Less research attention has focused on who should mark and set VET assessments. Currently, many vocational courses in the UK are teacher-assessed. The UK government has recently specified that a certain amount of assessment should be marked and set externally; that is, it should not be marked or set by the centre that delivers the course to the students. The international VET systems employ a diverse range of markers/setters. Some countries use a single type of marker and setter for all their methods of assessment (i.e., all externally or internally assessed) whereas, in other countries, externality varies across assessment methods.
The aim of this study was to conduct a preliminary exploration of employers’ opinions on different types of methods of assessment, markers and setters. Employers’ perceptions about assessment may influence their opinions about the qualifications, in turn affecting the value of such qualifications in the labour market. The findings should help elicit discussions about assessment design and how to ensure that employers value the qualifications offered to students. Methodologically, this preliminary study also provided insight into how feasible it is to engage employers in research on vocational qualifications.
Method
Expected Outcomes
References
Bathmaker, A.-M. (2013). Defining ‘knowledge’in vocational education qualifications in England: an analysis of key stakeholders and their constructions of knowledge, purposes and content. Journal of Vocational Education & Training, 65(1), 87-107. Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative research in psychology, 3(2), 77-101. DfE. (2015a). Technical awards for 14 to 16 year olds. 2017 and 2018 performance tables: technical guidance for awarding organisations. England, United Kingdom: Department for Education. DfE. (2015b). Vocational qualifications for 16 to 19 year olds. 2017 and 2018 performance tables: technical guidance for awarding organisations. England, United Kingdom: Department for Education. Hodkinson, P. (1992). Alternative models of competence in vocational education and training. Journal of Further and Higher Education, 16(2), 30-39. Jessup, G. (1991). Outcomes: NVQs and the Emerging Model of Education and Training. London: Falmer Press. Johnson, M. (2008). Assessing at the borderline: Judging a vocationally related portfolio holistically. Issues in Educational Research, 18(1), 26-43. Wolf, A. (2011). Review of vocational education: the Wolf report: Department for Education and Department for Business, Innovation & Skills.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.