How to Assess Expertise and Expert Learning in Work-Based Higher Education?
Author(s):
Conference:
ECER 2017
Format:
Paper

Session Information

Paper Session

Time:
2017-08-23
09:00-10:30
Room:
K5.19
Chair:
Georgeta Ion

Contribution

In today’s world, rapid changes in the working life and in the society, the fast development of technology and changes in the organisation of work call for the recognition of the value of learning throughout the working career. The European Union has for long promoted lifelong learning and the necessity of continuing development of knowledge and skills. (Collin, Van der Heijden & Lewis, 2012; Tynjälä, 2013). In Finland, a new type of professional postgraduate education - specialisation education - has recently been created. Specialisation education is designed to support the professional development and specialisation of the participants and is meant for higher education graduates who already have prior work experience. Specialisation education is provided by universities and universities of applied sciences, and the studies amount to a minimum of 30 ECTS. The studies are based on the research and development competencies of universities, and they pursue to answer the needs of and to develop the practices of working life.

The purpose of this study is to look into the process of assessment of expertise and expert learning, and to gain insight into how assessment would best be organised in work-based higher education. Miller’s Pyramid (1990) forms a well-known and widely utilised framework for the assessment of professional competence. The model consists of four consecutive layers that denote increasing competence and assessment tools best suited for each layer. The bottom layer deals with factual knowledge (knows) and the next layer with knowing how to apply that knowledge (knows how). The third layer assesses performance in a simulated/laboratory environment (shows how), and the top layer deals with performance in authentic circumstances (does). (Tigelaar & van der Vleuten 2014.) Single assessment methods are not considered to be enough to assess the complex nature of competence. Thus, Miller’s pyramid requires for a programmatic approach to assessment and calls for combining multiple methods of assessment in a programme. (Baartman, Bastiens et al. 2007; Kaslow et al. 2007; van der Vleuten and Schuwirth 2005). Assessment is also context  dependent, and thus professional  competence  should  preferably  be  assessed  in  authentic  professional  practice  settings (Tigelaar & van der Vleuten 2014).

Fenwick (2014, 1278) states, that when determining the basis for assessment, attention should also be payed on how learning is perceived. Thus, central to the assessment of expertise is to understand how experts learn, how expertise develops and what elements constitute the basis of expertise. Research has shown that connective learning environments, boundary crossing and the integration of theoretical and practical knowledge supports the development of expertise (Guile & Griffiths, 2001; Griffiths & Guile, 2003; Tynjälä, 2008). According to Tynjälä (2003; 2008; 2013), professional expertise can be described as consisting of four elements: theoretical or conceptual knowledge, practical or experimental knowledge, self-regulative knowledge and sociocultural knowledge. Tynjälä proposes that the key to expertise development is the integration of the different components of expert knowledge. This can be achieved by using certain pedagogical approaches such as work-related project-based learning, problem-based learning or by reflecting work experience with the help of theoretical tools.

Method

The data for this study was collected using a combination of the Delphi method and empathy based stories. The Delphi method is a collaborative forecast and problem-solving method which relies on a panel of experts. During the Delphi process, research data is collected from expert panelists who anonymously answer questionnaires in two or more rounds. As the process progresses, the panel is believed to converge towards consensus. However, instead of achieving consensus, the focus can also be placed on finding many alternative and well-grounded ideas for future development (Landeta, 2006). Empathy based stories are short stories written by the respondents based on the images the frame story has raised. The idea is to vary one thing in the frame story and to see how the stories of the respondents change as the frame story is being varied. The Delphi panel for this study consisted of 36 experts from five interest groups including 1) researchers of workplace learning and expertise (n=8), 2) designers of specialisation education (n=6), 2) experts of higher education policies (n=6), 4) the working life sector (n=11), and 5) students of long postgraduate programs (n=5). This study included two Delphi rounds, with empathy-based stories being part of the first round. In the empathy-based stories, the respondents were to write short stories describing what assessment was like in a specialisation education in year 2025 when the participant was a) satisfied or b) dissatisfied with assessment during the studies. For the Delphi study, data was collected using a web-based method software, the eDelfoi. The panelists provided their answers for 10 statements on the first and 7 statements on the second round. All of the statements were concerned with specialisation education and/or assessment in the studies and took place in the year 2025 in order for the panelists to be able to break loose from the present state of the studies. For each statement the panelists first chose on a likert scale how probable/not probable and how preferable/not preferable it was for the statement to be realised. The panelists were also asked to write down arguments and reasoning to support their answers, to anonymously comment on each others’ answers and to revise their own answers based on the comments from fellow panelists.

Expected Outcomes

The analysis is ongoing, but the preliminary results indicate that assessment in work-based higher education should take place hand in hand with the working life. Representatives of the working life should be engaged in the whole process of assessment; in the planning phase, setting goals for learning, and closely involved in the process of assessment both during the studies and at the final evaluation. The top level of Miller’s pyramid suggests that performance should also be assessed under authentic circumstances. Van der Vleuten et al. (2010) note that assessment at the top level is predominantly assessment in the workplace. According to this study, assessment in work-based higher education is best carried out via work-related development projects in which participants use theoretical and conceptual tools gained through their studies to address authentic problems related to their own work. The integration of theoretical and practical knowledge is also crucial to expertise development (Tynjälä, 2008). The data also suggests that assessment in work-based higher education should be viewed as a continuous process during which expertise is assessed on a regular basis, from multiple points of view and by several assessors (including participants themselves). As a part of assessment, close attention should also be payed on the recognition of prior learning of the participants. Competencies should be acknowledged, although not acquired through formal education. According to this study, effective assessment should be reliable (multiple assessment points and several assessors) and the assessment process should always promote the learning of the participant and provide tools for professional development.

References

Baartman, L.K.J., Bastiaens, T.J., Kirschner, P. & van der Vleuten, C. (2006) The wheel of competency assessment: Presenting quality criteria for competency assessment programs. Studies in Educational Evaluation, 32, 153–170. Collin, K., Van der Heijden, B. & Lewis, P. (2012). Continuing professional development. International Journal of Training and Development, 16(3), 155–163. Fenwick, T.J. (2014). Assessment of Professionals’ Continuous Learning in Practice. In S. Billett, C. Harteis & H. Gruber (eds.), International Handbook of Research in Professional and Practice-based Learning (pp.1271-1297). Dordrecht: Springer. Griffiths, T. & Guile, D. (2003). A connective model of learning: the implications for work process knowledge. European Educational Research Journal, 2(1), 56–73. Guile, D. & Griffiths, T. (2001). Learning through work experience. Journal of Education and Work, 14(1), 113–131. Kaslow, N., Bebeau, M., Lichtenberg,J., Portnoy, S., Rubin, N., Leigh, I. Nelson, P. & Smith, L. (2007). Guiding Principles and Recommendations for the Assessment of Competence. Professional Psychology: Research and Practice, 38, 441–451. Landeta, J. (2006). Current validity of the Delphi method in social sciences. Technological Forecasting & Social Change, 73, 467–482. Miller, G. E. (1990). The assessment of clinical skills/competence/performance. Academic Medicine, 65(9), 63–67. Tigelaar, D. & van der Vleuten, C. (2014). Assessment of Professional Competence. In S. Billett, C. Harteis & H. Gruber (eds.), International Handbook of Research in Professional and Practice-based Learning (pp.1237­–1270). Dordrecht: Springer. Tynjälä, P., Välimaa, J. & Sarja, A. (2003). Higher Education, 46(2), 147–166. Tynjälä, P. (2008). Perspectives into learning at the workplace. Educational Research Review, 3, 130–154. Tynjälä, P. (2013). Toward a 3-P Model of Workplace Learning: a Literature Review. Vocations and Learning, 6, 11–36. van der Vleuten, C. & Schuwirth, L. (2005). Assessing professional competence: from methods to programmes. Medical Education, 39, 309–317. van der Vleuten, C., Schuwirth, L.., Scheele, F., Driessen, E. & Hodges, B. (2010). The assessment of professional competence: building blocks for theory development. Best Practice & Research Clinical Obstetrics and Gynaecology, 24, 703–719.

Author Information

Susanna Mikkonen (presenting / submitting)
University of Tampere
Faculty of Education
Tampere
Petri Nokelainen (presenting)
Tampere University of Technology, Finland
University of Turku, Finland
University of Turku, Finland
University of Turku, Finland

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.