Session Information
09 SES 16 B, Assessment in Teacher Education
Paper Session
Contribution
Higher education institutions employ different methods for admitting suitable candidates. Selection methods can be based on pre-study selection, which is common, or can take place during the student’s initial years of study (Owen, Quirk, & Rodolfa, 2014).
The major selection tools employed in the admission of candidates for general undergraduate study include high school grades and, in some countries, the candidate’s score on a psychometric entry exam. Using both tools increase the predictability of academic success (Oren, Kennet-Cohen, Turvall, & Allalouf, 2014).
In addition to these two criteria, additional selection tools are customarily used primarily for applied professions. Among the most prominent of these tools is the admission interview, which is extremely common in candidate selection for programs in medicine, social work and teacher training.
Many studies addressing the subject of candidate selection focus on the validity and reliability of matriculation scores and psychometric test scores and indicate the high validity of these tools in predicting academic success (for example: Kennet-Cohen, 2016; Oren at el., 2014). These predictive factors have elicited substantial debate and criticism, particularly with regard to their ability to predict student degree completion and dropout (Ben-David & Shaor, 2012). A broad-range study found that the relationship between these two predictors, the matriculation scores and psychometric test scores, and the probability of degree completion is weak, due to numerous variables that are not related to academic ability, such as motivation, socioeconomic background, and family status, which influence students’ chances of degree completion or drop out (Haimovitch & Ben Shahar, 2004).
As for the interview, studies demonstrated relatively lower predictive validity as regards to academic performance but a predictive capacity for non-academic parameters such as the capacity for interpersonal communications, empathy, and maturity (Wilson, Roberts, Flynn, & Griffin, 2012; Pau, Chen, Lee, Sow & De-Alwis, 2016). In particular for the teacher training candidates, researches reports that admission interviews possess a relatively high predictive capacity regarding student performance in practical work (Byrnes, Kiger, & Schechtman, 2000; Orland-Belleli & Mzor, 1996).
During the 2013-14 academic year, a major teacher-training college in Israel resolved to formulate a new interview selection tool. The new tool was constructed in light of the recommendations of the scholarship (Levy-Feldman & Nevo, 2011) and with the participation of the faculty's deans and the heads of the college’s departments. The tool combines individual as well as group parts and was applied to teaching candidates for undergraduate programs in three out of the four major college faculties which includes the Faculty of education, the Teaching Science Faculty and the Teaching Humanities and Social Sciences Faculty. Due to the unique selection procedure implemented by the Faculty of Arts, this faculty did not use the tool.
The aim of this study was to examine the quality of the new interview selection tool mainly in order to find out if it has added value over the other two tools in use (the matriculation and psychometric), that justify the effort and investment in conducting it.
The quality of the new interview selection tool was examined in three ways: (1) by assessing the ability of the interviewers using the new tool to distinguish between candidates of different academic status; (2) by considering the tool’s added value in comparison to other admission tools; and (3) by examining the tool’s capability to predict student grades.
The study was longitudinal in nature and was conducted over the course of a three-year period, from the 2013/4 through the 2015/6 academic years.
Method
The study population regarding the 1st and the 2nd research questions consisted of all the college’s undergraduate students that are candidates for the teaching profession in three major Faculties in three cohorts between 2014/5 and 2016/7 academic years. The study population regarding the 3rd research question consisted of students in the Faculty of Education. The interview tool has core and optional components to make it adaptable to the college’s diverse array of departments. The core components are: i) Motivation and Sense of Educational Mission; ii) Oral Expression; iii) Written Expression; iv) Interpersonal Communication; v) Presence and vi) Reading Skills. The optional components include: i) maturity; ii) Curiosity and Desire to Learn; iii) Creativity; iv) Reflective Ability; v) Planning, discretion and flexible thinking; and vi) Demeanor. The score awarded for each component range from 0 to 100. The candidate’s final score is the average of the components specified for the candidate. A majority of the college’s departments in the three faculties conduct a 1.5 to 3.5 hours interview which includes the individual as well as the group parts. Interview mediated by at least two staff members and a maximum of 12 candidates. The interview takes place after the candidates other selection tool's grades are sent to the admission office of the college. The interviewers are senior staff members and the Heads of the departments who have had experience conducting group as well as individual interviews. Variables: The following background information was collected from candidates: National ID number, gender, age, registered faculty and department/specialization, average matriculation score (range: 70-120), psychometric exam score (range: 200-800), score on each of the interview components and final interview score. Additionally, information was collected regarding the enrolment status of each candidate whether he or she were accepted, rejected or accepted in a 'special status' (Conditional acceptance for borderline candidate), and if the accepted candidates eventually enrolled to the college. Information was also collected regarding student grades only from the faculty of education as it is the largest faculty in the college and the information was available to the researchers. These grades including the average grade for the year, field practice grades, grade in pedagogical instruction, and grades in various classes that were taught in all the three faculty's various departments.
Expected Outcomes
Main findings indicate that the usage of the new interview tool improved the selection procedure of qualified candidates, especially borderline candidates who would be rejected if using only the matriculation and the psychometric admission tools and could be accepted only in light of their non-academic qualities. A logistic regression model revealed the admission new interview tool to be the sole statistically significant predictor of enrollment in the college, indicating that candidates who passed the admission interview had a 1.34 chance of being accepted. The percentage of explained variance was almost 20%, and approximately 90% of those examined were correctly classified using the model. In addition, the study reported the interview score to be the sole statistically significant predictive factor in the model that forecasts student field-practice grades, particularly in their first year of study. The percentage of explained variance between student field-practice grades and interview scores was 20%. It is hoped that over time, interviewers will develop a deeper understanding of its strengths, particularly its capacity to evaluate non-academic aspects of candidates that cannot be assessed by means of the common selection tools, and will accumulate experience assessing candidates while using it and eventually the percentage of explained variance would increase. Yet, admission interviews in general, and the tool considered in this study in particular, have been found to have a statistically significant advantage in predicting student field-practice grades that reflects non-academic abilities and skills which have been found to be important in the study of many applied professions. This information further corroborates the fact that the new interview admission tool added value in comparison to the other tools and allows a more in-depth and well-based discussion about exceptions and candidates who do not meet the strict grade admission requirements and therefor justify the effort invested in it.
References
1.Ben-David, N. & Shaor, T. (2012). Would an increase in the psychometric entrance threshold score improve success in the social sciences at the B.A. level? Economic Quarterly, 59(1-2), 51-77 (in Hebrew). 2.Byrnes, D. A., Kiger, G., & Schechtman, Z. (2000). Evaluating the use of group interviews to select students into teacher-education programs. Paper presented at the Annual Meeting of the American Educational Research Association (New Orleans, LA, April 24-28, 2000). http://journals.sagepub.com/doi/pdf/10.1177/0022487102250310 3.Haimovitch, T., & Ben Shahar, G. (2004). The matriculation exam and the psychometric entrance test as predictors of degree completion and dropout. Megamot, 43(3), 446-470 (in Hebrew). 4.Kennet-Cohen, T. (2016). Reliability of the psychometric exam and its ability to predict successful academic studies. Report of the National Center for Testing and Evaluation, https://nite.org.il/files/rel_val.pdf Accessed 12 April 2017 (in Hebrew). 5.Levy-Feldman, I., & Nevo, D. (2013). Perception regarding the accomplished teacher among teacher educators in “research oriented” and “teaching oriented” institutes in Israel. Studies in Educational Evaluation, 39(3), 153-160. 6.Pau, A., Chen, Y. S., Lee, V. K. M., Sow, C. F., & De-Alwis, R. (2016). What does the multiple mini-interview have to offer over the panel interview? Medical Education Online, 21, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4752591/ Accessed 27 March 2018. 7.Oren, C., Kennet-Cohen, T., Turvall, E., & Allalouf, A. (2014). Demonstrating the validity of three general scores of PET in predicting higher education achievement in Israel. Psicothema, 26(1), 117-126. 8.Orland-Belleli, I., & Mazor, E. (1996). Can we foresee? Gauging students’ future success or failure. Trends, 5, 34-43. 9.Owen, J., Quirk, K., & Rodolfa, E. (2014). Selecting graduate students: Doctoral programs and internship admission. In N. J. Kaslow and W. B. Johnson (Eds.), The Oxford Handbook of Education and Training Professional Psychology (p. 237). Oxford: Oxford University Press. http://www.oxfordhandbooks.com/view/10.1093/oxfordhb/9780199874019.001.0001/oxfordhb-9780199874019 10.Wilson, I. G., Roberts, C., Flynn, E. M., & Griffin, B. (2012). Only the best: Medical student selection in Australia. Medical Education, 196(5), 1-5.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.