09 SES 04.5 PS, General Poster Session
General Poster Session
Admission committees of selective university programmes need information on the accuracy of admission tools and on the selection outcomes that result from them. SSGPA appears to be the most logic predictor of academic achievement, but is far from accurate. Research into other predictors is still work in progress, and a golden standard has not yet been accomplished (Richardson, Abraham, & Bond, 2012). In Europe, there are no standardised procedures like the SAT (Scholastic Aptitude Test) or ACT (American College Testing) in North America. Depending on how access to university programmes is regulated in specific countries for specific university programmes, programmes have their own admission procedure and tools. Assessment of programme-specific competencies and knowledge, which are partly acquired during the admission procedure, is becoming an increasingly common procedure (Cortes, 2013; Sternberg, Bonney, Gabora, & Merrifield, 2012).
Using signal detection theory both sensitivity and specificity are used to calculate accuracy of admission tools in discerning successful students from unsuccessful students, which offers more information than the conventional percentage of explained variance (Van Ooijen-van der Linden, Van der Smagt, Woertman, & te Pas, 2016). In addition, it offers information on the cut-off score or criterion and allows analyses at the individual level, provided accuracy is found to be stable across cohorts. The main question we investigate here, is to what extent different admission tools predict academic success for individual cases.
Signal detection theory (Green & Swets, 1966; Macmillan & Creelman, 2005; Stanislaw & Todorov, 1999) provides information on the two possible incorrect decisions that can be made: admitting applicants that will fail (‘false alarms’) and rejecting applicants that would have been successful (‘miss’). The two possible correct decisions are: admitting students that will be successful (‘hits’) and rejecting those that will fail (‘correct rejections’). Calculating the hit-rate and the false-alarm-rate for a set of tools allows calculating the accuracy of that set as a whole and thus allows the comparison of different sets of tools.
Equal sensitivity (hit-rate) and specificity (1 – false alarm-rate) of different admission tools does not necessarily mean these tools admit the same individuals. An individual case might be a ‘hit’ using SSGPA, but a ‘miss’ using an admission test score. For another individual, this might be the other way around. Each admission tool might yield unique hits and false alarms. Given the percentages of successful and unsuccessful students in a cohort, a set of admission tools can be compared on the percentages of unique hits and unique false alarms each tool within the set yields. Successful students that would not have been admitted by any of the unique tools at a given criterion are ‘Total Misses’ and unsuccessful students that would not have been admitted by any of the tools at the set criteria are ‘Total Correct Rejections’. These detailed comparisons of tools and criteria are informative for any selective programme. We investigated the applicability of different admission tools on the successful selection of individual students enrolled in a Dutch general psychology bachelor programme.
Cortes, C. M. (2013). Profile in Action: Linking Admission and Retention. New Directions for Higher Education, (161), 59–69. http://doi.org/10.1002/he.20046 Sternberg, R. J., Bonney, C. R., Gabora, L., & Merrifield, M. (2012). WICS: A model for college and university admissions. Educational Psychologist, 47(1), 30–41. http://doi.org/10.1080/00461520.2011.638882 Van Ooijen-van der Linden, L., Van der Smagt, M. J., Woertman, L., & te Pas, S. F. (2016). Signal detection theory as a tool for successful student selection. Assessment & Evaluation in Higher Education, 2938(October), 1–15. http://doi.org/10.1080/02602938.2016.1241860
00. Central Events (Keynotes, EERA-Panel, EERJ Round Table, Invited Sessions)
Network 1. Continuing Professional Development: Learning for Individuals, Leaders, and Organisations
Network 2. Vocational Education and Training (VETNET)
Network 3. Curriculum Innovation
Network 4. Inclusive Education
Network 5. Children and Youth at Risk and Urban Education
Network 6. Open Learning: Media, Environments and Cultures
Network 7. Social Justice and Intercultural Education
Network 8. Research on Health Education
Network 9. Assessment, Evaluation, Testing and Measurement
Network 10. Teacher Education Research
Network 11. Educational Effectiveness and Quality Assurance
Network 12. LISnet - Library and Information Science Network
Network 13. Philosophy of Education
Network 14. Communities, Families and Schooling in Educational Research
Network 15. Research Partnerships in Education
Network 16. ICT in Education and Training
Network 17. Histories of Education
Network 18. Research in Sport Pedagogy
Network 19. Ethnography
Network 20. Research in Innovative Intercultural Learning Environments
Network 22. Research in Higher Education
Network 23. Policy Studies and Politics of Education
Network 24. Mathematics Education Research
Network 25. Research on Children's Rights in Education
Network 26. Educational Leadership
Network 27. Didactics – Learning and Teaching
The programme is updated regularly (each day in the morning)
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.