11 SES 01, Beginning the Process as Educators
Value-added (VA) modelling aims to quantify the effect of pedagogical actions on students’ achievement, independent of students’ backgrounds (e.g. Braun, 2005); in other words, VA strives to model the added value of teaching. The conceptual idea is to identify the amount of “value” that has been “added” by teachers, schools, or other pedagogical interlocutors, to the evolution of students’ performance. VA is typically used for teacher and/or school accountability (e.g. Sanders, 2000). Although VA models have gained popularity in recent years—a substantial increase of publications is to be observed over the last decade—, there is no consensus on how to calculate VA, nor is there a consensus whether and which covariates should be included in the statistical models (e.g. Newton, Darling-Hammond, Haertel, & Thomas, 2010).
Especially in diverse settings, leveling out the influence of background factors is of interest. The Luxembourgish school setting is a useful example, as it is very heterogeneous and multilingual: 63% of the children in primary school and 52% of the students in secondary school do not speak Luxembourgish at home (Ministry of National Education, Children and Youth, 2017). Additionally, the instruction language in the Luxembourgish curriculum switches from Luxembourgish in preschool to German in primary school to French for some topics in secondary school. This represents a challenge for teachers, schools and pupils. One aim of such a diverse school setting should be to help students improve regardless of their background. Therefore, it is interesting to calculate and compare VA models in the Luxembourgish school setting.
This contribution has two purposes: In the first part, results from an exhaustive literature review will be presented. The objective of this literature review is to analyse the current status of the use of VA models in practice and research and to find similarities and differences. Concretely, the research questions are: (1) Which statistical models have been used? (2) Which variables have been included and which statistical adjustments (e.g. for measurement error or missing data) have been made? (3) Which statistical parameters (e.g. explained variance) have been reported?
In the second part of our contribution, we will present results from a quantitative analysis of VA models. It consists of calculations and comparisons between different VA models, using largescale longitudinal data from the Luxembourg School Monitoring Programme “Épreuves Standardisées” (ÉpStan). In these calculations, we will consider the results of the literature review and empirically investigate the different model types we found. The objective is on the one hand to compare the quality of different statistical models and on the other hand to apply them in a multilingual and diverse setting.
For the literature review, we conducted a systematic literature research using ERIC, Scopus, PsycINFO, and Psyndex databases. The search terms were “value added” and “added value”. Studies written in English, German or French have been considered. We classified rigorously 674 publications out of 32 different countries, using over 30 categories. Double coding has been used to calculate interrater-reliability. For the calculations of different VA models, we will rely on longitudinal large-scale data emerging from the Luxembourg School Monitoring Programme ÉpStan. The ÉpStan assess students’ academic competencies, language(s) spoken at home, learning motivation and attitudes towards school at the beginning of each learning cycle of compulsory education (i.e., at the beginning of grade levels 1, 3, 5, 7 and 9). Each year, the entire student population in each of the concerned grade levels participates in the ÉpStan. The rich ÉpStan database provides the ideal basis for the empirical investigation of different VA modelling techniques (e.g. linear regression models and Multilevel models; with or without prior achievement, etc.).
In the literature review, we found that half of the studies investigated VA models at teacher level; the remaining looked at school or principal level. 370 studies used empirical data to calculate VA models. Most of these studies explained their covariates, but approximately 15% did not specify the model. Most studies used prior achievement as a covariate, but cognitive and/or motivational student data were almost never taken into consideration. Moreover, most of the studies did not adjust for methodological issues such as missing data or measurement error. To conclude, given the high relevance of VA—it is primarily used for high-stakes decisions—more transparency, rigor and consensus are needed, especially concerning methodological details. For the quantitative part, we expect to answer some of the questions that came up in the literature review. We aim to find differences between the different model types and depending on which variables are included. We expect to find that prior achievement is not enough as a covariate and that background, language and / or motivational variables will additionally explain a significant amount of variance in the models. Especially in such a heterogeneous setting, it is important to consider student variables that cannot be influenced by teachers and / or schools, such as language(s) spoken at home. The VA model could be helpful to identify effective pedagogical strategies in a diverse and multilingual context, while considering student language and background information.
Braun, H. I. (2005). Using student progress to evaluate teachers: A primer on value-added models. Princeton, NJ: Educational Testing Service. Ministry of National Education, Children and Youth. (2017). The key figures of the national education. Statistics and indicators 2015/2016. Luxembourg. Retrieved from http://www.men.public.lu/catalogue-publications/themes-transversaux/statistiques-analyses/chiffres-cles/2015-2016/15-16en.pdf Newton, X., Darling-Hammond, L., Haertel, E., & Thomas, E. (2010). Value-added modeling of teacher effectiveness: An exploration of stability across models and contexts. Education Policy Analysis Archives, 18(23), 1–24. https://doi.org/10.14507/epaa.v18n23.2010 Sanders, W. L. (2000). Value-added assessment from student achievement data: Opportunities and hurdles. Journal of Personnel Evaluation in Education, 14(4), 329–339. https://doi.org/10.1023/A:1013008006096
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.