Session Information
09 SES 05.5 A, General Poster Session
General Poster Session
Contribution
Survey data from International Large-Scale Assessments (ILSA) provide valuable information for governments, institutions, and the general public. High response rates are an important indicator of the reliability and quality of the survey, conversely low response rates in ILSAs can threaten the inferential value of the survey method. ILSA data are highly valued by the Ministries of Education of participating countries as a guide to inform policy-making.
An important ILSA is the Programme for International Student Assessment (PISA) which assesses the performance of 15-year-old students in reading, mathematics, and science. First administered in 2000, PISA has been implemented every three years since. Meeting the response rate thresholds specified by a low-stakes test such as PISA has often proven to be a challenge for many PISA participating countries (Ferrera et al. 2010). In the PISA 2022 cycle, an elevated number of countries were required to undertake Non-Response Bias Analysis (NRBA) due to low response rates (OECD, 2023). Ireland has participated in PISA since the first cycle in 2000 and had consistently met the response rate standards at both student and school level until 2022, when it failed to meet the student response rate. This leads us to the main research question ‘Why was there a change in the student response rate between PISA 2018 and 2022 in Ireland?’.
Two major differences were observed between 2018 and 2022, a move from spring to autumn testing and the COVID-19 pandemic. For Ireland, the PISA Main Study took place in the spring (March/April), this was followed by a Feasibility Study in the autumn (October/November). The purpose of the Feasibility Study was to evaluate the possibility of moving testing in Ireland to autumn and for the first time in PISA, testing took place in the autumn 2022. Secondly, while school restrictions were no longer in place in Ireland during testing in 2022, there was still a level of disruption associated with the COVID-pandemic in the school environment.
Various theories have been proposed to understand response rates and why some people participate in surveys and others do not. For example, the theory of cognitive dissonance which according to Festinger (cited in Miller, Clark, & Alayna, 2015) suggests that reducing the lack of agreement between people is an important factor in whether a person will respond or not to a survey. Alternatively, the theory of commitment or involvement suggests that the nature of the first request in the ‘foot in the door’ technique may have a significant effect upon participation (Freedman & Fraser, 1966). However, Self-Determination Theory (SDT) may provide a theoretical framework to facilitate the examination of the role motivation (extrinsic/intrinsic) may play in determining response rates. In SDT, three factors that assist motivation are competence, autonomy and relatedness, according to Deci and Ryan (1985). These three factors are seen as essential psychological needs that guide behaviour. Wenemark et. al. (2011) used SDT to redesign a health-related survey in an effort to improve response rates. In a similar vein, this study will use it to examine the change in the student response rate between PISA 2018 and 2022 in Ireland.
While the focus of this poster is on the changing response rates in PISA in Ireland, the implications of the findings will assist other countries participating in similar ILSAs. With the number of countries experiencing lower response rates in PISA 2022 at an unprecedented level, it is of urgent importance that countries begin to understand and address the complex reasons behind falling response rates in order to maintain the reliability and quality of these kinds of studies.
Method
A case study of Ireland’s procedures in administering ILSAs such as PISA will be undertaken to examine the research question ‘Why was there a change in the student response rate between PISA 2018 and 2022 in Ireland?’ The research will use Ireland’s participation in four separate administrations of ILSA studies, spring and autumn 2018 PISA, PISA 2022 (autumn), with reference to Trends in Mathematics and Science Study 2023 (TIMSS). The inclusion of TIMSS 2023 allows us to consider a second post-COVID reference point. The adoption of a case study as a research strategy allows for several techniques of data collection such as the study of documents used (e.g. letters/ manuals/webinars), logs of procedures and communications from the initial contact with schools to the day of testing, as well as conversations with ILSA project managers. The case study will be descriptive (in describing the processes employed) and explanatory in an attempt to explain why there was a change in response rates. The analysis will be two-fold. The first step of analysis will consider operational issues such as the changed circumstances brought about by the COVID-19 school closures, the introduction of data protection legislation, and the switch to autumn testing. Changes in procedures and processes between the four ILSA administrations will be recorded, categorised, and then evaluated. In the second step, the recorded and categorised processes will be analysed in relation to motivational theory. The various constructs of motivational theory such as extrinsic/intrinsic motivation will be applied, and the factors that influence motivation (competence, autonomy and relatedness) will also be considered. This two-fold process will give rise to insights not only on important operational changes (in the first instance), but will also shed light on the motivations of students, school staff and test administrators in the second step of the analysis. Ultimately, conducting the analysis in this manner will assist in an understanding of possible links between motivation and participation. Furthermore, this methodology may allow for the development of useful strategies that could assist future administrations of ILSAs in meeting the specified response rates.
Expected Outcomes
The initial results highlight a number of differences between the administrations of the four implementations of ILSAs at the empirical level. In PISA 2022, a higher rate of absence was recorded amongst students, more test dates needed to be rescheduled due to scheduling conflicts within the school, and a higher rate of parental refusal was observed. These observations will be furthered examined using motivational theory. Examining processes and procedures using motivation theory, has already gone some way in understanding the change in response rates between 2018 and 2022. For example, a theme identified in a thematic analysis of semi-structured interview with principals in the PISA 2018 autumn study indicated that if there was more ‘buy in’ from teachers, students and parents there would not be an issue with response rates. The ‘buy in’ is an indication of a person being motivated to take on a task, in this case participating in a ILSAs. On foot of this initial analysis, we consider the change in response rates to be attributable to a combination of logistic and motivational factors. We consider motivation theory to be a valuable tool in the analysis of participation, given that ILSA’s are low-stakes tests at the student level (though the stakes are higher at a system-level). In an effort to maintain response rates at the required levels, project managers could consider employing strategies that not only address logistical factors, but that also give due consideration to the part that motivation factors may play in response rates. These strategies may ultimately provide a useful tool for project managers in administering ILSAs.
References
Deci, E.L., & Ryan, R.M. (1985). Intrinsic motivation and self-determination in human behavior. Springer, New York. Ferraro, D., Kali, J., & Williams, T. (2009). Program for International Student Assessment (PISA) 2003: U.S. Nonresponse bias analysis (NCES 2009-088). National Centre for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC. Retrieved at: https://nces.ed.gov/pubs2009/2009088.pdf Freedman, J.L. & Fraser, S.C. (1966). Compliance without pressure: The Foot-In-The-Door technique. Journal of Personality and Social Psychology. 4(2). 195-202. Miller, M. K., Clark, J. D., & Jehle, A. (2015). Cognitive dissonance theory (Festinger). The Blackwell encyclopedia of sociology, 1, 543-549. OECD (2023). PISA 2022 technical report. Paris: OECD Publishing. https://www.oecd.org/pisa/data/pisa2022technicalreport/ Wenemark, M., Persson, A., Brage H. N., Svensson, T., & Kristenson, M. (2011). Applying Motivation Theory to Achieve Increased Response Rates, Respondent Satisfaction and Data Quality. Journal of Official Statistics, Vol. 27, No. 2, 2011, pp. 393–414
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.