Session Information
09 SES 07 B, Using the International Large-Scale Student Assessments’ Databases for Secondary Analysis (Part 2)
Research Workshop continues from 09 SES 06 B
Contribution
Please note that this is Part 2 of the workshop (1.5 hours), preceded by Part 1 (another 1.5 hours).
International large-scale student assessments have resulted in published datasets containing a wealth of information for secondary analysts. In addition to achievement data from students, these studies also collect contextual data from students, their teachers, school principals and parents, hoping that this information will assist in the collective understanding of education systems and their improvement. Unfortunately, much of the student assessment data remains underutilized by policy makers as well as researchers, in part due to the fact that proper analysis requires the use of statistical methods that are generally unknown to many researchers and data analysts (e.g. plausible values, sampling and replicate weights). In addition, there is currently a wide variety of studies covering diverse aspects of the school experience, some of them unknown to researchers and policy makers. The intent of this workshop is to introduce participants to the current and largest international large-scale student assessments in education, such as the Trends in International Mathematics and Science Study (TIMSS) (Martin, Mullis, Foy, & Stanco, 2012; Mullis, Martin, Foy, & Arora, 2012), the Programme for International Student Assessment (PISA) (OECD, 2013), the International Civic and Citizenship Education Study (ICCS) (Schulz, Ainley, Fraillon, Kerr, & Losito, 2010), the International Computer and Information Literacy Study (ICILS) (Fraillon, Schulz, & Ainley, 2013), and the Progress in International Reading Literacy Study (PIRLS) (Mullis, Martin, Foy, & Drucker, 2012), and to provide with a set of tools to assist them in unraveling the analytical complexities of these studies.
This workshop has three main objectives:
The first one is to provide an overview of the objectives and theoretical frameworks of the large-scale student assessments such as TIMSS, PIRLS, PISA, etc. The first part of the workshop will briefly review the available documentation of these studies.
The second objective is to review the studies’ methodological complexities, such as the sample and assessment design, and their implications for analysis.
The third objective of this course is to train the participants in analyzing the data from these assessments taking into account the studies’ complexities and design issues. To achieve this goal the course will provide hands-on training on analyzing data from these studies using software (provided for free by the course organizers) that handles all issues related with the analysis of large-scale assessment data. A new version of the software (IDB Analyzer), which will be released in early 2015, will be presented, introducing both improvements and new functionalities. Among these new functionalities is logistic regression, with and without using categorical variables as independent, as well as interactions.
This course is intended for graduate students, emerging researchers and researchers at all levels who are interested in issues concerning international large-scale student assessments. Participants are required to bring laptops (Windows PC) with SPSS 16 or higher and have a basic working knowledge in inferential statistics.
The workshop is planned for three hours (two sessions, 1.5 hours each). The syllabus/planned workshop activities are as follow:
Hour 1 – Introduction to the student based international large-scale assessments, their questionnaires and data files
Hour 2 – Introduction to statistical complexities of using international large-scale student assessment data and overview of the IEA IDB Analyzer (IEA, 2014)
Hour 3 – Work through guided practice exercise, analysis, Q & A
Method
Expected Outcomes
References
Fraillon, J., Schulz, W., & Ainley, J. (2013). International Computer and Information Literacy Study. Assessment Framework. Amsterdam: IEA. IEA. (2014). IEA IDB Analyzer (version 3.0) [computer software]. Hamburg: IEA Data Processing and Research Center. Martin, M. O., Mullis, I. V. S., Foy, P., & Stanco, G. M. (2012). TIMSS 2011 International Results in Science. Chestnut Hill, MA: Lynch School of Education, Boston College. Mullis, I. V. S., Martin, M. O., Foy, P., & Arora, A. (2012). TIMSS 2011 International Results in Mathematics. Chestnut Hill, MA: Lynch School of Education, Boston College. Mullis, I. V. S., Martin, M. O., Foy, P., & Drucker, K. T. (2012). PIRLS 2011 International Results in Reading. Chestnut Hill, M.A.: TIMSS & PIRLS International Study Center, Boston College. OECD. (2013). PISA 2012 results: What Students Know and Can Do. Student Performance in Mathematics, Reading and Science. (Vol. 1). Paris: OECD. Schulz, W., Ainley, J., Fraillon, J., Kerr, D., & Losito, B. (Eds.). (2010). ICCS 2009 International Report: Civic knowledge, attitudes, and engagement among lower secondary school students in 38 countries. Amsterdam: International Association for the Evaluation of Educational Achievement.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.