Opportunity To Learn And Student Outcomes On PISA: The Moderating Role Of Test-Based Teacher Evaluation Policies
Author(s):
Conference:
ECER 2015
Format:
Paper

Session Information

09 SES 06 A, Relating Assessment Policies and Performance Interpretations to School and Student Variables

Paper Session

Time:
2015-09-09
15:30-17:00
Room:
326. [Main]
Chair:
Monica Rosén

Contribution

Objective

            School climate can influence students’ motivation to learn and achieve (Deal & Peterson 2009). Yet, a schools’ accountability climate may lead teachers to alter their teaching practices, influencing student-level outcomes. In previous work using the 2009 Programme for International Student Assessment (PISA) data, we found a significant negative moderating effect for cumulative school accountability policies on the relationship between teacher practices and student outcomes (reading motivation and achievement). The proposed study builds on our previous findings and utilizes new variables from the 2012 PISA administration, which measure students’ opportunity to learn in the classroom. The goal of the proposed study is to better understand the interaction between students’ opportunity to learn in the classroom, test-based teacher evaluation policies, and student-level outcomes (achievement and self-efficacy) from an international sample of students.

 

Theoretical Framework

School Climate and Accountability

            Students’ motivation and achievement have been shown to be positively related to teacher instructional practices (Wenglinsky, 2002). Yet, the positive influence of teacher classroom practices may be influenced and perhaps altered by school climate (Deal & Peterson, 2009). In particular, the evaluation practices used within a school have the potential to influence both teachers’ and students’ motivation by leading teachers to alter their practices, which may subsequently influences student outcomes.

Reliance on test-based accountability systems is becoming increasing popular in educational reform efforts (Hamilton, 2003). Test-based accountability policies are especially prevalent within the United States, but are also supported in other countries such as the United Kingdom and Australia (Rustique-Forrester, 2005). Test-based accountability in education presents a complex inter-relationship between the pressures of reform efforts and unintended consequences for students, teachers, and schools. When there is a strong emphasis on accountability and measuring performance via student achievement results, teachers may feel pressured and motivation may become extrinsic as they act upon fear or threat of consequences (Santiago & Benavides, 2009; Cruz & Brown, 2010). Research from across the globe has demonstrated negative outcomes including, but not limited to: a narrowing of the curriculum; test-centered rather than student-centered environments; heightened stress; and a marginalization of low-performing students (e.g., Jaeger, Merki, Oer, & Holmeier, 2012; Pedulla, Abrams, Madaus, Russell, Ramos, & Miao, 2003; Polesel, Rice, & Dulfer, 2014; Rustique-Forrester, 2005).

 Current Study

The current study expands our previous analyses with 2009 PISA data with the most up-to-date version of the PISA assessment (2012 administration), where additional expanded questions were added measuring teaching practices via opportunity to learn. The purpose of the current study is to evaluate the moderating effect of test-based teacher accountability policies on the relationship between opportunity to learn and student outcomes (mathematics achievement and self-efficacy) using data from the 2012 PISA.  

Method

Methods Data PISA 2012 data were collected from over 500,000 15-year old students who were enrolled in grades seven or higher (OECD, 2014). For our study, we will select students from 34 OECD countries with no missing values on the outcome variables and predictors. The use of complete data is recommended by the OECD (2009) in order to accurately create a normalized student weight. Analysis The HPMIXED procedure in SAS version 9.3 will be used to estimate a three-level hierarchical model (i.e., L1: student; L2: school; L3: country). All variables will be group-mean centered at the mean of the clustering unit under which they were nested. Variables Dependent variables. Two dependent variables will be used in our analyses. These variables include: 1) mathematics self-efficacy (MATHEFF scale) and 2) mathematics performance assessment score incorporating plausible values methodology. MATHEFF is an 8-item scale that measure students’ confidence with certain mathematics tasks (median scale reliability in OECD countries α = 0.95). Predictor and moderator variables. Analyses will include predictors of opportunity to learn and a measure of test-based teacher accountability practices as the moderator. Predictors will include teacher support for mathematics classes (TEACHSUP) scale, a 5-item measure of supportive teacher practices (median reliability in OECD countries α = 0.85) and cognitive activation for mathematics lessons (COGACT), a 9-item measure of teacher instructional practices (median reliability in OECD countries α = 0.83). The moderator variable is a single variable asking school administrators whether or not mathematics teachers were evaluated by student achievement results (SC30Q01). Control variables. Control variables will include the PISA index of economic, social & cultural status (ESCS), gender, and home language (Native or Non-native).

Expected Outcomes

Proposed Analyses Our proposal represents a work in-progress. All data have been acquired and are in the process of being prepared for analysis. We have selected our variables (detailed below) for inclusion in the models, but we are in the early stages of data management for subsequent analysis. Given our experience with this data from previous analyses and public availability of the data, there is sufficient time to complete our proposed analyses in advance of the fall conference. Expected Outcomes Based on our previous research with a similar topic using the 2009 PISA data, we expect to find that our interaction of interest (L2 teacher evaluation policies and L1 opportunity to learn) will have a significant negative effect for both of our dependent variables (mathematics self-efficacy and achievement).

References

References Deal, T. E., & Peterson, K. D. (2009). Shaping School Culture: Pitfalls Paradoxes and Promises.San Francisco: Jossey-Bass. Hamilton, L. (2003). Assessment as a Policy Tool. Review of Research in Education, 27, 25-68. Jaeger, D. J., Merki, K. M., Oer, B., & Holmeier, M. (2012). Statewide Low-stakes and a Teaching to the Test Effect? An Analysis of Teacher Survey Data from Two German States. Assessment in Education Principles Policy and Practice, 19, 451-467. Leithwood, K., & Jantzi, D. (2006). Transformational School Leadership for Large-Scale Reform: Effects on Students, Teachers and their classrooom practices. School Effectiveness and School Improvement, 17, 201-227. OECD (2009). PISA Data Analysis Manual (SAS, Second Edition). Retrieved from http://browse.oecdbookshop.org/oecd/pdfs/free/9809021e.pdf. OECD (2014), PISA 2012 Technical Report, PISA, OECD Publishing. Pedulla, J. J., Abrams, L. M., Madaus, G., Russell, M. K., Ramos, M. A., & Miao, J. (2003). Perceived Effects of State-Mandated Testing Programs on Teaching and Learning: Findings from a national survey of teachers. National Board on Educational Testing and Public Policy. Retrieved from http://www.bc.edu/research/nbetpp/statements/nbr2.pdf. Polesel, J., Rice, S., & Dulfer, N. (2014). The impact of high-stakes testing on curriculum and pedagogy: a teacher perspective from Australia. Journal of Education Policy, 29, 640-657. Rustique-Forrester, E. (2005, April 8). Accountability and the pressures to exclude: A cautionary tale from England. Education Policy Analysis Archives, 13(26). Retrieved from http://epaa.asu.edu/epaa/v13n26/. Wenglinsky, H. (2002). How schools matter: The link between teacher classroom practices and student academic performance. Education Policy Analysis Archives,10(12). Retrieved from http://epaa.asu.edu/epaa/v10n12/.

Author Information

Leslie Hawley (presenting / submitting)
University of Nebraska-Lincoln
Lincoln
University of Nebraska-Lincoln, United States of America
University of Nebraska-Lincoln, United States of America

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.