Impact Evaluation of Curriculum Reform Policy: Lessons Learned from an Exploratory Study in Taiwan
Author(s):
Conference:
ECER 2009
Format:
Paper

Session Information

23 SES 08 E, Politics of Curriculum

Paper Session

Time:
2009-09-30
08:30-10:00
Room:
HG, HS 45
Chair:
Palle Rasmussen

Contribution

Educational reform frequently occurs in many countries across the world. But not many evidence-based studies are conducted for the impact assessment of the reform. In other words, there is very little knowledge about the impact of reform on students, teachers, and schools, and even less of an understanding of the lasting benefits of a reform policy. Although most educational policy decision-makers and educational practitioners value the importance of monitoring the policy impact, systematic evaluation is not common. Because impact evaluation quite often involves long-term monitoring and tracking, it is so costly that many governments are reluctant to do it, not to mention the challenges and problems of the methodological design (Owen & Rogers, 1999; Kreber & Brook, 2001). However, due to impact evaluation is a learning tool for clarifying the value, the actual outcomes and effects of an educational policy/program (McKay & Treffgarne, 1999), it is important by all means to apply it to collect evidence/data for the revision and improvement of a policy, including finding better ways for the implementation of a policy. Since there have never been any impact evaluation done for assessing the real value and actual effects of educational policies or programs in Taiwan, this exploratory study was intended to conduct an impact evaluation of the recent curriculum reform policy in Taiwan, particularly with the attempt to understand how the current curriculum have impact on the students from disadvantaged rural schools. The research purposes include: 1. To explore the approaches and methods for impact evaluation of curriculum policy, 2. To develop the baselines and operation model for assessing impact of curriculum policy, 3. To understand the problems and coping strategies of implementing the new curriculum guidelines at rural schools, 4. To assess the impact of new curriculum policy on the curricular and pedagogical practices at rural schools, 5. To assess the impact of new curriculum policy on the students’ learning experience and educational opportunity at rural schools 6. To investigate the needed cultural capital and resources for rural schools to implement the new curriculum policy.

Method

This study was a two-year research design. In the first year, the research focus was to define the indicators and baseline conditions as well as operation model for impact assessment, and in the second year the research tasks were shifted to field study of rural schools to examine the impact of the new curriculum policy on various stakeholders, particularly on students. Methods applied in this study included (1) document analysis of official educational statistics, (2) focus group interviews of school administrators and teachers and students, (3) questionnaire survey, and (4) meta-analysis of Taiwan Educational Panel Survey (TEPS) results.

Expected Outcomes

Basically, impact evaluation has been surrounded by controversy and contradictory ideas related to definitions and the implementation of and evaluation system with a bureaucratic structure. Many studies have identified impact as the “effects of dissemination,” some have defined as the “consequences”, positive or negative, of a program or policy (Barnard, 1981; Rosenthal, 2000). In this paper the author will not only report the findings of the study but critically reflect on the methodological issues of the research design of a so-called “impact evaluation." It is hope that the findings and discussions in this paper may provide implications for future impact evaluation of educational policies/programs.

References

Barnard, W. S. (1981). Impact assessment of research and development program improvement efforts. Paper presented at the Annual Convention of the American Vocational Association. ERIC # ED211666. Kreber, C. & Brook, P. (2001). Impact evaluation of educational development programmes. The International Journal for Academic Development, 6(2), 96-108. McKay, V. & Treffgarne, C. (Eds.). (1999). Evaluating impact. Education research paper. London: Education Department, Department for International Development. Owen, J. M. & Rogers, P. J. (1999). Program evaluation: Forms and approaches. London: Sage Publication. Rosenthal, B. (2000). Impact analyses: Concepts and methods. AIR 2000 Annual Forum Paper. ERIC# ED446503.

Author Information

National Taiwan Normal University
Education
Taipei
215

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.