Session Information
09 SES 06 C, Discussing Social Impact in Education Research and Assessment Related Education Policy and Research
Paper Session
Contribution
The structure and nature of assessment is widely recognised to have an impact on teaching and learning (Abrahams, Reiss, & Sharpe, 2013). International drivers such as PISA are leading many national systems to reform their curriculum and assessments. Any reform process should consider the interaction of the curriculum, form of assessment and educational context, and should carefully investigate whether any unintended consequences could emerge as a result of the reform. This paper explores the introduction of a reformed model for practical science assessment in high stakes assessment at upper secondary school level during its first years of implementation.
Internationally, science is considered a core subject at secondary level, reflected in its inclusion in major international programmes such as PISA and TIMMS, reflecting both the economic and cultural benefits of a scientifically educated population both as producers and consumers of science (Millar, 2006). Practical science is considered by many to underpin science education, supporting a wide range of curricular aims, including, for example, conceptual development, understanding of the scientific process, data handling skills, and expertise in experimental procedures (Kirschner & Meester, 1988; SCORE, 2008). In this paper we follow Abrahams and Reiss (2012), who define practical work as “an overarching term that refers to any type of science teaching and learning activity in which students, either working individually or in small group, are involved in manipulating and/or observing real objects and materials (p1036).”
Practical skills are assessed using many different methods in different countries. For example, in Ireland, practical work is assessed using a written paper, while CIE’s IGCSE, used around the world, offers the option of a practical exam or an alternative written alternative to practical, and the Netherlands use coursework assessment (Watts, 2013). Abrahams et al. (2013) classify practical assessment into two categories: Direct Assessment of Practical Skills (DAPS), where students are observed conducting practical work (e.g. a practical exam), and Indirect Assessment of Practical Skills (IAPS), where, for example, the write up of a practical activity is assessed, rather than the practical activity itself (e.g. a project).
England is currently undergoing a major reform of upper secondary qualifications, including A levels, commonly taken at age 18, as preparation for university entry, with the new courses first taught from 2015, and first assessed in 2017. The pre-reform assessment of practical science skills used teacher assessed tasks comprised mainly of IAPS, and little DAPS. There was criticism that the assessments had a negative impact on teaching and learning, because the high stakes nature of the assessment meant that practical work was often limited to a narrow range of practical activities. Additionally, the assessments were vulnerable to malpractice, and did not discriminate well between students.
The reformed model of assessment requires a minimum of 12 practical activities assessed by teachers using a competency framework, but not contributing to the final grade. Instead outcomes are reported separately as a pass/fail endorsement. Practical skills are also assessed indirectly through questions on the written examinations. This model aims to support teaching and learning of practical skills, by supporting a wider range of practical activities, which can be implemented more flexibly by teachers, to better support teaching and learning (Wade & Abrahams, 2015).
This paper presents the first stage of a study which investigates the impact of the reform on teaching and learning over time. The study has three main aims:
- What do teachers believe the purpose of practical science to be?
- What types of practical science activity do they undertake?
- What is the impact of a reform of practical science assessment on teaching and learning?
Method
Expected Outcomes
References
Abrahams, I., & Reiss, M. J. (2012). Practical work: Its effectiveness in primary and secondary schools in England. Journal of Research in Science Teaching, 49(8), 1035-1055. doi: 10.1002/tea.21036 Abrahams, I., Reiss, M. J., & Sharpe, R. M. (2013). The assessment of practical work in school science. Studies in Science Education, 49(2), 209-251. doi: 10.1080/03057267.2013.858496 Kirschner, P. A., & Meester, M. A. M. (1988). The laboratory in higher science education: Problems, premises and objectives. Higher Education, 17(1), 81-98. doi: 10.1007/bf00130901 Millar, R. (2006). Twenty First Century Science: Insights from the Design and Implementation of a Scientific Literacy Approach in School Science. International Journal of Science Education, 28(13), 1499-1521. doi: 10.1080/09500690600718344 SCORE. (2008). Practical work in science: A report and proposal for a strategic framework. . Retrieved 08/04/2013, from http://www.score-education.org/media/3668/report.pdf Wade,N., & Abrahams, I. (2015) Validity issuues in the reform of a practical science assessment: An English case study. Retrieved 12/01/2015 from http://www.iaea.info/documents/paper_3fc736a61.pdf Watts, A. (2013). The assessment of practical science: a literature review. Retrieved 10/02/2014, from http://www.cambridgeassessment.org.uk/Images/135793-the-assessment-of-practical-science-a-literature-review.pdf
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.