Reform of Practical Science Assessment in England: Impact on Teaching and Learning.
Author(s):
Frances Wilson (presenting / submitting) Neil Wade Steve Evans
Conference:
ECER 2017
Format:
Paper

Session Information

09 SES 06 C, Discussing Social Impact in Education Research and Assessment Related Education Policy and Research

Paper Session

Time:
2017-08-23
15:30-17:00
Room:
W5.17
Chair:
Julia Gerick

Contribution

The structure and nature of assessment is widely recognised to have an impact on teaching and learning (Abrahams, Reiss, & Sharpe, 2013). International drivers such as PISA are leading many national systems to reform their curriculum and assessments.  Any reform process should consider the interaction of the curriculum, form of assessment and educational context, and should carefully investigate whether any unintended consequences could emerge as a result of the reform.  This paper explores the introduction of a reformed model for practical science assessment in high stakes assessment at upper secondary school level during its first years of implementation. 

Internationally, science is considered a core subject at secondary level, reflected in its inclusion in major international programmes such as PISA and TIMMS, reflecting both the economic and cultural benefits of a scientifically educated population both as producers and consumers of science (Millar, 2006). Practical science is considered by many to underpin science education, supporting a wide range of curricular aims, including, for example, conceptual development, understanding of the scientific process, data handling skills, and expertise in experimental procedures (Kirschner & Meester, 1988; SCORE, 2008).  In this paper we follow Abrahams and Reiss (2012), who define practical work as “an overarching term that refers to any type of science teaching and learning activity in which students, either working individually or in small group, are involved in manipulating and/or observing real objects and materials (p1036).”

Practical skills are assessed using many different methods in different countries. For example, in Ireland, practical work is assessed using a written paper, while CIE’s IGCSE, used around the world, offers the option of a practical exam or an alternative written alternative to practical, and the Netherlands use coursework assessment (Watts, 2013). Abrahams et al. (2013) classify practical assessment into two categories: Direct Assessment of Practical Skills (DAPS), where students are observed conducting practical work (e.g. a practical exam), and Indirect Assessment of Practical Skills (IAPS), where, for example, the write up of a practical activity is assessed, rather than the practical activity itself (e.g. a project). 

England is currently undergoing a major reform of upper secondary qualifications, including A levels, commonly taken at age 18, as preparation for university entry, with the new courses first taught from 2015, and first assessed in 2017.  The pre-reform assessment of practical science skills used teacher assessed tasks comprised mainly of IAPS, and little DAPS.  There was criticism that the assessments had a negative impact on teaching and learning, because the high stakes nature of the assessment meant that practical work was often limited to a narrow range of practical activities.  Additionally, the assessments were vulnerable to malpractice, and did not discriminate well between students. 

The reformed model of assessment requires a minimum of 12 practical activities assessed by teachers using a competency framework, but not contributing to the final grade. Instead outcomes are reported separately as a pass/fail endorsement. Practical skills are also assessed indirectly through questions on the written examinations.  This model aims to support teaching and learning of practical skills, by supporting a wider range of practical activities, which can be implemented more flexibly by teachers, to better support teaching and learning (Wade & Abrahams, 2015).

This paper presents the first stage of a study which investigates the impact of the reform on teaching and learning over time. The study has three main aims:

  • What do teachers believe the purpose of practical science to be?
  • What types of practical science activity do they undertake?
  • What is the impact of a reform of practical science assessment on teaching and learning?

 

Method

This study used an online survey methodology, enabling a breadth of responses from a range of teachers and types of institution. The survey asked teachers for their views on practical science at GCSE (taken at age 16), and A-levels (taken at age 18). The survey was run in January to July in 2015 and 2016. The reformed A-level qualifications were first taught in September 2015, so the 2015 questionnaire obtained data from the final year before any reformed qualifications were taught, and serves as a baseline year. Although data were collected about GCSE, since the reform of GCSE was implemented a year later, this paper focuses on the data related to A-level. The questionnaire will be run annually until 2020. This paper reports data from the questionnaire in 2015 and 2016, enabling a comparison of the impact of the reform to assessment at A-level against the background of a continuing GCSE programme. A literature review identified key issues in practical science in upper secondary level. In addition to information from teachers about the science subject(s) and qualification(s) taught, and the type of institution at which they worked, the questionnaire sought teachers’ views on four key areas: 1. Purpose of practical science 2. The type of practical science activities (teacher demonstrations, student practicals, pre-determined outcome or not) 3. Challenges associated with practical work (including content areas, impact of assessment) 4. Impact of the assessment model on practical work This questionnaire was piloted by six science teachers, and modified according to their comments. In 2016, the questionnaire was updated, clarifying that teachers should respond about the reformed A-level, but the legacy GCSE. Participants to the questionnaire were recruited through an assessment organisation’s webpage, social media, and an email to secondary institutions inviting participation in the study. The questionnaire was open from January to July in each year; participants were invited to participate by email in June. Participation was rewarded by entry to a prize draw for a £100 Amazon voucher. Participants were eliminated if they made fewer than 20 responses (ticking a box counted as a response) to questions after questions 1 (science courses taught) and question 2 (type of institution). In 2015, 619 people started the questionnaire, following data cleaning, 522 respondents remained, and were included in the analysis. In 2016, 235 people started the questionnaire, and 191 respondents remained, and were included in the analysis.

Expected Outcomes

Teachers’ views on the purpose of practical work did not change following the introduction of the reform, with the most important reasons including developing manipulative skills and techniques, data handling skills, accurate observation, and developing conceptual understanding. There were few changes in the type of science practical work undertaken, with a similar proportion of teacher demonstrations relative to student practicals. There was no change in the percentage of practical work undertaken which had a pre-determined outcome, or which required students to discover a concept for themselves. The challenges for undertaking practical work typically related to structural factors, such as timetabling, class size and funding limitations, though the need to prepare students for written examinations was considered to be a challenge. However, teachers indicated that the new model for practical assessment had an impact in the first year of implementation. In general, in 2016, teachers were more likely to report that the new practical assessment model at A level is having a positive impact on teaching and learning of theory, and has encouraged teachers to undertake a wider range of practical activities, compared to 2015. This was further supported by comments from individual teachers, who suggested that the new model allows more time and flexibility to undertake the practical work that they feel is most beneficial for their students. Where teachers expressed reservations about the reform, these largely related to a concern about the need to record evidence of practical work, and challenges relating to a perceived lack of equipment. This positive reporting on the revised A level assessment was balanced against a largely unchanged view of work at the GCSE which had not undergone reform in early 2016. Although the reform is relatively recent, this study indicates that the reform is having a positive impact on teaching and learning.

References

Abrahams, I., & Reiss, M. J. (2012). Practical work: Its effectiveness in primary and secondary schools in England. Journal of Research in Science Teaching, 49(8), 1035-1055. doi: 10.1002/tea.21036 Abrahams, I., Reiss, M. J., & Sharpe, R. M. (2013). The assessment of practical work in school science. Studies in Science Education, 49(2), 209-251. doi: 10.1080/03057267.2013.858496 Kirschner, P. A., & Meester, M. A. M. (1988). The laboratory in higher science education: Problems, premises and objectives. Higher Education, 17(1), 81-98. doi: 10.1007/bf00130901 Millar, R. (2006). Twenty First Century Science: Insights from the Design and Implementation of a Scientific Literacy Approach in School Science. International Journal of Science Education, 28(13), 1499-1521. doi: 10.1080/09500690600718344 SCORE. (2008). Practical work in science: A report and proposal for a strategic framework. . Retrieved 08/04/2013, from http://www.score-education.org/media/3668/report.pdf Wade,N., & Abrahams, I. (2015) Validity issuues in the reform of a practical science assessment: An English case study. Retrieved 12/01/2015 from http://www.iaea.info/documents/paper_3fc736a61.pdf Watts, A. (2013). The assessment of practical science: a literature review. Retrieved 10/02/2014, from http://www.cambridgeassessment.org.uk/Images/135793-the-assessment-of-practical-science-a-literature-review.pdf

Author Information

Frances Wilson (presenting / submitting)
OCR
Research and Technical Standards
Cambridge
OCR, United Kingdom
OCR, United Kingdom

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.