Session Information
ERG SES D 09, Assessment in Education
Paper Session
Contribution
The transition from post-primary to higher education mathematics is problematic for many students and is internationally referred to as the ‘mathematics problem’ (1–3).
One approach to addressing the ‘mathematics problem’ is the provision of mathematical support services by higher level institutes across Ireland, the UK and Australia (4). Another is the provision of online resources through dedicated websites such as those developed by project teams in Europe and the UK (5,6). This multi-institution project, funded by the National Forum for the Enhancement of Teaching and Learning in Higher Education in Ireland, is focussed on the development of interactive formative assessment techniques to improve the teaching and learning experience in first year undergraduate mathematics.
Two surveys were carried out in the first phase of the project. They aimed to identify the mathematical concepts and procedures that are problematic for first year undergraduate students, and the resources that students and lecturers find helpful. Initial results of these surveys have been reported at the CETL-MSOR conference in September 2015 (7).
The project team has developed a number of interactive tasks and formative assessment resources on topics such as logs, functions and limits as well as reusing or repackaging existing online resources on basic arithmetic skills.
The following resources are currently being trialled in a number of different institutions (academic year 2015/2016):
- A : Interactive Geogebra and Numbas tasks in a 1st year Differential Calculus module for science undergraduates
- B: Khan Academy structured playlists for 1st year computing and business undergraduates
- C: Moodle Lessons in a mathematics module for 1st year computing students
Theoretical Framework
There is extensive literature on the difference between the nature of mathematics research and mathematical education research (8,9). Mathematical education research is ‘inquiry by carefully developed research methods’ that provide evidence about mathematics teaching and learning (8).
In order to evaluate the effectiveness of the developed resources, evidence must be gathered. Evidence gathered from multiple sources, called triangulation, is more compelling and leads to more robust findings (9).
In this project we are using questionnaires, focus group interviews and task-based interviews with written artefacts. Questionnaire data can be used to determine useful information regarding student ‘background experiences, attitudes and perceptions’ and the usefulness of teaching and learning interventions (8). More in-depth probing on student attitudes can be obtained by interviewing students. McKnight et al. (8) suggest that task-based interviews can be used to ‘explore subjects’ approaches to and thinking about problem situations’. Written artefacts can be used to provide evidence of student progress as they solve tasks in task-based interviews. Reliability and validity analysis of the research instruments will ensure quality evidence is gathered (8).
The questionnaire has been developed with reference to similar evaluation studies such as MacGeorge et al. (10). The focus group interviews will be structured to elicit further information on students’ attitudes and to validate the questionnaire (10).
The interactive Geogebra tasks are designed to encourage conceptual understanding (11), to elicit information on students’ knowledge and growth in knowledge and to be new and unfamiliar (12). After initial evaluation by survey, think-aloud task-based interviews will be used to assess students’ cognitive processes (13). This task based interview structure is based on the 5 principles of effective task design for interview developed by Goldin (14): (i) Accessibility, (ii) Rich representational structure, (iii) Free problem solving, (iv) Explicit criteria for major contingencies and (v) Interaction with the learning environment (such as written artefacts).
The research questions are:
- To what extent do our research instruments gather evidence that facilitates the evaluation of the developed resources?
- How can these research instruments be improved upon?
Method
Expected Outcomes
References
1. Gill, O. et al., 2010. Trends in performance of science and technology students (1997-2008) in Ireland. International Journal of Mathematical Education in Science and Technology, 41(3), pp.323–339. 2. Lawson, D., Croft, T. & Waller, D., 2012. Mathematics support past , present and future. Practice and Research in Engineering Education., 1, pp.1–9. 3. Loughlin, W.A. et al., 2015. Snapshot of mathematical background demographics of a broad cohort of first year chemistry science students. International Journal of Innovation in Science and Mathematics Education, 23(1), pp.21–36. 4. O’Sullivan, C. et al., 2015. An Irish Mathematics Learning Support Network (IMLSN) Report on Student Evaluation on of Mathematics Learning Support: Insights from a large scale multi-institutional survey, Ireland. 5. FaSMed et al., 2014. FaSMEd. Society Collaborative Project of the European Community. Available at: http://research.ncl.ac.uk/fasmed/ [Accessed January 20, 2016]. 6. Sigma, 2003. mathscentre. Available at: http://www.mathcentre.ac.uk/ [Accessed January 20, 2016]. 7. Nί Shé, C. et al., 2015. Identifying problematic mathematical topics and concepts for first year students. In Proceedings of the CETL-MSOR 2015: Sustaining Excellence - 8/9 September 2015. In Press. 8. Mc Knight, C. et al., 2000. Mathematics education research: A guide for the research mathematician, American Mathematical Society. 9. Schoenfeld, A.H., 2015. Summative and Formative Assessments in Mathematics Supporting the Goals of the Common Core Standards. Theory Into Practice, 54(3), pp.183–194 10. MacGeorge, E.L. et al., 2008. Student evaluation of audience response technology in large lecture classes. Educational Technology Research and Development, 56(2), pp.125–145. 11. Breen, S. & O’Shea, A., 2012. Designing Tasks To Aid Understanding of Mathematical Functions. Proceedings of the National Academy’s Sixth Annual Conference and the Fourth Biennial Threshholds Concepts Conference, Dublin, (June), pp.1–5 12. Ericsson, A.K. & Simon, H.A., 1993. Protocol analysis: verbal reports as data. Revised edition, Cambridge, Massachusetts London, England: A Bradford Book, The MIT Press. 13. Nielsen, J., Clemmensen, T. & Yssing, C., 2002. Getting access to what goes on in people’s heads? Reflections on the think-aloud technique. In Proceedings of the second Nordic conference on human-computer interaction. ACM, pp. 101–110. 14. Goldin, G.A., 1997. Chapter 4: Observing Mathematical Problem Solving through Task-Based Interviews. Journal for Research in Mathematics Education., Monograph, pp.40–177. 15. Trenholm, S., Alcock, L. & Robinson, C., 2015. An investigation of assessment and feedback practices in fully asynchronous online undergraduate mathematics courses. International Journal of Mathematical Education in Science and Technology, 5211(September), pp.1–25.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.