Session Information
09 SES 04 B, Assessing and Investigating Achievement in STEM and Music Education
Paper/Ignite Talk Session
Contribution
Introduction
Measuring non-routine mathematics problem solving ability of the students requires a reliable and a valid test. In order to develop a non-routine mathematics problem solving test, it is necessary explicitly to define what non-routine problems are. There are definitions of non-routine problems in the literature. Polya (1966) stated that “There are problems and problems… the difference which is the most important for the teacher is that between “routine” and “non-routine” problems. The nonroutine problem demands some degree of creativity and originality from the student, the routine problem does not.” (p. 126–127). Stanic and Kilpatrick (1988) stated that “nonroutine problem solving is characterized as a higher level skill to be acquired after skill at solving routine problems.” (p. 15). Also, large scale assessments, such as TIMSS, defines non-routine problems under the reasoning cognitive domain. Reasoning in TIMSS is defined as “goes beyond the solution of routine problems to encompass unfamiliar situations, complex contexts, and multistep problems” (Mullis & Martin, 2017, p.22). The common feature of all these definitions is that they all use routine problems to define non-routine problems. It is claimed that non-routine problems are different than routine problems, then it is necessary to define what makes a problem a routine one. The concept of creativity, originality and unfamiliarity are also emphasized. However, the originality and the unfamiliarity could be student and context dependent. Since non-routine problems become more and more popular in international and national assessments, students could be trained to handle the originality and familiarity issues in non-routine problems. As a result of these training or education, non-routine problems might become routine problems.
Dependency of the definition of non-routine problems on routine problems lead us to search for a new framework that could be used for non-routine problem assessment. The use of mathematics in science, technology and engineering related context and scenarios that requires reasoning could be an alternative framework for measuring non-routine mathematical problems ability. Using this framework is also expected to contribute to Science, Technology, Engineering and Mathematics (STEM) education field. As STEM encompasses Science, Technology, Engineering and Mathematics, measuring the necessary mathematics ability related to be successful in these interrelated fields could provide us new sources for the creativity, originality and unfamiliarity.
There are very limited studies that measure STEM related abilities using a special test for STEM outcomes. Researchers generally use regular mathematics and science test or exam results to measure STEM education outcomes. Han, Capraro and Capraro (2015) used the Texas Assessment of Knowledge and Skills test to investigate the relationship between STEM education and mathematics achievement. Bicer, Capraro and Capraro (2017) reported that they constituted a higher order STEM assessment model. However, they only used mathematics and science items of the Texas Assessment of Knowledge and Skills, reported they found a fit in their second-order model, and claimed that the correlation between mathematics and science is an indication of an STEM assessment model without taking into consideration the technology and the engineering components. There is a need for a framework and an instrument to assess STEM related abilities of the students (Saxton et al., 2014).
The need for measuring non-routine mathematics problem solving abilities and measuring STEM related achievement could be combined. With these perspectives in mind, this study aims to develop a non-routine mathematics problems test using a STEM framework for 8th graders. The research questions of this study is “Could the student responses provide evidence for the hypothesized structure for measuring non-routine mathematics problem solving ability on a STEM framework?”.
Method
Methods Participants The data of the study will be obtained from 8th grade students in Turkey. There will be three phases of the test administration. In the first phase, there will be an interview about the test items with two students. In the second phase, the test will be piloted in two schools, one private and one public. In this phase, there will be around 100 students. In the last phase, the test will be administrated to about 400 students. The ratio of public school students and private school students will be 9 to 1, similar to the private school ratio of the country. The number of girls and boys in the samples are expected to be close to each other. All the samples will be selected using convenient sampling as it is required to get permission from the ministry of the education. The permissions are already granted. Measures The measure of the study is the non-routine mathematics problem test. This test is developed in the STEM framework for the 8th grade students by three researchers. In this test there are three dimensions: mathematics ability, science related mathematics ability and technology and engineering related mathematics ability. There are total of 10 subdimensions related to these dimensions. In mathematics ability dimension, the subdimensions are algorithmic thinking, concepts & principles, argumentation and pattern recognition. In science related mathematics ability dimension, the subdimensions are math involved scientific literacy: Physics, math involved scientific literacy: Chemistry, math involved scientific literacy: Biology. In technology and engineering related mathematics ability dimension, the sub-dimensions are modelling, coding, and technology and engineering related problem solving. For each sub-dimension, three items was prepared. Also, routine mathematics problem items will be used to investigate the relationship between routine problem test and non-routine problem test. There are total of 30 items for the pilot administration. The final form of the test will be around 20 items. The test has both multiple choice and constructed response items. Data Analysis The test will be piloted and IRT item analysis results and confirmatory factor analysis (CFA) results will be used to create the final version of the test. Final form of the test will also be evaluated using IRT and CFA. Measurement invariance for gender groups will be tested and proficiency level descriptors will be developed.
Expected Outcomes
Expected Outcome As a major outcome, it is expected to develop a reliable and a valid test that measures non-routine problem solving ability of mathematics using STEM framework. The reliability will be assessed by Cronbach’s alpha coefficient and the construct validity will be assessed using confirmatory factor analysis for the model. Also, for the additional validity evidences, the relationship between routine problems items and the test sub-dimensions will be reported. As STEM framework is used to develop the test, the test is expected to give information about STEM related abilities of the students. Additionally, this test will provide performance level descriptors to students and will be used to give criterion referenced feedback. Measuring STEM related abilities makes this test quite unique. Studies generally use ordinary mathematics and science test and use scores of these tests as an indication of STEM achievement score. This test expected to be used to measure STEM achievement of students. The adaptation of this test to other cultures is also planned as a future study. Therefore, this test is expected to have several versions in other languages. Also, taking this test as a basis, the framework used in this test will be discussed and new versions will be expected to emerge in different cultures through collaboration. Providing sample items to measure algorithmic thinking, concepts & principles, argumentation, pattern recognition, math involved scientific literacy: Physics, math involved scientific literacy: Chemistry, math involved scientific literacy: Biology, modelling, coding, and technology and engineering related problem solving will be helpful to discuss how we can measure these specific dimensions for middle school students.
References
References: Bicer, A., Capraro, R. M., & Capraro, M. M. (2017). Integrated STEM assessment model. EURASIA Journal of Mathematics Science and Technology Education, 13(7), 3959-3968. Han, S., Capraro, R., & Capraro, M. M. (2015). How science, technology, engineering, and mathematics (STEM) project-based learning (PBL) affects high, middle, and low achievers differently: The impact of student factors on achievement. International Journal of Science and Mathematics Education, 13(5), 1089-1113. Mullis, I. V. S., & Martin, M. O. (Eds.). (2017). TIMSS 2019 Assessment Frameworks. Retrieved from Boston College, TIMSS & PIRLS International Study Center website: http://timssandpirls.bc.edu/timss2019/frameworks/ Polya, G. (1966). On teaching problem-solving. In E.G. Begle (Ed.), The role of axiomatics and problem-solving in mathematics (pp. 123–129). Boston: Ginn. Saxton, E., Burns, R., Holveck, S., Kelley, S., Prince, D., Rigelman, N., & Skinner, E. A. (2014). A common measurement system for K-12 STEM education: Adopting an educational evaluation methodology that elevates theoretical foundations and systems thinking. Studies in Educational Evaluation, 40, 18-35. Stanic, G., & Kilpatrick, J. (1988). Historical perspectives on problem solving in the mathematics curriculum. In R. Charles & E. Silver (Eds.), The teaching and assessing of mathematical problem solving (pp. 1–22). Reston, VA: National Council of Teachers of Mathematics.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.