Session Information
09 SES 02 B, Investigating the Validity of TIMSS & PIRLS
Paper Session
Contribution
Higher order thinking skills require the use of multiple related skills at the same time (Marzano & Heflebower, 2012) and execution of critical, rational, reflective and creative thinking processes. Individuals use their higher order thinking skills when they encounter situations/questions/ dilemmas that they have not experienced before. They emerge on the basis of the combination between associated low-level mental skills and their own knowledge (King, Goodson & Rohani, 1998). As one of the higher order thinking skill, the reasoning is the process of subtracting results (Leighton 2004). Reasoning, especially related to problem-solving, becomes crucial in the decision making the process about the accuracy of a proposition or the consistency between two or more components (Brookhart, 2010).
The most common measurement method of reasoning is to make inferences from the individual’s performance in a given task. The reasoning is one of the focuses in TIMSS (Trends in International Mathematics and Science Study). It is applied to measure and evaluate fourth-grade students in science and mathematics at three mentally hierarchical cognitive levels. The reasoning is the last cognitive level and involves the most mentally complex processes; therefore, such questions require answers with better-structured approaches. (Mullis, Martin, Ruddock, OırSullivan & Preuschoff, 2009; Mullis & Martin, 2013).
In the task related to reasoning, the answerer should base their answer on some statement and enrich the answer with new information and similar examples (Haladyna & Rodriguez, 2013). This process is similar to the answering behavior in multiple-choice items. Answering behavior occurs as a result of a mental process and it is a process consisting of mental processing steps (Tokat, 2006). Answers are mediated by cognitive processes. Similarly, Turgut and Baykul (2010) examine the answering behaviors as four groups: (i) The answerer has obtained the target behavior and answers it correctly; the answerer has not obtained the target behavior (ii) gives the wrong answer or omit the test item, (iii) but eliminates options to find the correct answer and (iv) but the answers randomly and reaches the correct answer by chance. There exist more detailed descriptions for the answering behavior (Dodeen, 2008; Koçak, 2013; Pehlivan & Kutlu, 2014).
Metacognitive awareness is another feature that may be related to reasoning and answering behavior. It is what you know about your cognition and how you manage your own cognition (Flavell, Miller & Miller, 2002). Therefore, it is also used to monitor and regulate cognitive processes such as cognitive, learning, problem-solving, comprehension, reasoning (Metcalfe & Shimamura, 1996). Individuals with high-level metacognitive awareness use their knowledge in the most strategic way to provide the most effective performance (Gourgey, 2002). Cognitive-thinking involves thinking about mental processes such as understanding and remembering one's own perception while metacognition includes perception, understanding, remembering, and similar mental processes (Garner & Alexander, 1989); therefore, these two features may be related to each other. Accordingly, as individuals' cognitive awareness levels increase, their performance in high-level thinking questions is expected to increase. In light of this information, it is a necessity to investigate whether the students’ answering behaviors vary depending on their levels of reasoning and their levels of metacognitive awareness. Based on this general purpose, the following questions are answered:
• How are the students’ answering behaviors distributed according to their levels of metacognitive awareness?
•Are the students’ answering behaviors dependent on their level of metacognitive awareness?
•How are the students’ answering behaviors distributed according to their level of reasoning?
•Are the students’ answering behaviors dependent on their level of reasoning?
•How the students’ answering behavior distributed according to their performance in the item?
•Are the students’ answering behaviors dependent on their performance in the test item?
Method
In this study, the change in students’ answering behavior according to their level of metacognitive awareness and their level of reasoning is examined. Therefore, this is a survey research. In survey researches, it is found out how the members of a population are distributed in terms of one or more variables (Fraenkel, Wallen & Hyun, 2013). The sample of this study will be determined by purposive sampling method. Identifying and selecting individuals or groups of individuals that are especially knowledgeable about or experienced with a phenomenon of interest is the focus of purposive sampling (Cresswell & Plano Clark, 2011). Therefore, sample the students so as to reach a wide range of levels of reasoning. Based on this requirement, five 4th grade classes are determined as the sample. According to the purpose of the study, three instruments are applied. The first instrument is Jr. MAI-Form A developed by Sperling, Howard, Miller, and Murphy (2002) to measure the 3rd-9th-grade children’s knowledge and regulation of cognition. The instrument was adapted to Turkish by Karakelle and Saraç (2007). This instrument has 12 items with three categories. The second instrument is applied to measure the student’s reasoning level. The items applied in TIMSS 2009, TIMSS 2012 and TIMSS 2015 are examined. Among them, eight questions in reasoning level are determined and the reasoning test is constructed. The last instrument needed in the study is answering behavior form. According to the students' age, a more simple form is necessary. Therefore, a form is constructed by the researcher after the literature review. The form has four options as answering behaviors. All four options are placed as a second column for each reasoning questions in the test booklet; so that two instruments are applied as one. Firstly, the students will answer the reasoning question and then they select the option belonging to the answering behavior they exhibit. At the end of this session, the students also answer the Turkish Adaptation of Jr. MAI. After data collection, it will be analyzed. The students are grouped as high or low in terms of their score from Reasoning Test and Jr MAI. Then the distribution of answering behaviors in these groups are examined and chi-square analysis is executed.
Expected Outcomes
As a result of the study, it is expected to find out the difference between the students’ answering behaviors in terms of their level of metacognition and their level of reasoning. The multiple choice items are frequently applied in the measurement of reasoning which is one of the higher order thinking skills. Therefore, the students’ approach to these items and the differences in their approaches according to their level of metacognitive awareness and reasoning should be highlighted to examine the applicability of such items in this way. Their metacognitive awareness level as an indicator of the awareness of their own thinking processes and their answering behavior as an indicator of their approach to test item could be correlated in such a study. Therefore, the results would provide justification for the effects of the constructed structure in multiple-choice items. Similarly, Umay (1997) suggested that multiple-choice items are appropriate to conduct to measure higher order thinking skills such as problem-solving based on the finding of her study. From another perspective, the students with higher meta-cognitive awareness are expected to perform better. They are also expected to have a more strategic approach to the test items and reflect this to answering behavior. The studies in the literature have different results in terms of the answering behavior and the students’ test performance. For example as Frary's (1980) findings, students’ knowledge level is effective on their answering behavior. Pehlivan & Kutlu (2014) find the students answering behaviors change in terms of the subscales of the test. Umay (1998) states that the students tend to omit a difficult question.
References
Brookhart, S.M. (2010). How to assess higher-order thinking skills in your classroom. USA: ASCD. Cresswell, J.W. &Plano Clark, V.L. (2011). Designing and conducting mixed method research. California: Sage. Dodeen, H. (2008). Assessing test‐taking strategies of university students: developing a scale and estimating its psychometric indices. Assessment & Evaluation in Higher Education, 33(4), 409-419. Flavell, J.H., Miller, P.H. ve Miller, S.A. (2002). Cognitive education (fourth edition).NewJersey: Prentice Hall. Fraenkel, J. R., Wallen, N. E., & Hyun, H. H. (2011). How to design and evaluate research in education. New York: McGraw-Hill Humanities/Social Sciences/Languages. Garner, R., & Alexander, P. A. (1989). Metacognition: answered and unanswered questions. Educational Psychologist, 24, 143–158. Gourgey, A. F. (2002). Metacognition in basic skills instruction. H. J. Hartman, (Ed.), In Metacognition in learning and instruction: Theory, research and practice (17-32). The Netherlands: Kluwer Academic Publishing. Haladyna, T.M. & Rodriguez, M.C. (2013). Developing and validating test items. NewYork: Routledge. Karakelle, S., & Saraç, S. (2007). Çocuklar için üst bilişsel farkındalık ölçeği (ÜBFÖ-Ç) A ve B formları: Geçerlik ve güvenirlik çalışması. Türk Psikoloji Yazıları, 10(20), 87-103. King, F. J., Goodson, L., & Rohani, F. (1998). Higher order thinking skills: Definition, teaching strategies, assessment. Publication of the Educational Services Program, now known as the Center for Advancement of Learning and Assessment. Obtido de: www. cala. fsu.edu. Koçak, D. (2013). Farklı Yönergelerle Verilen Çoktan Seçmeli Testlerde Yanıtlama Davranışlarının İncelenmesi (Yayımlanmamış Yüksek Lisans Tezi) Ankara Üniversitesi Eğitim Bilimleri Enstitüsü, Ankara. Leighton, J. P. (2004). Avoiding misconception, misuse, and missed opportunities: The collection of verbal reports in educational achievement testing. Educational Measurement: Issues and Practice, 23(4), 6–15. Marzano, R. J., & Heflebower, T. (2012). Teaching & Assessing 21st Century Skills (The Classroom Strategies Series). USA: Marzano Research Laboratory. Metcalfe, J. & Shimamura, A.P. 1996. Metacognition; Knowing about Knowing, Cambridge, Mass: MIT Press. Mullis, I.VS., Martin, M.O., Ruddock, G.J., O’Sullivan, Y. & Preuschoss, C. (2009). TIMSS 2011 assessment frameworks. Boston: International Study Center. Mullis, I.V.S. & Martin, M.O. (2013). TIMSS 2015 assessment frameworks. Boston College: TIMSS & PIRLS International Study Center Retrieved fromhttp://timssandpirls.bc.edu/timss 2015/frameworks.html in 15th May 2018. Pehlivan, E. B., & Kutlu, Ö. (2014). Türkçe test maddelerinde yanıtlama davranışlarının incelenmesi. Eğitimde ve Psikolojide Ölçme ve Değerlendirme Dergisi, 5(1). Tokat, Y. N. (2006). Çoktan Seçmeli Testlerde Yanıtlama Davranışlarının Belirlenmesi. (Yayınlanmamış Yüksek Lisans Tezi), Ankara Üniversitesi Eğitim Bilimleri Enstitüsü, Ankara. Turgut, Y. & Baykul, Y. (2010). Eğitimde ölçme ve değerlendirme. Ankara: Pegem Akademi.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.