A Metacognitive Perspective to Open-ended Questions vs. Multiple Choice
Author(s):
Bengi Birgili (presenting / submitting) Ercan Kiraz
Conference:
ECER 2017
Format:
Paper

Session Information

ERG SES D 05, Professionalism and Education

Paper Session

Time:
2017-08-21
13:30-15:00
Room:
W2.10
Chair:
Jonas Almqvist

Contribution

The prevalence of knowledge economy has pushed the education higher up the government agenda in recent years and education as a trend policy area become important to govern. Whether or not standardization of education systems according to national or local aspects, or curricula should contribute the uniformity has been discussed as a new government agenda. Reforming education in all aspects begins to be seen as ambivalent roles of this policy attempt. So educational research become significant for the vitality of constant change. Discussion of standardization of education process opens up novice study era in not only teaching and learning situations but also measurement and evaluation. 

Measuring student achievement in education systems provides substantial inputs for the continuity and effectivity in educational process in which instructional quality can be evaluated to determine whether the intended outcomes are achieved. In Turkey, government, especially Ministry of National Education (MoNE), has attempt changes and adaptations on measurement system. Since 2013, it has been disputed that multiple-choice question format must be changed to open-ended in large-scale assessments as if one form is better than the other. However, each measurement technique requires different applications of metacognition, which means a higher order thinking that includes active control over the cognitive engagement in learning processes. Evaluating one’s own progress toward completing the task is accepted as metacognitive behaviors. There are two important skills indicator of metacognition: cognitive strategy and self-checking. Cognitive Strategy is a goal-directed and consciously controllable process that facilitates or supports performance as learners develop internal procedures that enable them to perform desired skills. Self-checking is a self-monitoring one’s performance when engaging in a task. Therefore, the aim of this paper is to analyze multiple-choice and open-ended question formats over large-scale assessments in Turkey in terms of students, teachers and academicians perspectives. As for guided by the research question: “What is the difference between MC and OE questions in terms of students’ metacognitive dimensions-cognitive strategy and self-checking?” through a qualitative manner, we investigated the experiences of the participants about MC and OE. How middle school students’ cognitive strategy and self-checking behaviours during MC or OE exercises become have been analysed through the common point of experiences of eight grade students, teachers and academicians.

Method

In this study, cognitive strategy and self-checking dimensions of metacognition were explored. The participants were selected purposefully. The researchers used phenomenological approach and applied cognitive interview with 32 participants for in-depth understanding so that the primary data collection method was in-depth interviews. The interview questions were adapted from O’Neil & Brown (1998)’s study to ask students’, teachers and academicians ideas about the contribution of MC and OE on middle school students’ metacognition. For instance, to measure how the students use cognitive strategy during MC or OE, such interview questions were imposed: “Do you reword a MC/OE question to understand better? Why?”; “With what kind of strategies and method do you reread the question root in MC/OE?”. While asking these question to understand the students cognitive strategy skills during solution process, another questions were also asked to impose self-checking skills: “How often do you check your solution of the problem?”; Do you need to judge correctness of solution during MC/OE?”. The participants of 8th grade students, academicians and teachers were subjected to cognitive interview about these two dimensions of metacognition in order to reveal their common experiences. For providing trustworthiness, interview questions were reviewed by three experts and instruments were developed throughout the study. To provide credibility, member check technique was utilized. Then, each participant were given control the interview transcripts and read them thoroughly for clarity, accuracy so that they provided additional insight and information when necessary. Importantly, the role of the researchers in this study was not to generate replicability, rather it was to describe the environment from viewpoints of the participants. Member checks also enhanced the level of dependability of this qualitative study. To describe how the students can apply self-checking and cognitive strategy skills when solving MC or OE, detail information coming from the participants were delivered through their views involving direct quotations. Data gathered during the process were analyzed through content analyses method. Codes, categories, and themes were developed based on information from interviews by reflecting theoretical framework, and data saturation. After the collected data had been coded, expressions including parallel patterns were combined into eleven categories and two themes emerged for the self-checking and cognitive strategy dimensions of metacognition.

Expected Outcomes

The research question of the current study was “What is the difference between MC and OE questions in terms of students’ metacognitive dimensions-cognitive strategy and self-checking?”. So as to understand the metacognitive thinking process, MC and OE are compared for eight grade students’ cognitive strategies. Interviews conducted with 10 eight-grade students who had been possible TEOG candidates, 10 branch teachers, and 6 academicians. These interviews were analyzed to find out how they activate their cognitive strategy on MC and OE, which is one of the sub dimensions of metacognitive phenomena. The meta-codes inferred from this part were solution strategy preference, cognitive strategies employed, rewording skill to activate cognitive strategy, spending time to understand, and students’ thinking on meaning of a problem by rereading. The results revealed that by rereading the participants tried to think the meaning of the question again and examine core meaning under the problem before jumping into choice election. All of the students apart from one of them explained they reread the question root before solving them. It meant that MC question format necessitated their reread skill for understanding deeply. MC and OE were also compared on how to support the eighth grade students’ self-checking ability from the point of six categories according to the students, teachers and academicians views respectively. The possible metacodes inferenced from this dimension were checking works, going over choices, judging correctness of solution process, asking how well doing and when during solution process, correcting errors during solution process and asking questions to stay on track. Similarly, it was inferred that nearly half of the participants informed according to their experience OE may force the children to activate self-checking strategy more than MC. The results were strongly enriched by the direct quotations from all of the participants.

References

Berberoglu, G. (2009). CİTO Türkiye öğrenci izleme sistemi (ÖİS) öğrenci sosyal gelişim programı’na (ÖSGP) ilişkin ön bulgular. [CITO Turkey student follow- up system (OIS) pre-findings about student social development program.] CITO Eğitim: Kuram ve Uygulama Dergisi, 32-42. Berberoglu, G., & Is-Guzel, C. (July-September, 2013). Eğitim sistemimizdeki ölçme ve değerlendirme nasıl olmalıdır? [How should educational measurement and evaluation practices be in an educational system?] CITO Eğitim: Kuram ve Uygulama. Birenbaum, M., & Feldman, R. A. (2006). Relationship between learning patterns and attitudes towards two assessment formats. Educational Research, 40(1), 90-98. doi: 10.1080/0013188980400109. Bridgeman, B. (1992). A comparison of quantitative questions in open-ended and multiple-choice formats. Journal of Educational Measurement, 29(3), 253-271. Efklides, A. (2006). Metacognition and affect: What can metacognitive experiences tell us about learning process? Educational Research Review, 1, 3-14. Efklides, A. (2011) Interactions of metacognition with motivation and affect in self- regulated learning: The MASRL model, Educational Psychologist, 46(1), 6-25, doi: 10.1080/00461520.2011.538645 Ericsson, K. A., & Simon, H. A. (1990). Verbal reports as data. Psychological Review, 87, 215-250. Heck, J. L., & Stout, D. E. (1998). Multiple-choice vs. open-ended exam problems: Evidence of their impact on student performance in introductory finance. Financial Practice and Education, 8, 83-93. Hong, E. & O’Neil, H. F. (2001). Construct validation of a trait self-regulation model. International Journal of Psychology, 36(3), 186-194. Kapa, E. (2007). Transfer from structured to open-ended problem solving in a computerized metacognitive environment. Learning and Instruction, 17(6), 688-707. Ko, M. H. (2010). A comparison of reading comprehension tests: Multiple-choice vs. open-ended. English Teaching, 65(1), 137-159. Martinez, M. E. (2006). What is metacognition?. Phi Delta Kappan, 696-699. MEB (2013) Temel Eğitimden Ortaöğretime Geçiş. [Transition from Primary to Secondary Education.] Retrieved on June 25, 2014, from http://oges.meb.gov.tr/docs2104/sunum.pdf O’Neil, H. F., & Brown, R. S. (1998). Differential effects of question formats in math assessment on metacognition and affect. Applied Measurement in Education, 11(4), 331-351. Ozuru, Y., Briner, S., Kurby, C. A., & McNamara, D. S. (2013). Comparing comprehension measured by multiple-choice and open-ended questions. Canadian Journal of Experimental Psychology, 67(3), 215-227. Patton, M. Q. (1990). Qualitative evaluation and research methods. (2nd ed.). USA: Sage Publications. TEDMEM (2013). Kazak Modeli Nedir? [What is Kazakh Model?] Retrieved on April 12, 2013, from http://www.tedmem.org/haberler/2013/05/28/kazak_modeli_nedir.html

Author Information

Bengi Birgili (presenting / submitting)
MEF University, Istanbul, Turkey; Middle East Technical University, Ankara, Turkey
Çanakkale Onsekiz Mart University
Educational Sciences Department, Curriculum and Instruction
Çanakkale

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.