Session Information
09 SES 01 B, Assessment in Higher Education
Paper Session
Contribution
The development of critical thinking is generally acknowledged as an important aim of higher education (Bok, 2006; Facione, 1990; Wood, Kitchener, & Jensen, 2002). Studies on critical thinking show differences in students’ critical thinking skills between the start and the end of their undergraduate studies (Astin, 1993; Giancarlo & Facione, 2001; Miller, 1992), but students often do not reach the highest levels of critical thinking. Furthermore, information about the development of critical thinking in Flemish students in higher education is missing.
Despite the widespread agreement on the importance of the development of critical thinking in students, agreement on its precise meaning is lacking (Abrami et al., 2008). Not surprisingly, in spite of the presence of different measurements a valid and widely accepted instrument to assess critical thinking is currently absent (Saiz & Rivas, 2008).
The current study compares Dutch translations of two existing English critical thinking tests. The study aims at identifying which instrument is most suitable to measure critical thinking in first year-bachelor students. The tests are compared on four aspects, identified in the literature as important (Cook et al., 1996; Erwin, 2000): (1) the underlying definition or vision on critical thinking, (2) psychometric features (reliability, correlation with other tests, internal structure of the test, in the literature and in the retrieved data); (3) the feasibility (ease of administration and analyses of the tests); (4) the attractiveness of the test for the envisaged respondents.
The two tests under investigation are the Cornell critical thinking test-level Z (CCTT, Ennis, Millman, & Tomko, 1985) and the Halpern Critical Thinking Assessment Using Everyday Situations (HCTAES, Halpern, 2007). The CCTT is a domain-neutral multiple choice test, intended for strong students in upper secondary education, student in higher education and adults. It claims to measure six aspects of critical thinking: induction, deduction, observation, evaluation (value judgment), credibility (of statements made by others), and assumption identification. The HCTAES exists of 25 descriptions of daily life situations. Each situation is offered twice to the respondents: a first time followed by an open ended question and a second time followed by closed questions. The test aims at measuring five categories of critical thinking: verbal reasoning, analysis of arguments, hypothesis testing, use of likelihood and uncertainty, and decision making and problem solving skills.
Method
Expected Outcomes
References
Abrami, P. C., Bernard, R. M., Borokhovski, E., Wade, A., Surkes, M., Tamin, R., & Zhang, D. (2008). Instructional interventions affecting critical thinking skills and dispositions: a stage 1 meta-analysis. Review of Educational Research, 78, 1102-1134. Cook, P., Johnson, R., Moore, P., Myers, P., Pauly, S., Pendarvis, F. et al. (1996). Critical Thinking assessment: Measuring a moving target. Report & recommendations of the South Carolina Higher Education Assessment Network Critical Thinking Task Force. Rock Hill, SC: The South Carolina Higher Education Assessment Network. Ennis, R. H., Millman, J., & Tomko, T. N. (1985). Cornell critical thinking test (3rd ed.). Pacific Grove, CA: Midwest Publications. Erwin, T. D. (2000). The NPEC Sourcebook on Assessment, Volume 1: Definitions and assessment methods for critical thinking, problem solving and writing. Washington, DC: U.S. Government Printing Office. Facione, P. (1990). Critical thinking: A statement of expert consensus of purposes of educational assessment and instruction. Millbrae: California Academic Press. Giancarlo, C. A. & Facione, P. (2001). A look across four years at the disposition towards critical thinking among undergraduate students. The Journal of General Education, 50, 29-55. Halpern, D. F. (2007). Halpern critical thinking assessment using everyday situations: Background and scoring standards. Claremont, CA: Claremont McKenna College. Ku, K. (2009). Assessing students' critical thinking performance: Urging for measures using multi-response format. Thinking skills and creativity, 4, 70-76. Maneesriwongul, W. & Dixon, J. K. (2004). Instrument translation process: A methods review. Methodological issues in nursing research, 48, 175-186. Wood, P. K., Kitchener, K. S., & Jensen, L. (2002). Considerations in the design and evaluation of a paper-and-pencil measure of epistemic cognition. In B. K. Hofer & P. R. Pintrich (Eds.), Personal epistemology: The psychology of beliefs about knowledge and knowing (pp. 277-294). Mahwah: Lawrance Erlbauw Association.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.