Model Competence in Biology Education
Conference:
ECER 2010
Format:
Paper

Session Information

27 SES 08 A, Teaching and Learning in Pre-School and Primary School

Paper Session

Time:
2010-08-26
17:15-18:45
Room:
M.B. SALI 10, Päärakennus / Main Building
Chair:
Florence Ligozat

Contribution

 

Objectives

Going along with a stronger focus on the outcomes of the German educational system, educational objectives are operationalized in terms of competence (Klieme et al. 2008). The implemented German educational standards and curricula defining learning outcome for 16 years olds particularly include model competence. Model competence functions as a “door-opener” for an elaborated understanding of the nature of science. It is therefore a profound part of scientific literacy (Gilbert & Boulter 2000). Yet various studies have identified that students are not aware of the role of models in an epistemological process and focus on descriptive aspects of models (e. g. Grosslight et al. 1991, Treagust et al. 2002).

The precondition for the diagnosis of model competence in biology education is the development of diagnostic instruments based on an empirically tested theoretical model of model competence suited to this domain – a gap that needs to be filled. Krüger and Upmeier zu Belzen (2009) used various studies dealing with students’ and teachers’ comprehension of models and modelling (e. g. Grosslight et al. 1991, Justi & Gilbert 2003, Crawford & Cullin 2005) to develop a theoretical model of model competence that identifies two cognitive dimensions of model competence differentiated into three levels with a different reflection rate. This structure still has to be validated empirically.

 

 

Theoretical Framework

In our research, model competence relates to Weinert’s (2001) general definition of competence as a specialized, domain-specific competence referring to cognitive, motivational, and volitional prerequisites to cope with a specific range of situations. The model of model competence was developed according to the studies of Grosslight et al. (1991), Justi & Gilbert (2003), and Crawford & Cullin (2005). It entails two cognitive dimensions: knowledge about models with the aspects nature of models and multiple models and modelling with the aspects purpose of models, testing models, and changing models. Each category is differentiated into three levels with different reflection rates (Krüger & Upmeier zu Belzen 2009).

Aims and Research questions

The aims of four methodical approaches are the operationalization and the empirical validation of the model of model competence (Krüger & Upmeier zu Belzen 2009). As a product of these processes, a diagnostic instrument for students’ model competence based on this theoretical model will be available. This instrument contains open-ended, multiple-choice, closed items with forced-choice and hands-on-activity test items. Each methodical approach is covered by one doctoral project.

 

The main resulting research questions associated with the operationalization are:

·         Is it possible to validate the structure of the theoretical model in the field of biology education internally using different types of items?

·         Are the multiple-choice items content valid?

In addition to the information gained on the validity of the theoretical model, students’ answers will allow a diagnosis of their model competence.

·         Are there differences between 7th graders up to 10th graders concerning their model competence?

Method

Methods The theoretical model is operationalized in open-ended items, multiple-choice items, closed items with forced-choice, and hands-on tasks. The multiple-choice items will be reduced in a study and content validated using “thinking aloud” protocols, calculating item difficulty conducting distractor analyses. The open-ended items will be categorized into the structure of the model. After quantification they will be analyzed like the multiple-choice items. The closed items with forced-choise are developed at the moment. The hands-on assessment is videotaped and analyzed using by qualitative content analysis according to Mayring (2003). The internal validity (multi-matrix-design) is evaluated using confirmatory factor analyses (Mplus, Muthén & Muthén, 2007) and IRT models (ConQuest; Wu et al. 1997) in order to test if the theoretical model consists of discrete dimensions and categories (Hartig 2008). Variance analyses will provide information about the relations between components and the qualities of model competence (multiple-choice and open-ended items).

Expected Outcomes

Findings and expected outcomes Findings with open-ended items suggest that student answers can be assigned to the theoretical model. According to the results with multiple-choice items the operationalization led to valid items with an adequate difficulty. Malfunctioning distractors will be revised. “Thinking aloud” protocols indicated that while answering the multiple-choice items, students considered features of models. This supports the content validity of the items. These first outcomes suggest that the operationalization using multiple-choice items is successful. Analyses considering internal validation of the theoretical model and differences between groups could not yet be conducted because of too little items for each combination of aspect and quality. The pretest data for the four approaches will be available in July 2010, first outcomes of the analyses in August 2010.

References

Crawford, B. A. & M. J. Cullin (2005): Dynamic Assesments of preservice teachers' knowledge of models and modelling. In: Boersma, K., M. Goedhart, O. de Jong & H. Eijkelhof [Eds.]: Research and the Quality of Education. Springer, Dordrecht. 309-323. Gilbert, J. K. & C. J. Boulter [Eds.] (2000): Developing Models in Science Education. Kluwer, Dordrecht. Grosslight, L., C. Unger, E. Jay & C. L. Smith (1991): Understanding Models and their Use in Science: Conceptions of Middle and High School Students and Experts. JRST 28 (9), 799-822. Hartig, J. (2008): Psychometric Models for the Assessment of Competencies. In: Hartig, J., E. Klieme & D. Leutner [Eds.]: Assessment of Competencies in Educational Contexts. Hogrefe & Huber, Toronto. 69-90. Justi, R. S. & J. K. Gilbert (2003): Teachers' view on the nature of models. Int. J. Sci. Educ. 25 (11), 1369-1386. Klieme, E., J. Hartig & D. Rauch (2008): The Concept of Competence in Educational Contexts. In: Hartig, J., E. Klieme & D. Leutner [Eds.]: Assessment of Competencies in Educational Contexts. Hogrefe & Huber, Toronto. 69-90. Muthén, L. K. & B. O. Muthén (2007): Mplus User’s guide. Muthén & Muthén, Los Angeles. Treagust, D. F., G. Chittleborough & T. L. Mamiala (2002): Students' understanding of the role of scientific models in learning science. Int. J. Sci. Educ. 24 (4), 357-368. Krüger, D. & A. Upmeier zu Belzen (2009): Modellkompetenz im Kontext Biologieunterricht. Internationale Tagung der Fachsektion Didaktik der Biologie (FDdB) im VBIO, Universität Kiel, 21.-25.09.2009. 48-49. Weinert, F. E. (2001): Vergleichende Leistungsmessung in Schulen–eine umstrittene Selbstverständlichkeit. In: Weinert, F. E. [Ed.]: Leistungsmessungen in Schulen. Beltz, Weinheim. 17-31. Wu, M. L., R. J. Adams & M. R. Wilson (1997): ConQuest – Generalised item response modelling software, Draft Release 2. Australian Council for Educational Research, Camberwell. Mayring, P. (2003). Qualitative Inhaltsanalyse. Weinheim: Beltz UTB.

Author Information

Humboldt-Universität zu Berlin
Institut of Biology
Berlin
Freie Universität Hannover
Biology Education
Berlin

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.