Session Information
11 SES 03 B, The Study of Quality Improvement and Researcher’s Role
Paper Session
Contribution
Intervention research is an important field in establishing quality assurance procedures. Recently, conceptualizing and developing educational intervention research guidelines and activities were, above all, in the hands of social research methodologists handling different criteria of empirical research, for example, validity or design sensitivity together with related assessment tools. However, improvements in educational intervention research also could come from considering expanded theoretical perspectives.
First, educational interventions need theory sophistication instead of early theory replacement: Of course, theories are used more or less significantly in intervention research to formulate hypotheses, to undertake construct validation, or to cope with alternative explanations or methodological problems what should lead to an enriched knowledge about a subject area (e.g., Nipedal, Nesdale, & Killen, 2010). However, this kind of progress from enrichment is endangered, because, especially, in educational research, researchers have complained repeatedly about theories that they “seem to replace one another, rather than subsume, extend, or complement other theories” (DiSessa & Cobb, 2004, p. 79). There is probably less progress and less confidence in nomological networks than it could be, because of a resistance in integrating different theories or background variables that were found to be effective in other studies (e.g., Lynch, 1983).
Second, educational interventions need to apply methods for theoretical problem solving: Undoubtedly, the design of interventions should be well balanced in respect to the state of the art in methodological and theoretical developments. However, there is much more literature on applying scientific methods for gathering, processing, and validating data than on constructing theories and on integrating them in designing and evaluating intervention research (e.g., Reynolds, 2007). One might conclude that, within the field educational intervention research, methodological problem solving is much stronger implemented that theoretical problem solving.
Third, not only measurements or assessments but also educational interventions need construct validation: There is a whole scientific “industry” dealing with the construction of tests for measuring personality characteristics and related variables (e.g., Embretson, 2007). However, within educational intervention research, theories have played a major role in construct validation of dependent variables, but not of independent ones. Bredenkamp (1979) has pointed out that within a construct validation of independent variables, it has to be asked whether an intervention is a representative for all possible interventions, for related internal processes, or for all modalities of a variable.
All these activities that show how to use theories to improve intervention research have identified some important goals, processes, and areas. However, they have not delivered some kind of tools that could assist designers and evaluators of educational interventions in a comprehensive and systematic way. Such a tool could consist of areas and related questions that can guide the use of theories in a step-by-step intervention design and evaluation. Valentine and Cooper (2008) have presented such a tool, i.e., a quality scale, but with a strong focus on the role of methods, but not on theories.
It is the purpose of this paper to develop and apply a tool (called Intervention Theory Questions (ITQ)).
Method
Expected Outcomes
References
Astleitner, H. (2011). Theorieentwicklung für SozialwissenschaftlerInnen [Theory building methods for social scientists]. Wien: Böhlau, UTB. Brewer, J. & Hunter, A. (2006). Foundations of multimethod research. Synthesizing styles. Thousand Oaks, CA: Sage. DiSessa, A. A. & Cobb, P. (2004). Ontological innovation and the role of theory in design experiments. The Journal of the Learning Sciences, 13, 77-103. Fuchs, L. S., Fuchs, D., & Speece, D. L. (2002). Treatment validity as a unifying construct for identifying learning disabilities. Learning Disability Quarterly, 25, 33-45. Highhouse, S. (2009). Designing experiments that generalize. Organizational Research Methods, 12, 554-566. Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton Mifflin Comp. Valentine, J. C. & Cooper, H. (2008). A systematic and transparent approach for assessing the methodological quality of intervention effectiveness research: The study design and implementation assessment device (Study DIAD). Psychological Methods, 13, 130-149. Whyte, J. (2006). Using treatment theories to refine the designs of brain injury rehabilitation treatment effectiveness studies. Journal of Head Trauma Rehabilitation, 21, 99-106.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.