20 SES 14, International Perspectives on Student Engagement: Using Innovative Smartphone Technology in Science Classrooms
External validity remains a particular concern when using the experience sampling method (ESM) (Hektner, Schmidt, & Csikszentmihalyi, 2007; Silvia et. al 2013; 2014; Courvoisier, Eid, & Lischetzke, 2012; Messiah, Grondin, & Encrenaz, 2011). The novelty of the method, which now involves data collection often using phones, pagers, or other devices, prompts one to question how accurately it captures the experiences it purports to record. This has driven several important questions regarding the use and interpretation of ESM data, most notably how to understand validity, reliability, and generalizability in the context of an ESM study. These are three weighty concepts, about which much has been written over the last fifty years or more just in education research (cf. Campbell & Stanley, 1963; Shadish, Cook, & Campbell, 2002). Less, however, has been written about these concepts as they relate to study designs that employ ESM (Hektner, Schmidt, & Csikszentmihalyi, 2007). In this paper, I seek to answer a more specific, yet still related question: how to measure rates of participation for ESM designs. To answer this question, I use data from a recent international research collaboration (Schneider et. al, under review) that used ESM to measure student engagement and optimal learning in secondary school science classes in the U.S. and Finland. In order to make valid inferences about our study population, it is critical to understand and analyze rates of participation, particularly between different subgroups in the US and Finland. One possible way to represent participation in research is through the use of response or volunteer rates. However, ESM complicates traditional notions of response rates by adding a longitudinal component that allows additional opportunity for varied participation in the study sample. Hektner, Schmidt, and Csikszentmihalyi (2007) distinguish between three important factors that can influence participation in ESM studies: 1) Volunteer rate, 2) Signal response rate, and 3) Participant attrition. In this paper, I briefly define each concept above, and then present results and analysis for signal response rate and attrition by applying both hierarchical generalized linear models (HGLMs) as well as discrete and continuous time series analysis. Results suggest that rates of participation in the ESM study, as measured by both signal response rate and attrition, vary significantly by both gender and country. These findings have implications for drawing generalizations from findings derived from ESM approaches and can help inform future designs that might minimize attrition and encourage higher rates of participation.
Campbell, D. T., & Stanley, J. C. (1963). Experimental and quasi-experimental designs for research. Boston: Houghton Mifflin. Courvoisier, D. S., Eid, M., & Lischetzke, T. (2012). Compliance to a cell phone-based ecological momentary assessment study: the effect of time and personality characteristics. Psychological assessment, 24(3), 713. Hektner, J. M., Schmidt, J. A., & Csikszentmihalyi, M. (Eds.). (2007).Experience sampling method: Measuring the quality of everyday life. Sage. Messiah, A., Grondin, O., & Encrenaz, G. (2011). Factors associated with missing data in an experience sampling investigation of substance use determinants. Drug and alcohol dependence, 114(2), 153-158. Shadish, W.R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Wadsworth Cengage Learning. Silvia, P. J., Kwapil, T. R., Walsh, M. A., & Myin-Germeys, I. (2014). Planned missing-data designs in experience-sampling research: Monte Carlo simulations of efficient designs for assessing within-person constructs. Behavior research methods, 46(1), 41-54. Silvia, P. J., Kwapil, T. R., Eddington, K. M., & Brown, L. H. (2013). Missed Beeps and Missing Data: Dispositional and Situational Predictors of Nonresponse in Experience Sampling Research. Social Science Computer Review, 0894439313479902.
The programme is updated regularly (each day in the morning)
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.