20 SES 14, International Perspectives on Student Engagement: Using Innovative Smartphone Technology in Science Classrooms
External validity remains a particular concern when using the experience sampling method (ESM) (Hektner, Schmidt, & Csikszentmihalyi, 2007; Silvia et. al 2013; 2014; Courvoisier, Eid, & Lischetzke, 2012; Messiah, Grondin, & Encrenaz, 2011). The novelty of the method, which now involves data collection often using phones, pagers, or other devices, prompts one to question how accurately it captures the experiences it purports to record. This has driven several important questions regarding the use and interpretation of ESM data, most notably how to understand validity, reliability, and generalizability in the context of an ESM study. These are three weighty concepts, about which much has been written over the last fifty years or more just in education research (cf. Campbell & Stanley, 1963; Shadish, Cook, & Campbell, 2002). Less, however, has been written about these concepts as they relate to study designs that employ ESM (Hektner, Schmidt, & Csikszentmihalyi, 2007). In this paper, I seek to answer a more specific, yet still related question: how to measure rates of participation for ESM designs. To answer this question, I use data from a recent international research collaboration (Schneider et. al, under review) that used ESM to measure student engagement and optimal learning in secondary school science classes in the U.S. and Finland. In order to make valid inferences about our study population, it is critical to understand and analyze rates of participation, particularly between different subgroups in the US and Finland. One possible way to represent participation in research is through the use of response or volunteer rates. However, ESM complicates traditional notions of response rates by adding a longitudinal component that allows additional opportunity for varied participation in the study sample. Hektner, Schmidt, and Csikszentmihalyi (2007) distinguish between three important factors that can influence participation in ESM studies: 1) Volunteer rate, 2) Signal response rate, and 3) Participant attrition. In this paper, I briefly define each concept above, and then present results and analysis for signal response rate and attrition by applying both hierarchical generalized linear models (HGLMs) as well as discrete and continuous time series analysis. Results suggest that rates of participation in the ESM study, as measured by both signal response rate and attrition, vary significantly by both gender and country. These findings have implications for drawing generalizations from findings derived from ESM approaches and can help inform future designs that might minimize attrition and encourage higher rates of participation.
Campbell, D. T., & Stanley, J. C. (1963). Experimental and quasi-experimental designs for research. Boston: Houghton Mifflin. Courvoisier, D. S., Eid, M., & Lischetzke, T. (2012). Compliance to a cell phone-based ecological momentary assessment study: the effect of time and personality characteristics. Psychological assessment, 24(3), 713. Hektner, J. M., Schmidt, J. A., & Csikszentmihalyi, M. (Eds.). (2007).Experience sampling method: Measuring the quality of everyday life. Sage. Messiah, A., Grondin, O., & Encrenaz, G. (2011). Factors associated with missing data in an experience sampling investigation of substance use determinants. Drug and alcohol dependence, 114(2), 153-158. Shadish, W.R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Wadsworth Cengage Learning. Silvia, P. J., Kwapil, T. R., Walsh, M. A., & Myin-Germeys, I. (2014). Planned missing-data designs in experience-sampling research: Monte Carlo simulations of efficient designs for assessing within-person constructs. Behavior research methods, 46(1), 41-54. Silvia, P. J., Kwapil, T. R., Eddington, K. M., & Brown, L. H. (2013). Missed Beeps and Missing Data: Dispositional and Situational Predictors of Nonresponse in Experience Sampling Research. Social Science Computer Review, 0894439313479902.
00. Central Events (Keynotes, EERA-Panel, EERJ Round Table, Invited Sessions)
Network 1. Continuing Professional Development: Learning for Individuals, Leaders, and Organisations
Network 2. Vocational Education and Training (VETNET)
Network 3. Curriculum Innovation
Network 4. Inclusive Education
Network 5. Children and Youth at Risk and Urban Education
Network 6. Open Learning: Media, Environments and Cultures
Network 7. Social Justice and Intercultural Education
Network 8. Research on Health Education
Network 9. Assessment, Evaluation, Testing and Measurement
Network 10. Teacher Education Research
Network 11. Educational Effectiveness and Quality Assurance
Network 12. LISnet - Library and Information Science Network
Network 13. Philosophy of Education
Network 14. Communities, Families and Schooling in Educational Research
Network 15. Research Partnerships in Education
Network 16. ICT in Education and Training
Network 17. Histories of Education
Network 18. Research in Sport Pedagogy
Network 19. Ethnography
Network 20. Research in Innovative Intercultural Learning Environments
Network 22. Research in Higher Education
Network 23. Policy Studies and Politics of Education
Network 24. Mathematics Education Research
Network 25. Research on Children's Rights in Education
Network 26. Educational Leadership
Network 27. Didactics – Learning and Teaching
The programme is updated regularly (each day in the morning)
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.