Session Information
08 SES 08, Policy Enactment and Implementation
Paper Session
Contribution
Health education programmes in various settings (workplace, sport club, school, hospital) are considered as effective means to improve the health of the population. Research has clearly endeavoured to provide evidence of successes, however results from programme implementation remain unclear and challenging to evaluate. Furthermore, demonstrating a positive and sustainable impact on health inequalities is difficult. The level of complexity of the factors impacting the effectiveness of prevention programmes led many authors to consider evaluation results with caution. In addition to these difficulties in the assessment of prevention programmes’ effectiveness, the issues of scaling up and transferability are still rarely examined.
Transferability and scaling up of prevention interventions are still laborious
In existing literature, tools and framework, developed for programme evaluation, are often (not always) grounded in a linear programme fidelity perspective. It is assumed that when it comes to the evaluation of implementation one of two options exist: a) either the programme is delivered as planned or not; and b) either it delivers expected outcomes, or not. Conversely however, implementation is argued as being a complex process, which defies such linear one-dimensional thinking. Multiple and interwoven contextual factors are at play, which relate, not only to the nature of the intervention, but also and more importantly to the different contexts of implementation.This complexity sets two challenges for the development of successful intervention programmes. The first challenge pertains to transferability, because within such variability and with limited options to control for them, streamlined outcomes are in reality difficult to predict. The second, is specific to wider replication of interventions which cannot be taken for granted because the determinants involved are numerous, variable and contextually influenced.
Implementation as a process of change
A given programme implementation cannot be limited to a “plug and play” process. For example, research in the school setting has shown the many different types of mechanisms involved. These mechanisms are linked to the characteristics of staff members, the setting, the community and of the programme(s). Depending on the context, the programme or the development stage of the process, professionals cope with a multitude of stimulations, try to make the most of the situation, define the status of programmes, select what fits, customize what can be used, and discard what doesn’t suit, in brief, they often follow their own path.
The assumption that ‘one size fits all,’ and that contexts are homogenous and that they will all respond similarly is deeply problematic. This perspective, we argue decreases effectiveness, limits community ‘buy in’ and thus adversely impacts the sustainability of interventions. The temptation to judge contexts that fail to deliver pre-defined objectives, in a pre-defined way shows lack of insight into the inherent complexity therein. Intervention implementation needs to take the community/setting as its point of origin, and adopt as a matter of course what is commonly referred to as working from ‘the ground up.’
The health education programme is considered an added ingredient (among many others) to the existing context. It may act as a catalyst, or a revealer, sometimes even a constraint that enforces new solutions and innovation contributing to enhancing the fit between people and their surroundings. This occurs whether or not expected impact on education or health is reached, as it is the very interactions between the context and the newly introduced programme which initiate the changes expected.
Assuming that interactions between the context and the programme produce outputs implies to assess the process rather than its results. This does not mean that programme outputs become a pet peeve, contrariwise, it means that they are systematically considered in the light of the process that created them.
Method
Expected Outcomes
References
Hawe, P. et al., 2000. Indicators to Help with capacity building in Health Promotion NSW Health. Australian Center for Health Promotion, ed., Sydney: State Health Publication. Available at: http://www.bvsde.paho.org/bvsacd/cd64/capbuild.pdf. Hirsch, G.B., Levine, R. & Miller, R.L., 2007. Using system dynamics modeling to understand the impact of social change initiatives. American Journal of Community Psychology, 39(3–4), pp.239–253. Jourdan, D. et al., 2016. A scoping review/study of designs and outcomes of school- and community-based prevention interventions involving children and young people. BMC Public Health, (In Press). Naaldenberg, J. et al., 2009. Elaborating on systems thinking in health promotion practice. Global health promotion, 16(1), pp.39–47. Pawson, R. & Tilley, N., 1997. An introduction to scientific realist evaluation Sage, ed., London. Pommier, J., Guével, M.-R. & Jourdan, D., 2010. Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods. BMC Public Health, 10(1), p.43. Available at: http://www.biomedcentral.com/1471-2458/10/43. Stokols, D., 1996. Translating social ecological theory into guidelines for community health promotion. American Journal of Health Promotion, 10(4), pp.282–298. Whitelaw, S. et al., 2001. “Settings” based health promotion: A review. Health Promotion International, 16(4), pp.339–353.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.