Hostname: page-component-7d684dbfc8-hsbzg Total loading time: 0 Render date: 2023-09-26T20:32:08.324Z Has data issue: false Feature Flags: { "corePageComponentGetUserInfoFromSharedSession": true, "coreDisableEcommerce": false, "coreDisableSocialShare": false, "coreDisableEcommerceForArticlePurchase": false, "coreDisableEcommerceForBookPurchase": false, "coreDisableEcommerceForElementPurchase": false, "coreUseNewShare": true, "useRatesEcommerce": true } hasContentIssue false

Implementation evaluation of school-based obesity prevention programmes in youth; how, what and why?

Published online by Cambridge University Press:  10 December 2014

Femke van Nassau
Department of Public & Occupational Health and EMGO Institute for Health and Care ResearchVU University Medical CenterVan der Boechorststraat 71081 BT Amsterdam, The Netherlands Email:
Amika S. Singh
Department of Public & Occupational Health and EMGO Institute for Health and Care ResearchVU University Medical CenterVan der Boechorststraat 71081 BT Amsterdam, The Netherlands Email:
Willem van Mechelen
Department of Public & Occupational Health and EMGO Institute for Health and Care ResearchVU University Medical CenterVan der Boechorststraat 71081 BT Amsterdam, The Netherlands Email:
Johannes Brug
Department of Epidemiology and Biostatistics and EMGO Institute for Health and Care ResearchVU University Medical CenterAmsterdam, The Netherlands
Mai J.M. Chinapaw
Department of Public & Occupational Health and EMGO Institute for Health and Care ResearchVU University Medical CenterVan der Boechorststraat 71081 BT Amsterdam, The Netherlands Email:
Rights & Permissions [Opens in a new window]


Invited Commentary
Copyright © The Authors 2015 

In an ideal world, one combats public health problems with theory- and evidence-based programmes. In the real world, evidence-based programmes are often lacking and programmes that are developed and implemented are mainly practice-based. In the last few decades, governmental and other funding agencies have prioritized the development and evaluation of evidence-based obesity prevention programmes to combat the major public health problem of childhood obesity( Reference Brug, van Stralen and te Velde 1 ). Consequently, a large variety of healthy nutrition and physical activity promotion programmes targeting youth have been developed and evaluated in more or less controlled and real-world settings( Reference Doak, Visscher and Renders 2 Reference Waters, de Silva-Sanigorski and Hall 5 ). Unfortunately, the real-world effectiveness of many programmes is disappointing, especially in the long term( Reference Metcalf, Henley and Wilkin 6 , Reference Van Cauwenberghe, Maes and Spittaels 7 ).

This lack of effectiveness could be because the programme was not effective in itself or because it was not implemented as intended( Reference Durlak and DuPre 8 ). It is therefore of obvious importance to evaluate if and to what extent a programme was implemented as intended. Implementation evaluation research can provide insight into the dynamic nature of implementation processes and key factors that are expected to be critical for achieving effectiveness of overweight and obesity prevention programmes during implementation.

Three important generic implementation research questions in the context of programme evaluation are: (i) how to promote implementation as intended; (ii) what happens during implementation; and (iii) why did my programme (not) work? In the present invited commentary we discuss these three questions, enriched by our experiences with the school-based obesity prevention programme DOiT (Dutch Obesity Intervention in Teenagers)( Reference Singh, Chinapaw and Kremers 9 , Reference van Nassau, Singh and van Mechelen 10 ).

How to promote implementation as intended?

Moving too quickly from science to the real world may result in implementation of programmes that are not yet ready for implementation. Moving too slowly from science to practice may lead to implementation of interventions that are easy to implement but that are not evidence-based. Therefore, both science and practice need to collaborate to facilitate the development of theory and feasible evidence-based programmes for implementation.

Schools are regarded as a convenient and practical setting to implement programmes targeting children’s and adolescents’ health behaviour( Reference Katz, O’Connell and Njike 11 ). In such programmes, teachers are often intermediaries delivering the programme. Implementation of such programmes requires that teachers change their daily routines. However, change often does not occur automatically or simultaneously among all teachers within a school( Reference Fixsen, Naoom and Blase 12 ). If one teacher is enthusiastic to implement a new programme, this does not mean that all teachers in that school are willing to work with the programme as well.

A growing body of evidence has identified a large variety of factors that may explain the transition of implementers from non-use to sufficient use through stages of innovation, i.e. adoption, implementation and continuation( Reference Fleuren, Wiefferink and Paulussen 13 , Reference Domitrovich, Bradshaw and Poduska 14 ). Regarding school-based overweight and obesity prevention, primary determinants of behavioural change of teachers are: (i) contextual factors, such as the extent to which a programme fits the existing school health policy; (ii) organizational factors, such as the decision-making process in the school, available time and budget; (iii) individual factors, such as teachers’ knowledge, skills, self-efficacy and intention to implement the programme; (iv) characteristics of the programme, such as compatibility and flexibility of the programme; and (v) characteristics of the implementation strategy, such as programme training, feedback on implementation and implementation materials.

These factors can either facilitate or impede implementation. For instance, if teachers are not continuously supported to prepare, implement and evaluate the lessons, they might deliver only a small part of the programme or refuse to implement the programme at all. Barriers to implementation may lead to negative adaptations or even termination of the programme. Therefore, it is important to identify and then address these factors in order to ensure optimal implementation.

As these factors can change over time, an implementation plan, tailored to the potential implementers at both organizational and individual levels, should be developed to support implementers throughout the process of adoption, implementation and continuation. The first step towards such an implementation plan is to address the potential mismatch between a programme and its implementers by identifying facilitating factors and barriers for implementation( Reference Bartholomew, Parcel and Kok 15 ). For example, intervention developers might think that a standardized multi-component programme is most effective, while teachers may prefer flexibility during implementation. Close collaboration with implementers and other practice stakeholders during the development of the programme and the implementation plan can provide insight into programme-specific factors that need to be addressed, such as duration, compatibility and flexibility of the programme.

The next step is to define essential elements and strategies for implementation. Examples of essential elements are a favourable school climate including a supportive programme coordinator, supportive colleagues and available time for implementation( Reference Bessems 16 ). Several programme delivery strategies have been shown to promote implementation; for example, providing materials and training( Reference Domitrovich, Bradshaw and Poduska 14 , Reference Bessems 16 ), providing regular feedback on implementation behaviour( Reference Fixsen, Naoom and Blase 12 , Reference Spoth, Rohrbach and Greenberg 17 ) and technical assistance to support implementation( Reference Bessems, Van Assema and Paulussen 18 Reference Roberts-Gray, Solomon and Gottlieb 23 ). Together with the potential implementers, the best applicable strategies need to be selected. In DOiT, for example, we used a person-to-person approach by installing a ‘DOiT support office’, based on the advice of teachers and stakeholders. The contact person in this support office was available for support and advice for implementers of DOiT throughout the school year.

The final step is to merge this knowledge on essential elements and strategies into an implementation plan that supports the process of implementation.

What happens during implementation?

The next task in implementation evaluation is adequate and systematic measurement of the implementation process. In recent years, the number of programme evaluations including a process evaluation has increased( Reference Fixsen, Naoom and Blase 12 ) and several models and frameworks have been used. A few examples are the Diffusion of Innovation Theory of Rogers (i.e. a theory that seeks to explain how, why and at what rate new ideas and technology is diffused)( Reference Rogers 24 ), the model developed by Steckler and Linnan (i.e. a guide for the conduct of a process evaluation)( Reference Steckler and Linnan 25 ), the Process Evaluation Plan of Saunders et al. (i.e. a comprehensive and systematic approach for developing a process-evaluation plan)( Reference Saunders, Evans and Joshi 26 ) and the RE-AIM framework (i.e. a framework designed to enhance the quality, speed and public health impact of efforts to translate research)( Reference Dzewaltowski, Glasgow and Klesges 27 , Reference Glasgow, Vogt and Boles 28 ). By measuring process indicators, such as reach, fidelity and dosage, researchers can document if the target population (e.g. youth at the schools) was reached, if adaptations to the programme were made and what part of the programme was implemented( Reference Fleuren, Wiefferink and Paulussen 13 , Reference Domitrovich, Bradshaw and Poduska 14 ).

However, the large array of impeding and facilitating factors, which can influence implementation, is only seldom part of evaluations. Examples of such factors are teachers’ intention to implement the programme, available time for implementation and supervisors’ support. These influential factors should be measured throughout the whole implementation process, including the phase preceding implementation. Since there is limited knowledge about mechanisms that underlie successful implementation, the combination of both process indicators and influential factors measured at multiple occasions can help to gain insight into the dynamic nature of implementation processes and into key factors for successful implementation.

Why did my programme (not) work?

In this next step of implementation research, interpretation of the different implementation measures is needed to assess at what level implementation occurred (e.g. degree of implementation) and how implementation affected programme effectiveness (e.g. was the programme successful in changing youth’s energy balance-related behaviours and reducing overweight and obesity?). Youth cannot benefit from programmes they do not receive. Therefore, the first step is to define the degree of programme implementation. Some studies have reported the number of lessons that were taught as a single measure for the degree of implementation. Since implementation is a complex process, a combination of different process indicators such as dosage (e.g. how much time was spent on programme delivery, how many lessons, how many core activities), fidelity (e.g. to what extent was the programme delivered according to the teacher manual and what adaptations were made) and quality of delivery (e.g. skills, motivation of teachers and support within a school for implementation) at both the programme and the support level play a role( Reference Domitrovich, Bradshaw and Poduska 14 ). Next, the association between the degree of implementation and programme outcomes can be explored; for example, did schools with a higher degree of implementation show more effects?

Translation into the real world

Finally, in order to maximize the public health impact and to successfully decrease childhood overweight and obesity prevalence rates, a blueprint for dissemination of the programme should be developed( Reference Spoth, Rohrbach and Greenberg 17 ). Since most programmes use a broad range of programme components and strategies, implementers often make changes to the programme; for instance, teachers adapt lessons to fit their teaching preferences. However, to promote programme effectiveness in real-world settings, we need to distinguish which combination of programme components contributes most to the beneficial health effects( Reference Waters, de Silva-Sanigorski and Hall 5 ). These effective components should be bundled into a so-called blueprint for replication of the programme. This blueprint should contain: (i) information on contextual conditions that are compulsory for implementation (e.g. support, available budget, available time); (ii) a description of core, or most essential, components of the programme; and (iii) a description of the most critical, core components of the implementation plan that need to be executed in order to achieve effectiveness of the programme. This blueprint can be used to effectively implement the programme more widely. If there is, for example, insufficient support, time and/or budget for implementation, schools should not adopt the programme.

In summary

Answering the three proposed generic research questions can lead to better understanding of how to implement overweight and obesity prevention programmes effectively. It means that science has to collaborate with practice in order to develop an implementation plan. It also means that one should evaluate the process of implementation by addressing both process indicators, as well as facilitating factors and barriers. Moreover, one should explore the key factors that are expected to be critical for achieving effectiveness during implementation. Therefore, we call for more implementation research in the current overweight and obesity prevention field in order to promote not only adequate implementation of effective programmes, but also to increase knowledge of effective strategies.


Conflict of interest: None. Authorship: F.v.N. drafted the original idea for the commentary based on experience and conversations with A.S.S., M.J.M.C., W.v.M. and J.B. A.S.S., M.J.M.C., W.v.M. and J.B. critically reviewed the manuscript. All authors approved the final draft of the manuscript.


1. Brug, J, van Stralen, MM, te Velde, SJ et al. (2012) Differences in weight status and energy-balance related behaviors among schoolchildren across Europe: the ENERGY-project. PLoS One 7, e34742.CrossRefGoogle ScholarPubMed
2. Doak, CM, Visscher, TL, Renders, CM et al. (2006) The prevention of overweight and obesity in children and adolescents: a review of interventions and programmes. Obes Rev 7, 111136.CrossRefGoogle ScholarPubMed
3. Karnik, S & Kanekar, A (2012) Childhood obesity: a global public health crisis. Int J Prev Med 3, 17.Google ScholarPubMed
4. Swinburn, B (2009) Obesity prevention in children and adolescents. Child Adolesc Psychiatr Clin N Am 18, 209223.CrossRefGoogle ScholarPubMed
5. Waters, E, de Silva-Sanigorski, A, Hall, BJ et al. (2011) Interventions for preventing obesity in children. Cochrane Database Syst Rev issue 12, CD001871.Google ScholarPubMed
6. Metcalf, B, Henley, W & Wilkin, T (2012) Effectiveness of intervention on physical activity of children: systematic review and meta-analysis of controlled trials with objectively measured outcomes (EarlyBird 54). BMJ 345, e5888.CrossRefGoogle Scholar
7. Van Cauwenberghe, E, Maes, L, Spittaels, H et al. (2010) Effectiveness of school-based interventions in Europe to promote healthy nutrition in children and adolescents: systematic review of published and ‘grey’ literature. Br J Nutr 103, 781797.CrossRefGoogle ScholarPubMed
8. Durlak, JA & DuPre, EP (2008) Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol 41, 327350.CrossRefGoogle ScholarPubMed
9. Singh, AS, Chinapaw, MJM, Kremers, SP et al. (2006) Design of the Dutch Obesity Intervention in Teenagers (NRG-DOiT): systematic development, implementation and evaluation of a school-based intervention aimed at the prevention of excessive weight gain in adolescents. BMC Public Health 6, 304.CrossRefGoogle ScholarPubMed
10. van Nassau, F, Singh, AS, van Mechelen, W et al. (2014) In preparation of the nationwide dissemination of the school-based obesity prevention program DOiT: stepwise development applying the Intervention Mapping protocol. J Sch Health 84, 481491.CrossRefGoogle ScholarPubMed
11. Katz, DL, O’Connell, M, Njike, VY et al. (2008) Strategies for the prevention and control of obesity in the school setting: systematic review and meta-analysis. Int J Obes (Lond) 32, 17801789.CrossRefGoogle ScholarPubMed
12. Fixsen, DL, Naoom, SF, Blase, KA et al. (2005) Implementation Research: A Synthesis of the Literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network.Google Scholar
13. Fleuren, M, Wiefferink, K & Paulussen, TGWM (2004) Determinants of innovation within health care organizations: literature review and Delphi study. Int J Qual Health Care 16, 107123.CrossRefGoogle ScholarPubMed
14. Domitrovich, CE, Bradshaw, CP, Poduska, JM et al. (2008) Maximizing the implementation quality of evidence-based preventive interventions in schools: a conceptual framework. Adv Sch Based Ment Health Promot 1, 628.CrossRefGoogle ScholarPubMed
15. Bartholomew, LK, Parcel, GS, Kok, G et al. (2001) Intervention Mapping: Designing Theory and Evidence-Based Health Promotion Programs. Mountain View, CA: Mayfield.Google Scholar
16. Bessems, KM (2011) The dissemination of the healthy diet programme Krachtvoer for Dutch prevocational schools. PhD Thesis, Maastricht University.Google Scholar
17. Spoth, R, Rohrbach, LA, Greenberg, M et al. (2013) Addressing core challenges for the next generation of type 2 translation research and systems: the translation science to population impact (TSci Impact) framework. Prev Sci 14, 319351.CrossRefGoogle ScholarPubMed
18. Bessems, KM, Van Assema, P, Paulussen, TGWM et al. (2011) Evaluation of an adoption strategy for a healthy diet programme for lower vocational schools. Health Educ Res 26, 89105.CrossRefGoogle ScholarPubMed
19. Brink, SG, Basen-Engquist, KM, O’Hara-Tompkins, NM et al. (1995) Diffusion of an effective tobacco prevention program. Part I: Evaluation of the dissemination phase. Health Educ Res 10, 283295.CrossRefGoogle ScholarPubMed
20. Cahill, HW (2007) Challenges in adopting evidence-based school drug education programmes. Drug Alcohol Rev 26, 673679.CrossRefGoogle ScholarPubMed
21. Hoelscher, DM, Kelder, SH, Murray, N et al. (2001) Dissemination and adoption of the Child and Adolescent Trial for Cardiovascular Health (CATCH): a case study in Texas. J Public Health Manag Pract 7, 90100.CrossRefGoogle Scholar
22. Johnstone, E, Knight, J, Gillham, K et al. (2006) System-wide adoption of health promotion practices by schools: evaluation of a telephone and mail-based dissemination strategy in Australia. Health Promot Int 21, 209218.CrossRefGoogle ScholarPubMed
23. Roberts-Gray, C, Solomon, T, Gottlieb, N et al. (1998) Heart Partners: a strategy for promoting effective diffusion of school health promotion programs. J Sch Health 68, 106110.CrossRefGoogle ScholarPubMed
24. Rogers, EM (1995) Diffusion of Innovations. New York: The Free Press.Google ScholarPubMed
25. Steckler, A & Linnan, L (2002) Process Evaluation for Public Health Interventions and Research. San Francisco, CA: Jossey-Bass.Google Scholar
26. Saunders, RP, Evans, MH & Joshi, P (2005) Developing a process-evaluation plan for assessing health promotion program implementation: a how-to guide. Health Promot Pract 6, 134147.CrossRefGoogle ScholarPubMed
27. Dzewaltowski, DA, Glasgow, RE, Klesges, LM et al. (2004) RE-AIM: evidence-based standards and a Web resource to improve translation of research into practice. Ann Behav Med 28, 7580.CrossRefGoogle Scholar
28. Glasgow, RE, Vogt, TM & Boles, SM (1999) Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health 89, 13221327.CrossRefGoogle ScholarPubMed