Skip to main content

[]

Intended for healthcare professionals
Skip to main content
Free access
Editorial
First published online June 14, 2021

Health Promotion Planning and an Interview With Dr. Lawrence Green

Abstract

“If we are to have more evidence-based practice, we need more practice-based evidence.” This quote has become something of a mantra for the health promotions profession’s most pre-eminent scholar, Dr. Lawrence W. Green. This editorial features an interview with Dr. Green and previews the forthcoming 5th edition of Green and Kreuter’s seminal health promotion planning textbook. The new title will be Health Program Planning, Implementation and Evaluation: Creating Behavioral, Environmental and Policy Change, with the Johns Hopkins University Press as the new publisher. Co-Editors for this new edition are Larry Green, Andrea Gielen, Judith Ottoson Darleen Peterson, and Marshall Kreuter. This edition shows the vital progression from planning and implementation to evaluation and has further refined and simplified the visual representation of the planning model. The “enabling factors” that will spawn more practice based evidence are discussed. To enable practice-based research will mean that end users of a service or intervention must be taught to be leaders and advocates for approaches that are responsive to their needs, preferences and values.
One element that separates a profession from an occupation is specialized knowledge, and in the health promotion field we’re taught that our approaches to improving well-being must be grounded in evidence. How is it then that one of the most pre-eminent scholars in our discipline feels we need “an antidote to the widespread imposition” of evidence-based practice? As one of countless Larry Green devotees, I’ve often reiterated his challenge that “if we are to have more evidence-based practice, we need more practice-based evidence.” His most widely cited article advancing this argument was published in the Annual Review of Public Health where he and his co-authors posit that if we are to be more adept at bridging science and practice we’ll need a more decentralized approach to dissemination, including “changes in the way we produce the science itself.”1
Given how often Green has returned to the concept of practice-based evidence in his research and writing,2-4, I’ve come to consider the premise as more than a challenge; it also seems an overt indictment of our profession’s research legacy and an admonition to our profession’s practitioners. Indeed, Green considers the production of practice-based evidence as both a criterion for credible interventions as well as a “prerequisite for funding or approval of programs in varied populations.” Like most health promotion professionals trained in one of this nation’s Schools of Public Health, one of my required textbooks was Green and Kreuter’s “Health Promotion Planning: An Educational and Ecological Approach.” The kind of evidence that could and should be derived from practice, and used to inform research, has been codified in this book’s PRECEDE/PROCEED framework. It is undoubtedly the most studied, cited and replicated planning paradigm in our profession with over 1,000 studies published that codify, affirm and show health improvement outcomes produced via use of the model.5
Precede, developed in 1974, stands for Predisposing, Reinforcing and Enabling Constructs in Educational Diagnosis and Evaluation. Proceed, added in 1991, stands for Policy, Regulatory, and Organizational Constructs in Educational and Environmental Development. That this planning framework was prescient in scope has been affirmed every year in the worksite health promotion sphere as I’ve watched this discipline’s slow but steady migration from a focus on individually targeted programs to more environmental and ecological approaches.
In the decades since the first edition, Green and Kreuter’s profuse tome remains our profession’s most cogent and comprehensive guide for how to collect and evaluate evidence to inform program planning and evaluation. I’m pleased, then, to report that the Fifth Edition of this seminal text is in production. I know this because I was honored to be a co-author with Nico Pronk and Shelly Golden contributing the chapter on program planning for occupational settings. I plan to profile and interview several of the chapter authors to this latest edition in my editorials in journal issues to come beginning in this issue with an interview with Dr. Green. Stay tuned to these pages for an announcement of the book’s release date.
I was enthusiastic about the chance to review the Precede-Proceed framework in the context of worksite wellness programs because program planning hasn’t matured in this corner of the health promotion profession nearly as fully as it should. Worksite wellness scorecards are the tools most commonly used by companies intent on assessing whether their programs are adhering to best practices standards and worksite wellness practitioners interested in comparing their approaches and outcomes against comparable organizations or sectors. One commonly used scorecard is freely available from HERO (the Health Enhancement Research Organization) where I work as a Senior Fellow. Over 1800 companies have completed the HERO Scorecard developed in cooperation with Mercer.6 In one HERO study, 205 organizations responded to scorecard questions about strategic planning. Only 6% of companies that had no written objectives reported significant employee health risk improvement and 22% reported slight improvement. In contrast, 21% of organizations with written objectives reported significant improvement and 49% reported slight improvement.7 Given this clear connection between planning and improved outcomes, it is troubling that HERO research also shows planning to be lacking in most organizations. Specifically, a study of HERO Scorecard users found that 44% of companies reported that they had no written strategic plan and only 25% of companies had a multi-year strategic plan. And, also unfortunately, of those who had a written plan, less than half (47%) felt that their strategic planning had been effective.8
Under investment in rigorous assessment and planning is all the more surprising given the impressive investments being made in employee health improvement programs and services. Under planning, yet over delivering, isn’t at all responsive to Green’s call for practice-based evidence. Instead, it may reflect a headlong march using a “ready, fire, aim” approach that, though well meaning, may be why some worksite wellness programs are felt to be paternalistic and intrusive rather than what wellness should be best at; personalized, relevant, fun and engaging.
Speaking of fun and engaging, the interview below was a repeat chance for me to interview Dr. Larry Green. In a prior interview with Larry, published on these pages, we explored many of the cultural influences and scientific conundrums that led to the development of the PRECEDE Framework. I also offered a brief profile of Larry’s extraordinary career and inimitable impact on our profession.9 In the interview below we discuss the current issues and challenges that Green and his co-authors address in the 5th Edition and we return to discussing some seminal moments for him and our profession that continue to drive his call for practice based evidence. Green quips that his has been a “turnstile career” and the side bar containing his professional positions reads more like the journey of a cadre of audacious researchers and valiant public health leaders than the explorations of one unassuming, wry witted scholar.

Lawrence W. Green, MPH, DrPH. ScD(Hon). Professional Positions Held 1961-2016

Professor Emeritus, since 2016; Professor, Department of Epidemiology and Biostatistics, School of Medicine; Postdoctoral Fellows Program faculty for the Center for Tobacco Control Research and Education; the Helen Diller Comprehensive Cancer Center—Population Sciences and Disparities Programs, University of California at San Francisco; and Community Engagement Program of the Clinical Translational Science Institute of UCSF and of the Robert Wood Johnson Health and Society Scholars Program of UCSF and UC Berkeley, 2005-2016;
Health & Society Visiting Professor, University of Maryland, School of Public Health, College Park, Dec 2004-June 2005.
Visiting Professor, University of California at Berkeley, School of Public Health, 2004-2005.
Director, Office of Science & Extramural Research, Public Health Practice Program Office, Centers for Disease Control and Prevention (CDC), 2001-2004.
Visiting Professor, Department of Behavioral Sciences & Health Education, Rollins School of Public Health, Emory University, 2002-2004.
Acting Director, Office on Smoking and Health, National Center for Chronic Disease Prevention & Health Promotion, Centers for Disease Control and Prevention, 2001.
Distinguished Fellow/Visiting Scientist (1999-2004), Founding Co-Director, CDC-World Health Organization Collaborating Center on Global Tobacco Control, CDC, 1999-2001.
Director, Institute of Health Promotion Research, Faculty of Graduate Studies; Professor, Department of Health Care & Epidemiology, Faculty of Medicine, and Head, Division of Health Promotion and Preventive Medicine, University of British Columbia, 1991-1999.
Vice President and Director, National Health Promotion Program, Henry J. Kaiser Family Foundation, 1988-91.
Professor and (founding) Director, Center for Health Promotion & Prevention Research, University of Texas Health Science Center at Houston, 1982-88.
Visiting Lecturer, Center for Health Policy Research, and School of Public Health, Harvard University, 1981-82. Consultant to WHO, Geneva.
Director, Office of Health Information and Health Promotion (now Office of Disease Prevention & Health Promotion), U.S. Department of Health & Human Services, 1979-81.
Assistant Professor to Professor and (founding) Head, Division of Health Education, Johns Hopkins University School of Hygiene and Public Health, 1970-79.
Lecturer and Doctoral Program Coordinator, School of Public Health, University of California at Berkeley, 1968-70.
Ford Foundation Project Associate, University of California Family Planning Research and Development Project, Dhaka, East Pakistan (now Bangladesh), 1963-65.
Field training positions between BS & MPH programs at UC Berkeley: California State Health Department, Berkeley, 1962; Contra Costa County Health Department, 1963; USPHS Region IX Office, San Francisco, 1963.
Statistical Assistant, UC Berkeley Department of Physiology, 1961-62; Research Assistant, Navajo Health Education Project, School of Public Health, UC Berkeley.

An Interview with Dr. Lawrence Green

Dr. Paul Terry: Congratulations on nearing the completion of the 5th Edition of Health Promotion Planning (now the first edition with a new publisher, Johns Hopkins University Press). The new title will be “Health Program Planning, Implementation and Evaluation: Creating Behavioral, Environmental and Policy Change.” Your Co-Editors for this edition are Andrea Geilen, Judith Ottoson, Darleen Peterson, and Marshall Kreuter. This edition shows the clear progression from planning and implementation to evaluation and you’ve further refined and simplified the visual representation of the planning model. Can you describe the genesis of previous editions and some of the research derived from the PRECEDE model?
Dr. Lawrence Green: Our first edition presented an approach to planning and evaluating health education and health promotion programs that became the PRECEDE model. Formalizing the PROCEED model came in later editions. The model was built on research and field experience from my work in county, state and federal agencies in California during my field training as an MPH student at Berkeley, and then elaborated and tested over 2 years in Bangladesh and during my doctoral studies at Berkeley. During my first 10 years of teaching at the Johns Hopkins University School of Public Health, my publications from the Bangladesh experience provided a cross-cultural test of my assumptions about the generalizability of my beliefs about planning effective programs, and a test of their credibility among MPH students at Hopkins. The Hopkins teaching at that time of mostly physicians and nurses pursuing their MPH degrees also provided a rigorous test of the interdisciplinary relevance and credibility of the concepts. In the inner-city context of Johns Hopkins, the opportunity to conduct experimental studies with patient and family interventions on their control of hypertension and asthma with funding from NIH set in motion a decade of evidence-based validation of educational, behavioral and social-support interventions that produced cardiovascular improvements and mortality reductions in patients.
Some research I’ve reviewed in this editorial indicates that even modest planning is often lacking in health promotion. For example, pundits note that Biden’s pandemic plan was maddeningly obvious. Based on PRECEDE-PROCEED tenets, what elements of planning were most obviously missing in 2020 and what is your greatest wish for 2021 national planning?
The necessarily missing element is the engagement of local practitioners, citizens and/or patients in the planning process. Biden and his transition team could not have gone to each community in which a tailored program of population surveillance, supply, vaccination, priority-setting among population groups, and other adjustments to geographic, demographic, epidemiological, language and educational specificity could be included in what was necessarily a national plan. It is at these more local community levels that the PRECEDE-PROCEED model can be expected to disaggregate and fine tune the more generic Biden plan for each community’s planning. The principle of participation is one of the “key concepts” presented in Chapter 1 of this new edition: “Overarching perspectives on population health promotion planning.”
As the pandemic has shown, many, if not most, organizations fall into reaction mode and do their best to adapt to a fast changing environment on the fly. Indeed, some innovators or disruptive industries pride themselves in building the plane as they are flying it. Is disciplined planning antithetical to winging it? Can planning and breakthrough innovation coincide?
No, not antithetical. Indeed, I love your term “breakthrough innovation,” and will substitute that for my weak terms of “tweaking the plan,” or “adaptation of the plan.” If we view the breakdown of prior assumptions or the presumed link in a program plan as mistakes rather than as a limitation of understanding the presumed link and its own determinants, we will encounter more resistance to the proposed change in a plan. By calling the change a breakthrough innovation, we share the credit for its success with those who helped uncover the need for adaptation and for suggested remedies. That’s the essence of ongoing process evaluation and redesign. Our new chapter on evaluation in the new edition gives prominence to the variety of breakdowns to be discovered for remedies to be inserted in their place as innovations.
This is not the first or last time that ideological differences will foil planning. When you headed up a research section at CDC you were conflicted by differences over research and tobacco policies, among others. What did you learn about the need for many differing voices to be heard and the tension that lends to evidence-based decision making?
In my haste to implement a CDC grant program for community-based participatory research, I invited peer reviewers of proposals to come to CDC for a meeting with applicants. The exchange sensitized me and my staff to the reality that adherence to “evidence-based practices” from the scientific literature was not necessarily the only criterion that counted. What mattered most to the community project applicants was that they built on their experience and familiarity with community practice, composition, and histories. I concluded that CDC review process with the coining of a phrase that “If we want more evidence-based practices, we need more practice-based evidence.”
Another significant instance of clashing ideological/scientific differences was when, as Founding Director under Michael Eriksen (who had been one of my students at Johns Hopkins 30 years earlier) of the WHO-CDC Collaborating Center on Global Tobacco Control, I was appointed by the newly elected President Bush to the US delegation to represent CDC among other federal agencies, each with one representative at the Global Tobacco Control Policy Convention in Geneva. Our delegation of 6 professionals representing various agencies was shadowed by a federal political appointee of the Bush Administration who, during the proceedings, sat behind and above us in the WHO auditorium to monitor our verbal contributions to the proceedings, apparently to report to the White House on our fidelity to its new industry-friendly policies on tobacco. I resigned during the second meeting.
In hindsight, are there planning strategies you wish you had done differently when you returned to government as CDC’s Acting Director of the Office on Smoking and Health and then as founding Director of CDC’s Office of Science & Extramural Research?
What hadn’t been accomplished prior to my arrival in Washington in 1979, and still was inadequate when I left CDC in 2004, was a harmonization of the research to practice pipeline. My final departure from government culminated with my retirement from CDC in 2004. I tried for a while not to second-guess what I had and hadn’t done while in government. But I soon took up a Visiting Professorship to help with the creation of a Health & Society Program at the University of Maryland, as part of Bob Gold’s starting of a new School of Public Health, and then at UC Berkeley School of Public Health, both of which challenged some of my assumptions about the effective penetration of federal policy initiatives into the consciousness of public health students and faculty. The textbook on the PRECEDE-PROCEED model of program planning, which Marshal Kreuter and I revised in 2003 while translating our federal experience, and was published as the 4th edition in 2005, seemed to have missed the mark in adequately translating policies effectively into public health practice that used evidence as strategically and effectively as we had imagined was possible.
Too soon after settling back in our San Francisco home to retire more honestly, I was persuaded to accept a part-time Adjunct Professorship at UCSF in the Department of Epidemiology and Biostatistics. Like the 2 Visiting Professorships, the academic settings raised questions about what had or hadn’t been accomplished in my stint in Washington as Director of the Office of Health Information and Health Promotion under Deputy Assistant Secretary of Health, Michael McGinnis. Besides planning and supporting the implementation of policies and programs with states, was the development of an adequate data base for the evaluation of the policies that had been passed by Senator Edward Kennedy’s Senate Committee (where I had given testimony that was included in the Congressional Record). My position included planning, supporting implementation and evaluation of innovative programs in support of the 1990 Healthy People Objectives for the Nation. The closest I could come to that legislative mandate was the recruitment, development and deployment of a federal staff for the Office of Health Promotion, a federal information clearinghouse, a variety of guidelines and tools for state and local health departments’ health promotion, a national health information media campaign, and recommendations for addition of items in the National Health Survey in line with evaluating progress on the Objectives for the Nation in Health Promotion.
In an interview we had in 2015 for this Journal, you described working with Senator Kennedy on what led to the creation of the Office of Health Information and Health Promotion. He guided you away from using the term “health education” for his legislative purposes because it would cause his Health Information and Health Promotion Bill to be referred to the Education Committee of the Senate which would have buried it in their more crucial school legislation and left it low among the high priority medical issues of NIH. What lessons from your collaboration with Kennedy stick with you now relative to piloting the planning process?
The pull from Kennedy’s Senate office came first while I was still at Hopkins, and the lesson I took from his notice of my research was that it was referred to him by others among his constituency, as I had given lectures in Massachusetts and had been a consultant in the formation of the President’s Committee on Health Education, which he had championed. His interest and representation of my testimony before his Senate Committee led, in turn, to the interest of Michael McGinnis, then heading the new federal Office of Disease Prevention and Health Promotion, in recruiting me to head the still relatively new Office of Health Information and Health Promotion created under his Office by Kennedy’s legislation. These Offices were created under the Assistant Secretary of Health. Two years later, that Assistant Secretary, Julius Richmond, was leaving to return to his Professorship at Harvard, and he offered me a place in his new Harvard Center for Disease Prevention and Health Promotion. The unconscious lessons I must have learned in this sequence, was the importance of leveraging your modest accomplishments and the contacts that came with them to take the next steps
Political polarities were tame during Kennedy’s hay days compared to today’s ideological schisms. In what many worry is a ‘post fact’ society, how should the diagnostic stage of planning factor in competing views or ‘alternative facts?
The scientific communities, including NIH and CDC, had developed a convention of responding to “alternative facts” with a mantra that they and everyone else should base their planning on “evidence-based practice.” During my tenure at CDC, when leaving the Office on Smoking and Health, I became director of the Office of Science & Extramural Research and Associate Director for Prevention Research and Academic Partnerships for the Public Health Practice Program Office. This put me at the interface of the scientific and practice and planning worlds, where practitioners and their state and local public health directors pushed back in recognizing that the science handed down from NIH and us at CDC was based on research that was rigorous in its clinical and other randomized trials, but such rigor often failed or was impossible to represent the diversity and uniqueness of many local circumstances. I, too, became skeptical of the one-way construction of scientific facts to universal guidelines. My best opportunity for addressing the need to better link science and practice came via an appointment to chair and publish the findings of an Institute of Medicine of the National Academies Committee on “Linking Research and Public Health Practice.”10
I led the chapter on workplace-based health promotion planning for your new edition and I shared data about how having a plan with measurable accountabilities and monitoring of plans is associated with better organizational level outcomes. Yet, I also shared studies showing that a relatively small number of organizations adhere to planning best practices. What would you say to organizational leaders who look at PRECEDE-PROCEED and are daunted by the scope or who feel thorough planning is more trouble than it is worth?
One answer might be, if you’ve collected appropriate information in the justification of your organization’s purposes, and in the hiring of staff, and in the matching of resources to those purposes and staff assignments, much of the data required by the PRECEDE planning process would be in hand, begging for re-purposing organizational allocations and application to an emerging health need or problem. Another might be to gauge the data-collection and organization proposed to the severity and reach of the problem and the duration of the commitment to maintain a program capable of controlling it.
After leaving CDC, my critiques of “evidence-based practice” and the need for more “practice-based evidence” attracted a cross-fire of science defenders vs practice- and specific population-based supporters who recognized the misfit of some “evidence-based practices” for their populations, begging for greater specificity or tailoring of evidence-based practices. The PRECEDE model had a sufficiently publication-based track record of over 1000 published applications that a sharing of that bibliography (now over 1200) on my website seemed to offer a quieting of the “where’s the evidence” complaints, as well as some of the “is it worth the time and effort” questions.
Some leaders delegate strategic planning, but when I held executive posts I considered leading an annual planning process a prime duty. I’ve always worked with skeptics of planning, including on my leadership teams, who argued that detailed plans end up on a shelf collecting dust. Why not hold people accountable for desired results without being so prescriptive about how they go about achieving them?
Experience would seem to favor early intervention rather than merely waiting for outcomes to justify OK results or to reveal disappointing or disastrous outcomes. The experience of some who cite plans that collect dust on shelves (i.e., were never implemented) has given rise to a new subspecialty, in the planning to evaluation spectrum, which NIH came to label “implementation science.” My wife, Judith Ottoson, did her doctoral dissertation at Harvard on that subject in the early 1980s, when it was still a fledgling subject, but it has gained respectability with the growing recognition that failures or disappointing results from presumably well-planned programs are implementation failures rather than planning failures, hence our added chapter on “Implementation” in this 5th edition and her emphasis in co-authoring the chapter on evaluation.
Planning is a team sport. You’ve had a marvelously productive collaboration with Marshall Kreuter over 5 decades and it no doubt tested your respective capacities to abide by some ambitious plans. What do you look for in planning teammates and what goes into robust and sustainable partnerships?
Yes, Marshall came to my attention when he was a professor of health education at the University of Utah, and he followed up our initial contacts with an expression of interest in my Postdoctoral Fellowship program at Hopkins. He was such a popular star among my colleagues, students and Fellows that following year that we invited him to join us as a co-author of the first edition, which was to reflect the NIH-funded studies we were conducting as early tests of the model in patient and community trials. What went into our robust and sustainable partnership was the good humor of all 4, Marshall, our wives, Judith and Martha Katz, who had worked with us at the Office of Health Information and Health Promotion and CDC, into our continued friendship and collaborations with second and subsequent editions as we came together at CDC and shared family vacations, and now in our respective semi- or pseudo-retirements.

Practice-Based Evidence that Informs Worksite Health Promotion Best Practices.

What will it take to realize Green’s vision of more evidence based practice that is derived from practice based evidence? Just as Green has proposed that we need to change the way science is produced, it’s also likely that health promotion professionals will need to reimagine how we define and arrive at so called “best practices.” If you review articles that espouse what “evidence based” components need to be present in a “best practices program” in worksite health promotion, you’ll find an eclectic blend of studies were used to support a list of components like the need for leadership support, comprehensive communications, cultural supports and program planning and evaluation.11-14 Some studies, especially relating to weight management and tobacco use, were conducted in experimental settings with all the advantages of well-funded grant supported interventions. Other studies, especially relating to motivation or health communications, emanate from lab like settings with highly controlled variables and homogenous subjects. Some research is based in “real world” settings but, most likely, conducted in worksites with well-established programs long supported by the organizations.15-17
Researchers and practitioners alike would do well to embrace the saying: “give a fish and you feed a person for a day, teach a person to fish and you feed them for a lifetime.” That is, to use a PRECEDE model element, what are the “enabling factors” that will spawn more practice based evidence? As the new Edition of Health Program Planning emphasizes in nearly every chapter; planning, development and implementation of health promotion initiatives should be a partnership process where researchers and practitioners are learning from each other. Enabling factors that support a partnership approach will cast researchers as listeners and facilitators of planning. Enabling factors for practice-based research will set up practitioners as advisors and co-developers of wellness initiatives. Most importantly, those we are seeking to serve must be enabled to guide program plans that they find meaningful, relevant and responsive to their lived experiences. To enable practice-based research will mean that end users of a service or intervention must be taught to be leaders and advocates for a plan that is responsive to their preferences, needs and values.
Paul E. Terry, Ph.D.
Editor in Chief, American Journal of Health Promotion, Senior Fellow, HERO (The Health Enhancement Research Organization).

References

1. Green LW, Ottoson J, Garcia C, Hiatt R. Diffusion theory and knowledge dissemination, utilization, and integration in public health. Ann Rev Public Health. 2009;30:151–174. January 15, 2009. https://www.annualreviews.org/doi/full/10.1146/annurev.publhealth.031308.100049#_i27
2. Green LW. The Prevention Research Centers as models of practice-based evidence, two decades on. Am J Prev Med. 2007;33(1S):S6–S8. doi: 10.1016/j.amepre.2007.03.012
3. Green LW. Making research relevant: if it is an evidence-based practice, where’s the practice-based evidence? Family Practice. 2008;25(suppl 1):i20–i24. [Abstract] [Free Full Text
4. Green LW, Ottoson JM. From efficacy to effectiveness to community and back: evidence-based practice vs practice-based evidence. In: Green L, Hiss R, Glasgow R, et al. eds. From Clinical Trials to Community: The Science of Translating Diabetes and Obesity Research. National Institutes of Health; 2004:15–18. [Full text online, NIH]
5. Green LW. Website. A Resource for instructors, students, health practitioners and researchers using the PRECEDE-PROCEED Model. Accessed May 14, 2021. www.lgreen.net
7. HERO Case Studies: Resources and research studies, Health Enhancement Research Organization (HERO). Hunt, 2017. Accessed May 14, 2021. https://hero-health.org/resources/all-resources/
8. Grossmeier J, Terry PE, Anderson D. Broadening the metrics used to evaluate corporate wellness programs –the case for understanding the total value of the investment. In: Burke RJ, Richardsen AM, eds. Corporate Wellness Programs: Linking Employee and Organizational Health. Edward Elgar Publishing; 2015.
9. Green L, Terry PE. The past is prologue: views from Larry Green. Am J Health Promot. 2015;29(3):2–8. January/February issue. doi:10.4278/ajhp.29.3.tahp
10. Stoto MA, Green LW, Bailey LA, eds. Linking Research and Public Health Practice: A Review of CDC’s Program of Centers for Research and Demonstration of Health Promotion and Disease Prevention. National Academies Press; 1997. Accessed March 3, 2021. https://www.nap.edu/read/5564/chapter/1
11. Terry PE. Best practices in health promotion: the joy of chasing the uncatchable. Am J Health Promot. 2017;31(5):375–377. September Issue. Sage Publications.
12. Goetzel R, Terry PE. Current state and future directions for organizational health scorecards. Am J Health Promot. 2013;27(5):11–12. May/June issue. doi:10. 4278/ajhp.27.5 tahp
13. Fonarow GC, Calitz C, Arena R, et al. On behalf of the American Heart Association. Workplace wellness recognition for optimizing workplace health: a presidential advisory from the American Heart Association. Circulation. 2015;131(20):1–18. doi:10.1161/CIR.0000000000000206
14. Pronk NP. Best practice design principles of worksite health and wellness programs. ACSMs Health Fit J. 2014;18(1):42–46.
15. Terry P, Grossmeier J, Mangen D, Gingerich S. Analyzing best practices in employee health management: how age, sex, and program components relate to employee engagement and health outcomes. J Occup Environ Med. 2013;55(4):378–392.
16. Terry PE, Seaverson EL, Grossmeier J, Anderson DR. Association between nine quality components and superior worksite health management program results. J Occup Environ Med. 2008;50(6):633–641. doi:10.1097/JOM.0b013e31817e7c1c
17. Green LW, Cargo M. The changing context of health promotion in the workplace. In: O’Donnell M, ed. Health Promotion in the Workplace, 2nd ed. Delmar Publishers, Inc.; 1994:497–524. ISBN 0-8273-4940-8.

Cite article

Cite article

Cite article

OR

Download to reference manager

If you have citation software installed, you can download article citation data to the citation manager of your choice

Share options

Share

Share this article

Share with email
Email Article Link
Share on social media

Share access to this article

Sharing links are not relevant where the article is open access and not available if you do not have a subscription.

For more information view the Sage Journals article sharing page.

Information, rights and permissions

Information

Published In

Article first published online: June 14, 2021
Issue published: July 2021

Keywords

  1. best practices
  2. planning
  3. health promotion
  4. social ecological framework

Rights and permissions

© The Author(s) 2021.
Request permissions for this article.
PubMed: 34120469

Authors

Affiliations

Paul E. Terry, Ph.D.
Editor in Chief
American Journal of Health Promotion, Senior Fellow, HERO (The Health Enhancement Research Organization)

Metrics and citations

Metrics

Journals metrics

This article was published in American Journal of Health Promotion.

View All Journal Metrics

Article usage*

Total views and downloads: 4058

*Article usage tracking started in December 2016


Articles citing this one

Receive email alerts when this article is cited

Web of Science: 3 view articles Opens in new tab

Crossref: 7

  1. Factors affecting patient safety culture and adverse drug reaction reporting among healthcare professionals in an Indonesian public hospital: A cross-sectional study
    Go to citationCrossrefGoogle ScholarPub Med
  2. Good Management Practice Is Correlated With Good Performance of Community-Engaged Primary Health Care Facilities in Peru
    Go to citationCrossrefGoogle Scholar
  3. Exploring Key Determinants of Trail Run Athlete’s Preparedness to Perform Pre-Hospital First Aid for Ankle Sprain
    Go to citationCrossrefGoogle Scholar
  4. Development and evaluation of a theory-based health promotion programme aimed at improving retirees’ psychological well-being and quality of life: a protocol for a mixed-method study
    Go to citationCrossrefGoogle Scholar
  5. A Qualitative Study on Barriers to Stunting Primordial Prevention during the PentaCOME Project
    Go to citationCrossrefGoogle Scholar
  6. Well-Being and Evolving Work Autonomy: The Locus of Control Construct Revisited
    Go to citationCrossrefGoogle ScholarPub Med
  7. Health Promotion Planning and an Interview With Dr Andrea Gielen
    Go to citationCrossrefGoogle ScholarPub Med

Figures and tables

Figures & Media

Tables

View Options

View options

PDF/EPUB

View PDF/EPUB

Access options

If you have access to journal content via a personal subscription, university, library, employer or society, select from the options below:


Alternatively, view purchase options below:

Purchase 24 hour online access to view and download content.

Access journal content via a DeepDyve subscription or find out more about this option.