Evaluation is critical to ongoing success and utility of the New Jersey Comprehensive Cancer Control Plan (the Plan). Charged by Executive Order 114 and mandated by P.L. 2005, c.280, the Task Force on Cancer Prevention, Early Detection and Treatment in New Jersey is responsible for reporting its progress to the Governor via biennial reports.1,2 Recognizing the importance of obtaining data on implementation progress over time for the biennial reports to the Governor, the Task Force charged its Evaluation Committee, one of its standing committees, with development of this Evaluation Chapter for the 2008–2012 Plan.
An ad hoc committee had reviewed best practices in developing the 2003–2007 Plan. In particular, that committee was guided by the comprehensive cancer control model of the Centers for Disease Control and Prevention (CDC) and the successful state-level models developed by Kentucky, Maine, Michigan, and North Carolina. The Committee also considered recommendations from the Battelle Centers for Public Health Research and Evaluation, a consultant to the Task Force throughout the planning process.
A conceptual model developed by Battelle for CDC’s Division of Cancer Prevention and Control involves an outcomes-based planning and implementation process.3 The long-range goal of that process is to achieve significant reductions in the incidence, morbidity, and mortality of cancer. In this model, evaluation is one of the six key “building blocks” in comprehensive cancer control. Evaluation is needed to monitor progress and record results for accountability purposes, but also to identify problems and facilitate ongoing program improvement. Following this model, New Jersey built evaluation into its first comprehensive cancer control planto assist Task Force members in assessing and documenting success over time. Thus, evaluation has been part of New Jersey’s planning process from the outset. An example of this ongoing commitment to evaluation is the fact that evaluation activities are regularly conducted after each Task Force, workgroup/committee, and county cancer Coalition meeting to benchmark participant satisfaction and to guide “continuous quality improvement” of processes.
Comprehensive cancer control is a highly complex and dynamic program, and many of its outcomes are relatively intangible and difficult to measure, such as improved working relationships among partners.3 Assessing some health outcomes prematurely (such as changes in morbidity and mortality or in disparities) can be misleading; there is a need to maintain a long-term perspective as the anticipated health improvements are expected to take time to become evident and measurable. While improved health outcomes remain always in view as the ultimate goals desired, they will not be achieved until some years hence. Task Force efforts during the first Plan concentrated on building an infrastructure able to implement the statewide cancer Plan that New Jersey cancer experts believe will lead to the desired health outcomes.
The Task Force delegated to the Evaluation Committee the preparation of an evaluation plan to be utilized during the 2003–2007 period of the Plan. This evaluation plan4 included an overview of the timeframe and reporting for activities related to evaluation of the Plan, a description of the evaluation components, and delineation of the primary data to be utilized in evaluation.
Materials were developed for evaluation by the New Jersey Department of Health (NJDOH) Office of Cancer Control and Prevention (OCCP) and the Evaluation Team (based in the Department of Preventive Medicine and Community Health at the University of Medicine and Dentistry of New Jersey [UMDNJ] New Jersey Medical School), and these were reviewed and endorsed by the Evaluation Committee. Outside consultants were utilized at the discretion of the OCCP, the Evaluation Team, and/or the Evaluation Committee to enhance breadth of experience.
The 2004 biennial New Jersey Comprehensive Cancer Control: Status Report to the Governor (the 2004 Status Report) identified issues that were not addressed in the Plan. These included assessing cancers that were not among the original seven priority cancers (but may be on the rise or were identified as emerging trends) and addressing evolving matters concerning the priority cancers.5 The OCCP accepted responsibility for addressing these issues and distributed the recommendations to the appropriate workgroups for further consideration. The 2004 Status Report further documented many successes and described systems that were being established to measure long-term health outcomes.
As part of the initial implementation of the Plan, a thorough and structured comprehensive capacity and needs assessment (C/NA) process was conducted.5 This county-based effort established a well-documented baseline status that will assist eventually in measuring long-term health outcomes. A systematized compendium of recommendations from these C/NA reports was developed by the Evaluation Team.6
The cancer control community recognizes that availability of adequate evaluation data is critical for effective implementation of comprehensive cancer control plans, as well as for development of future plans.7 While the Evaluation Committee realizes that improvements in incidence and mortality are the critical long-term goals, measurement of the ongoing process to achieve those changes is also essential. Thus, one component of evaluation is assessment of the entire comprehensive cancer control program, as an ongoing process. Such program evaluation has the dual goals of showing that a program works and of further improving the program.8 Figure 1 below depicts CDC’s recommended framework for program evaluation in public health.9
Figure 1. Steps in CDC's Framework for Program Evaluation in Public Health
Understanding how the program functions, examining the internal and external factors that influence the program, and assessing the impact of the program on participants, organizations, and the community provides stakeholders with necessary information to improve the program. The evaluation plan,4 effective through December 2007 for the first five-year implementation period, was developed using the framework recommended in the W.K. Kellogg Foundation Logic Model Development Guide.10 It uses the three-tiered evaluation design outlined below. Integration of these three different aspects of evaluation is critical to understanding how and why a program is working, monitoring the program, and developing recommendations for improvement as needed.4,11
Context evaluation describes how the program functions within its environment and can help identify strengths and weaknesses of the program and the effect of unanticipated and/or external influences on the program.
Implementation evaluation seeks to assess how well the program tasks are being performed relative to their specifications in the Plan.
Outcome evaluation addresses progress toward the desired change in individuals, organizations, communities, and/or systems as a result of the program. The effectiveness of the program’s activities is assessed.
Reducing the cancer burden is a long-term process. Changes in many types of outcomes generally take years or decades to observe. The tiered evaluation structure described above lends itself specifically to early assessments that are more heavily focused on process, rather than outcomes. Thus, this approach has been particularly appropriate for New Jersey, which is still in the first years of comprehensive cancer control planning. This type of logic model was utilized in developing the 2006 biennial New Jersey Comprehensive Cancer Control: Status Report to the Governor and Legislature (the 2006 Status Report).11 Figure 2, below, depicts the evaluation logic model used.12
Figure 2. Comprehensive Cancer Control Evaluation Logic Models
During the time period 2008–2012 of this second Plan, continuing emphasis on process evaluation will be appropriate. In accordance with CDC recommendations, the Plan should continue to include explicit clarification of how goals and objectives are linked with strategies and to outcomes. Evaluation of the Plan will need to begin to shift toward increased inclusion of assessments of outcomes. Thus, as future evaluation plans are developed, data collection efforts and analysis plans, including specific measures and time frames for their assessment, will need to be continually delineated and refined.
CDC requires reports to include an evaluation component. CDC anticipates, for these annual reports, that assessment of performance will include measurement of state- and local-level policy changes regarding important cancer control outcomes including physical activity, nutrition, tobacco, screening, tanning, insurance coverage, and professional education. CDC recommends that data be gathered from such sources as state population-based central cancer registries, the Behavioral Risk Factor Surveillance System (BRFSS), the Youth Risk Behavior Surveillance System (YRBSS), and vital statistics. The reporting efforts for CDC should be coordinated with the state-mandated biennial status reports for efficiency and continuity.
The Task Force and the OCCP recognize the continuing importance of utilizing an outside agency to develop and implement an Evaluation Plan, based on the success of this approach in evaluating New Jersey’s first Plan as well as the earlier experiences of the New Jersey Comprehensive Tobacco Control Program. CDC concurs that monitoring progress and measuring outcomes against plan goals, objectives, and strategies may require the services of an outside evaluator.3 CDC has enthusiastically commended New Jersey for its approach and its thorough evaluation efforts. Two comprehensive reports have been issued thus far,5,11 and these are available from the OCCP.
Goals, Objectives and Strategies
The goal, objective, and strategies developed by the Task Force’s Evaluation Committee to implement ongoing evaluation for New Jersey’s comprehensive cancer control process are presented below.
To evaluate the New Jersey Comprehensive Cancer Control Plan by:
- Assessing the implementation and effectiveness of its strategies
- Determining its impact on the knowledge and behavior of the citizens of New Jersey
- Measuring resultant changes in health outcomes
To develop and implement annual Evaluation Plans for the New Jersey Comprehensive Cancer Control Plan.
||Continue to recruit and retain members for the Evaluation Committee.
||Continue to identify and secure funding for evaluation of the Plan.
Continue to contract with a New Jersey institution to develop and implement annual Evaluation Plans in partnership with the Task Force on Cancer Prevention, Early Detection and Treatment in New Jersey.
- State of New Jersey Executive Department. Executive Order 114. 5-9-2000. Governor Christine Todd Whitman.
- State of New Jersey, Office of Legislative Services. Annual Summary of Enactments: 2005 Session.
- Centers for Disease Control and Prevention, Battelle Centers for Public Health Research and Evaluation. Guidance for Comprehensive Cancer Control Planning. Atlanta, GA: Centers for Disease Control and Prevention, 2002.
- Evaluation Committee of the Task Force on Cancer Prevention, Early Detection and Treatment in New Jersey. Evaluation Plan for the 2003–2007 New Jersey Comprehensive Cancer Control Plan (NJ‑CCCP). Trenton, NJ: Evaluation Committee of the Task Force on Cancer Prevention, Early Detection and Treatment in New Jersey, June 2006. (copy on file with the OCCP)
- Weiss SH, Rosenblum DM, and Kim JY. New Jersey Comprehensive Cancer Control: Status Report to the Governor from the Task Force on Cancer Prevention, Early Detection and Treatment in New Jersey. Trenton, NJ: New Jersey Department of Health, Office of Cancer Control and Prevention, December 2004.
- Kim JY, Rosenblum DM, and Weiss SH. A Summary of Recommendations from the Report Summaries of the County Cancer Capacity and Needs Assessment. Prepared on behalf of the Evaluation Team for the Evaluation Committee of The Governor’s Task Force on Cancer Prevention, Early Detection and Treatment in New Jersey, June 2006. (unpublished; copy on file with the Office of Cancer Control and Prevention, New Jersey Department of Health; the underlying report summaries from each county are available from the OCCP website at http://www.state.nj.us/health/ccp.
- Advisory Committee on Cancer Coordination and Control. North Carolina Cancer Control Plan 1996–2001. Raleigh, NC: North Carolina Department of Environment, Health, and Natural Resources, 1996.
- W.K. Kellogg Foundation. Evaluation Handbook. Battle Creek, MI: W.K. Kellogg Foundation, 1998.
- Centers for Disease Control and Prevention. Framework for program evaluation in public health. MMWR 1999;48(No. RR–11).
- W.K. Kellogg Foundation. Logic Model Development Guide. Battle Creek, MI: W.K. Kellogg Foundation, 2004.
- Weiss SH, Kim JY, Rosenblum DM, Parikh P, and Tasslimi A. New Jersey Comprehensive Cancer Control: 2006 Status Report to the Governor and Legislature from the Task Force on Cancer Prevention, Early Detection and Treatment in New Jersey. Edited by the New Jersey Department of Health. Trenton, NJ: New Jersey Department of Health, Office of Cancer Control and Prevention, December 2006.
- Maine Center for Public Health. Maine Comprehensive Cancer Control Program Evaluation Report. Topsham, ME: Maine Center for Public Health, September 2005.