Description
The Getting To Outcomes 2004: Promoting Accountability through Methods and Tools for Planning, Implementation, and Evaluation technical report was designed to help practitioners improve the accountability and the quality of their programs. The Getting to Outcomes (GTO) approach was developed to address the gap between research and practice by building capacity for effective practices at the individual practitioner and program levels (e.g., choosing evidence-based practices and planning, implementing, evaluating and sustaining effective practices). This summary statement contains the methods from this report (refer to Tools for program planning, implementation and evaluation summary statement for the tools).
The Getting to Outcomes (GTO) approach is based on traditional evaluation methods, empowerment evaluation, results-based accountability and continuous quality improvement. While traditional evaluation methods typically use neutral, external evaluators working at an arm's length relationship from practitioners, empowerment evaluation encourages collaborative relationships between evaluators and practitioners. Empowerment evaluation is based on the notion that program success is more likely when evaluators collaborate with practitioners and provide them with the tools and opportunities to plan, implement, evaluate and develop a continuous quality improvement system themselves.
Results-Based Accountability (RBA) focuses on the results of programs and what can be learned from program impacts and program effectiveness, rather than process or output information. Continuous Quality Improvement (CQI) is a technique within Total Quality Management (TQM), which is based on principles of quality improvement, error and cost reduction and increasing client satisfaction.
The GTO approach consists of 10 accountability questions that are discussed in chapters in this resource. There are six planning questions (steps 1-6), two evaluation questions (steps 7-8) and two questions that address using data to improve and sustain programs (steps 9-10). The accountability questions are as follows:
- Needs and Resources: what are the community's underlying needs and conditions?
- Goals: what are the goals, target populations and objectives (i.e., desired outcomes) that will address the needs and change the underlying conditions?
- Best Practice: which evidence-based models and best practice programs will you use to reach your goals?
- Fit: what actions do you need to take so that the selected program "fits" the community context?
- Capacities: what organizational capacities do you need to implement the program?
- Plan: what is the plan for this program?
- Process: how will you assess the quality of program implementation?
- Outcomes: how well did the program work?
- Continuous Quality Improvement: how will you incorporate continuous quality improvement strategies?
- Sustainability: how will you sustain effective programs?
There are three key features of the GTO system:
- The GTO approach can be used at any stage of program planning, implementation and evaluation. Practitioners can select which accountability questions are helpful at a particular point in time, depending on the development stage of their programs.
- The GTO approach is not linear. Although presented in a linear fashion, the GTO system can be used in an iterative way, where learning may feed back into earlier steps.
- The GTO approach promotes cultural competence in programming. Practitioners can incorporate the ethnic/cultural characteristics, experiences, norms and values of intended populations at each program development stage.
Steps for Using Method/Tool
The 10 sections of the resource provide useful methods to help in program planning, implementation and evaluation.
1. Needs and Resources: what are the community's underlying needs and conditions?
Conducting a needs and resources assessment (p.17-27)
- Step 1: Set up an assessment committee or work group of members to collect data (include key stakeholders).
- Step 2: Examine what data are currently available to assess the risk and protective factors.
- Step 3: Determine what data still need to be collected by the coalition.
- Step 4: Determine the best methods to gather the data and develop a data collection plan.
- Step 5: Implement the data collection plan.
- Step 6: Analyze and interpret the data.
- Step 7: Select the priority risk and protective factors to be assessed.
- Step 8: Use those priority factors to develop goals and objectives and to select programs/strategies to implement.
2. Goals: what are the goals, target populations, and objectives (i.e., desired outcomes) that will address the needs and change the underlying conditions?
Characteristics of useful objectives (p.35-36):
- They are specific and measureable.
- They specify what will change, for whom, by how much and by when.
- There can be more than one objective for each goal.
- Objective statements are logically linked to support the attainment of goals.
3. Best Practice: which evidence-based models and best practice programs can you use to reach your goals?
Criteria to assess evidence-based programs (p.44):
- the degree to which the program is based on a well-defined theory or model
- the degree to which the intended population received enough of the intervention (i.e., dosage)
- the quality and appropriateness of data collection and data analyses procedures
- the degree to which there is strong evidence of a cause and effect relationship
There are three levels of evidence for programs (p.45):
- Promising programs have been reasonably well evaluated and shown to have some positive outcomes. These programs cannot be assessed as effective programs since findings are not consistent enough or the evaluation design lacks sufficient rigour.
- Effective programs have rigorous evaluations that have consistently demonstrated positive outcomes.
- Model programs qualify as effective programs AND are available for dissemination from the program developers and other consultants.
Steps for balancing fidelity and adaptation when implementing an innovation (p. 47)
- Determine what aspects of the program can be adapted and the rationale for adapting those specific program elements.
- Understand the core components of the program and its theory. Removing core elements can seriously affect outcomes.
- Estimate the human, financial, technical and structural/linkage resources that will be needed for tracking and managing fidelty and potential changes needed for implementation.
Steps for choosing an evidence-based or best practice program (p.52-53):
- Examine what evidence-based and best practice resources are available in your content area.
- Determine how the results of the evidence-based/best practice program fit with program goals and objectives.
- Select the program.
4. Fit: what actions do you need to take so that the selected program "fits" the community context?
Program fit is the degree to which a selected best practice program fits within the existing program and community context.
Steps for determining program fit (p.59-61):
Consider the following:
- the cultural context and 'readiness' of the intended population for the program through a cultural and community readiness analysis
- how the selected program 'fits' with the intended population
- how the selected program 'fits' with other local programs offered to your target population
- the philosophy and values of your organization, and whether the program is compatible with them
5. Capacities: what organizational capacities do you need to implement the program?
Delineating key capacities for program planning, implementation and evaluation (p.64-70)
- Staff capacities: qualifications, training, staffing level
- Technical (expertise) capacities: access to program materials, access to personnel with appropriate evaluation skills
- Fiscal capacities: adequate funding
- Structural/formal linkage capacities: sharing resources and strengthening links among organizations through collaboration
Community readiness (p.71-72)
The PREVENT model to assess and increase community readiness:
- Problem defined by needs assessment
- Recognition of problem by community
- Existence of funding
- Vision
- Energy to mobilize and sustain
- Networking with stakeholders
- Talent/leadership
6. Plan: what is the plan for this program?
Planning culturally appropriate programs (p.85-86)
- Are staff representative of the intended population?
- Are the program materials relevant to the intended population?
- Have experts and/or members of the intended population examined the program plan and materials?
- Does the program consider the language, socioeconomic, cultural and historical contexts of the intended population?
- Are staff culturally competent to work with the intended population?
7. Process: how will you assess the quality of program implementation?
Process Evaluation Matrix (p. 95)
Several process evaluation questions are listed with their corresponding data collection activities. The questions listed in the Process Evaluation Planning Tool are:
- Did the program follow the basic plan for service delivery?
- What are the program's characteristics?
- What are participants' characteristics?
- What is the participants' satisfaction with the program?
- What is the staff's perception of the program?
- What were the individual participants' dosages of the program?
- What were the program components' levels of quality (fidelity monitoring)?
8. Outcomes: how well did the program work?
Conducting an outcome evaluation (p.115-133)
Determine or complete the following:
- What will be measured?
- What are the best types of outcomes to measure?
- What are some resources to find outcome measures?
- What criteria will you use to choose outcomes?
- Select an appropriate evaluation design to fit your program and evaluation questions.
- Choose methods for measurement.
- Determine whom you will assess with your measurement methods.
- Determine when you will conduct the assessment.
- Gather the data.
- Seek informed consent and assent.
- Ensure confidentiality and anonymity of the data.
- Analyze and interpret the data.
- Assess program outcomes against benchmarks.
- Weigh results against the cost of the program.
9. Continuous Quality Improvement: how will you incorporate continuous quality improvement strategies?
Steps for implementing a CQI strategy (p.138-140)
- Examine changes in the program context.
- Use information from the planning, implementation and evaluation processes.
10. Sustainability: how will you sustain effective programs?
Key elements for sustaining effective programs (p.144-146)
- Program effectiveness
- Program sustainability plan
- Program champions
- Program negotiation process
- Program financing
- Training
- Institutional strength
- Integration with existing programs/services
- Fit with host organization or community
Evaluation
Dr. Chinman and colleagues have been evaluating and refining the Getting to Outcomes (GTO) approach for over 10 years. Chinman, Hunter, Ebener, Paddock, et al (2008) conducted a quasi-experimental study of 10 GTO and non-GTO drug prevention programs that compared their practitioner capacity (i.e., knowledge, attitudes, skills) to select effective practices, and plan, implement, evaluate and sustain those practices. The GTO intervention involved distributing GTO manuals, delivering training, and providing on-site technical assistance to program staff. The GTO process helped program staff improve their capacity more than the comparison programs. Change in practitioner capacity from baseline to two years was related to the amount of technical assistance hours delivered. Using GTO, several programs were able to document improvement in outcomes.
In Chinman et al. (2009), a formative evaluation of the GTO process on a larger scale was conducted in the state drug prevention systems of Tennessee and Missouri. In Tennessee, 54 drug prevention programs were randomly assigned to receive either the GTO intervention or standard practice (RCT design). In Missouri, 36 programs that received GTO were compared with 9 similar programs that did not (quasi-experimental design). At one year, analyses found that GTO-programs improved their capacity to implement effective practices over non-GTO programs, despite experiencing challenges with respect to implementation.
These summaries are written by the NCCMT to condense and to provide an overview of the resources listed in the Registry of Methods and Tools and to give suggestions for their use in a public health context. For more information on individual methods and tools included in the review, please consult the authors/developers of the original resources.
We have provided the resources and links as a convenience and for informational purposes only; they do not constitute an endorsement or an approval by McMaster University of any of the products, services or opinions of the external organizations, nor have the external organizations endorsed their resources and links as provided by McMaster University. McMaster University bears no responsibility for the accuracy, legality or content of the external sites.