Description
The Stetler model of research utilization helps practitioners assess how research findings and other relevent evidence can be applied in practice. This model examines how to use evidence to create formal change within organizations, as well how individual practitioners can use research on an informal basis as part of critical thinking and reflective practice. The model links research use, as a first step, with evidence-informed practice.The Stetler model provides a way to think about the relationship between research use and evidence-informed practice. These two concepts are not the same. Integrating both concepts enhances the overall application of research.
Research use, in light of available supplemental evidence to aid decision making, provides the first step for research-related actions that result in evidence-informed practice. Research use occurs in three forms (Stetler, 1994): Instrumental use refers to the concrete, direct application of knowledge. Conceptual use occurs when using research changes the understanding or the way one thinks about an issue. Symbolic use or political/strategic use happens when information is used to justify or legitimate a policy or decision, or otherwise influence the thinking and behaviour of others. These different kinds of research use can occur together and can be influenced by multiple factors at the individual level. The Stetler model of research use is based on the notion that the user's internal characteristics, as well as external environmental factors, influence use of knowledge.
The Stetler model of evidence-based practice is based on the following assumptions (Stetler, 2001, p. 274; Stetler, 2010, p. 59-60):
The formal organization may or may not be involved in an individual's use of research or other evidence.
- Use may be instrumental, conceptual and/or symbolic/strategic.
- Other types of evidence and/or non-research-related information are likely to be combined with research findings to facilitate decision making or problem solving.
- Internal or external factors can influence an individual's or group's review and use of evidence.
- Research and evaluation provide probabilistic information, not absolutes.
Lack of knowledge and skills pertaining to research use and evidence-informed practice can inhibit appropriate and effective use. Key organizational elements needed to support evidence-informed practice at the organizational level include (Stetler, 2003):
- leadership's support for an evidence-informed practice culture;
- capacity to engage in evidence-informed practice, including an effective implementation framework;
- infrastructure to support and maintain a culture of evidence-informed practice and related activities.
The Stetler model of evidence-based practice outlines criteria to determine the desirability and feasibility of applying a study or studies to address an issue. These criteria are:
- substantiating evidence;
- current practice (relates to the extent of need for change);
- fit of the substantiated evidence for the user group and settings; and
- feasibility of implementing the research findings (risk/benefit assessment, availability of resources, stakeholder readiness).
This model consists of five phases (Stetler, 2001, p. 276):
- Phase I: Preparation
- Phase II: Validation
- Phase III: Comparative Evaluation/Decision Making
- Phase IV: Translation/Application
- Phase V: Evaluation
Steps for Using Method/Tool
The Stetler model of evidence-based practice consists of five phases (Stetler, 1994; Stetler, 2001; Stetler, 2010). Each phase is designed to:
- facilitate critical thinking about the practical application of research findings;
- result in the use of evidence in the context of daily practice; and
- mitigate some of the human errors made in decision making.
Phase I: Preparation—Purpose, Context and Sources of Research Evidence
- Identify the purpose of consulting evidence (such as need to solve a problem or revising an existing policy) and relevant related sources.
- Recognize the need to consider important contextual factors that could influence implementation.
- Note that the reasons for using evidence will also identify measurable outcomes for Phase V (Evaluation).
Phase II: Validation—Credibility of Findings and Potential for/Detailed Qualifiers of Application
- Assess each source of the evidence for its level of overall credibility, applicability and operational details, with the assumption that a methodologically weak study may still provide useful information in light of additional evidence.
- Determine whether a given source has no credibility or fit and thus whether to accept or reject it for synthesis with other evidence (rather than simply determine whether the evidence is weak or strong).
- Summarize relevant details regarding each source in an 'applicable statement of findings' to look at the implications for practice in Phase III. A summary of findings should:
- reflect the meaning of study findings for the issue at hand; and
- reflect studied variables or relationships in ways that could be practically used (eg. in terms of the actual operational nature of interventions and potential qualifiers or conditions of application that may be key to future use).
Phase III: Comparative Evaluation/Decision Making—Synthesis and Decisions/Recommendations per Criteria of Applicability
- Logically organize and display the summarized findings from across all validated sources in terms of their similarities and differences.
- Determine whether it is desirable or feasible to apply these summarized findings in practice, based on applicability criteria, i.e. substantiating evidence, in terms of the overall strength of the accumulated findings. The criteria are fit to the targeted setting; current practice; and feasibility ("r, r, r" = evaluation of risk factors, ned for resources, readiness of others involved).
- Based on the comparative evaluation, the user makes one of four choices:
- Decide to use the research findings by putting knowledge into effect and moving forward in terms of the appropriate types of uses (instrumental, conceptual, symbolic).
- Consider use by gathering additional internal information before acting broadly on the evidence.
- Delay use since more research is required which you may decide to conduct based on local need (no further action is considered with the information available at this point).
- Reject or not use (no further consideration).
Phase IV: Translation/Application—Operational Definition of Use/Actions for Change
- Write generalizations that logically take research findings and form action terms (using the summary statements from Phase II/III). Specifically, articulate the how-to's of implementation of the synthesized findings, identifying the practice implications that answer the overall question, "So what?".
- Identify type of research use (cognitive, symbolic and instrumental). Identify method of use (informal/formal, direct/indirect).
- Identify level of use (individual, group, organization).
- Assess whether translation or use goes beyond actual findings/evidence.
- Consider the need for appropriate, reasoned variation in certain cases.
- Plan formal dissemination and change strategies.
Phase V: Evaluation
- Clarify expected outcomes relative to purpose of seeking evidence and whether the evaluation is related to a direct use or consider use decision.
- Differentiate formal and informal evaluation of applying findings in practice.
- Consider cost-benefit of various evaluation efforts.
- Use Research Utilization as a process (Stetler, 2001) to enhance the credibility of evaluation data. Include two types of evaluation data: formative and outcome.
Refer to Figure 38 on p. 277 of the primary 2001 document and p. 53-55 of the 2009 version for an overview of the steps for the Stetler model.
These summaries are written by the NCCMT to condense and to provide an overview of the resources listed in the Registry of Methods and Tools and to give suggestions for their use in a public health context. For more information on individual methods and tools included in the review, please consult the authors/developers of the original resources.
We have provided the resources and links as a convenience and for informational purposes only; they do not constitute an endorsement or an approval by McMaster University of any of the products, services or opinions of the external organizations, nor have the external organizations endorsed their resources and links as provided by McMaster University. McMaster University bears no responsibility for the accuracy, legality or content of the external sites.