Darrell A. Dromgoole, Associate Professor and Extension Specialist, Texas A&M AgriLife Extension Service.
Scott Cummings, Associate Department Head and Program Leader; Professor and Extension Specialist, Texas A&M AgriLife Extension Service.
In recent years Extension educators have experienced increased pressure from federal, state, and local governments; funding entities; and land-grant university administrators for greater program effectiveness and accountability (Dunifon, Duttweiler, Pillemer, Tobias, & Trochim, 2004; Mincemoyer et al., 2008). This increased demand for educational program accountability has fostered an increased need for and understanding of evidence-based programming in Extension (Fetsch, MacPee, & Boyer, 2012).
Despite some inconsistencies in definitions of evidence-based programming in research literature, most researchers and Extension educators agree that such initiatives have been built upon a sound theoretical and/or empirical base, and their effectiveness has been demonstrated through high-quality outcome evaluations (Catalano, Berglund, Ryan, Lonczak, & Hawkins, 2004; Clearinghouse for Military Family Readiness, n.d.; Elliott & Mihalik, 2004; Flay et al., 2005; Olson, 2010; Small, Cooney, & O’Connor, 2009). According to Small, Cooney, and O’Connor (2009, p. 1), “are well-defined programs that have demonstrated their efficacy through rigorous, peer-reviewed evaluations and have been endorsed by government agencies and well-respected research organizations. Evidenced- based programming is not simply characterized by known effectiveness; it is also well documented so that they are more easily disseminated.” The roots of this type of programming is based in the evidence-based medicine movement that began to emerge in the 1990s (Claridge & Fabian, 2005), and has recently extended to include a variety of program areas, including child, youth, and family focused programs that are commonly implemented by Extension educators.
Researchers and Extension education Scholars have articulated that using evidence-based programs has a variety of benefits for Extension educators as follows (Dunifon et al., 2004; Fetsch et al., 2012; Hill & Parker, 2005):
In addition, the increasing availability of established evidence-based programs encourages more efficient programming, eliminating the need to “reinvent the wheel” when developing new programs (Fetsch et al., 2012; Olson, 2010). Because of these benefits, a variety of education scholars have requested Extension administrators, faculty, and staff to increase their commitment to implementing evidence-based programs (Dunifon et al., 2004; Fetsch et al., 2012; Hill & Parker, 2005).
Historically, Extension faculty have conducted research with an expectation that the knowledge generated would be disseminated through local offices to address the issues and problems of communities (Dunifon, et al., 2004). This practice of research synthesis, translation, and dissemination in Texas A&M AgriLife Extension Service is consistent with the recent advancement of “evidence-based” research, which includes a thorough scientific review of the research literature, the identification of the most effective interventions or strategies, and a commitment to translating the results of this process into guidelines for best educational practice.
The following features generally characterize evidence-based programming approaches (Dunifon et al., 2004):
What makes the evidence-based program movement unique and sets it apart from other systems for moving science to practice is the emphasis on statistical analyses of qualified existing studies and the formation of guidelines that have been developed through a rigorous process of analysis and review, all set within a framework that views the science-to-practice continuum as a formal system for diffusion of research (Rogers, 2003). Figure 1 illustrates the science to practice relationship for evidence-base programs:
Figure 1. Science to Practice Relationship for Evidence-Based Programs (Rogers, 2003).
According to Flay, Biglan, Boruch, Castro, Gottfredson, Kellam and Ji (2005) for an educational intervention to be considered effective it should:
Programs that are ready for broad dissemination also must provide evidence of the ability to “go to scale,” cost information, and monitoring and evaluation tools so that adopting agencies can assess how well the intervention works in their settings. Fundamentally, then, “evidence based” entails more than the collection of data on client satisfaction, or baseline and post-test measures from a select group of participants (Dunifon et al., 2004). Procedural rigor and in-depth documentation of program processes are essential features of evidence-based programs (Dunifon et al., 2004).
There are several persuasive arguments in favor of evidence-based programs in Extension (Dunifon et al., 2004). The use of formal methods and reliance on panels of scientists to review results help encourage a more thorough and rigorous review of research than lone investigator literature reviews tend to produce (Dunifon et al., 2004). Additionally, formal recommendations or “best” practices or guidelines help to assure a higher degree of consonance between the system of science-based knowledge generation and the world of practice (Dunifon et al., 2004).
In an age where information overload is a significant concern, it is often difficult for Extension educators to distinguish legitimate science claims from pseudo-science (Dunifon et al., 2004). Additionally, Extension educators can become entrenched in “program tradition” where program decisions are made based on anecdotal evidence and intuition rather than by the most recent science (Dunifon et al., 2004).
Evidence-based programs offers a systematic approach for summarizing the best that current science has to offer in an area and packaging the programs and interventions that have been tested in a way that is accessible to the Extension educator. Proponents maintain that evidence-based programming strategies have been transformative, have improved practice, and have produced a paradigm shift in the education of Extension educators (Davidoff, 1999; Hoge, Jacobs, Belitsky, & Migdole, 2002).
The challenge for Extension educators in the future will be to rigorously examine what basis in empirical research exists for programs they deliver and to design new programs based on those that have been proven effective through evidence-based reviews (Dunifon et al., 2004).
The bottom line is that Extension should enthusiastically embrace the development of more evidence-based programs to link scientific evidence and practice. Evidence-based practice entails a thorough scientific review of the research literature, the identification of the most effective interventions or strategies, and a commitment to translating the results of this process into guidelines for practice (Dunifon et al., 2004).
Catalano, R. F., Berglund, M. L., Ryan, J. A. M., Lonczak, H. S., & Hawkins, J. D. (2004). Positive youth development in the United States: Research findings on evaluations of positive youth development programs. The Annals of the American Academy, 591, 98-124.
Claridge, J. A., & Fabian, T. C. (2005). History and development of evidence-based medicine. World Journal of Surgery, 29(5), 547-553.
Clearinghouse for Military Family Readiness (n.d.). Programs. Retrieved from: http://www.militaryfamilies.psu.edu/programs.
Davidoff, F. (1999). In the teeth of the evidence: The curious case of evidence- based medicine. Mount Sinai Journal of Medicine, 66(2), 75-83.
Dunifon, R., Duttweiler, M., Pillemer, K., Tobias, D., & Trochim, W. M. K. (2004). Evidence-based Extension. Journal of Extension [On-line], 42(2) Article 2FEA2. Available at: http://www.joe.org/joe/2004april/a2.php.
Elliott, D. S., & Mihalik, S. (2004). Issues in disseminating and replicating effective prevention programs. Prevention Science, 5, 47–53.
Fetsch, R. J., MacPhee, D., & Boyer, L. K. (2012). Evidence-based programming: What is a process an Extension agent can use to evaluate a program’s effectiveness? Journal of Extension [On-line], 50(5) Article 5FEA2. Available at: http://www.joe.org/joe/2012october/a2.php
Flay, B. R., Biglan, A., Boruch, R. F., Castro, F. G., Gottfredson, D., Kellam, S., &.Ji, P. (2005). Standards of evidence: criteria for efficacy, effectiveness and dissemination. Prevention Science, 6, 151-175.
Hill, L. G., & Parker, L. A. (2005). Extension as a delivery system for prevention programming: Capacity, barriers, and opportunities. Journal of Extension [On-line], 43(1) Article 1FEA1. Available at http://www.joe.org/joe/2005february/a1.php
Hoge, M. A., Jacobs, S., Belitsky, R., & Migdole, S. (2002). Graduate education and training for contemporary behavioral health practice. Administration and Policy in Mental Health, 29(4-5), 335-357.
Mincemoyer C, Perkins D, Ang P, Greenberg MT, Spoth RL, Redmond C, & Feinberg M. (2008). Improving the reputation of cooperative extension as a source of prevention education for youth and families: The effects of the PROSPER model. Journal of Extension [On-line], 46 (1) Article 1FEA6. Available at https://www.joe.org/joe/2008february/a6.php
Olson, J., Welsh, J., & Perkins, D. (2015). Evidence-Based Programming within Cooperative Extension: How Can we Maintain Program Fidelity While Adapting to Meet Locals Needs? Journal of Extension [On-line], 53 (3) Article 3FEA3. Available at https://www.joe.org/joe/2015june/pdf/JOE_v53_3a3.pdf.
Olson, J. R. (2010). Choosing effective youth-focused prevention strategies: A practical guide for applied family professionals. Family Relations, 59, 207-220.
Rogers, E. M. (2003). Diffusion of Innovations (5th ed.). New York, NY: Free Press.
Small, S. A., Cooney, S. M., & O’Connor, C. (2009). Evidence-informed program improvement: Using principles of effectiveness to enhance the quality and impact of family-based prevention programs. Family Relations, 58, 1-13.