Darrell A. Dromgoole, Associate Professor and Extension Specialist, Texas A&M AgriLife Extension Service.
Scott Cummings, Associate Department Head and Program Leader; Professor and Extension Specialist, Texas A&M AgriLife Extension Service.
Linking program design, program objectives and evaluation is critically important to consider when designing Extension programs during the plan phase of the PIE Program Change Model. Yet a framework for making such a connection has not been presented. Instead, previous frameworks have tended to focus on either educational programs or evaluation designs. Franz and Townson (2008) provided a framework for classifying educational programs through a quadrant analysis. By plotting content on the x-axis and process (delivery methods) on the y-axis and then overlaying program design domains onto four quadrants, they represent four distinct domains of educational programming: service, facilitation, content transmission, and transformative education (see Figure 1). The four program domains mirror components of key evaluation models (Bennett, 1975; Bennett & Rockwell, 1995; Kirkpatrick, 1996) often used in Extension.
Figure 1. Four Domains of Educational Programming (Franz and Townson, 2008).
When executing the PIE Program Change Model, Extension educators should focus on the transformative domain (domain with square around it) that requires high levels of both process and content, with the goal being to change behavior, have clientele adopt best practices or new technology (Franz & Townson, 2008). For example, the work of an Extension educator who develops, delivers, and evaluates a comprehensive beef cattle nutrition program for county beef cattle producers to achieve long-term impact would be considered transformative education. Such programs can be evaluated through the use of quasi-experimental or true experimental designs involving follow-up studies and pre- and posttests used for assessing, for example, percentage increase in producers adopting best practices or adoption of technology (Radhakrishna, Chaudhary, & Tobin, 2019). Transformative education theory provides the theoretical framework to design programs that result in clientele change at a higher level. Figure 2 illustrates an Extension educational model for transformational education.
Figure 2. Texas A&M AgriLife Extension Service Extension Educational Continuum.
Transformational education builds upon Extension’s long history of providing quality educational experiences for clientele. Teaching specific disciplines and transferring research-based information or content has been and remains the hallmark of Extension since its inception. AgriLife Extension historically emphasized a variety of approaches to traditional information transfer. However, since the 1980s, AgriLife Extension programs have not focused just on discipline or information-oriented needs but shifted its focus to issue-based needs that require a more multidisciplinary approach. Extension has a competitive advantage in deploying transformational Extension education because there are many options for clientele to access educational information from competing educational enterprises, agriculture manufacturing companies, private consultants, the internet, health care providers, or other outreach educational sources. Extension is operating in a very competitive environment (Blewett, Keim, Leser, & Jones, 2008). However, AgriLife Extension is uniquely positioned with an extensive educational network of county Extension agents and specialists. If transformational education is an approach that can deliver the most value to communities, it is essential to design educational programs more consistently to lay the foundation for transformational learning and action in communities.
When designing transformative educational programs it is imperative that Extension educators take a deliberate approach to designing programs that includes connecting program design, objectives and evaluation during the plan phase of the PIE Program Change Model. Figure 3 illustrates how program objectives is linked to educational design/teaching points, evaluation methods and outcome indicators.
Figure 3. Linking Program Objectives with Program Design/Teaching Points, Evaluation Method and Outcome Indicators.
To demonstrate how to link program design and program objectives with evaluation design, an example is provided in Table 1 by indicating the program objective(s), educational program design/teaching points, evaluation method(s), and outcome(s) in the transformative domain (Radhakrishna et al, 2019).
Table 1. Linking Program Design with Objectives and Evaluation for Transformative Education.
Implementing the new AgriLife Extension PIE Program Change Model requires a new mindset that moves clientele from obtaining knowledge to taking action in order to change behavior, adopt new practices, or adopt new technology that results in economic, social, or environmental impacts. Transformative education theory provides the theoretical framework to design programs that result in clientele change at a higher level. In describing transformative outcomes, O’Sullivan, Morrell and O’Conner (2002) reported that transformative learning involves experiencing a deep, structural shift in the basic premises of thought, feelings, and actions. The basic premise regarding transformative Extension education is that learning, or knowledge is a precursor to action or change.
It should be emphasized that time between learning experiences is imperative to providing a more optimum educational environment for clientele (Gray, Sewell Hartnett, Wood, Kemp, Blair, Kenyon & Morris, 2016). By providing clientele time between educational experiences, they have time to reflect and synthesize information, while determining how it will be applied to their specific situation.
Bennett, C. (1975). Up the hierarchy. Journal of Extension, 13(2). Available at: https://www.joe.org/joe/1975march/1975-2-a1.pdf
Bennett, C., & Rockwell, K. (1995, December). Targeting outcomes of programs (TOP): An integrated approach to planning and evaluation.Unpublished manuscript. Lincoln, NE: University of Nebraska.
Blewett, T., Keim, A., Leser, J., & Jones, J. (2008). Designing a transformational education model for the engaged university. Journal of Extension. Retrieved from https://www.joe.org/joe/2008june/comm1.php
Campbell, D. T., & Stanley, J. (1963). Experimental and quasi-experimental designs for research. Chicago, IL: Rand-McNally.
Franz, N., & Archibald, L. (2018). Four approaches to building Extension program evaluation capacity. Journal of Extension, 56(4), Article 4TOT5. Available at: https://www.joe.org/joe/2018august/tt4.php
Franz, N., & Townson, L. (2008). The nature of complex organizations: The case of Cooperative Extension. In M. T. Braverman, M. Engle, M. E. Arnold, & R. A. Rennekamp (Eds.), Program evaluation in a complex organizational system: Lessons from Cooperative Extension (pp. 5–14). New Directions for Evaluation, 120. San Francisco, CA: Jossey-Bass. doi:10.1002/ev.272
Gray, D., Sewell, A., Hartnett, M., Wood, B., Kemp, P., Blair, H., Kenyon, P., & Morris, S. (2016) Improved extension practices for sheep and beef farmers. Hill Country- Grassland Research and Practice Series # 16.
Kirkpatrick, D. (1996). Great ideas revisited: Techniques for evaluating training programs. Training and Development, 50, 54–59.
O’Sullivan, E., Morrell, A., O’Connor, M. (2002). Expanding the boundaries of transformative learning: Essays on theory and praxis. New York, NY: Palgrave.
Radhakrishna, R., Chaudhary, A.,& Tobin, D. (2019).Linking Extension Program Design with Evaluation Design for Improved Evaluation. Journal of Extension, 57(4), Article 4TOT1. Available at: https://www.joe.org/joe/2019august/tt1.php
Radhakrishna, R. B., & Relado, R. Z. (2009). A framework to link evaluation questions to program outcomes. Journal of Extension, 47(3), Article 3TOT2. Available at: https://www.joe.org/joe/2009june/tt2.php