Darrell A. Dromgoole, Associate Professor and Extension Specialist, Texas A&M AgriLife Extension Service.
Scott Cummings, Associate Department Head and Program Leader; Professor and Extension Specialist, Texas A&M AgriLife Extension Service.
Much of our effort as Extension educators and specialists focuses on providing people with credible, research-based information that ultimately results in behavioral change. Whereas we spend considerable time developing curricula and learning objectives—sometimes even ways of measuring these objectives—the task of actually finding out whether or not people have moved from knowledge to action is often perceived as a labor intensive task.
When executing the PIE Program Change Model, Extension educators should focus on the transformative educational approaches that requires high levels of both process and content, with the goal being to change behavior, have clientele adopt best practices or new technology (Franz & Townson, 2008). For example, the work of an Extension educator who develops, delivers, and evaluates a comprehensive beef cattle nutrition program for county beef cattle producers to achieve long-term impact would be considered transformative education. Transformative education theory provides the theoretical framework to design programs that result in clientele change at a higher level. Figure 1 illustrates an Extension educational model for transformational education.
Figure 1. Texas A&M AgriLife Extension Service Extension Educational Continuum.
This article provides suggestions for following up with program participants and measuring behavior change several months after program completion. The context for these recommendations stems from Extension’s community development programming, but these strategies could easily be applied to other content areas (Chazdon, Horntvedt, &Templin, 2016). The particular method discussed in this article is called the “action items method” (Chazdon, Horntvedt, &Templin, 2016). This method, unlike other approaches for measuring behavior change, requires program participants to define their own action plans as part of the program and then asks them about completing these goals several months after the program is completed (Chazdon, Horntvedt, &Templin, 2016).
Before providing a description of the action items method, it is worth noting the context for behavior change measurement in Extension education. In 1975, Claude Bennett documented in the Journal of Extension that behavior change was among the highest levels of evidence for evaluation of Extension education (Bennett, 1975). Since then, Extension has made progress in measuring behavior change (Chazdon, Horntvedt, &Templin, 2016). There have been many examples of effective behavior change evaluation published in Journal of Extension (Clements, 1999; Garst & Bruce, 2003; Garton et al., 2003; Jayaratne, Harrison, & Bales, 2009; Koszewski, Sehi, Behrends, & Tuttle, 2011). Workman and Scheer’s meta-analysis of evaluation articles published in the Journal of Extension found that about 27% of articles focused on behavior change. Yet, as Workman and Scheer (2012) noted, “Too often, Extension personnel fail to document impact of programs by collecting real evidence of behavior change or greater end results that benefit society” (Problem Statement, Purpose, and Objectives section, para. 1).
Most important, perhaps, is that the National Institute for Food and Agriculture continues to push for impacts that affect conditions rather than simply knowledge changes (Chazdon, Horntvedt, &Templin, 2016). Their effort to collect impacts from across the country encourages evaluation specialists to look beyond knowledge change (National Institute for Food and Agriculture, 2015).
The University of Minnesota’s Extension Center for Community Vitality developed an action items method to measure behavior change in several of its leadership workshops and in one leadership cohort program (Chazdon, Horntvedt, &Templin, 2016). To clarify the action items approach, several core components of the method were identified and are shown in Table 1 (Chazdon, Horntvedt, &Templin, 2016). The table provides information on how the method is conducted in two distinct program contexts. The components of the method include program delivery enhancements, program evaluation enhancements, post program enhancements, and a feedback loop (Chazdon, Horntvedt, &Templin, 2016):
Table 1. Components of the Action Items Method for a Workshop and a Cohort Program (Chazdon, Horntvedt, &Templin, 2016).
Action items method component | One-time workshop Example: eMarketing workshops (target audience = small, locally owned retail and service businesses) | Cohort program Example: Red River Valley Emerging Leadership Program (target audience = 30- to 45-year-olds together for four sessions during a 5-month period) | |
Program delivery enhancements | Prior to 2015:
· Evaluations focused on knowledge gain. Participants were asked to identify action items, but there was limited focus in the curriculum on behavior change or action goals. Beginning in 2015: · Prompted by training on brain research and adult education theory, the team examined how curricula could better identify action steps for workshop participants. |
Prior to 2013:
· Encouraged participants to set personal goals as they completed the program. Beginning in 2013: · Started collecting “action items” at the end of the program. Participants were asked to list 3–5 items. In 2014–15: · Added a 2-hr workshop to encourage participants to reflect on leadership learning. |
|
Program evaluation enhancements | · A form, prepared in duplicate, is completed at the end of the meeting.
· The participant tears off a copy to take away and turns in a copy to Extension staff. At the end of the event, participants are asked the following questions: · Thinking about what you’ve learned here today, what specific actions do you intend to take in the next few months? · Is there anything you decided not to do as a result of this session? · Are there other individuals you plan to share this information with? If so, who? (Example: my employees, my city council, my banker, etc.) |
· A worksheet, My Action Items, is provided to participants to record details about their action steps. The worksheet includes examples (to jump-start their thinking) and requires a signature (to instill a sense of ownership).
· Staff collect the forms, scan them, and mail the originals back to participants within 1 month. · The worksheet completed by participants includes the following prompts: |
|
Post program enhancements | Evaluator-led
· Qualtrics survey is emailed to participants, customized with their action items. Qualtrics allows the evaluator to send the survey under the name of the educator, which increases response rates. · Typically 3 months after end of program · NOTE: Each community economics offering has a slightly different time frame. |
Educator-led
· Educators send regular mail (letters and/or postcards) monthly and follow up with personal emails monthly. · Educators connect with participants via social media (closed Facebook group for participants) and one-on-one contacts initiated by participants. · Educators use a Qualtrics survey (16 months after program) for evaluation, customized with each participant’s action items. |
|
Feedback loop | · The actions that participants list alert staff to topics for new curricula.
· The actions that participants list help determine whether marketing materials were clear regarding learning objectives. · Evaluating action items and what people actually did helps identify where more detailed instruction is needed in the curriculum. |
· Educators review action plans to identify areas in which participants might need support and then use that information to design an alumni retreat (3 months after the session) for continued learning. |
The following are some take home points that emerged from the work done in Minnesota (Chazdon, Horntvedt, &Templin, 2016):
By encouraging participants to identify personal action items, we remind ourselves, as well as the participants, of the continuing value of applying knowledge gained, of evaluation and the importance of adding value to their lives, organizations, and communities (Chazdon, Horntvedt, &Templin, 2016). To the extent that participants identify specific behavioral changes that move them beyond their individual lives, we also exemplify the public value of Extension programming (Chazdon & Paine, 2014; Franz, 2011; Kalambokidis, 2004).
Bennett, C. (1975). Up the hierarchy. Journal of Extension [online], 13(2), 7–12. Available at: http://www.joe.org/joe/1975march/1975-2-a1.pdf
Chazdon,S., Hrontvedt, J. & Templin, E. (2016). From Knowledge to Action: Tips for Encouraging and Measuring Program-Related Behavior Change. Journal of Extension. 54 (2). Available at: https://joe.org/joe/2016april/tt1.php
Chazdon, S., & Paine, N. (2014). Evaluating for public value: Clarifying the relationship between public value and program evaluation. Journal of Human Sciences and Extension, 2(2), 100–119. Retrieved from http://media.wix.com/ugd/c8fe6e_8b2458db408640e580cfbeb5f8c339ca.pdf
Clements, J. (1999). Results? Behavior change! Journal of Extension [online], 37(2) Article 2COM1. Available at http://www.joe.org/joe/1999april/comm1.php
Franz, N. K. (2011). Advancing the public value movement: Sustaining Extension during tough times. Journal of Extension [online], 49(2) Article 2COM2. Available at: http://www.joe.org/joe/2011april/comm2.php
Franz, N., & Townson, L. (2008). The nature of complex organizations: The case of Cooperative Extension. In M. T. Braverman, M. Engle, M. E. Arnold, & R. A. Rennekamp (Eds.), Program evaluation in a complex organizational system: Lessons from Cooperative Extension (pp. 5–14). New Directions for Evaluation, 120. San Francisco, CA: Jossey-Bass. doi:10.1002/ev.272
Garst, B. A., & Bruce, F. A. (2003). Identifying 4-H camping outcomes using a standardized evaluation process across multiple 4-H educational centers. Journal of Extension [online], 41(3) Article 3RIB2. Available at: http://www.joe.org/joe/2003june/rb2.php
Garton, M., Hicks, K., Leatherman, M., Miltenberger, M., Mulkeen, P., Nelson-Mitchell, L., & Winland, C. (2003). Newsletters: Treasures or trash? Parenting newsletter series results in positive behavior changes. Journal of Extension [online], 41(1) Article 1RIB5. Available at: http://www.joe.org/joe/2003february/rb5.php
Jayaratne, K. S. U., Harrison, J. A., & Bales, D. W. (2009). Impact evaluation of food safety self-study Extension programs: Do changes in knowledge relate to changes in behavior of program participants? Journal of Extension [online], 47(3) Article 3RIB1. Available at: http://www.joe.org/joe/2009june/rb1.php
Kalambokidis, L. (2004). Identifying the public value in Extension programs. Journal of Extension [online], 45(2) Article 2FEA1. Available at: http://www.joe.org/joe/2004april/a1.php
Koszewski, W., Sehi, N., Behrends, D., & Tuttle, E. (2011). The impact of SNAP-ED and EFNEP on program graduates 6 months after graduation. Journal of Extension [online], 49(5) Article 5RIB6. Available at: http://www.joe.org/joe/2011october/rb6.php
National Institute of Food and Agriculture. (2015). NIFA 2015 Impacts Report. Retrieved from http://nifa.usda.gov/sites/default/files/resource/NIFA%202015%20Impact%20Report%20Web%20Version.pdf
Workman, J. D., & Scheer, S. D. (2012). Evidence of impact: Examination of evaluation studies published in the Journal of Extension. Journal of Extension [online], 50(2) Article 2FEA1. Available at: http://www.joe.org/joe/2012april/a1.php