The Tuesday Letter
Agricultural Experiment Station & Cooperative Extension Service
Tuesday, November 26, 2013
(Vol. 20 No. 5)
IN THIS ISSUE...
WORD FROM THE ASSOCIATE DIRECTOR - EXTENSION AND APPLIED RESEARCH
For this Thanksgiving Holiday, I leave you with a suggested read about one the greatest to have ever lived. An inspirational book, "Our Daily Bread: the Essential Norman Borlaug" by Noel Vietmeyer tells the life and passion of Norman Borlaug who found his calling to feed a starving nation and succeeded beyond belief. Dr. Borlaug understood and relentlessly pursued his "WHY". In tribute to Dr. Norman Borlaug, Dr. Noel Vietmeyer, author of this book, was on the Kansas State University campus for World Food Day, October 16, 2013. Learn more about this author at the above link. The book is also available for download at this link.
And, I
wish you all a wonderful Thanksgiving holiday. I truly feel blessed and
know that I have so much for which to be thankful. My thanks to each of
you for your passion, commitment, and professionalism in making a
positive difference in people's lives through your Extension programs.
- Travel safely,
especially with night travel and deer in the headlights.
- Take a
moment and spread the thanks to others in your workplace, home, and
community during this special holiday time.
We all have far more for
which we are blessed and for which to be thankful. Those thoughts should
overshadow any doubts. And, blessings to you for a wonderful Thanksgiving! --Daryl Buchholz dbuchhol@ksu.edu
AFFORDABLE CARE ACT WORKSHOPS BY AARP
Where:
Southwest Research-Extension Center, Garden City KS
When:
Tuesday, December 3, 2013 Time: English Version at 10:00 a.m.
Spanish Version at 6:00 p.m.
RSVP
by December 2nd to
Lynn Harshbarger by either calling 620-275-9164 or email at harshbar@ksu.edu. Lynn Harshbarger
COMPARING SURVEY DESIGNS
This installment of Ask the OEIE Evaluator was originally developed for and published in the December 20, 2011 edition of the Tuesday Letter.
Asking participants to participate in a survey and complete a questionnaire is a common strategy for evaluating the impact of extension programs. There are different designs you may consider for a survey, each with its own set of advantages and disadvantages. In this installment of Ask the OEIE Evaluator, we discuss three survey designs.
Q: Should I conduct a pre- and post-test? Just a post-test? A retrospective post-then-pre? How do I decide?
These three survey designs –pre-test post-test, post-test only, and retrospective post-then-pre – differ in the timing in which participants complete the survey and in the kinds of information sought at that time. Accordingly, each design presents particular advantages and disadvantages.
With a pre-test post-test design, participants are asked to complete a questionnaire at the beginning of a program and then again at the end. These questionnaires typically include some or all of the same question items so that you can compare responses after their participation in the program to their response prior to participation. The main advantage of a pre-test post-test is that you may infer the effect of your program on changes in reported attitudes, knowledge, and perhaps behaviors, based on participants’ responses “pre” to “post.” However, one significant disadvantage of this design is the possibility of participants remembering or learning from the pre-test, especially if the time between the pre- and the post- is relatively short, and thus responding to the post-test based on that and not what they gained from the program. And, sometimes participants can actually learn from the program that they did not know as much about a topic before as they thought they did, and thus respond lower in the post-test than the pre-test! This is called a “response shift bias.”
When there are concerns about the time, energy (yours or participants’), and resources needed to do both a pre-test and post-test, you may sometimes choose a post-test only design. While this may be easier to implement, the post-test design only allows you to assess participants’ reported knowledge, attitudes or behaviors after the program. This makes it difficult to consider how this may compare to before participation, and thereby, determine the actual effect of your program. Sometimes you can gain a sense of the program effect if you can also gather information from non-participants or identify comparable information (e.g., results from a published study).
Much like the post-test only design, the post-then-pre retrospective survey design is both time and cost effective. It is administered at the end of the program and contains questions about the participant’s knowledge and behavior after, but also before, having participated in it. By asking participants to reflect back on their knowledge and behavior before the program and to compare that to after the program, the post-then-pre retrospective design can be similar to the pre-test post-test design in that it can allow for inference of the effect of their program on changes in knowledge, attitudes, and/or behavior. This design can also help control response shift bias. However, the post-then-pre retrospective is not without its own limitations. For example, participants (especially children) may not always be able to accurately recall the requested information. Additionally, participants may report a change even if one did not occur simply because they know they were supposed to change. This is one form of self-report bias that can occur with the use of the post-then-pre retrospective design.
Questions about evaluation? Visit the Extension Evaluation Resources website, http://apps.oeie.ksu.edu/extension/index.php, or contact Kathleen Gary, ksgary@ksu.edu, or 785-532-5127, at OEIE. --Office of Educational Innovation and Evaluation
RESOURCE CENTER - LOGIC MODEL GUIDEBOOK
A recent
addition to the Resource Center is the second edition of The Logic Model Guidebook: Better
Strategies for Great Results published August 21, 2012. The Logic Model Guidebook offers clear, step-by-step support
for creating logic models and the modeling process in a range of contexts. Authors
Lisa Wyatt Knowlton and Cynthia C. Phillips describe the structures, processes,
and language of logic models as a robust tool to improve the design,
development, and implementation of program and organization change efforts.
The
text is enhanced by numerous visual learning guides (sample models, checklists,
exercises, worksheets) and many new case examples. The authors provide
students, practitioners, and beginning researchers with practical support to
develop and improve models that reflect knowledge, practice, and beliefs. The
text includes logic models for evaluation, discusses archetypes, and explores
display and meaning.
In an important contribution to programs and
organizations, it emphasizes quality by raising issues like plausibility,
feasibility, and strategic choices in model creation.
This should be a useful resource to those wanting to develop a better understanding of how to use a logic model framework to guide their programming. To check these or other materials out, go to http://www.ksre.ksu.edu/resourcecenter or call 785-532-6775. --Marie Blythe mblythe@ksu.edu
|