Best Practice Model for Evaluation

© By Barry Sweeny, 2010

The following list presents the basic program evaluation model which I use to guide my own consulting and program evaluation work. Of course, in practice, the steps are more complex than is clear here. Never-the-less, I share this with you to aid you in evaluating your own mentoring program. The concept I have in mind as I suggest this is that using this sequence will assure you that your evaluation has the right steps in the process. Other pages elsewhere on this web site will help you be sure that you also use best practices as you do each step.

The original ideas for this model are loosely derived from the research and work of both Tom Gusky and Donald Kirkpatrick. See below for citations. The model has been developed through use and refinement over about 20 years of work in curriculum, staff development, with grants, and many other kinds of programs. It has been refined and adapted to the point where I feel it represents a “best practice” model.

Please note – The “Tools” menu has a version of this list which has added columns for check offs and notes.

Steps in the Process:

  1. Set the  Purpose and the Goals for the evaluation
  2. Identify the audiences for the conclusions and recommendations
  3. Define the indicators of success for each audience
  4. Check the alignment (relevance) of each indicator to program goals
  5. Establish the evaluation parameters – what you will and will not do
  6. Determine the scope of the evaluation process
  7. Organize the indicators by data type:
    1. Reactions (survey attitudes)
    2. Learning (self-assessment/test)
    3. Behavior (observe extent of use)
    4. Needs Assessment (obstacles to use? needs?)
    5. Results (effectiveness, gains)
  8. State the evaluation questions
  9. Refine the evaluation questions for:
    1. Measurability
    2. Feasibility
    3. Clarity
    4. Validity & Fairness
    5. Consensus
  10. Select how to collect the data:
    1. Method
    2. Design/select the instrument(s)
    3. Check the need to validate instrument(s)
  11. List the evaluation implementation plan:
    1. List the activities
    2. Set the time line
    3. Decide the persons responsible
  12. Estimate the cost of the evaluation
  13. Gain approval to do the evaluation plan
  14. Collect the data
  15. Analyze and interpret the data
  16. Evaluate the evaluation
  17. Report the conclusions & offer recommendations
  18. Develop improvement priorities (focus?) and targets (how much improvement?)
  19. Develop an improvement action plan

REFERENCES:

Gusky, T. (1998). “The Age of Our Accountability”. Journal of Staff Development. Fall, 36-44. National Staff Development Council. This is not the original source I used, but it is a good and more recent one.

Kirkpatrick, D. (1975). “Evaluating Training Programs”. Madison, WI. American Society for Training and Development.

Kirkpatrick, D. (1977). “Evaluating Training Programs: Evidence Vs Proof”. Training and Development Journal. 31, (11), 9-12.