Purpose: Planning how to conduct a program evaluation is the essessential the first step. The preparatory thinking involves understanding the program being evaluated, organizing an evaluation team, and determining how to conduct the evaluation. Element 1 outlines the pre-planning tasks.
Guideline: When interpreting collected data do so within the context of the evaluation questions. Depending on the scope, complexity of the data analysis and available in-house expertise it may be necessary to acquire external statistical assistance to aid with the interpretation. External assistance may be particularly useful if the program being evaluated stirs emotions among the stakeholders.
This American Statistical Association document encourages ethical and effective statistical work.
Edward Tufte, author of The Visual Display of Quantitative Information suggests eight practices to consider to assure the accurate representation of visual data.
This resource describes in "plain English, some basic concepts in statistics that every writer should know."
Purpose: Conducting the evaluation involves designing data collection so the analysis and interpretation will answer the questions the evaluation sets out to resolve. When developing and implementing the evaluation design be flexible to collect and analyze data from many perspectives. The collected data should be attentive to the evaluation questions. Element 2 outlines the tasks associated with implementing the evaluation.
Purpose: The evaluation's findings and recommendation have limited value unless they are shared with the stakeholders and utilized to improve the evaluated program. Using the results to improve the evaluated program and communicating with constituencies are activities that occur in parallel. Element 3 outlines the tasks associated with using the results.