The Program Evaluation KnowledgeBase is an online resource aiding education professionals in understanding the basics of project evaluation in order to properly assess projects and programs. It is organized around three elements to assist educators with their program evaluation.
Purpose: Planning how to conduct a program evaluation is the essessential the first step. The preparatory thinking involves understanding the program being evaluated, organizing an evaluation team, and determining how to conduct the evaluation. Element 1 outlines the pre-planning tasks.
Guideline: The evaluation questions determine what data is collected and impact how the data is collected. The collection methods should focus on the project's needs. When identifying what methods to use be sure to consider resources needed to do so, responsiveness to program participants, their creditability, and whether the resulting information will be useful to the evaluation process.
This checklist provides guidance in organizing data collection efforts.
This worksheet from the W.K. Kellogg Foundation Evaluation Handbook lists possible evaluation activities for each project stage from planning to policy.
This tool from the U.S. Department of Education's An Idea Book for Planning is useful for managing the data collected during the needs assessment. It consists of two parts: Data Sources Matrix and Data Collection and Analysis Plan.
"The Data Sources Matrix helps organize needs assessment data collection by identifying information sources and methods of data collection. In the matrix, fill specific sources of information you already have on hand from the school profile (e.g., student achievement data, results from a parent survey with results that are pertinent to the planning effort) so you do not duplicate efforts. Then, list any additional information the team decides to collect. Examine each focus area to make sure that there are data describing the status of major aspects of the priority focus areas."
"The Data Collection and Analysis Plan prioritizes the "focus areas" for which data will be collected and it lays out the data collection and analysis plans. First, define the team's key questions, the data collection methods (i.e., surveys, interviews, focus groups, shadowing, etc.), instruments to be used by analysis subcommittee members, and summarize the plans for analysis. List two to three "focus areas" the team plans to study in order of highest (#1) to lowest priority for data gathering. Respond to the questions for each focus area."
This chart compares the pros and cons of eight data collection methods.
This document provides an overview of six data collection techniques that can be useful in profiling the school.
This link is to Part II Chapter 3 of the National Science Foundation's User-Friendly Handbook for Mixed Method Evaluations addressing common qualitative methods used in project evaluations.
Purpose: Conducting the evaluation involves designing data collection so the analysis and interpretation will answer the questions the evaluation sets out to resolve. When developing and implementing the evaluation design be flexible to collect and analyze data from many perspectives. The collected data should be attentive to the evaluation questions. Element 2 outlines the tasks associated with implementing the evaluation.
Purpose: The evaluation's findings and recommendation have limited value unless they are shared with the stakeholders and utilized to improve the evaluated program. Using the results to improve the evaluated program and communicating with constituencies are activities that occur in parallel. Element 3 outlines the tasks associated with using the results.