The Program Evaluation KnowledgeBase is an online resource aiding education professionals in understanding the basics of project evaluation in order to properly assess projects and programs. It is organized around three elements to assist educators with their program evaluation.
These guidelines deal with the ethical considerations involved in conducting survey research with children and young people for both economic and sociological purposes. ESOMAR, originally founded as the European Society for Opinion and Marketing Research, is the "world association of research professionals."
The Family Educational Rights and Privacy Act (FERPA) is a federal law designed to protect the privacy of student education records. The law applies to all schools receiving funds under any applicable U.S. Department of Education program. School districts should ensure all staff members and outside contractors, such as counselors and interpreters, are familiar with the applicable policies on the privacy of student records.
A research activity involves human subjects if the activity is research, as defined in the U.S. Department of Educationâ€™s regulations, and the research activity will involve use of human subjects, as defined in the regulations. Such activities must follow the Regulations for the Protection of Human Subjects. When developing a program evaluation, it is important to keep these restrictions in mind.
This link is to the U.S. Department of Education's Office of Chief Financial Officer webpage providing information on federal requirements associated with protecting human subjects in research. Educators inquiring as to federal requirements may find this content useful.
Purpose: Planning how to conduct a program evaluation is the essessential the first step. The preparatory thinking involves understanding the program being evaluated, organizing an evaluation team, and determining how to conduct the evaluation. Element 1 outlines the pre-planning tasks.
Purpose: Conducting the evaluation involves designing data collection so the analysis and interpretation will answer the questions the evaluation sets out to resolve. When developing and implementing the evaluation design be flexible to collect and analyze data from many perspectives. The collected data should be attentive to the evaluation questions. Element 2 outlines the tasks associated with implementing the evaluation.
Purpose: The evaluation's findings and recommendation have limited value unless they are shared with the stakeholders and utilized to improve the evaluated program. Using the results to improve the evaluated program and communicating with constituencies are activities that occur in parallel. Element 3 outlines the tasks associated with using the results.