Speaker 1: In the late 1990s, the CDC convened an evaluation working group charged with developing a framework that summarizes and organizes the basic elements of program evaluation. The CDC's Framework for Program Evaluation was developed as a guide to public health professionals for systematic planning for evaluation. The framework was developed to summarize and organize the essential elements of program evaluation, provide a common frame of reference for conducting evaluations, clarify the steps in program evaluation, review standards for effective program evaluation, address misconceptions about the purposes and methods of program evaluation. The framework includes six steps in evaluation practice and four standards for effective evaluation. Central to the framework are the standards adopted from the Joint Committee on Standards for Educational Evaluation. The four standards are Utility. These considerations are directed toward ensuring that an evaluation will serve the information needs of intended users and include identifying and addressing needs of relevant stakeholders, ensuring competent, trustworthy, and credible evaluators, collecting a broad range of information addressing the interests of the program, clients, and stakeholders, describing the perspectives, underlying interpretation of findings, reporting findings in a clear and easy-to-understand way, reporting as evaluation findings become available, evaluating in a way that encourages follow-through by stakeholders, feasibility. These considerations are intended to ensure that an evaluation will be realistic, prudent, diplomatic, and frugal. Evaluation procedures should be practical. Evaluation should be carried out in a way that avoids politicizing or misusing the results. The evaluation should be cost-effective. Propriety. These considerations are intended to ensure that an evaluation will be conducted legally, ethically, and with due regard to the welfare of those involved in the evaluation, as well as those affected by its results. Service Orientation. Evaluation should be designed to assist organizations to address and effectively serve the needs of the full range of targeted participants. These standards are similar in nature to those used in institutional review of human subjects' research. Accuracy. These considerations are intended to ensure that an evaluation will reveal and convey technically adequate information about the features that determine worth or merit of the program being evaluated. These include reliable documentation of findings using appropriate analyses with justifiable conclusions. Step one involves engaging the stakeholders. Who are the stakeholders? Anyone who might be affected by the program or policy. Examples of stakeholders might be program health department staff, program participants clients, staff in other programs, potential competitors. Why is it important to include stakeholders? To get agreement on program goals. To get agreement on purpose of evaluation. One size doesn't fit all. To bring together lay and professional resources. To build capacity to address health needs and give more control over factors affecting health. Because it increases credibility of evaluation, increases likelihood that recommendations from evaluations will be implemented. The second step of the framework is to describe the program. It is helpful to think in terms of the logic model. Description of the program should include information about the long-term goals and objectives of the program, as well as the strategies for reaching those goals. In particular, these aspects should be addressed. The need to be addressed by the program. The expected effects of the program. What constitutes success? The stage at which the program is in, whether it be planning, implementation, or effects. The context in which the program operates, including history, geography, or politics. Social economic conditions, and other things that can affect results. A flow chart or logic model that describes how the program is supposed to work. The third step of the framework is to focus the evaluation design. At issue is striking a balance between meeting the concerns of stakeholders and using resources efficiently. Things to consider at this step are defining the purpose of the evaluation as established by stakeholders, ensuring user participation in determining the focus of evaluation, identifying and prioritizing the different uses for which evaluation results might be used, determining questions to be asked. Take some time here to collect questions, identify need process, impact, and or outcome evaluation. Specifying evaluation methods. This includes the design of evaluation and selection of measure. Developing explicit agreements that summarize procedures, clarify roles and responsibilities, use of resources, and protection of human subjects. The fourth step in the framework is gathering credible evidence. Steps in gathering credible evidence include indicators to be used. What sources of evidence are available or useful? What is the quality of the data? How much evidence is necessary? What are the logistics for gathering this information? Use of multiple procedures for gathering data is important. The fifth step in the framework is justification of conclusions. Evaluation conclusions are justified when they are linked to the evidence gathered and judged against agreed-upon values or standards set by the stakeholders. Stakeholders must agree that conclusions are justified before they will use the evaluation results with confidence. Elements involved in ensuring justification for conclusions are use of appropriate analyses for examining data, summarizing findings, and looking for patterns in the results, interpreting what the findings mean, making judgments regarding the merit, worth, and significance of the program. This involves comparison of findings and interpretations against agreed-upon standards. Standards reflect the values of stakeholders and are the basis for forming judgments. Finally, recommendations are actions for consideration based on results. Recommendations regarding the use of evaluation results should be made with regard to the organizational context in which programming decisions will be made. The final step in the framework involves ensuring that lessons learned from the evaluation will be shared in a planned, thoughtful way so that they may be used effectively. Elements that ensure effective use of evaluation information are design. Remember the third step of the framework? The evaluation design should be focused at that step to ensure effective use of evaluation results. Preparation Steps taken to plan for or rehearse eventual use of the evaluation findings. How might your findings affect the decision-making of the stakeholders? Feedback Providing feedback of findings to all parties of the evaluation builds trust and keeps the evaluation activities on track. Follow-up This includes providing support to the users of the evaluation. Follow-up can also help avoid possible misuse of findings. Dissemination Findings and lessons learned about the program should be communicated to relevant audiences in a timely and appropriate manner. Reports can take many forms, policy brief, press release, full report, brief report, or executive summary. And that's it. We've completed the framework.
Generate a brief summary highlighting the main points of the transcript.
GenerateGenerate a concise and relevant title for the transcript based on the main themes and content discussed.
GenerateIdentify and highlight the key words or phrases most relevant to the content of the transcript.
GenerateAnalyze the emotional tone of the transcript to determine whether the sentiment is positive, negative, or neutral.
GenerateCreate interactive quizzes based on the content of the transcript to test comprehension or engage users.
GenerateWe’re Ready to Help
Call or Book a Meeting Now