Clear title – The recommended way of writing the title is that you should write it on a page of its own. The title page should contain a recognizable name of the project, dates of the project, and the general focus of the evaluation plan. Uses and Users of the Evaluation Plan – It is essential to describe the use of the evaluation plan clearly.
Full Answer
An effective evaluation plan should show how the project will be monitored and how its objectives will be met. To effectively complete or implement most projects, an evaluation plan is needed. There are two basic types of evaluation plans: A formative evaluation plan is completed before or during the project.
Writing an Evaluation Plan An evaluation plan is an integral part of a grant proposal that provides information to improve a project during development and implementation. For small projects, the Office of the Vice President for Research can help you develop a simple evaluation plan.
Before embarking on the design of your course evaluation, you should determine what your goals are for this research – what information would you like to acquire? And, what will you use this information for? Do you want feedback on student satisfaction with course structure, effective teaching methods of instructors, or communication of administ...
The purpose of program evaluation is to systematically collect information about program activities and objectives, monitor progress, and to report and communicate results to network members, partners, stakeholders, and community.
The evaluation process can be broken down into a series of steps, from preparation to implementation and interpretation.Develop a conceptual model of the project and identify key evaluation points. ... Create evaluation questions and define measurable outcomes. ... Develop an appropriate evaluation design. ... Collect data.More items...
An evaluation plan is a written document that describes how you will monitor and evaluate your program, as well as how you intend to use evaluation results for program improvement and decision making. The evaluation plan clarifies how you will describe the “What,” the “How,” and the “Why It Matters” for your program.
The 5 Step Approach to Evaluation: Designing and Evaluating Behaviour Change InterventionsForeward.Background: The tricky business of assessing impact in a messy world.The 5-Step approach.Step 1: Identify the problem.Step 2: Review the evidence.Step 3: Draw a logic model.Step 4: Monitor your logic model.More items...•
The essential first step in an evaluation is identifying the purpose of the evaluation and who the main audience will be. This will help determine the type of evaluation you do and the data you collect. The purpose of an evaluation is often linked to the main intended audience.
An evaluation plan is a written document that describes how you will monitor and evaluate your program, as well as how you intend to use evaluation results for program improvement and decision making. The evaluation plan clarifies how you will describe the "What," the "How," and the "Why It Matters" for your program.
Good evaluation is replicable and its methods are as rigorous as circumstances allow. A good evaluation is one that is likely to be replicable, meaning that someone else should be able to conduct the same evaluation and get the same results.
In general, evaluation processes go through four distinct phases: planning, implementation, completion, and reporting.
In this section, each of the four phases is discussed.Planning. ... Implementation — Formative and Process Evaluation. ... Completion — Summative, Outcome, and Impact Evaluation. ... Dissemination and Reporting.
Steps for Effective EvaluationUnderstand the evaluation environment. Determine the objective and target group. ... Create a logic model. It explains how and why a project's activities are expected to lead to desired outcomes. ... Determine benchmarks and indicators. ... Outline an evaluation schedule. ... Pick evaluation tools.
Evaluation in Six StepsPlan the program/Collect information.Write objectives.Decide what, how and when to measure.Conduct the program and monitor progress.Collect information and interpret findings.Use results.
To evaluate is defined as to judge the value or worth of someone or something. An example of evaluate is when a teacher reviews a paper in order to give it a grade. To determine the importance, effectiveness, or worth of; assess. Evaluate teacher performance.
A useful starting set of key evaluation questions to guide initial analysis are:Is the research delivering on outputs and outcomes as planned? (efficiency and effectiveness)Have applied activities and their delivery methods been effective? ... Is the wider project story being told?More items...
The following example is from MUS STU 352 – Visitor Experience and Design in Museums, taken from the pre-class case studies. Before the instructor teaches her course, she identifies her goals as well as those of other stakeholders.
The instructor identifies the following kinds of qualitative and quantitative evidence to be collected. This data will help her make evidence-based changes to improve the effectiveness of her course design.
DoIT Academic Technology Evaluation Services has developed some principles of evaluation that can be helpful as you plan to evaluate your course. Good evaluation plans:
There are two types of evaluation typically requested by funders--formative and summative —and which you use is largely dictated by the purpose of the evaluation. Do you want to prove that you achieved the outcomes as intended (summative) or are you doing evaluation to monitor if you are doing what you said you would in your grant application (formative)? Or both? We can help you prepare and review both types of evaluations outlined below.
For small projects, the Office of the Vice President for Research can help you develop a simple evaluation plan. If you are writing a proposal for larger center grant, using a professional external evaluator is recommended. We can provide recommendations of external evaluators; please contact Amy Carroll at [email protected] or 3-6301. For faculty in BioMed, please contact Judy Kimberly, Evaluation Director, at [email protected] or 3-5171.
Research provides the basis for drawing conclusions; Evaluation provides a basis for decision making
Involves review by the principal investigator, the steering or governance committee, and either an internal or external evaluator (depending on grant requirements)
Not all grant proposals require an evaluation plan; however, many program announcements and funding opportunities stipulate and evaluation strategy with specific milestones are important elements that should be considered. If an evaluation plan is required, it will generally be listed in the program announcement.
Ask direct and clear questions. For example, if one of your goals is to collect feedback on the specified course, be sure to ask direct questions perta ining to the course curriculum. Another goal could be to collect feedback on the instructor’s teaching style. In this case, ask direct questions pertaining to how the course was taught and what the professor can do to improve. Ensure that these questions are clear and concise, and directly related to the feedback you are trying to acquire. The clearer the question is, the less likely that it will be open to interpretation and result in a variety of answers.
Clearly communicate that evaluation responses will remain strictly confidential. Confidentiality gives students the assurance they need to comfortably provide absolute, candid feedback. Again, this is another way to increase response rates.
Writing an evaluation plan will not ensure that the evaluation is implemented on time, as intended, or within budget. A critical piece of the evaluation plan is to identify the roles and responsibilities of program staff, evaluation staff, contractors, and stakeholders from the beginning of the planning process. This information should be kept up to date throughout the implementation of the evaluation. Stakeholders must clearly understand their role in the evaluation implementation. Maintaining an involved, engaged network of stakeholders throughout the development and implementation of the plan will increase the likelihood that their participation serves the needs of the evaluation. An evaluation implementation work plan is as critical to the success of the evaluation as a program work plan is to the success of the program. This is even more salient when multiple organizations are involved and/or multiple evaluation activities occur simultaneously.
These two aspects of the evaluation serve as a foundation for evaluation planning, focus, design, and interpretation and use of results. The purpose of an evaluation influences the identification of stakeholders for the evaluation, selection of specific evaluation questions, and the timing of evaluation activities. It is critical that the program is transparent about intended purposes of the evaluation. If evaluation results will be used to determine whether a program should be continued or eliminated, stakeholders should know this up front. The stated purpose of the evaluation drives the expectations and sets the boundaries for what the evaluation can and cannot deliver. In any single evaluation, and especially in a multi-year plan, more than one purpose may be identified; however, the primary purpose can influence resource allocation, use, stakeholders included, and more. Purpose priorities in the plan can help establish the link between purposes and intended use of evaluation information. While there are many ways of stating the identified purpose(s) of the evaluation, they generally fall into three primary categories:
As previously stated, the planning stage is the time for the program to address the best way to share the lessons you will learn from the evaluation. The communication-dissemination phase of the evaluation is a two-way process designed to support use of the evaluation results for program improvement and decision making. In order to achieve this outcome, a program must translate evaluation results into practical applications and must systematically distribute the information or knowledge through a variety of audience-specific strategies.
A program description clarifies the program’s purpose, stage of development, activities, capacity to improve health, and implementation context. A shared understanding of the program and what the evaluation can and cannot deliver is essential to the successful implementation of evaluation activities and
This is important because the link between outputs and short-term outcomes remains an empirical question.
narrative description helps ensure a full and complete shared understanding of the program. A logic model may be used to succinctly synthesize the main elements of a program. While a logic model is not always necessary, a program narrative is. The program description is essential for focusing the evaluation design and selecting the appropriate methods. Too often groups jump to evaluation methods before they even have a grasp of what the program is designed to achieve or what the evaluation should deliver. Even though much of this will have been included in your funding application, it is good practice to revisit this description with your ESW to ensure a shared understanding and that the program is still being implemented as intended. The description will be based on your program’s objectives and context but most descriptions include at a minimum:
This workbook was developed by the Centers for Disease Control and Prevention’s (CDC’s) Office on Smoking and Health (OSH) and Division of Nutrition, Physical Activity, and Obesity (DNPAO). This workbook was developed as part of a series of technical assistance workbooks for use by program managers, and evaluators. The workbooks are intended to offer guidance and facilitate capacity building on a wide range of evaluation topics. We encourage users to adapt the tools and resources in this workbook to meet their program’s evaluation needs.
The purpose of program evaluation is to systematically collect information about program activities and objectives, monitor progress, and to report and communicate results to network members, partners, stakeholders, and community.
Evaluation results provide a means of demonstrating program progress and impact to its members and the community. The results are an assessment of the program’s effectiveness and should be easy for external readers to quickly see and understand the status of the program in terms of effective implementation and demonstrating positive change toward the goals and objectives of the program.
The purpose of a communication plan is to intentionally and purposefully share program progress and impact with program members, partners, stakeholders, community members, and funders. Thinking through a communication plan focuses network leadership on a critical factor of network development; intentional communication builds trust and credibility between the program and its members and partners increasing likelihood of program sustainability. The external reader is interested in understanding how the program plans to communicate its successes and achievements.
Program objectives are identified as either strategic objectives or outcomes depending on the selected planning framework. The data is often referred to as “achievements” or “outcomes”. External readers are primarily interested in progress that demonstrates impact of the program goals; E.g. “What progress has been made on program impact?” In other words, the program efforts are indeed moving toward the goals. It is this result of the effort that should be measured, i.e. the achievement.
One key to successfully evaluating objectives is to have SMART objectives: specific, measureable, attainable, realistic and timely. It is worth the time and effort to consider and revise objectives to be SMART.
Program activities are actions and processes put into place to execute objectives. They help network leaders and stakeholders track the implementation activity of a program; “What progress has been made toward program implementation?” Implementation activities are typically identified within the program work plan. This data is often referred to as ‘outputs’. However, external readers are primarily interested in the key activities that are aligned with the program goals and considered critical to the success of the program and that will demonstrate success of implementation.