In: Nursing
This information below just need to respond to as a discussion.
In planning a program, the evaluation must include the outcome goals and objectives must be clearly expressed. As a program is being developed, it is essential to incorporate the planning and the evaluation phase into the process (Issel, 2009). In addition, when considering the measurement of and evaluation of the effectiveness of programmatic intervention, is it important to collect and/or measure data at different intervals.
ANSWER:
The content in the question is about Program Evaluation.
Carefully reading and understanding the contents given below will help you gain mastery over the topic go for the discussion with confidence.
INTRODUCTION:
The concept of program evaluation can include a wide variety of methods to evaluate many aspects of programs in nonprofit or for-profit organizations. Typically, organizations work from their mission to identify several overall goals which must be reached to accomplish their mission. In nonprofits, each of these goals often becomes a program. Nonprofit programs are organized methods to provide certain related services to constituents, e.g., clients, customers, patients, etc. . In a for-profit, a program is often a one-time effort to produce a new product or line of products. Programs must be evaluated to decide if the programs are indeed useful to constituents.
PROGRAM EVALUATION-MEANING:
Evaluation is a process that critically examines a program. It involves collecting and analyzing information about a program's activities, characteristics, and outcomes. Its purpose is to make judgments about a program, to improve its effectiveness, and/or to inform programming decisions
Program evaluation can include any or a variety of at least 35 different types of evaluation, such as for needs assessments, accreditation, cost/benefit analysis, effectiveness, efficiency, formative, summative, goal-based, process, outcomes, etc.
NEED FOR PROGRAMME EVALUATION:
MAJOR TYPES OF PROGRAM EVALUATION
1.Formative Evaluation
Evaluates a program during development in order to make early improvements.and helps to refine or improve program.
E.g: When starting a new program to assist in the early phases of program development,
How well is the program being delivered?
What strategies can we use to improve this program?
2. Summative Evaluation
Provides information on program effectiveness.It is conducted after the completion of the program.It helps to decide whether to continue or end a program .It also helps to determine whether a program should be expanded to other locations.
E.g:
Should this program continue to be funded?
Should we expand these services to all other after-school programs in the community?
3.Goals-Based Evaluation
Often programs are established to meet one or more specific goals. These goals are often described in the original program plans. Goal-based evaluations are evaluating the extent to which programs are meeting predetermined goals or objectives. Questions to ask yourself when designing an evaluation to see if you reached your goals, are:
E.g: Do personnel have adequate resources (money, equipment, facilities, training, etc.) to achieve the goals?
4. Process-Based Evaluations
Process-based evaluations are geared to fully understanding how
a program works, how does it produce that results that it
does.
There are numerous questions that might be addressed in a process
evaluation. Examples of questions to ask yourself when designing an
evaluation to understand and/or closely examine the processes in
your programs, are:
E.g: On what basis do employees and/or the customers decide that products or services are needed?
5. Outcomes-Based Evaluation
Program evaluation with an outcomes focus is increasingly important for nonprofits and asked for by funders.
An outcomes-based evaluation facilitates your asking if your organization is really doing the right program activities to bring about the outcomes you believe (or better yet, you've verified) to be needed by your clients (rather than just engaging in busy activities which seem reasonable to do at the time).
METHODS TO COLLECT DATA DURING EVALUATIONS.
The following provides an overview of the major methods used for collecting data during evaluations.
FOUR LEVELS OF EVALUATION:
There are four levels of evaluation information that can be gathered from clients, including getting their:
1. reactions and feelings (feelings are often poor indicators that
the service made lasting impact)
2. learning (enhanced attitudes, perceptions or knowledge)
3. changes in skills (applied the learning to enhance
behaviors)
4. effectiveness (improved performance because of enhanced
behaviors)
ANALYZING AND INTERPRETING INFORMATION
Analyzing quantitative and qualitative data is often the topic of advanced research and evaluation methods. There are certain basics which can help to make sense of reams of data.
Always start with your evaluation goals:
When analyzing data (whether from questionnaires, interviews, focus
groups, or whatever), always start from review of your evaluation
goals,
Example: If you wanted to improve your program by identifying its strengths and weaknesses, you can organize data into program strengths, weaknesses and suggestions to improve the program.
BASIC ANALYSIS OF "QUANTITATIVE" INFORMATION (FOR INFORMATION OTHER THAN COMMENTARY, E.G., RATINGS, RANKINGS, YES'S, NO'S, ETC.):
1. Make copies of your data and store the master copy away.Use the
copy for making edits, cutting and pasting, etc.
2. Tabulate the information, i.e., add up the number of ratings,
rankings, yes's, no's for each question.
3. For ratings and rankings, consider computing a mean, or average,
for each question. For example, "For question #1, the average
ranking was 2.4". This is more meaningful than indicating, e.g.,
how many respondents ranked 1, 2, or 3.
4. Consider conveying the range of answers, e.g., 20 people ranked
"1", 30 ranked "2", and 20 people ranked "3".
BASIC ANALYSIS OF "QUALITATIVE" INFORMATION (RESPONDENTS' VERBAL ANSWERS IN INTERVIEWS, FOCUS GROUPS, OR WRITTEN COMMENTARY ON QUESTIONNAIRES):
1. Read through all the data.
2. Organize comments into similar categories, e.g., concerns,
suggestions, strengths, weaknesses, similar experiences, program
inputs, recommendations, outputs, outcome indicators, etc.
3. Label the categories or themes, e.g., concerns, suggestions,
etc.
4. Attempt to identify patterns, or associations and causal
relationships in the themes,
e.g., all people who attended programs in the evening had similar concerns, most people came from the same geographic area, most people were in the same salary range, etc.
4. Keep all commentary for several years after completion in case
needed for future reference.
INTERPRETING INFORMATION:
1.Attempt to put the information in perspective, e.g., compare results to what you expected, promised results; management or program staff; any common standards for your services; original program; indications of accomplishing outcomes (especially if you're conducting an outcomes evaluation); description of the program's experiences, strengths, weaknesses, etc. (especially if you're conducting a process evaluation).
2.Consider recommendations to help program staff improve the program, conclusions about program operations or meeting goals, etc.
3. Record conclusions and recommendations in a report document, and associate interpretations to justify your conclusions or recommendations.
REPORTING EVALUATION RESULTS
1.The level and scope of content depends on to whom the report is intended, e.g., to bankers, funders, employees, customers, clients, the public, etc.
2. Be sure employees have a chance to carefully review and discuss
the report. Translate recommendations to action plans, including
who is going to do what about the program and by when.
3. Funders will likely require a report that includes an executive
summary, description of the organization and the program under
evaluation; explanation of the evaluation goals, methods, and
analysis procedures; listing of conclusions and recommendations;
and any relevant attachments, e.g., inclusion of evaluation
questionnaires, interview guides,
4. Be sure to record the evaluation plans and activities in an evaluation plan which can be referenced when a similar program evaluation is needed in the future.
PITFALLS TO AVOID IN PROGRAM EVALUATION:
2.There is no "perfect" evaluation design. Don't worry about the plan being perfect. It's far more important to do something, than to wait until every last detail has been tested.
3.Work hard to include some interviews in your evaluation methods.
4.Don't interview just the successes. You'll learn a great deal about the program by understanding its failures, dropouts, etc.
5.Don't throw away evaluation results once a report has been generated. Results don't take up much room, and they can provide precious information later when trying to understand changes in the program.