click below
click below
Normal Size Small Size show me how
Program Evaluation
Program Evaluation ch 3 on
| Question | Answer |
|---|---|
| Objectivism epistemology | Evaluation information must be "scientifically objective" |
| Subjectivism epistemology | appeal to experience rather than to science - the validity of a subjectivist evaluation depends on the relevance of the evaluator's background and qualifications and the keenness of his perceptions |
| Utilitarian | determine value by assessing the overall impact on a program on those affected |
| Intuitionist-pluralist | the value depends on the impact of the program on EACH individual (tended to be subjectivist epistemology |
| Areas evaluators disagree are | 1. whether the intent of evaluation is to render a value judgment 2. general view of the political roles of evaluation 3. Influenced by their prior experience 4. Knowledge and expertise required for evaluators 5. Do evaluations need a wide variety of appr |
| Five categories of evaluation | 1. Objective-oriented; 2. Management-Oriented; Consumer-oriented; Expertise-oriented approaces; 5. Participant- oriented |
| Objective-oriented evaluation approach | the purposes of some activity are specified and then evaluation focuses on the extent to which those purposes are achieved |
| Uses of objective-oriented evaluation | reformulate the purposes of the activity, change the activity itself, or the assessment procedures and devices used to determine the achievement of purposes |
| Tylerian Evaluation Approach | evaluation is the process of determining the extent to which the objectives of a program are actually being attained. |
| Tylerian Evaluation steps | 1. Establish broad goals or objectives; 2. Classify the goals or objectives; 3. Define objectives in behavioral terms; 4. Find situations in which achievement of objectives can be shown; 5. Develop or select measurement techniques; 6. Collect performance |
| Logical method for evaluating goals and objectives | Examine the 1. need; 2. consequences; 3. fit with higher-order values (laws, policies) of goals and objectives |
| Empirical methods for evaluating goals and objectives | 1. Collecting group data to describe judgments about the value of a goal or objective; 2. Arranging for experts, hearings, or panels to review; 3. Content studies; 4. Pilot study |
| The Evaluation Cube | Three-dimensional framework for analyzing the objects of community-based youth programs |
| Logic Models | Start with the long-term vision of how program participants will improve with program (ultimate outcome); Determine inputs, activities, outputs and immediate, intermediate, long-term and ultimate outputs |
| Pros of Objective-oriented evaluation | Simplicity - easily understood, follow and implement; legitimate - programs are being held accountable for what its designers said it would accomplish |
| Cons of Objective-oriented evaluation | 1. lacks real evaluative component; 2. lacks judgment of difference between objectives and performance; 3. neglects value of objectives; 4. ignores alternatives; 5. neglects context; 6. neglects other outcomes; 7. linear, inflexible |
| goal-free evaluation | 1. avoids becoming aware of goals; 2. predetermined goals are not permitted; 3. examines actual outcome; 4. limit contact with management; 5. increase likelihood of finding unanticipated side effects |
| Management-oriented | An evaluation that serves the decision maker and the evaluation is directed to the concerns, informational needs, and criteria for effective of the decision maker. |
| CIPP Evaluation Model (4 types of evaluation) | 1. Context - planning decisions; 2. Input - structuring decisions; 3. Process - implementing decisions; 4. Product - recycling decisions |
| Management-oriented models | 1. CIPP Evaluation Model; 2. UCLA Evaluation Model; 3. parts of Provus' Discrepancy Evaluation Model; 4. Utilization-focused evaluation approach |
| Management-oriented model strengths | 1. Rational and orderly system approach; 2. gives focus to the evaluation; 3. can begin evaluation before program begins; 4. provides timely feedback to decision makers |
| Management-oriented model weaknesses | 1. Important questions may be ignored because at odds with decision maker; 2. disenfranchises other stakeholders; 3. can be costly and complex; 4. assumes decisions can be identified in advance. |
| Consumer-Oriented Evaluation Approaches | A summative evaluation with checklists and criteria of the consumers |
| Consumer-Oriented Strengths | 1. Make evaluation material widely available; 2. Increase consumer knowledge |
| Consumer-Oriented Weaknesses | 1. Increase cost of products; 2. suppress creativity |
| Expertise-Oriented Evaluation Approaches | Depends primarily on professional expertise to judge an institution, program, product, or activity |
| Examples of expertise-oriented evaluation | Doctoral examinations administered by a committee, proposal review panels, professional reviews conducted by professional accreditation bodies |
| Types of Expertise-Oriented Evaluations | 1. Formal review system; 2. Informal review system; 3. Ad hoc panel review; 4. Ad hoc individual review |
| Strengths of Expertise-Oriented | 1. emphasized the central role of expert judgment and human wisdom; 2. Formal review boards develop criteria, self-study, defined objectives, support and protect |
| Weaknesses of Expertise-Oriented | 1. The evaluation could reflect one persons biases; 2. public concerns over credibility and public cynicism |
| participant-oriented evaluation | Views the participants in the endeavor being evaluated as central to the evaluation |
| Commonalities of participant-oriented | Depend on inductive reasoning; Use multiplicity of data; do not follow standard plan; Record multiple rather than single realities |
| Responsive Evaluation | Central focus is in addressing the concerns and issues of a stakeholder audience; heavily qualitative |
| Naturalistic Evaluation | the evaluator is studying he program activity in situ, without constraining, manipulating, or controlling it. Uses cross-checking and triangulation; unobtrusive measures |
| Types of participant-oriented evaluation | 1. Responsive; 2. Naturalistic; 3. Participatory; 4. Utilization-focused; 5. Empowerment |
| Strengths of Participant-oriented evaluation | Readily used by any sensitive individual; emphasize human element; complex; foster and facilitates activism of recipients of program services |
| Negatives of Participant-oriented evaluation | Subjective; high cost; labor intensive; advocates lose unbias |