Busy. Please wait.
Log in with Clever
or

show password
Forgot Password?

Don't have an account?  Sign up 
Sign up using Clever
or

Username is available taken
show password


Make sure to remember your password. If you forget it there is no way for StudyStack to send you a reset link. You would need to create a new account.
Your email address is only used to allow you to reset your password. See our Privacy Policy and Terms of Service.


Already a StudyStack user? Log In

Reset Password
Enter the associated with your account, and we'll email you a link to reset your password.

Foundations of Measurement and Evaluation

Quiz yourself by thinking what should be in each of the black spaces below before clicking on it to display the answer.
        Help!  

Question
Answer
Examples of Subjective Assessment   Essay Answers  
🗑
Define Objective Assessment   Objective assessment is a form of questioning which has a single correct answer.  
🗑
Validity Evidence   There are different types of validity. Face, content, construct, and criterion related.  
🗑
Criterion Related Validity   Validity that is determined by gauging a test resently created to a standard (which can be in the form of another similar test). The standard could be in existance presently or in the future.  
🗑
Examples of Assesment   Achievement tests, aptitude test, and effective tests.  
🗑
Evaluation   Evaluations focus on Internal situations. An ex. is collecting data on a specific program  
🗑
Threats to validity   Unclear test items, unknown vocabulary, items not covered in instruction, inconsistent testing procedures, confusing test directions.  
🗑
Types of Formative Eval Questions   Questions should not be answered with yes or no. Answers should be towards making the instruction better. The answers should give informative info for improvements.  
🗑
Define Reliability   A degree to which a test is consistently measures whatever it is measuring. A test maybe reliable without being valid but it cannot be valid without being reliable. A test must have stability, equivalence, and internal consistency to be reliable.  
🗑
Validity   Assessment validity is when a test measures what it is intended to measure.  
🗑
Define Formative Evaluation   Is used to improve instruction or informs the person of revisions to instruction. These evals are performed during the development of instruction. Ex. You want to know if the instructional goal was accomplished and any revisions that the product needs.  
🗑
Split-Half reliability   Scores are seperated into two groups and the group results are compared to determine the simularities.  
🗑
Two types of Criterion Related Validity   Concurrent and predictive. Concurrent means the standard used to gauge validity is something in existance in the present (another test concurrently being used). Predictive uses future performance as the standard to gauge validity.Ex. SAT  
🗑
Examples of Formative Evaluation   SME, One on One interviews, small group or focus group, field trial in the location where the instruction is intended to take place.  
🗑
Reliability Coefficients   Statistical means of estimating and interpreting the reliability of assessments.  
🗑
Examples of summ eval   Did it work, was it successfull, should we countinue funding.  
🗑
Constructed Response Tests   Performance tasks such as essay, and or profolios.  
🗑
Content Validity   The test only includes items that are from a specific content area.  
🗑
Examples of Objective Assessment   Multiple choice, true false,multiple-response and matching questions  
🗑
Types of Reliability   Internal reliability, Rater reliability, Equivalence reliability, and Stability reliability.  
🗑
Examples of Authentic Assesment   performance of the skills, or demonstrating use of a particular knowledge simulations and role plays studio portfolios, strategically selecting items exhibitions and displays  
🗑
Define Authentic Assessment   Measurement of "intellectual accomplishments  
🗑
Face Validity   Does the face of the test appear to be valid.  
🗑
Program eval or summative eval   The program itself is being evaluated. An evaluation can be a product, project, or process. The purpose is to make a decision from the results of the eval. Decisions can include discountinuing a program, starting a new one, or comparing programs.  
🗑
Define Subjective Assessment   Subjective assessment is a form of questioning which may have more than one correct answer (or more than one way of expressing the correct answer  
🗑
Types of Assesments   Tests are the only type of Assesment.  
🗑
Definition of Assesment   Assesment means collecting information on an individual student's competence, ability, or improvement. Basicaly gathering data on a student ability to learn or what they have already learned.  
🗑
Selected Response Tests   Mutiple choice, true or false, matching. Students have to choose answers provided by the test maker.  
🗑
Two Phases of Prog or summ eval   Expert Judgement Phase and Field Trial Phase. Judge if it works and fiels is actually putting it to the test.  
🗑
Inter-Rater Reliability   Determine the simularities of the graders who score tests.There needs to be a high degree of consistancy.  
🗑
Diffences and simularities in eval and research   Both use same tools and methods Research generalizes while evals particularizes The difference is in the purpose.  
🗑
Who are stakeholders   Stakeholders are people that will be impacted by the results of the evaluation. Sometimes it may be the developer of the program and other times it may be groups. Ex. parents, teachers, students.  
🗑
Construct Valdility   The experimental demonstration that a test is measuring the construct it claims to be measuring. Construct vadility is not observable. Ex. honesty and intelligence.  
🗑
Test and Retest   Test and Retests Reliability determines if the same test will give simular results when administered to the same people two different times. A relationship between the scores must be detremined.  
🗑
Differences between summ and forma eval   Formative eval is used to make improvements to the exsisting program. Summ is used to make a decision on the program NOT to make improvements.  
🗑
Research   Research is conducted to generalize your findings from a sample to a larger population.  
🗑
Evaluation tip   With evaluations you do not want to generaluze the data for a larger population. The point of the evaluation is to determine if the program was successful.  
🗑
Reliability tests   Test Retest (Tests stability), split half test (tests internal reliability), and Equivalent form (tests equivalence reliability.  
🗑
Importance of reliability   Reliability is important because it indicates that a test will consistently measure what it says it will measure.  
🗑
Importance of validity   Validity is important becaue it dicates that a test will consistently measure what it purports to measure over time.  
🗑


   

Review the information in the table. When you are ready to quiz yourself you can hide individual columns or the entire table. Then you can click on the empty cells to reveal the answer. Try to recall what will be displayed before clicking the empty cell.
 
To hide a column, click on the column name.
 
To hide the entire table, click on the "Hide All" button.
 
You may also shuffle the rows of the table by clicking on the "Shuffle" button.
 
Or sort by any of the columns using the down arrow next to any column heading.
If you know all the data on any row, you can temporarily remove it by tapping the trash can to the right of the row.

 
Embed Code - If you would like this activity on your web page, copy the script below and paste it into your web page.

  Normal Size     Small Size show me how
Created by: kasuao1
Popular Miscellaneous sets