Save
Busy. Please wait.
Log in with Clever
or

show password
Forgot Password?

Don't have an account?  Sign up 
Sign up using Clever
or

Username is available taken
show password


Make sure to remember your password. If you forget it there is no way for StudyStack to send you a reset link. You would need to create a new account.
Your email address is only used to allow you to reset your password. See our Privacy Policy and Terms of Service.


Already a StudyStack user? Log In

Reset Password
Enter the associated with your account, and we'll email you a link to reset your password.
focusNode
Didn't know it?
click below
 
Knew it?
click below
Don't Know
Remaining cards (0)
Know
0:00
Embed Code - If you would like this activity on your web page, copy the script below and paste it into your web page.

  Normal Size     Small Size show me how

D293

Assessment and Learning Analytics

QuestionAnswer
Ipsative assessment strategy Measurement of progress over time based on indiv starting/ending points – “I” = ruler, How much have they learned? Self-assessment, use pre-assessment and assessment ex.- Compare previous work and progress to current work, pretest diagnostic to summative MOTIVATING, showing distance progressed even if learner isn’t expert or at level of classmates. Learning checks progress through unit, suggest learning strategies if progress slows.
Formative assessment strategy FORM understanding of student learning, during instruction, guide teaching and learning • Mult choice • Online presentations • Create website or blog • Learners’ online portfolios • Online group projects
Summative assessment strategy summarize learning AFTER instruction, SUMMARY, measure competency
Diagnostic assessment strategy pre-assessment, DIAGNOSE, assess learner's understanding pre-lesson
Competency-based assessment strategy - SKILL application real world, scenario-based tasks ex.- Creating a presentation
Criterion-referenced assessment strategy, NOT compared with peers, but against predetermined standard, score, or criteria. What criteria has to be met to pass? (ex.- Each pilot has to take off/fly/land plane, and all who can do this pass.), ex.- Advanced Placement Test
Standards-based assessment strategy Competence can be measured by both authentic assessment and criterion-referenced traditional assessments. Eval specific standard, criterion aligns and supports learning goals ex.- State Standardized Test
Norm-referenced assessment strategy Compare indiv against peers NORMal for group? Class ranking, bell curve (ex .- 7 best pilots, no matter if they are good) ex.- Test graded on a curve, Competitive. Based on a concrete learning standard, objective, or outcome. Criterion-referenced assessments are considered fairer and more equitable than norm-referenced assessments.
Psychomotor learning domain physical skills (classrm activities lead to experiential learning) Perception, Set (feeling ready to act), Guided Response, Mechanism (devel basic profic), Complex Overt Response, Adaptation, Origination
Affective learning domain emotional growth, interpersonal develop (stud more interaction with peers, self-confidence) Receiving, Responding, Valuing, Organizing, Characterizing
Cognitive learning domain thinking/problem-solving (encourage stud to think and problem solve)
assess methods more specific, picked out of the bucket – determine if learning obj has been met (HOW of assessment – project, discussion, reflection, m/c…)
Comprehensive type assess method chapter test
Discussion board assess method online discusion
direct assess method chapter test, etc– task-based, demonstrate knowledge/skill/ability. Learners directly do something we assess.
indirect assess method Survey, etc. - collect data requiring reflection on student learning, skills, behaviors – learner does NOT actually do something. Surveys, exit interviews, LMS data about how often a page is visited, etc.
project-based assess method build something
reflection-focused assess method journal
Bloom's taxonomy "6. Create – design, assemble, construct, develop, investigate 5. Eval – appraise, argue/defend, judge, critique 4. Analyze – differentiate, compare/contrast, examine, distinguish 3. Apply – execute, implement, demonstrate, interpret 2. Understand – classify, describe, explain, ID 1. Remember- recall, list, define
Bloom's 6. Create design, assemble, construct, develop, investigate
Bloom's 5. Eval appraise, argue/defend, judge, critique
Bloom's 4. Analyze differentiate, compare/contrast, examine, distinguish
Bloom's 3. Apply execute, implement , demonstrate, interpret
Bloom's 2. Understand classify, describe, explain, ID
Bloom's 1. Remember recall, list, define
Descriptive data, analytics sick, see MD, symptoms = FACTS historical data—what happened? Pie charts, bar charts, tables, line graphs insight into the past – data aggregation/data mining to understand trends: student feedback from surveys, analy at all stages of student life
Diagnostic data, analytics MD diagnose, WHY happened why did it happen – drill-down, data discovery, data mining, correlations: inform and uplift key perf indicators, analy patterns, equity access reporting, effective strategies to support students, LMS metrics to improve stud engagement
Predictive data, analytics MD predicts WHAT is likely to happen (future), educated guess THINK What is likely to happen? Forecasting, multivariate stats, pattern matching, predictive modeling, regression analysis Understanding future: stud historical data, apply algorithms to capture relationships between data sets to forecast trends, devel of staff dashboards to predict stud numbers and ID areas of improvement
Prescriptive data, analytics solutions/“orders” what need to DO to feel better, ACTION What should be done? Reyly on quality of analy but also accuracy to ensure good decision-making goes beyond predictive by recommending 1+ choices. Focus on subject/courses where small changes have big impact on changing stud engagement, feedback, outcomes. Data visualization – provides program/degree level metrics on stud enrollments, program stage, results and survey feedback to give teaching staff visual snaphots of students in programs.
Metrics DEFINITION = Diagnostic metrics ex.- LMS data showing student engagement Prescriptive metrics ex. - . Data visualization – provides program/degree level metrics on stud enrollments
Quantitative analysis NUMBERS, objective, numerical data, stats happened in past. Used in DESCRIPTIVE ANALY to determine what happened. works well with item analysis of comprehensive, multiple choice question assessments using data measures such as current score, time spent, or frequency of access.
Qualitative analysis NON-NUMERICAL info, observations, reflections, interviews. SUBJECTIVE. Used in DIAGNOSTIC analy to determine WHY something happened. data considers feelings and experiences. works well with content analysis of reflection-focused assessments using data measures that examine the quality of reflection.
Social network analysis study of patterns or trends among groups of learners or learners/instructors. Used in PREDICTIVE analy to predict future behavior. determines learner engagement by examining behaviors, norms, and interactions in educational settings. works well with analysis of discussion board assignments using data measures that focus on interaction measures.
Data for research WHAT?? ("Which of these 2 lrng materials is the most effective way to prep stud for test?") Gather insights about the learning process. Study learner behaviors, patterns, effective instructional strategies. Often is comparison. Used to gather new data and test new theories. Collects MORE data than necessary, in case might be needed. Testing strategy is focus on 1 big test.
Data for accountability WHERE?? And WHY?? ("Which class performed the best?") Accountability = using learning analy to track and assess performance of programs, instruct, materials against predefined standards. Demonstrate effectiveness and efficiency of instruct design to stakholders, admin, accrediting bodies, funding agencies) Used to eval/rate/rank performance. Collects all recent and relevant data that is available. Is interested in performance at a given point in time, not testing anything.
Data for improvement HOW?? ( "Is this learning material effective?) Update teaching practices/instruct design/course content, enhance learning. Collects "just enough" data. Applies knowledge to make improvements. Tests small changes to see what's working and not.
Data ethics learning experience designers need to consider things that analytics may not, such as access, equity, accessibility... • data ownership and control, 3rd party sharing, transparency, accessibility of data, validity and reliability of data, institutional responsibility and obligation to act, communications, cultural values, inclusion, consent, and student agency and responsibility.
Task-level feedback related to specific task or activity being performed, accuracy/completely/quality of work done. Task-level feedback is concrete and objective (math error or better word choices), aimed at improving output of specific tasks and is usually immediate. most effective when it aids in building cues and information regarding erroneous hypothesis and ideas and then leads to the development of more effective and efficient strategies for processing and understanding the material.
Process-level feedback advises on method and strategy improvements. Focus on process used rather than task itself. Enhance understanding and skills by addressing how task was approached, including strategies/methods/techniques use. assists in building better or more effective searching and strategizing.
Regulatory-level feedback self-regulation, guides self-monitoring and adjustment. Monitor, direct, and regulate their own learning/performance. Feedback on self-assessment, goal-setting, adjustment of strategies based on performance outcomes. Reg level feedback encouraged indiv to think about their own learning/habits/take control of learning. builds more confidence and engagement in further investigation of the subject.
Self-level feedback personal identity and motivation. Indiv self-esteem, confidence, perceive capabilities, role in learning/work. Strong emotional charge. Less about specifics of performance. feedback at the self or personal level (usually praise) is rarely effective. Therefore, you should avoid feedback such as “Great job!” and “You’re awesome!” without providing any additional, effective elements on the three other levels.
Actionable feedback Actionable feedback is very much like constructive criticism, as both explain how something can be improved or what can be done differently to achieve a certain result.
Multiple means of engagement (UDL) WHY of learning; real-world applications, stud input on class activity design activity choice, vary difficulty and order, prompt/frequent feedback with rubrics and peer feedback
Perceivable – info and user interface components must be understandable (colorblind red) o Text alternatives, time-based media, adaptable content across screen sizes
Operable navigation/interface components and resources (drag/drop with mouse…what if no mouse?) o Mouse/keyboard/voice command o Keyboard accessibility – just keyboard, no need for mouse o Time limits removed or extended o Navigable structure – headings/labels/navigation aids
Understandable info and operation of user interface must be understandable o Clear language, no jargon o Consistent layout helps focus on content instead of navigation o Error ID and suggestions for incorrect answers
Robust content must be robust enough to be interpreted reliably across platforms (tablet/computer/phone) o Compatibility across browsers and assistive tech o Standards compliance HTMS, CSS, other coding standards o Regular updates to ensure accessibility remains with their updates
Cognitive load overwhelming working memory’s limited capacity to handle only a little info at once *too wordy word problems…(does it test math or reading?)
contruct-validity bias Failure of an assessment to fully represent its subject matter (content that was instructed). Inaccuracy in how an assessment measures what it is intended to. inaccuracy in how assess measures what is intended, ex.- resumé MC test (MAKE resumé instead) “Construction” - How is the test constructed? Is this the right type of assessment?
Content-validity bias failure of assess to fully rep subject matter (content taught), ex.- World history ?s on US History test “Content” - Does the assessment accurately reflect the content that was taught?
item selection bias Biased test item choice that favors or disadvantages certain groups. ex.- statistics questions in math using football jargon “Specific item” -This is a narrow focus. This particular item on an assessment leads to an advantage or disadvantage for a specific group.
predictive-validity bias Inaccuracy in forecasting future outcomes across different groups. inaccuracy in forecasting outcomes across different groups, ex.- driver’s road test *MC test doesn’t predict if you are a good driver, only road test does that) “Predict” - Future forecast. If someone is good at this assessment now, will they be good with this topic in the real world in the future?
bias Assessment bias is when a group of students has an unfair advantage on an item or group of items that is statistically observable The presence of bias can result in the teacher or student reaching inaccurate or misleading conclusions about the student’s abilities based on assessment attributes that are unrelated to what the student knows or can do
Popular Academic Vocabulary sets

 



Voices

Use these flashcards to help memorize information. Look at the large card and try to recall what is on the other side. Then click the card to flip it. If you knew the answer, click the green Know box. Otherwise, click the red Don't know box.

When you've placed seven or more cards in the Don't know box, click "retry" to try those cards again.

If you've accidentally put the card in the wrong box, just click on the card to take it out of the box.

You can also use your keyboard to move the cards as follows:

If you are logged in to your account, this website will remember which cards you know and don't know so that they are in the same box the next time you log in.

When you need a break, try one of the other activities listed below the flashcards like Matching, Snowman, or Hungry Bug. Although it may feel like you're playing a game, your brain is still making more connections with the information to help you out.

To see how well you know the information, try the Quiz or Test activity.

Pass complete!
"Know" box contains:
Time elapsed:
Retries:
restart all cards