Busy. Please wait.
Log in with Clever
or

show password
Forgot Password?

Don't have an account?  Sign up 
Sign up using Clever
or

Username is available taken
show password


Make sure to remember your password. If you forget it there is no way for StudyStack to send you a reset link. You would need to create a new account.
Your email address is only used to allow you to reset your password. See our Privacy Policy and Terms of Service.


Already a StudyStack user? Log In

Reset Password
Enter the associated with your account, and we'll email you a link to reset your password.

Violations of Classic Assumptions

Quiz yourself by thinking what should be in each of the black spaces below before clicking on it to display the answer.
        Help!  

Question
Answer
Which classical assumption(s) does an Omitted Variable violate?   Violates Classical Assumptions I and III  
🗑
What are the consequences of an Omitted Variable?   1) Estimated coefficient doesn’t equal actual coefficient. 2) Bias is forced onto another coefficient, causing the estimated value of the coefficient to change. 3) Increases by decreasing variance.  
🗑
What factors may indicate Omitted Variable bias?   1) Unexpected signs on the coefficient. 2) Too big of coefficients (if positive bias on a positive coefficient).  
🗑
How do you solve the problem of Omitted Variable bias?   Add omitted variable or proxy variable.  
🗑
Which classical assumption(s) does a Redundant Variable violate?   Violates Classical Assumption VI  
🗑
What are the consequences of a Redundant Variable?   1) Does NOT introduce bias. 2) Increases variance.  
🗑
How do you detect a Redundant Variable bias?   1) Decreased R2. 2) Wald Test.  
🗑
How do you solve the problem of Redundant Variable bias?   Drop irrelevant variable  
🗑
How do you use to decide whether to include Omitted or Redundant Variables?   1) Theory. 2) t-tests. 3) Adjusted R2 decreases if the improvement in overall fit due to addition of the independent variables to the regression does NOT outweigh loss in degrees of freedom. 4) Bias.  
🗑
Which classical assumption(s) does omitting an Intercept violate?   Violates Classical Assumption II: the error term has a zero population mean. Intercept usually absorbs error.  
🗑
Which classical assumption(s) does Multicollinearity violate?   Violates Classical Assumption VI  
🗑
What are the consequences of Multicollinearity?   1) Does not cause bias 2) MIGHT CAUSE WRONG SIGN DUE TO INCREASED VARIANCE 3) Increases variance because it impacts r sub1,2 4) T-scores Fall because SE increases 5) Will not fall much  
🗑
How do you detect Multicollinearity?   1) High Adjusted R2 and low t-scores. 2) High simple correlation coefficients (r sub1,2). 3) High Variance Inflation Factor (VIF > 5).  
🗑
How do you interpret VIF?   When VIF = 5, this means you have 5 times the variance in the model than you would have without the multicollinearity.  
🗑
How do you solve the problem of Multicollinearity?   1) Do nothing 2) Drop variables 3) Transform variables 4) Increase sample size  
🗑
Which classical assumption(s) does Serial Correlation violate?   Violation of Classical Assumption IV  
🗑
What are the consequences of Serial Correlation?   1) No bias in coefficient estimates. 2) TRUE Increased variance in coefficient estimates. 3) Distorts SEE portion of SE. 4) OLS no longer BLUE. 5) OLS underestimates standard error of the coefficients.  
🗑
How do you detect Serial Correlation?   1) T-scores appear larger than they really are, leading to Type I error More likely to make a Type I error (reject a null hypothesis that is true) 2) Durbin-Watson test  
🗑
How do you solve the problem of Serial Correlation?   1) Better Specification 2) Generalized Least Squares  
🗑
Which classical assumption(s) does Heteroskedacity violate?   Violation of Classical Assumption V Most common with cross-sectional models Comparing proportionately different instances results in inconstant variance in error term  
🗑
What are the consequences of Heteroskedacity?   1) No bias in coefficient estimates, t-scores are larger. 2) TRUE Increased variance in coefficient estimates 3) Distorts SEE portion of SE 4) OLS is no longer BLUE 5) OLS underestimates standard error of the coefficients.  
🗑
How do you detect Heteroskedacity?   1) Plot residuals to see if variance is constant. If bell or flower shape emerges, there is heteroskedasticity. 2) Park Test (for proportionality) 3) White Test  
🗑
How do you solve the problem of Heteroskedacity?   1) Weighted Least Squares. 2) Redefinition of Variables. 3) Heteroskedasticity Corrected Standard Errors.  
🗑


   

Review the information in the table. When you are ready to quiz yourself you can hide individual columns or the entire table. Then you can click on the empty cells to reveal the answer. Try to recall what will be displayed before clicking the empty cell.
 
To hide a column, click on the column name.
 
To hide the entire table, click on the "Hide All" button.
 
You may also shuffle the rows of the table by clicking on the "Shuffle" button.
 
Or sort by any of the columns using the down arrow next to any column heading.
If you know all the data on any row, you can temporarily remove it by tapping the trash can to the right of the row.

 
Embed Code - If you would like this activity on your web page, copy the script below and paste it into your web page.

  Normal Size     Small Size show me how
Created by: kristel387
Popular Math sets