Regression Word Scramble
|
Embed Code - If you would like this activity on your web page, copy the script below and paste it into your web page.
Normal Size Small Size show me how
Normal Size Small Size show me how
Question | Answer |
Which classical assumption(s) does an Omitted Variable violate? | Violates Classical Assumptions I and III |
What are the consequences of an Omitted Variable? | 1) Estimated coefficient doesn’t equal actual coefficient. 2) Bias is forced onto another coefficient, causing the estimated value of the coefficient to change. 3) Increases by decreasing variance. |
What factors may indicate Omitted Variable bias? | 1) Unexpected signs on the coefficient. 2) Too big of coefficients (if positive bias on a positive coefficient). |
How do you solve the problem of Omitted Variable bias? | Add omitted variable or proxy variable. |
Which classical assumption(s) does a Redundant Variable violate? | Violates Classical Assumption VI |
What are the consequences of a Redundant Variable? | 1) Does NOT introduce bias. 2) Increases variance. |
How do you detect a Redundant Variable bias? | 1) Decreased R2. 2) Wald Test. |
How do you solve the problem of Redundant Variable bias? | Drop irrelevant variable |
How do you use to decide whether to include Omitted or Redundant Variables? | 1) Theory. 2) t-tests. 3) Adjusted R2 decreases if the improvement in overall fit due to addition of the independent variables to the regression does NOT outweigh loss in degrees of freedom. 4) Bias. |
Which classical assumption(s) does omitting an Intercept violate? | Violates Classical Assumption II: the error term has a zero population mean. Intercept usually absorbs error. |
Which classical assumption(s) does Multicollinearity violate? | Violates Classical Assumption VI |
What are the consequences of Multicollinearity? | 1) Does not cause bias 2) MIGHT CAUSE WRONG SIGN DUE TO INCREASED VARIANCE 3) Increases variance because it impacts r sub1,2 4) T-scores Fall because SE increases 5) Will not fall much |
How do you detect Multicollinearity? | 1) High Adjusted R2 and low t-scores. 2) High simple correlation coefficients (r sub1,2). 3) High Variance Inflation Factor (VIF > 5). |
How do you interpret VIF? | When VIF = 5, this means you have 5 times the variance in the model than you would have without the multicollinearity. |
How do you solve the problem of Multicollinearity? | 1) Do nothing 2) Drop variables 3) Transform variables 4) Increase sample size |
Which classical assumption(s) does Serial Correlation violate? | Violation of Classical Assumption IV |
What are the consequences of Serial Correlation? | 1) No bias in coefficient estimates. 2) TRUE Increased variance in coefficient estimates. 3) Distorts SEE portion of SE. 4) OLS no longer BLUE. 5) OLS underestimates standard error of the coefficients. |
How do you detect Serial Correlation? | 1) T-scores appear larger than they really are, leading to Type I error More likely to make a Type I error (reject a null hypothesis that is true) 2) Durbin-Watson test |
How do you solve the problem of Serial Correlation? | 1) Better Specification 2) Generalized Least Squares |
Which classical assumption(s) does Heteroskedacity violate? | Violation of Classical Assumption V Most common with cross-sectional models Comparing proportionately different instances results in inconstant variance in error term |
What are the consequences of Heteroskedacity? | 1) No bias in coefficient estimates, t-scores are larger. 2) TRUE Increased variance in coefficient estimates 3) Distorts SEE portion of SE 4) OLS is no longer BLUE 5) OLS underestimates standard error of the coefficients. |
How do you detect Heteroskedacity? | 1) Plot residuals to see if variance is constant. If bell or flower shape emerges, there is heteroskedasticity. 2) Park Test (for proportionality) 3) White Test |
How do you solve the problem of Heteroskedacity? | 1) Weighted Least Squares. 2) Redefinition of Variables. 3) Heteroskedasticity Corrected Standard Errors. |
Created by:
kristel387
Popular Math sets