Embed Code - If you would like this activity on your web page, copy the script below and paste it into your web page.

Normal Size Small Size show me how

Normal Size Small Size show me how

# P264B Final

### Multiple Regression Special Topics

Question | Answer |
---|---|

Statistic used to measure the proportion of variance accounted for by the nominal variable | R squared |

Statistic used to represent the relationship between the coded aspect of the nominal variable and Y | ryi |

Statistic used to represent the relationship between the coded aspect of the nominal variable and the reference group | Pri and sri |

Partial regression coefficients for dummy coded variables | compares each group to the reference, or left out, group |

Semi-partial correlations between y and dummy coded variable | gives the importance, in terms of y variance, of the chosen group vs reference group distinction when controlling for covariates |

Partial correlation between y and dummy coded variable | point-biserial correlation between outcome variable (y) and the dichotomy formed by the chosen group vs reference group adjusted for any covariates |

R squared | SSR / SST |

Hypotheses of multiple regression | H0: R squared y.12 = 0 H1: R squared y.12 > 0 H0: Beta1 = Beta2 = . . . = 0 H1: not all betas are zero |

Unstandardized regression coefficient | SE(b) |

Standardized regression coefficient | Beta |

Three steps of interpreting interactions | 1. Does an interaction effect exist? 2. What is the strength of the effect? 3. What is the nature of the relationship? |

Bilinear moderated relationship | Slope between y and x1 changes as a linear function of x2 |

interpretation of the partial regression coefficient for the interaction term in a multiple regression model | change in slope between y and x1 for a one-unit increase in x2 |

What type of relationship does a product term interaction represent? | x1 . x2 |

Does interaction term exist? | Look at partial regression coefficient for the interaction term. T = b/SE(b) |

What is the strength of the effect? | Check r square change when interaction term added or can square the semi-partial correlation |

What is the nature of the effect? | Plot the relationship between y and x1 for fixed values of x2 and interpret with 1-2 sentences |

Is interaction term symmetric | Yes - doesn't matter whether we designate x1 or x2 as the moderator variable. The inference is the same |

VIF | Variance Inflation Factor - represents how much the variances of the individual regression coefficients are inflated relative to when the IVs are not linearly related. |

At what level is VIF a problem | greater than 10 |

What type of variable selection or screening strategies are most useful when evaluating a set of potential predictor variables and interaction among the variables? | Hierarchical selection strategies |

Can interaction terms be interpreted w/o the main effect included in the model? | No |

What are the benefits of centering when an interaction term is present? | Reduces intercorrelation, yielding more stable estimates of regression parameters, which reduces size of SE(b) and makes for a more powerful test of the interaction effect |

Problems encountered when IVs are highly correlated | 1. adding or deleting an IV changes reg coef 2. estimated se of reg coef become large, affecting stability of population estimates 3. individual reg coef may not be significant even though a statistical relationship exists b/w DV and a set of IVs |

Centering | a linear transformation used to help mitigate the effects of high correlation between multiplicative interaction terms and their constituent parts |

Problems involved with detecting interaction effects in the multiple regression model | 1. Measurement Error 2. Mis-specification of functional form of interaction 3. Levels of measurement 4. Statistical power |

Formal diagnostics of multicollinarity | VIF and Tolerance |

Dealing with multicollinearity in multiple regession models | 1. Centering works for interactions 2. Drop one or more IVs (but can cause problems with model specification) 3. Add cases to break patterns 4. Create a construct, or index, variable |

Assumptions of Repeated Measures ANOVA | 1. Normally distributed, continuous DV 2. Same number of repeated measures per subject 3. Same of fixed timing of measurements across subjects 4. Sphericity |

Sphericity | refers to the equality of covariances and is tested by evaluating the variances of differences between all possible pairs of scores |

Benefits of Generalized Estimating Equations (GEEs) for repeated measures | 1. Allows for discrete or continuous DV 2. Allows subjects to have varying number s and timing of the repeated measurements 3. Allows a variety of covariance structures to be considered |

Why is Multiple Regression better than ANCOVA when you add covariates? | In ANCOVA you have to manually check for treatment x covariate interaction and if so, then you cannot use. |

Ordinal vs. disordinal interactions | Ordinal - lines never cross Disordinal - lines cross each other |

When interaction term is present, how does it change the interpretation of the betas? | When you have a higher order term (interaction), lower order terms (main effects) are evaluated at 0 (instead of means). |

Non-essential collinearity | refers to correlation b/w interaction term and its main effects (can be corrected w/centering) |

Benefits of hierarchical variable entry | 1. Causal priority including temporal order can be acknowledged. 2. control for confounding and test complex relationships b/w variables (mediation and moderation) |

Types of automatic variable entry | Forward Backward Step-wise |

Limitations of automatic variable entry | 1. Order of variable entry doesn't necessarily affect importance of variable 2. assumes a "single" best subset of variables 3. Theoretically important variables can be removed b/c of correlation w/other variables |

Assumptions of GEE | 1. Clusters are independent 2. ID (subject) correlated |

Created by:
bkflyer