Save
Upgrade to remove ads
Busy. Please wait.
Log in with Clever
or

show password
Forgot Password?

Don't have an account?  Sign up 
Sign up using Clever
or

Username is available taken
show password


Make sure to remember your password. If you forget it there is no way for StudyStack to send you a reset link. You would need to create a new account.
Your email address is only used to allow you to reset your password. See our Privacy Policy and Terms of Service.


Already a StudyStack user? Log In

Reset Password
Enter the associated with your account, and we'll email you a link to reset your password.
focusNode
Didn't know it?
click below
 
Knew it?
click below
Don't Know
Remaining cards (0)
Know
0:00
Embed Code - If you would like this activity on your web page, copy the script below and paste it into your web page.

  Normal Size     Small Size show me how

psy400ch14p357-364

Tests of Association: Correlation and Regression

TermDefinition
Pearson correlations are used when you want to determine the degree of linear association between two interval- or ratio-level variables
Pearson's r assumptions two populations are normally distributed and the relationship between the variables is linear
Linear regression determines the joint impact/effect of one or more independent variables on a single dependent variable.
Computing Spearman’s p rather than Pearson's r may be more appropriate when your measurement scale is ordinal, your data set violates the assumption of normality, or your two variables are not linearly related.
When you have a small sample size and a number of outliers in your data, Spearman's p is generally a more conservative approach
Linear regression Assumptions: dependent variable is interval or ratio level; dependent and independent relationship is linear; residuals are independent, have equal variance across all independent variable values, and are normally distributed.
residuals differences between predicted and actual dependent variable values
Regression equation expresses a linear relationship between a dependent variable and one or more independent variables
Regression coefficient: A quantity in a linear regression equation that indicates the change in a dependent variable associated with a unit change in the independent variable.
y-intercept value The point at which a regression line intersects the y-axis.
R-squared is identical to the square of Pearson's r
Regression analysis will produce p values for the multiple correlation R and for each of the coefficients in the regression equation, and confidence intervals may (and should) be constructed for each of these values
R (or R-squared) serves as an effect size
Both ANOVA and regression fall under a more general statistical model known as the general linear model
Regression is particularly useful in situations where you want to include variables that cannot be experimentally manipulated (c.g., income, socioeconomic status)
TEST ON ORDINAL DATA Mann-Whitney U test
ASSUMPTION VIOLATIONS: may affect the accuracy of confidence intervals, p values, and your estimates of quantities such as means and standard deviations, as well as increasing the probabilities of type I or type II errors
Independence of observations measurements from one participant do not depend on the measurements from other participants
Nonnormal Distributions Outliers, Skew
Outliers Leverage value, Cook's distance, residuals, Histograms and box-and-whisker plots
Leverage value and Cook's distance measures used to detect outliers in a data set.
residuals Differences between actual values and predicted values in linear regression or ANOVA.
QQ plot (or normal quantile-quantile plot) A graphical technique to identify deviations from normality in a data set
quantile: each of any set of values of a variate which divide a frequency distribution into equal groups, each containing the same fraction of the total population
Shapiro-Wilk test andKolmogorov-Smirnov test A statistical test of whether a set of data values is normally distributed
statistical tests for checking assumptions are hypothesis tests
low power for small sample sizes may prevent statistical tests from identifying sizeable assumption violations
using the results of an assumption violation hypothesis test to determine which subsequent hypothesis test you then apply runs the risk of increasing the probability of a type I error
Data transformations are sometimes useful for converting a skewed distribution into one that is more normal in shape including logarithmic and square or cube root transformations
transformed data conclusions apply only to the transformed data which may make interpretation of your results more difficult.
Created by: james22222222
 

 



Voices

Use these flashcards to help memorize information. Look at the large card and try to recall what is on the other side. Then click the card to flip it. If you knew the answer, click the green Know box. Otherwise, click the red Don't know box.

When you've placed seven or more cards in the Don't know box, click "retry" to try those cards again.

If you've accidentally put the card in the wrong box, just click on the card to take it out of the box.

You can also use your keyboard to move the cards as follows:

If you are logged in to your account, this website will remember which cards you know and don't know so that they are in the same box the next time you log in.

When you need a break, try one of the other activities listed below the flashcards like Matching, Snowman, or Hungry Bug. Although it may feel like you're playing a game, your brain is still making more connections with the information to help you out.

To see how well you know the information, try the Quiz or Test activity.

Pass complete!
"Know" box contains:
Time elapsed:
Retries:
restart all cards