click below

click below

Normal Size Small Size show me how

# research Ch. 11

### correlational designs

Term | Definition |
---|---|

Correlation | a statistical procedure that is used to measure and describe a relationship between two variables. In general, these variables are continuous—interval or ratio scales |

3 Characteristics of a Correlation | 1. The direction of the relationship 2. The Form of the Relation 3. The Degree of the Relationship |

The direction of the relationship | 2 basic categories -Positive correlation: as the value of one variable increases, the value of the other variable also increases -Negative Correlation:as the value of one variable increases, the value of the other variable decreases relationship is ind |

The Form of the Relation | The most common form is linear- A relationship described by a straight line. A perfect linear relationship (r = +1.00 or -1.00) is when all the data point lie on a straight line in a scatter plot. Other forms include curvilinear |

The Degree of the Relationship | A perfect relationship is +1.00 or -1.00; the lack of a relationship is r = .00 |

The 4 Major Uses of Correlations | Prediction Validity Reliability Theory Verification |

Prediciton | -If two variables are known to be related in a systematic way, then we can use one of the variables to predict the other variable. E.g., If we assume that SAT scores are related to college GPA, I can predict a student’s GPA in college based on their SAT |

Validity | -How does a psychologist know if a new test measures what it suppose to measure? Developing a new depression test and correlating it with the Hamilton Depression Rating Scale. The MTMM is a correlation matrix used to prove construct validity |

Reliability | A measurement procedure is considered reliable to the extent it produces stable, consist measurements. In other words, the same individuals should score very similar scores under similar conditions. ex: MTMM |

Theory Verification | -Many psychological therapies make specific predictions about the relationship between two variables Ex: Behavioral Therapy suggests that depression is sustained because less events that are pleasant to the depressed patient is experienced |

Issues in Interpreting Correlations | -does not prove cause-effect -affected by range of scores -measurement error -outliers -not to be interpreted as proportion of "variance explained" -linear transformations -bivariate normal distribution |

Correlation simple describes a relationship between 2 variables | It does NOT explain why the 2 variables are related. It cannot prove a cause-and-effect relationship. This is true because there is no systematic manipulation of a variable |

Ceiling Effect | An undesirable measurement outcome occurring when the dependent measure puts an artificially “low ceiling” on how high a participant may score. |

Floor Effect | The dependent measure artificially restricts how low scores can be. Example: A problem-solving test that is so difficult, no one gets any of the questions right |

Restriction of Range | To observe a sizeable correlation between two variables, both must be allowed to vary widely |

Coefficient of Determination | It measures the proportion of variability in r^2; one variable that can be determined from the relationship with the other variable |

Linear transformations | It does not matter if the scores of either variables are transformed in to standard scores or added or multiplied by a constant, the correlation between the two variables will remain the same |

Bivariate normal distribution | -the frequency of the correlations between 2 variables X and Y. -sampling distribution is 3d and will be normally distributed as sample size increases -the distribution of the frequency of scores looks like a pitcher’s mound in a baseball field |

Simple Regression | -Pearson correlation measures the linear relationship between two continuous variables. -The data presented above on the scatter plot graph shows a good, but not perfect, linear relationship. -A line is drawn through the middle of the data points |

This line called the Regression Line serves 3 purposes | 1.It helps describe the relationship between the two variables. 2.It identifies the center or ”central tendency” of the relation. 3.The line can be used for prediction |

Regression line | the straight line that best describes the linear relationship between two variables |

Homoscedasticity | -is assumed when sample size is large. cannot be assumed when the sample size is small -homoscedasticity= variance of Y scores for each value of X -equal spread of data points from regression line for all values of x |

Standard Error of Measure | -provides a measure of the standard distance between the regression line and the actual data points -Conceptually the SE of the estimate is like the standard deviation -accuracy is determined by standard error measurement -review class notes on SE |

Regression to the Mean | -In pre&posttest study,the sample's posttest mean is closer to the posttest population mean than their pretest mean was to the pretest population mean -occurs when you have a nonrandom sample from a pop. and 2 measures that are imperfectly correlated |

Multiple Regression | -the weights for this line are selectged ont he basis of the least squares criterion where the sum of the squared residuals is at a minimum and the sum of squares for the regression is at a maximum -y-bx1+bx2+bx3...bxi+a -ex: venn diagram exercise |

Factor analysis | a statistical procedure in which the correlations between responses (variables) to questionnaires or other measures are used to discover common underlying factors; those variables that are highly correlated are “grouped” together to make a factor |

Factor | a group of interrelated variables |

Factor loading | the correlation between the variable and the factor; they range from -1 to +1. Normally, a variable is considered to contribute meaningfully to a factor with a loading of at least +0.30 |

Some Uses of Factor Analysis | -To establish patterns between Interdependent or interrelated variables or items -Parsimony or data reduction -Structure -Classification or description -Hypothesis testing -Exploration -Theory |

To establish patterns between Interdependent or interrelated variables or items | -Factor analysis may be used to untangle the linear relationships into their separate patterns within items of a measure or groups of measures. -Each pattern will appear as a factor delineating a distinct cluster of interrelated data |

Parsimony or data reduction | -Factor analysis can be useful for reducing a mass of information to an economical description |

Structure | -Factor analysis may be employed to discover the basic structure of a domain. For instance, what makes up the factor social anxiety? |

Classification or description | -Factor analysis can be used to group interdependent variables into descriptive categories e.g ideology, revolution, liberal voting, and authoritarianism. -It can be used to classify nation profiles into types with similar characteristics or behavior |

Hypothesis testing | -hyotheses regarding dimensions of attitude, personality, group, etc.; factor analyses can be used to test for empirical existence -dimensions can be postulated in advance and statistical tests of signif. can be applied to factor analysis |

Exploration | -unknown domain explored through factor analy. -Factor analy. fulfills enables the scientist to untangle interrelationships, to separate different sources of variation, and to partial out or control for undesirable influences on the variables of concern |

Theory | • Analytic framework of social theories or models can be built from the geometric or algebraic structure of factor analysis |