资源预览内容
第1页 / 共51页
第2页 / 共51页
第3页 / 共51页
第4页 / 共51页
第5页 / 共51页
第6页 / 共51页
第7页 / 共51页
第8页 / 共51页
第9页 / 共51页
第10页 / 共51页
亲,该文档总共51页,到这儿已超出免费预览范围,如果喜欢就下载吧!
资源描述
Multiple Regression,Dr. Andy Field,Slide 2,Aims,Understand When To Use Multiple Regression. Understand the multiple regression equation and what the betas represent. Understand Different Methods of Regression Hierarchical Stepwise Forced Entry Understand How to do a Multiple Regression on PASW/SPSS Understand how to Interpret multiple regression. Understand the Assumptions of Multiple Regression and how to test them,Slide 3,What is Multiple Regression?,Linear Regression is a model to predict the value of one variable from another. Multiple Regression is a natural extension of this model: We use it to predict values of an outcome from several predictors. It is a hypothetical model of the relationship between several variables.,Regression: An Example,A record company boss was interested in predicting record sales from advertising. Data 200 different album releases Outcome variable: Sales (CDs and Downloads) in the week after release Predictor variables The amount (in s) spent promoting the record before release (see last lecture) Number of plays on the radio (new variable),Slide 5,The Model with One Predictor,Slide 6,Multiple Regression as an Equation,With multiple regression the relationship is described using a variation of the equation of a straight line.,Slide 7,b0,b0 is the intercept. The intercept is the value of the Y variable when all Xs = 0. This is the point at which the regression plane crosses the Y-axis (vertical).,Slide 8,Beta Values,b1 is the regression coefficient for variable 1. b2 is the regression coefficient for variable 2. bn is the regression coefficient for nth variable.,Slide 9,The Model with Two Predictors,bAdverts,bairplay,b0,Slide 10,Methods of Regression,Hierarchical: Experimenter decides the order in which variables are entered into the model. Forced Entry: All predictors are entered simultaneously. Stepwise: Predictors are selected using their semi-partial correlation with the outcome.,Slide 12,Hierarchical Regression,Known predictors (based on past research) are entered into the regression model first. New predictors are then entered in a separate step/block. Experimenter makes the decisions.,Slide 13,Hierarchical Regression,It is the best method: Based on theory testing. You can see the unique predictive influence of a new variable on the outcome because known predictors are held constant in the model. Bad Point: Relies on the experimenter knowing what theyre doing!,Slide 14,Forced Entry Regression,All variables are entered into the model simultaneously. The results obtained depend on the variables entered into the model. It is important, therefore, to have good theoretical reasons for including a particular variable.,Slide 15,Stepwise Regression I,Variables are entered into the model based on mathematical criteria. Computer selects variables in steps. Step 1 SPSS looks for the predictor that can explain the most variance in the outcome variable.,Exam Performance,Revision Time,Previous Exam,Difficulty,Slide 18,Stepwise Regression II,Step 2: Having selected the 1st predictor, a second one is chosen from the remaining predictors. The semi-partial correlation is used as a criterion for selection.,Slide 19,Semi-Partial Correlation,Partial correlation: measures the relationship between two variables, controlling for the effect that a third variable has on them both. A semi-partial correlation: Measures the relationship between two variables controlling for the effect that a third variable has on only one of the others.,Slide 20,Partial Correlation,Semi-Partial Correlation,Slide 21,Semi-Partial Correlation in Regression,The semi-partial correlation Measures the relationship between a predictor and the outcome, controlling for the relationship between that predictor and any others already in the model. It measures the unique contribution of a predictor to explaining the variance of the outcome.,Slide 22,Slide 23,Problems with Stepwise Methods,Rely on a mathematical criterion. Variable selection may depend upon only slight differences in the Semi-partial correlation. These slight numerical differences can lead to major theoretical differences. Should be used only for exploration,Slide 24,Doing Multiple Regression,Slide 25,Doing Multiple Regression,Regression Statistics,Regression Diagnostics,Slide 28,Output: Model Summary,Slide 29,R and R2,R The correlation between the observed values of the outcome, and the values predicted by the model. R2 Yhe proportion of variance accounted for by the model. Adj. R2 An estimate of R2 in the population (shrinkage).,Slide 30,Output: ANOVA,Slide 31,Analysis of Variance: ANOVA,The F-test looks at whether the variance explained by the model (SSM) is significantly greater than the error within the model (SSR). It tells us whether using the regression model is significantly better at predicting values of the outcome than using the mean.,Slide 32,Output: betas,Slide 33,How to Interpret Beta Values,Beta values: the change in the outcome associated with a unit change in the predictor. Standardised beta values: tell us the same but expressed as standard deviations.,
收藏 下载该资源
网站客服QQ:2055934822
金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号