Friday, April 9, 2010. 1. Collect hw. 2. Odds. 3. Likelihood and logistic regression in R. 4. Testing and deviance. 5. Residuals in logistic regression. 1. Hand in hw. 2. Odds of, or odds in favor of, an event = p/(1-p). Odds against are (1-p)/p. See p266. Probability and logistic regression. p268. Instead of one unit increase in x_i increasing the probability by beta_i, an increase in x_i by 1 increases the log odds of y by beta_i. Interpretation of logistic regression. p268. 3. Likelihood and logistic regression in R. The parameters, beta = (beta_0, beta_1, ..., beta_p) are fit by maximum likelihood. See p.269. In R, you can use glm(y ~ x, family = binomial). 4. Testing and deviance. Z-test, p270. Deviance is a measure of lack of fit. See p.271 and p272. The deviance should be chi-square with n-(p+1) degrees of freedom. Compare with a saturated model, i.e. a model with n parameters. Residual deviance is comparison with a saturated model. Significance means that the saturated model fits significantly better. Compare with a null model with beta = beta_0. The differences are chi-square random variables with (n-(p+1)) - (n-1) = p degrees of freedom. Significance implies that the model under consideration fits significantly better than a null model. R^2, p273. Chi-square, p274. Chi-square and Deviance don't work well with 0-1 responses when m_i = 1. See p278-280. 5. Residuals in logistic regression. Response residuals, p274. Pearson residuals, p275. Standardized Pearson residuals, p275. Deviance residuals, p275. Standardized deviance residuals, p275. Look for outliers and patterns in the residuals. Example p276.