×

Modern regression methods. 2nd ed. (English) Zbl 1166.62049

Wiley Series in Probability and Statistics. Hoboken, NJ: John Wiley & Sons (ISBN 978-0-470-08186-0/hbk). xvii, 642 p. (2009).
[For the review of the first edition from 1997 see Zbl 0885.62074.]
This new edition has been updated and enhanced to include all new information on the latest advances and research in the evolving field of regression analysis. It covers diagnostics, transformations, linear regression, logistic regression, nonparametric regression, robust regression, ridge regression, nonlinear regression, and experimental designs for regression.
There are a total of 16 chapters in this book. In Chapter 1, an introduction to the fundamentals of simple linear regression is given. Chapter 2 completes the introduction to simple linear regression given in Chapter 1 by discussing diagnostic procedures and additional plots. Chapter 3 covers simple linear regression with matrix algebra. In Chapter 4, an introduction to fundamental concepts of multiple linear regression is given which includes orthogonal and correlated regressors, multicollinearity, signs of regression coefficients, and centering and scaling. Chapter 5 discusses how to use various types of regression plots for detecting influential observations and for detecting the possible need to transform one or more regressors. Emphasis has been given to partial residual plots, partial regression plots, and variations of each. Chapter 6 examines some of the commonly used methods for deciding on a transformation of the dependent variable and/or regressors. In Chapter 7, various methods used for selecting a subset of regression variables are discussed, and Chapter 8 considers various issues in the use of polynomial terms and also gives indications when trigonometric terms are likely to be of value.
Chapter 9 contains a survey of the state-of-the-art of many topics in logistic regression. Chapter 10 presents several nonparametric regression techniques, ranging from monotonic regression to local regression. Chapter 11 contributes to robust regression. Ridge regression is discussed in Chapter 12. Chapter 13 provides an introduction to nonlinear regression, with two examples used to illustrate basic concepts. Chapter 14 focuses on the statistical aspects of experimental designs for regression. Chapter 15 contains many different regression methods that were not covered in the preceding chapters, including quantile regression and Poisson regression. In Chapter 16, analyses of five challenging data sets are presented.
Each chapter contains a brief section for SAS, Minitab, SPSS, BMDP, and Systat users, though particular software packages are not emphasized overall. Besides, each chapter ends with a summary, references and exercises and some chapters have their own appendices. It is noted that answers to selected exercises are provided. In addition, Some statistical tables are also given at the end of the book.
This book can serve as a textbook for undergraduates and graduate courses on regression models. It can also be used as a reference book for faculty and professionals.

MSC:

62Jxx Linear inference, regression
62G08 Nonparametric regression and quantile regression
62K99 Design of statistical experiments
62-02 Research exposition (monographs, survey articles) pertaining to statistics
62-01 Introductory exposition (textbooks, tutorial papers, etc.) pertaining to statistics

Keywords:

analysis of variance; simple linear regression; confidence intervals; inverse regression; constant variance; autocorrelation; Bayesian regression; bent-cable regression; bootstrapping; calibration; Cochran’s theorem; constrained regression; constructed variables; correlations; partial correlation coefficients; Cytel software coproration; error-in-variables regression; mean squared errors; unbiased estimation; experimental designs; inverse projection approach; logistic regression; nonlinear regression; Bayesian D-optimal; polynomial regression; optimal designs; outliers; Palm Beach County data; partial-F test; augmented partial residuals; CORR; CERES; partial regression; partial residuals; Poisson regression; zero-inflated; prediction; mean squared error of prediction; prediction intervals; probit regression; multiple regression; Durbin-Watson test; correlated errors; standardized deletions; supernormality property; simulation envelopes; least squares estimators; regression with life data; ridge regression; robust regression; breakdown points; FAST-LTS; PROGRESS; efficiency; bounded influence; Welsch procedure; exact fit; GM-estimator; least median of squares; least trimmed squares; M-estimators; MM-estimators; S-estimators; least absolute value regression; two-stage procedures; multistage procedures; scaling regression data; semiparametric regression; software for regression; BMDP; LogXact; Mintab; R; S-Plus; SAS; SPSS; STATA; SYSTAT; strategy for analyzing regression data; t-test; transformations; ACE; AVAS; Box-Cox transformation; Box-Tidwell transformation; variance proportions; Mallows’s \(C_p\); forward selection; backward elimination; stepwise regression; polynomial terms; trigonometric terms; weighted least squares; residual plots; non-normality; influence measures; measurement errors; matrix algebra; orthogonal regressors; correlated regressors; multicollinearity; robust variable selection; polynomials; maximum likelihood; deviance; Wald test; scores tests; likelihood ratio test; goodness-of-fit tests; Pearson residuals; deviance residuals; monotone regression; smoother; kernel regression; local regression; spline; nonparametric regression; lack-of-fit tests; model misspecification; quantile regression; water quality data; Scottish Hill races data; leukemia data; dosage response data

Citations:

Zbl 0885.62074
PDFBibTeX XMLCite