Linear regression analysis : theory and computing
- Author
- Additional Author(s)
-
- Publisher
- Singapore: World Scientific Publishing Co. Pte. Ltd., 2009
- Language
- English
- ISBN
- 9789812834119
- Series
-
- Subject(s)
-
- Notes
-
. Includes bibliographical references (p. 317-324) and index..
- Abstract
- This volume presents in detail the fundamental theories of linear regression analysis and diagnosis, as well as the relevant statistical computing techniques so that readers are able to actually model the data using the methods and techniques described in the book. It covers the fundamental theories in linear regression analysis and is extremely useful for future research in this area. The examples of regression analysis using the Statistical Application System (SAS) are also included. This book is suitable for graduate students who are either majoring in statistics/biostatistics or using linear regression analysis substantially in their subject fields.
Physical Dimension
- Number of Page(s)
- 1 online resource (xix, 328 p.)
- Dimension
- -
- Other Desc.
- ill. (some col.)
Summary / Review / Table of Content
1. Introduction.
1.1. Regression model.
1.2. Goals of regression analysis.
1.3. Statistical computing in regression analysis --
2. Simple linear regression.
2.1. Introduction.
2.2. Least squares estimation.
2.3. Statistical properties of the least squares estimation.
2.4. Maximum likelihood estimation.
2.5. Confidence interval on regression mean and regression prediction.
2.6. Statistical inference on regression parameters.
2.7. Residual analysis and model diagnosis.
2.8. Example --
3. Multiple linear regression.
3.1. Vector space and projection.
3.2. Matrix form of multiple linear regression.
3.3. Quadratic form of random variables.
3.4. Idempotent matrices.
3.5. Multivariate normal distribution.
3.6. Quadratic form of the multivariate normal variables.
3.7. Least squares estimates of the multiple regression parameters.
3.8. Matrix form of the simple linear regression.
3.9. Test for full model and reduced model.
3.10. Test for general linear hypothesis.
3.11. The least squares estimates of multiple regression parameters under linear restrictions.
3.12. Confidence intervals of mean and prediction in multiple regression.
3.13. Simultaneous test for regression parameters.
3.14. Bonferroni confidence region for regression parameters.
3.15. Interaction and confounding.
3.16. Regression with dummy variables.
3.17. Collinearity in multiple linear regression.
3.18. Linear model in centered form.
3.19. Numerical computation of LSE via QR decomposition.
3.20. Analysis of regression residual.
3.21. Check for normality of the error term in multiple regression.
3.22. Example --
4. Detection of outliers and influential observations in multiple linear regression.
4.1. Model diagnosis for multiple linear regression.
4.2. Detection of outliers in multiple linear regression.
4.3. Detection of influential observations in multiple linear regression.
4.4. Test for mean-shift outliers.
4.5. Graphical display of regression diagnosis.
4.6. Test for inferential observations.
4.7. Example --
5. Model selection.
5.1. Effect of underfitting and overfitting.
5.2. All possible regressions.
5.3. Stepwise selection.
5.4. Examples.
5.5. Other related issues --
6. Model diagnostics.
6.1. Test heteroscedasticity.
6.2. Detection of regression functional form --
7. Extensions of least squares.
7.1. Non-full-rank linear regression models.
7.2. Generalized least squares.
7.3. Ridge regression and LASSO.
7.4. Parametric nonlinear regression --
8. Generalized linear models.
8.1. Introduction: a motivating example.
8.2. Components of GLM.
8.3. Maximum likelihood estimation of GLM.
8.4. Statistical inference and other issues in GLM.
8.5. Logistic regression for binary data.
8.6. Poisson regression for count data --
9. Bayesian linear regression.
9.1. Bayesian linear models.
Bayesian model averaging.
Exemplar(s)
# |
Accession No. |
Call Number |
Location |
Status |
1. | 00316/19 | 519.536 Yan L | Online ! | Available |