Applied regression analysis: a research tool

Free Download

Authors:

Edition: 2nd

Series: Springer Texts in Statistics

ISBN: 0387984542, 9780387984544

Size: 7 MB (6934952 bytes)

Pages: 678/678

File format:

Language:

Publishing Year:

Category:

John O. Rawlings, Sastry G. Pantula, David A. Dickey0387984542, 9780387984544

Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. This book is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. This book serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester intro! duction to statistical methods and a thoeretical linear models course. This book emphasizes the concepts and the analysis of data sets. It provides a review of the key concepts in simple linear regression, matrix operations, and multiple regression. Methods and criteria for selecting regression variables and geometric interpretations are discussed. Polynomial, trigonometric, analysis of variance, nonlinear, time series, logistic, random effects, and mixed effects models are also discussed. Detailed case studies and exercises based on real data sets are used to reinforce the concepts.

Table of contents :
PREFACE……Page 8
CONTENTS……Page 14
1 REVIEW OF SIMPLE REGRESSION……Page 20
1.1 The Linear Model and Assumptions……Page 21
1.2 Least Squares Estimation……Page 22
1.3 Predicted Values and Residuals……Page 25
1.4 Analysis of Variation in the Dependent Variable……Page 26
1.5 Precision of Estimates……Page 30
1.6 Tests of Significance and Confidence Intervals……Page 35
1.7 Regression Through the Origin……Page 40
1.8 Models with Several Independent Variables……Page 46
1.9 Violation of Assumptions……Page 47
1.10 Summary……Page 48
1.11 Exercises……Page 49
2.1 Basic Definitions……Page 56
2.2 Special Types of Matrices……Page 58
2.3 Matrix Operations……Page 59
2.4 Geometric Interpretations of Vectors……Page 65
2.5 Linear Equations and Solutions……Page 69
2.6 Orthogonal Transformations and Projections……Page 73
2.7 Eigenvalues and Eigenvectors……Page 76
2.8 Singular Value Decomposition……Page 79
2.10 Exercises……Page 87
3.1 The Model……Page 94
3.2 The Normal Equations and Their Solution……Page 97
3.3 The Y and Residuals Vectors……Page 99
3.4 Properties of Linear Functions of Random Vectors……Page 101
3.5 Properties of Regression Estimates……Page 106
3.6 Summary of Matrix Formulae……Page 111
3.7 Exercises……Page 112
4 ANALYSIS OF VARIANCE AND QUADRATIC FORMS……Page 120
4.1 Introduction to Quadratic Forms……Page 121
4.2 Analysis of Variance……Page 126
4.3 Expectations of Quadratic Forms……Page 132
4.4 Distribution of Quadratic Forms……Page 134
4.5.1 The General Linear Hypothesis……Page 138
4.5.2 Special Cases of the General Form……Page 140
4.5.3 A Numerical Example……Page 141
4.5.4 Computing Q from Differences in Sums of Squares……Page 145
4.5.5 The R-Notation to Label Sums of Squares……Page 148
4.5.6 Example: Sequential and Partial Sums of Squares……Page 152
4.6.1 Univariate Confidence Intervals……Page 154
4.6.2 Simultaneous Confidence Statements……Page 156
4.6.3 Joint Confidence Regions……Page 158
4.7 Estimation of Pure Error……Page 162
4.8 Exercises……Page 168
5.1 Spartina Biomass Production in the Cape Fear Estuary……Page 180
5.2 Regression Analysis for the Full Model……Page 181
5.2.1 The Correlation Matrix……Page 183
5.2.2 Multiple Regression Results: Full Model……Page 184
5.3 Simplifying the Model……Page 186
5.4 Results of the Final Model……Page 189
5.5 General Comments……Page 196
5.6 Exercises……Page 198
6 GEOMETRY OF LEAST SQUARES……Page 202
6.1 Linear Model and Solution……Page 203
6.2 Sums of Squares and Degrees of Freedom……Page 208
6.3 Reparameterization……Page 211
6.4 Sequential Regressions……Page 215
6.5 The Collinearity Problem……Page 216
6.7 Exercises……Page 220
7 MODEL DEVELOPMENT: VARIABLE SELECTION……Page 224
7.1 Uses of the Regression Equation……Page 225
7.2 Effects of Variable Selection on Least Squares……Page 227
7.3 All Possible Regressions……Page 229
7.4 Stepwise Regression Methods……Page 232
7.5.1 Coefficient of Determination……Page 239
7.5.3 Adjusted Coefficient of Determination……Page 241
7.5.4 Mallows’ C[sub(p)] Statistic……Page 242
7.5.5 Information Criteria: AIC and SBC……Page 244
7.5.6 “Significance Levels” for Choice of Subset Size……Page 245
7.6 Model Validation……Page 247
7.7 Exercises……Page 250
8 POLYNOMIAL REGRESSION……Page 254
8.1 Polynomials in One Variable……Page 255
8.2 Trigonometric Regression Models……Page 264
8.3.1 Considerations in Specifying the Functional Form……Page 268
8.3.2 Polynomial Response Models……Page 269
8.4 Exercises……Page 281
9 CLASS VARIABLES IN REGRESSION……Page 288
9.1 Description of Class Variables……Page 289
9.2 The Model for One-Way Structured Data……Page 290
9.3 Reparameterizing to Remove Singularities……Page 292
9.3.1 Reparameterizing with the Means Model……Page 293
9.3.2 Reparameterization Motivated by Στ[sub(i)] = 0……Page 296
9.3.3 Reparameterization Motivated by τ[sub(t)] = 0……Page 298
9.3.4 Reparameterization: A Numerical Example……Page 299
9.4 Generalized Inverse Approach……Page 301
9.5 The Model for Two-Way Classified Data……Page 303
9.6 Class Variables To Test Homogeneity of Regressions……Page 307
9.7 Analysis of Covariance……Page 313
9.8 Numerical Examples……Page 319
9.8.1 Analysis of Variance……Page 320
9.8.2 Test of Homogeneity of Regression Coefficients……Page 325
9.8.3 Analysis of Covariance……Page 326
9.9 Exercises……Page 335
10 PROBLEM AREAS IN LEAST SQUARES……Page 344
10.1 Nonnormality……Page 345
10.2 Heterogeneous Variances……Page 347
10.3 Correlated Errors……Page 348
10.4 Influential Data Points and Outliers……Page 349
10.5 Model Inadequacies……Page 351
10.6 The Collinearity Problem……Page 352
10.7 Errors in the Independent Variables……Page 353
10.9 Exercises……Page 358
11 REGRESSION DIAGNOSTICS……Page 360
11.1 Residuals Analysis……Page 361
11.1.1 Plot of e Versus Y……Page 365
11.1.2 Plots of e Versus X[sub(i)]……Page 369
11.1.3 Plots of e Versus Time……Page 370
11.1.4 Plots of e[sub(i)] Versus e[sub(i–1)]……Page 373
11.1.5 Normal Probability Plots……Page 375
11.1.6 Partial Regression Leverage Plots……Page 378
11.2 Influence Statistics……Page 380
11.2.1 Cook’s D……Page 381
11.2.2 DFFITS……Page 382
11.2.4 COVRATIO……Page 383
11.2.5 Summary of Influence Measures……Page 386
11.3 Collinearity Diagnostics……Page 388
11.3.1 Condition Number and Condition Index……Page 390
11.3.2 Variance Inflation Factor……Page 391
11.3.3 Variance Decomposition Proportions……Page 392
11.4 Regression Diagnostics on the Linthurst Data……Page 396
11.4.1 Plots of Residuals……Page 397
11.4.2 Influence Statistics……Page 407
11.4.3 Collinearity Diagnostics……Page 410
11.5 Exercises……Page 411
12.1 Reasons for Making Transformations……Page 416
12.2 Transformations to Simplify Relationships……Page 418
12.3 Transformations to Stabilize Variances……Page 426
12.4 Transformations to Improve Normality……Page 428
12.5 Generalized Least Squares……Page 430
12.5.1 Weighted Least Squares……Page 433
12.5.2 Generalized Least Squares……Page 436
12.6 Summary……Page 445
12.7 Exercises……Page 446
13 COLLINEARITY……Page 452
13.1 Understanding the Structure of the X-Space……Page 454
13.2.1 Explanation……Page 462
13.2.2 Principal Component Regression……Page 465
13.3 General Comments on Collinearity……Page 476
13.5 Exercises……Page 478
14.1 The Problem……Page 482
14.2 Multiple Regression: Ordinary Least Squares……Page 486
14.3 Analysis of the Correlational Structure……Page 490
14.4 Principal Component Regression……Page 498
14.5 Summary……Page 501
14.6 Exercises……Page 502
15 MODELS NONLINEAR IN THE PARAMETERS……Page 504
15.1 Examples of Nonlinear Models……Page 505
15.2 Fitting Models Nonlinear in the Parameters……Page 513
15.3 Inference in Nonlinear Models……Page 517
15.4.1 Heteroscedastic Errors……Page 526
15.5 Logistic Regression……Page 528
15.6 Exercises……Page 530
16 CASE STUDY: RESPONSE CURVE MODELING……Page 534
16.1 The Ozone–Sulfur Dioxide Response Surface (1981)……Page 536
16.1.1 Polynomial Response Model……Page 539
16.1.2 Nonlinear Weibull Response Model……Page 543
16.2 Analysis of the Combined Soybean Data……Page 549
16.3 Exercises……Page 562
17 ANALYSIS OF UNBALANCED DATA……Page 564
17.1 Sources Of Imbalance……Page 565
17.2 Effects Of Imbalance……Page 566
17.3 Analysis of Cell Means……Page 568
17.4 Linear Models for Unbalanced Data……Page 572
17.4.1 Estimable Functions with Balanced Data……Page 573
17.4.2 Estimable Functions with Unbalanced Data……Page 577
17.4.3 Least Squares Means……Page 583
17.5 Exercises……Page 587
18 MIXED EFFECTS MODELS……Page 592
18.1 Random Effects Models……Page 593
18.2 Fixed and Random Effects……Page 598
18.3 Random Coefficient Regression Models……Page 603
18.4 General Mixed Linear Models……Page 605
18.5 Exercises……Page 608
19 CASE STUDY: ANALYSIS OF UNBALANCED DATA……Page 612
19.1 The Analysis Of Variance……Page 615
19.2 Mean Square Expectations and Choice of Errors……Page 626
19.3 Least Squares Means and Standard Errors……Page 629
19.4 Mixed Model Analysis……Page 634
19.5 Exercises……Page 637
A APPENDIX TABLES……Page 640
REFERENCES……Page 654
D……Page 666
R……Page 667
Y……Page 668
B……Page 669
D……Page 670
G……Page 671
M……Page 672
N……Page 673
P……Page 674
S……Page 675
V……Page 676
X……Page 677

Reviews

There are no reviews yet.

Be the first to review “Applied regression analysis: a research tool”
Shopping Cart
Scroll to Top