Mathematical Statistics: A Unified Introduction

Free Download

Authors:

Edition: 1st Edition.

Series: Springer Texts in Statistics

ISBN: 9781441931412, 1441931414

Size: 2 MB (2483720 bytes)

Pages: 478/478

File format:

Language:

Publishing Year:

Category:

George R. Terrell9781441931412, 1441931414

This textbook introduces the mathematical concepts and methods that underlie statistics. The course is unified, in the sense that no prior knowledge of probability theory is assumed; this is developed as needed. The book is committed to a high level of mathematical seriousness; and to an intimate connection with application. Modern methods, such as logistic regression, are introduced; as are unjustly neglected clasical topics, such as elementary asymptotics. The book first develops elementary linear models for measured data and multiplicative models for counted data. Simple probability models for random error follow. The most important famiies of random variables are then studied in detail, emphasizing their interrelationships and their large-sample behavior. Inference, including classical, Bayesian, finite population, and likelihood-based, is introduced as the necessary mathematical tools become available. In teaching style, the book aims to be * mathematically complete: every formula is derived, every theorem proved at the appropriate level * concrete: each new concept is introduced and exemplified by interesting statistical problems; and more abstract concepts appear only gradually * constructive: direct derivations and proofs are preferred * active: students are led to do mathematical statistics, not just to appreciate it, with the assistance of 500 interesting exercises. The text is aimed for the upper undergraduate level, or the beginning Masters program level. It assumes the usual two-year college mathematics sequence, including an introduction to multiple integrals, matrix algebra, and infinite series. George R. Terrell received his degrees from Rice University, where he later taught.

Table of contents :
Contents……Page 14
1.1 Introduction……Page 33
1.2.1 Plotting Data……Page 34
1.2.2 Location Models……Page 35
1.3.1 Data from Several Treatments……Page 36
1.3.2 Centered Models……Page 38
1.3.3 Degrees of Freedom……Page 39
1.4.1 Cross-Classified Observations……Page 40
1.4.2 Additive Models……Page 42
1.4.3 Balanced Designs……Page 43
1.4.4 Interaction……Page 45
1.4.5 Centering Full Models……Page 46
1.5.1 Interpolating Between Levels……Page 47
1.5.2 Simple Linear Regression……Page 49
1.6.1 Double Interpolation……Page 51
1.6.2 Multiple Linear Regression……Page 53
1.7.1 Counted Data……Page 54
1.7.2 Independence Models……Page 56
1.7.3 Loglinear Models……Page 57
1.7.4 Loglinear Independence Models……Page 58
1.7.5 Loglinear Saturated Models*……Page 60
1.8.1 Interpolating in Contingency Tables……Page 61
1.8.2 Linear Logistic Regression……Page 63
1.9 Summary……Page 64
1.10 Exercises……Page 65
1.11 Supplementary Exercises……Page 69
2.1 Introduction……Page 75
2.2.1 Multiple Observations as Vectors……Page 76
2.2.2 Distances as Errors……Page 78
2.3.1 Simple Proportion Models……Page 79
2.3.2 Estimating the Constant……Page 81
2.3.3 Solving the Problem Using Matrix Notation……Page 83
2.3.4 Geometric Degrees of Freedom……Page 84
2.3.5 Schwarz’s Inequality……Page 85
2.4.1 Least-Squares Location Estimation……Page 86
2.4.2 Sample Variance……Page 87
2.5.1 Analysis of Variance……Page 88
2.5.2 Geometric Interpretation……Page 90
2.5.3 ANOVA Tables……Page 92
2.5.4 The F-Statistic……Page 93
2.5.5 The Kruskal–Wallis Statistic……Page 95
2.6.1 Estimates for Simple Linear Regression……Page 96
2.6.2 ANOVA for Regression……Page 98
2.7.1 Standardizing the Regression Line……Page 99
2.7.2 Properties of the Sample Correlation……Page 100
2.8.1 ANOVA for Two-Way Layouts……Page 102
2.8.2 Additive Models……Page 104
2.10 Exercises……Page 106
2.11 Supplementary Exercises……Page 109
3.1 Introduction……Page 113
3.2.1 What Is Probability?……Page 114
3.2.2 Probabilities by Counting……Page 115
3.3.1 Basic Rules for Counting……Page 117
3.3.2 Counting Lists……Page 118
3.3.3 Combinations……Page 120
3.3.4 Multinomial Counting……Page 121
3.4.1 Complicated Counts……Page 122
3.4.2 The Birthday Problem……Page 123
3.4.3 General Principles About Probability……Page 124
3.5.1 An Upper Bound……Page 126
3.5.2 A Lower Bound……Page 128
3.5.3 A Useful Approximation……Page 129
3.6 Sampling……Page 130
3.8 Exercises……Page 131
3.9 Supplementary Exercises……Page 134
4.1 Introduction……Page 139
4.2.1 Uniform Geometric Probability……Page 140
4.2.2 General Properties……Page 142
4.3.2 Rules for Combining Events……Page 143
4.4.1 In General……Page 144
4.4.2 Axioms of Probability……Page 145
4.4.3 Consequences of the Axioms……Page 146
4.5.1 Definition……Page 147
4.5.2 Examples……Page 148
4.6.1 Partitions……Page 149
4.6.2 Division into Cases……Page 150
4.6.3 Bayes’s Theorem……Page 152
4.6.4 Bayes’s Theorem Applied to Partitions……Page 153
4.7.1 Irrelevant Conditions……Page 154
4.7.3 Near-Independence……Page 155
4.8.1 Probability Density……Page 156
4.8.2 Sigma Algebras and Borel Algebras*……Page 159
4.8.3 Kolmogorov’s Axiom*……Page 161
4.10 Exercises……Page 164
4.11 Supplementary Exercises……Page 166
5.1 Introduction……Page 169
5.2.1 Some Simple Examples……Page 170
5.2.2 Discrete Random Variables……Page 171
5.2.3 The Negative Hypergeometric Family……Page 172
5.3.1 The Hypergeometric Family……Page 174
5.3.3 Fisher’s Test for Independence……Page 176
5.3.5 The Sign Test……Page 178
5.4.1 Some Properties……Page 179
5.4.2 Continuous Variables……Page 180
5.4.3 Symmetry and Duality……Page 182
5.5.1 Average Values……Page 184
5.5.2 Discrete Random Variables……Page 185
5.5.3 The Method of Indicators……Page 186
5.6.2 Compatibility with the Data……Page 188
5.7 Summary……Page 190
5.8 Exercises……Page 191
5.9 Supplementary Exercises……Page 195
6.1 Introduction……Page 199
6.2.1 The Geometric Approximation……Page 200
6.2.3 Negative Binomial Approximations……Page 201
6.2.4 Negative Binomial Variables……Page 202
6.2.5 Convergence in Distribution……Page 203
6.3.1 Binomial Approximations……Page 204
6.3.2 Binomial Random Variables……Page 205
6.3.3 Bernoulli Processes……Page 207
6.4.1 Poisson Approximation to Binomial Probabilities……Page 208
6.4.2 Approximation to the Negative Binomial……Page 209
6.4.3 Poisson Random Variables……Page 210
6.5 More About Expectation……Page 211
6.6.1 Expectations of Functions……Page 214
6.6.2 Variance……Page 216
6.6.3 Variances of Some Families……Page 217
6.7.1 Estimating Binomial p……Page 219
6.7.2 Confidence Bounds for Binomial p……Page 220
6.7.3 Confidence Intervals……Page 221
6.7.4 Two-Sided Hypothesis Tests……Page 222
6.8 The Poisson Limit of the Negative Hypergeometric Family*……Page 223
6.9 Summary……Page 225
6.10 Exercises……Page 226
6.11 Supplementary Exercises……Page 230
7.1 Introduction……Page 233
7.2.1 Multinomial Random Vectors……Page 234
7.2.2 Marginal and Conditional Distributions……Page 235
7.3.1 Random Coordinates……Page 238
7.3.2 Multivariate Cumulative Distribution Functions……Page 240
7.4.1 Independence and Random Samples……Page 242
7.4.2 Sums of Random Vectors……Page 243
7.4.3 Convolutions……Page 244
7.5.2 Conditional Expectations……Page 245
7.5.3 Regression……Page 246
7.5.4 Linear Regression……Page 247
7.5.5 Covariance……Page 249
7.5.6 The Correlation Coefficient……Page 250
7.6.1 Expectations and Variances……Page 251
7.6.2 The Covariance Matrix……Page 252
7.6.4 Statistical Properties of Sample Means and Variances……Page 253
7.6.5 The Method of Indicators……Page 255
7.7.2 Markov’s Inequality……Page 257
7.7.3 Convergence in Mean Squared Error……Page 258
7.8.1 Parameters in Models as Random Variables……Page 259
7.8.2 An Example of Bayesian Inference……Page 260
7.9 Summary……Page 261
7.10 Exercises……Page 262
7.11 Supplementary Exercises……Page 266
8.1 Introduction……Page 269
8.2.1 Posterior Probability of a Parameter Value……Page 270
8.2.2 Maximum Likelihood……Page 271
8.3.1 Ratio of the Maximum Likelihood to a Hypothetical Likelihood……Page 273
8.3.2 G-Squared……Page 274
8.4.1 Chi-Squared……Page 275
8.4.2 Comparing the Two Statistics……Page 276
8.4.4 Multinomial Models……Page 277
8.5.1 Conditions for a Maximum……Page 278
8.5.2 Proportional Fitting……Page 280
8.5.3 Iterative Proportional Fitting*……Page 281
8.5.4 Why Does It Work?*……Page 284
8.6.1 Relative G-Squared……Page 285
8.6.2 An ANOVA-like Table……Page 286
8.7.2 General Logistic Regression……Page 288
8.8.1 Linear Approximation to a Root……Page 290
8.8.2 Dose–Response with Historical Controls……Page 291
8.9 Summary……Page 292
8.10 Exercises……Page 293
8.11 Supplementary Exercises……Page 295
9.1 Introduction……Page 299
9.2.2 Continuous Variables……Page 300
9.3.1 How Would It Look?……Page 301
9.3.2 How to Construct a Poisson Process……Page 302
9.3.3 Spacings Between Events……Page 304
9.3.4 Gamma Variables……Page 305
9.3.5 Poisson Process as the Limit of a Hypergeometric Process*……Page 306
9.4.1 Transforming Variables……Page 308
9.4.2 Gamma Densities……Page 309
9.4.3 General Properties……Page 310
9.4.4 Interpretation……Page 312
9.5.1 Order Statistics……Page 315
9.5.2 Dirichlet Processes……Page 316
9.5.3 Beta Variables……Page 317
9.5.4 Beta Densities……Page 319
9.5.5 Connections……Page 320
9.6.1 Hypothesis Tests and Parameter Estimates……Page 322
9.6.2 Confidence Intervals……Page 323
9.6.3 Inferences About the Shape Parameter……Page 324
9.7.1 Alternative Hypotheses……Page 325
9.7.2 Most Powerful Tests……Page 326
9.8 Summary……Page 328
9.9 Exercises……Page 329
9.10 Supplementary Exercises……Page 331
10.1 Introduction……Page 333
10.2.2 Quantile Functions in General……Page 334
10.2.3 Continuous Quantile Functions……Page 336
10.3.1 Expectation as the Integral of a Quantile Function……Page 337
10.3.2 Markov’s Inequality Revisited……Page 340
10.4.1 Changing Variables in a Density……Page 341
10.4.2 Expectation in Terms of a Density……Page 342
10.5.1 Shape of a Gamma Density……Page 344
10.5.2 Quadratic Approximation to the Log-Density……Page 345
10.5.3 Standard Normal Density……Page 348
10.5.5 Approximate Gamma Probabilities……Page 350
10.5.6 Computing Normal Probabilities……Page 351
10.5.7 Normal Tail Probabilities……Page 352
10.6.1 Dual Probabilities……Page 353
10.6.2 Continuity Correction……Page 355
10.7.1 The Normal Family……Page 356
10.7.2 Approximate Poisson Intervals……Page 357
10.7.3 Approximate Gamma Intervals……Page 358
10.9 Exercises……Page 359
10.10 Supplementary Exercises……Page 362
11.1 Introduction……Page 365
11.2.2 The General Case……Page 366
11.3.1 Two Order Statistics at Once……Page 367
11.3.2 Joint Density of Two Order Statistics……Page 368
11.3.3 Joint Densities in General……Page 369
11.3.4 The Family of Divisions of an Interval……Page 370
11.4.1 Affine Multivariate Transformations……Page 371
11.4.2 Dirichlet Densities……Page 373
11.4.3 Some Properties of Dirichlet Variables……Page 374
11.4.4 General Change of Variables……Page 376
11.5.1 Gammas Conditioned on Their Sum……Page 377
11.5.3 Gamma Densities in General……Page 378
11.5.4 Chi-Squared Variables……Page 380
11.6.1 Bayes’s Theorem Revisited……Page 381
11.6.2 Application to Gamma Observations……Page 382
11.7.2 Linear Combinations of Normal Variables……Page 384
11.7.4 Approximating a Beta Variable……Page 386
11.8.1 Binomial Variables with Large Variance……Page 387
11.8.2 Negative Binomial Variables with Small Coefficient of Variation……Page 388
11.9.1 Approximating Two Order Statistics……Page 389
11.9.2 Correlated Normal Variables……Page 390
11.10.1 Family Relationships……Page 391
11.10.2 Asymptotic Normality……Page 392
11.11 Summary……Page 393
11.12 Exercises……Page 394
11.13 Supplementary Exercises……Page 396
12.1 Introduction……Page 399
12.2.1 A Probability Model for Errors……Page 400
12.2.2 Statistics of Fit for the Error Model……Page 401
12.3.1 Independence Models for Errors……Page 402
12.3.2 Distribution of R-squared……Page 403
12.3.3 Elementary Errors……Page 404
12.4.1 Continuous Likelihoods……Page 405
12.4.2 Maximum Likelihood with Normal Errors……Page 406
12.4.3 Unbiased Variance Estimates……Page 407
12.5.1 When the Variance Is Known……Page 408
12.5.2 When the Variance Is Unknown……Page 409
12.6.1 Matrix Form……Page 410
12.6.2 Centered Form……Page 411
12.6.3 Least-Squares Estimates……Page 412
12.6.4 Homoscedastic Errors……Page 413
12.6.5 Linear Combinations of Parameters……Page 415
12.7.2 Gauss–Markov Theorem……Page 416
12.8.1 The Score Estimator……Page 417
12.8.2 How Good Is It?……Page 419
12.8.3 The Information Inequality……Page 420
12.9 Summary……Page 422
12.10 Exercises……Page 423
12.11 Supplementary Exercises……Page 424
13.1 Introduction……Page 427
13.2.2 The P.G.F. Representation……Page 428
13.2.3 The P.G.F. As an Expectation……Page 430
13.2.4 Applications to Compound Variables……Page 431
13.2.5 Factorial Moments……Page 433
13.3.1 Comparison with Exponential Variables……Page 434
13.3.2 The M.G.F. as an Expectation……Page 436
13.4.1 Poisson Limits……Page 437
13.4.2 Law of Large Numbers……Page 438
13.4.3 Normal Limits……Page 439
13.4.4 A Central Limit Theorem……Page 440
13.5.1 Natural Exponential Forms……Page 442
13.5.2 Expectations……Page 443
13.5.3 Natural Parameters……Page 444
13.5.5 Other Sufficient Statistics……Page 445
13.6.1 Conditional Improvement……Page 446
13.6.2 Sufficient Statistics……Page 448
13.7.1 Tail Probability Approximation……Page 449
13.7.2 Tilting a Random Variable……Page 450
13.7.3 Normal Tail Approximation……Page 451
13.7.4 Poisson Tail Approximations……Page 453
13.7.5 Small-Sample Asymptotics……Page 454
13.9 Exercises……Page 455
13.10 Supplementary Exercises……Page 458
B……Page 469
D……Page 470
G……Page 471
L……Page 472
N……Page 473
P……Page 474
S……Page 475
U……Page 476
W……Page 477

Reviews

There are no reviews yet.

Be the first to review “Mathematical Statistics: A Unified Introduction”
Shopping Cart
Scroll to Top