Kai-Tai Fang, Runze Li, Agus Sudjianto1584885467, 9781584885467
Table of contents :
Design and Modeling for Computer Experiments……Page 3
Preface……Page 5
Contents……Page 8
Contents……Page 0
Part I An Overview……Page 12
1.1 Experiments and Their Statistical Designs……Page 13
1.2 Some Concepts in Experimental Design……Page 14
1.3.1 Motivations……Page 20
1.3.2 Metamodels……Page 22
1.3.3 Computer experiments in engineering……Page 26
1.4 Examples of Computer Experiments……Page 30
1.5 Space-Filling Designs……Page 34
1.6 Modeling Techniques……Page 36
1.7 Sensitivity Analysis……Page 41
1.8 Strategies for Computer Experiments and an Illustration Case Study……Page 43
1.9 Remarks on Computer Experiments……Page 48
1.10 Guidance for Reading This Book……Page 50
Part II Designs for Computer Experiments……Page 54
2.1 Latin Hypercube Sampling……Page 55
2.2 Randomized Orthogonal Array……Page 59
2.3 Symmetric and Orthogonal Column Latin Hypercubes……Page 62
2.4.1 IMSE criterion……Page 68
2.4.2 Entropy criterion……Page 70
2.4.3 Minimax and maximin distance criteria and their extension……Page 72
2.4.4 Uniformity criterion……Page 73
3.1 Introduction……Page 75
3.2.1 The star Lp-discrepancy……Page 76
3.2.2 Modified L2-discrepancy……Page 78
3.2.3 The centered discrepancy……Page 79
3.2.4 The wrap-around discrepancy……Page 80
3.2.5 A unified definition of discrepancy……Page 81
3.2.6 Descrepancy for categorical factors……Page 83
3.2.7 Applications of uniformity in experimental designs……Page 84
3.3.1 One-factor uniform designs……Page 86
3.3.2 Symmetrical uniform designs……Page 87
3.3.3 Good lattice point method……Page 88
3.3.4 Latin square method……Page 93
3.3.6 The cutting method……Page 94
3.4 Characteristics of the Uniform Design: Admissibility, Minimaxity, and Robustness……Page 98
3.5.1 Resolvable balanced incomplete block designs……Page 101
3.5.3 New uniform designs……Page 102
3.6.2 Collapsing method……Page 105
3.6.3 Combinatorial method……Page 108
3.6.4 Miscellanea……Page 111
4.1 Optimization Problem in Construction of Designs……Page 113
4.1.2 Neighborhood……Page 114
4.1.3 Replacement rule……Page 115
4.1.4 Iteration formulae……Page 117
4.2.1 Algorithms……Page 121
4.2.2 Local search algorithm……Page 122
4.2.4 Threshold accepting algorithm……Page 123
4.2.5 Stochastic evolutionary algorithm……Page 124
4.3 Lower bounds of the discrepancy and related algorithm……Page 125
4.3.2 Lower bounds of the wrap-around L2-discrepancy……Page 127
4.3.3 Lower bounds of the centered L2-discrepancy……Page 129
4.3.4 Balance-pursuit heuristic algorithm……Page 130
Part III Modeling for Computer Experiments……Page 133
5.1.1 Mean square error and prediction error……Page 134
5.1.2 Regularization……Page 137
5.2 Polynomial Models……Page 140
5.3 Spline Method……Page 146
5.3.1 Construction of spline basis……Page 147
5.3.2 An illustration……Page 149
5.3.3 Other bases of global approximation……Page 151
5.4 Gaussian Kriging Models……Page 152
5.4.1 Prediction via Kriging……Page 153
5.4.2 Estimation of parameters……Page 154
5.4.3 A case study……Page 160
5.5.1 Gaussian processes……Page 166
5.5.2 Bayesian prediction of deterministic functions……Page 167
5.5.3 Use of derivatives in surface prediction……Page 169
5.5.4 An example: borehole model……Page 172
5.6 Neural Network……Page 174
5.6.1 Multi-layer perceptron networks……Page 175
5.6.2 A case study……Page 179
5.6.3 Radial basis functions……Page 184
5.7.1 Motivation of local polynomial regression……Page 187
5.7.2 Metamodeling via local polynomial regression……Page 190
5.8.1 Connections……Page 191
5.8.2 Recommendations……Page 192
6.1 Introduction……Page 194
6.2.1 Criteria……Page 195
6.2.2 An example……Page 198
6.3.1 Functional ANOVA representation……Page 200
6.3.2 Computational issues……Page 202
6.3.3 Example of Sobol’ global sensitivity……Page 205
6.3.4 Correlation ratios and extension of Sobol’ indices……Page 206
6.3.5 Fourier amplitude sensitivity test……Page 209
6.3.6 Example of FAST application……Page 212
7.1 Computer Experiments with Functional Response……Page 214
7.2.1 Functional response with sparse sampling rate……Page 222
7.2.2 Functional response with intensive sampling rate……Page 225
7.3 Penalized Regression Splines……Page 226
7.4 Functional Linear Models……Page 229
7.4.1 A graphical tool……Page 230
7.4.2 E.cient estimation procedure……Page 231
7.4.3 An illustration……Page 233
7.5.1 Partially linear model……Page 237
7.5.2 Partially functional linear models……Page 241
7.5.3 An illustration……Page 243
Acronyms……Page 248
A.1 Some Basic Concepts in Matrix Algebra……Page 250
A.2.1 Random variables and random vectors……Page 253
A.2.2 Some statistical distributions and Gaussian process……Page 256
A.3 Linear Regression Analysis……Page 258
A.3.1 Linear models……Page 259
A.3.2 Method of least squares……Page 260
A.3.3 Analysis of variance……Page 261
A.3.4 An illustration……Page 262
A.4 Variable Selection for Linear Regression Models……Page 265
A.4.1 Nonconvex penalized least squares……Page 266
A.4.2 Iteratively ridge regression algorithm……Page 267
A.4.3 An illustration……Page 268
References……Page 270
Reviews
There are no reviews yet.