D. M. Titterington0471907634, 9780471907633
Table of contents :
Cover ……Page 1
Title ……Page 3
Date-line ……Page 4
Contents ……Page 5
PREFACE ……Page 9
1.1 Basic definitions and concepts ……Page 11
1.2.1 Forms of sampling ……Page 13
1.2.2 The number of components ……Page 14
1.2.4 Discriminant analysis and classification ……Page 15
1.3 Other forms of mixture ……Page 16
1.4 The literature ……Page 17
2.1 Direct applications ……Page 18
2.2 Indirect applications ……Page 32
3.1.1 Introduction and definition ……Page 45
3.1.2 Theorems and applications ……Page 46
3.1.3 Further results and literature ……Page 51
3.2 Information ……Page 52
3.3.1 Multimodality ……Page 58
3.3.3 Properties of general mixtures ……Page 60
4.1.1 Methods based on density functions ……Page 62
4.1.2 Methods based on the cumulative distribution function ……Page 68
4.1.3 Methods for mixtures of discrete and multivariate distributions ……Page 77
4.2.1 Introduction ……Page 81
4.2.2 Mixtures of two densities ……Page 82
4.2.3 Mixtures ofk densities ……Page 89
4.3.1 Introduction ……Page 92
4.3.2 EM and other numerical algorithms ……Page 94
4.3.3 Theoretical considerations ……Page 101
4.3.4 Further examples ……Page 107
4.4.1 Introduction ……Page 116
4.4.2 Bayesian approaches to outlier models ……Page 118
4.4.3 Bayesian cluster analysis ……Page 123
4.5.1 Introduction to distance measures ……Page 124
4.5.2 Estimation of mixing weights based on quadratic distances ……Page 127
4.5.3 Problems with non-explicit estimators ……Page 131
4.5.4 What to do with extra categorized data! ……Page 135
4.6.1 Introduction ……Page 136
4.6.2 Theoretical aspects of the MGF and CF methods ……Page 138
4.6.3 Illustrations based on the estimation of mixing weights ……Page 140
4.7.1 Some introductory methods ……Page 143
4.7.2 Formal methods for mixtures of exponentials ……Page 147
4.7.3 Medgyessy’s method ……Page 148
4.7.4 Further examples ……Page 152
5.1 Introduction ……Page 158
5.3 Formal techniques for special cases ……Page 159
5.4 General formal techniques ……Page 162
5.5 The structure of modality ……Page 169
5.6 Assessment of modality ……Page 175
5.7 Discriminant analysis ……Page 178
6.1.1 The problem and its Bayesian solution ……Page 186
6.2.1 The two-class problem: Bayesian and related procedures ……Page 189
6.2.2 The two-class problem: a maximum likelihood related procedure ……Page 193
6.2.3 Asymptotic and finite-sample comparisons of the quasi-Bayes and Kazakos procedures ……Page 194
6.2.4 The k-class problem: a quasi-Bayes procedure ……Page 199
6.3.1 A general recursive procedure for a one-parameter mixture ……Page 203
6.3.2 Unsupervised learning for signal versus noise ……Page 206
6.3.3 A quasi-Bayes sequential procedure for the contaminated normal distribution ……Page 209
6.3.4 A quasi-Bayes sequential procedure for bipolar signal detection and related problems ……Page 211
6.4.1 A review of some pragmatic approaches ……Page 213
6.4.2 A general recursion for parameter estimation using incomplete data ……Page 215
6.4.3 Illustrations of the general recursion ……Page 218
6.4.4 Connections with the EM algorithm ……Page 220
6.5.1 Dynamic linear models and finite mixture Kalman filters ……Page 222
6.5.2 An outline of suggested approximation procedures ……Page 224
REFERENCES ……Page 226
INDEX ……Page 248
Series Contents ……Page 254
Reviews
There are no reviews yet.