Applebaum978-0-511-41424-4
Table of contents :
Contents……Page 4
1.1 Chance and information……Page 12
1.2 Mathematical models of chance phenomena……Page 13
1.3 Mathematical structure and mathematical proof……Page 16
1.4 Plan of this book……Page 18
2.1 Counting……Page 21
2.2 Arrangements……Page 22
2.3.2 Sampling without replacement……Page 24
Order irrelevant……Page 25
2.4 Multinomial coefficients……Page 27
2.5 The gamma function ()……Page 29
Exercises……Page 30
Further reading……Page 32
3.1 The concept of a set……Page 33
3.2 Set operations……Page 36
3.3 Boolean algebras……Page 40
3.4 Measures on Boolean algebras……Page 43
Exercises……Page 48
Further reading……Page 51
4.1.1 Properties of probabilities……Page 52
Principle of symmetry……Page 54
Subjective probabilities……Page 56
Relative frequency……Page 57
4.3 Conditional probability……Page 59
4.4 Independence……Page 66
4.5.1 The classical theory of probability……Page 68
4.5.2 Subjective probabilities……Page 71
4.5.3 The frequentist approach……Page 72
4.6 The historical roots of probability……Page 73
Exercises……Page 75
Further reading……Page 79
5.1 The concept of a random variable……Page 81
5.2 Properties of random variables……Page 83
5.2.2 Bernoulli random variables……Page 86
5.2.3 `Certain’ variables……Page 87
5.3 Expectation and variance……Page 89
5.4 Covariance and correlation……Page 94
5.5 Independent random variables……Page 97
5.6 I.I.D. random variables……Page 100
5.7 Binomial and Poisson random variables……Page 102
Geometric, negative binomial and hypergeometric random variables……Page 106
5.8.1 The negative binomial random variable……Page 108
5.8.2 The hypergeometric random variable……Page 109
5.8.3 Exercises……Page 110
Further reading……Page 115
6.1 What is information?……Page 116
6.2 Entropy……Page 119
6.3 Joint and conditional entropies; mutual information……Page 122
6.4 The maximum entropy principle……Page 126
6.5 Entropy, physics and life……Page 128
6.6 The uniqueness of entropy……Page 130
Exercises……Page 134
Further reading……Page 136
7.1 Transmission of information……Page 138
7.2 The channel capacity……Page 141
7.3 Codes……Page 143
7.4 Noiseless coding……Page 148
7.5.1 Decision rules……Page 154
7.5.3 Average error probability……Page 155
7.5.4 Transmission rate……Page 156
7.6 Brief remarks about the history of information theory……Page 161
Exercises……Page 162
Further reading……Page 164
8.1 Random variables with continuous ranges……Page 166
8.2 Probability density functions……Page 168
8.3 Discretisation and integration……Page 172
8.4 Laws of large numbers……Page 175
8.5 Normal random variables……Page 178
8.6 The central limit theorem……Page 183
8.7 Entropy in the continuous case……Page 190
Exercises……Page 193
Further reading……Page 197
9.1 Cartesian products……Page 199
9.2 Boolean algebras and measures on products……Page 202
9.3 Distributions of random vectors……Page 204
9.4 Marginal distributions……Page 210
9.5 Independence revisited……Page 212
9.6 Conditional densities and conditional entropy……Page 215
9.7 Mutual information and channel capacity……Page 219
Exercises……Page 223
Further reading……Page 227
10.1 Stochastic processes……Page 228
10.2.1 Definition and examples……Page 230
10.2.1 Calculating joint probabilities……Page 234
10.3 The Chapman–Kolmogorov equations……Page 235
10.4 Stationary processes……Page 238
10.5.1 Invariant distributions……Page 240
10.5.2 The detailed balance condition……Page 242
10.5.3 Limiting distributions……Page 243
10.5.4 Doubly stochastic matrices and information theory revisited……Page 245
10.6.1 The chain rule……Page 246
10.6.2 Entropy rates……Page 248
Exercises……Page 251
Further reading……Page 254
Exploring further……Page 256
Appendix 1: Proof by mathematical induction……Page 258
Appendix 2: Lagrange multipliers……Page 260
Appendix 3: Integration of exp -12×2……Page 263
Appendix 4: Table of probabilities associated with the standardnormal distribution……Page 265
Appendix 5: A rapid review of matrix algebra……Page 267
Selected solutions……Page 271
Index……Page 279
Reviews
There are no reviews yet.