Probability and information: An integrated approach

Free Download

Authors:

Edition: 2nd

ISBN: 0521899044, 9780521899048

Size: 1 MB (1375773 bytes)

Pages: 291/291

File format:

Language:

Publishing Year:

Category:

David Applebaum0521899044, 9780521899048

This new and updated textbook is an excellent way to introduce probability and information theory to students new to mathematics, computer science, engineering, statistics, economics, or business studies. Only requiring knowledge of basic calculus, it begins by building a clear and systematic foundation to probability and information. Classic topics covered include discrete and continuous random variables, entropy and mutual information, maximum entropy methods, the central limit theorem and the coding and transmission of information. Newly covered for this edition is modern material on Markov chains and their entropy. Examples and exercises are included to illustrate how to use the theory in a wide range of applications, with detailed solutions to most exercises available online for instructors.

Table of contents :
Cover……Page 1
Half-title……Page 3
Title……Page 5
Copyright……Page 6
Contents……Page 9
Preface to the second edition……Page 13
Preface to the first edition……Page 15
1.1 Chance and information……Page 19
1.2 Mathematical models of chance phenomena……Page 20
1.3 Mathematical structure and mathematical proof……Page 23
1.4 Plan of this book……Page 25
2.1 Counting……Page 28
2.2 Arrangements……Page 29
2.3.2 Sampling without replacement……Page 31
Order irrelevant……Page 32
2.4 Multinomial coefficients……Page 34
2.5 The gamma function ()……Page 36
Exercises……Page 37
Further reading……Page 39
3.1 The concept of a set……Page 40
3.2 Set operations……Page 43
3.3 Boolean algebras……Page 47
3.4 Measures on Boolean algebras……Page 50
Exercises……Page 55
Further reading……Page 58
4.1.1 Properties of probabilities……Page 59
Principle of symmetry……Page 61
Subjective probabilities……Page 63
Relative frequency……Page 64
4.3 Conditional probability……Page 66
4.4 Independence……Page 73
4.5.1 The classical theory of probability……Page 75
4.5.2 Subjective probabilities……Page 78
4.5.3 The frequentist approach……Page 79
4.6 The historical roots of probability……Page 80
Exercises……Page 82
Further reading……Page 86
5.1 The concept of a random variable……Page 88
5.2 Properties of random variables……Page 90
5.2.2 Bernoulli random variables……Page 93
5.2.3 `Certain’ variables……Page 94
5.3 Expectation and variance……Page 96
5.4 Covariance and correlation……Page 101
5.5 Independent random variables……Page 104
5.6 I.I.D. random variables……Page 107
5.7 Binomial and Poisson random variables……Page 109
Geometric, negative binomial and hypergeometric random variables……Page 113
5.8.1 The negative binomial random variable……Page 115
5.8.2 The hypergeometric random variable……Page 116
5.8.3 Exercises……Page 117
Further reading……Page 122
6.1 What is information?……Page 123
6.2 Entropy……Page 126
6.3 Joint and conditional entropies; mutual information……Page 129
6.4 The maximum entropy principle……Page 133
6.5 Entropy, physics and life……Page 135
6.6 The uniqueness of entropy……Page 137
Exercises……Page 141
Further reading……Page 143
7.1 Transmission of information……Page 145
7.2 The channel capacity……Page 148
7.3 Codes……Page 150
7.4 Noiseless coding……Page 155
7.5.1 Decision rules……Page 161
7.5.3 Average error probability……Page 162
7.5.4 Transmission rate……Page 163
7.6 Brief remarks about the history of information theory……Page 168
Exercises……Page 169
Further reading……Page 171
8.1 Random variables with continuous ranges……Page 173
8.2 Probability density functions……Page 175
8.3 Discretisation and integration……Page 179
8.4 Laws of large numbers……Page 182
8.5 Normal random variables……Page 185
8.6 The central limit theorem……Page 190
8.7 Entropy in the continuous case……Page 197
Exercises……Page 200
Further reading……Page 204
9.1 Cartesian products……Page 206
9.2 Boolean algebras and measures on products……Page 209
9.3 Distributions of random vectors……Page 211
9.4 Marginal distributions……Page 217
9.5 Independence revisited……Page 219
9.6 Conditional densities and conditional entropy……Page 222
9.7 Mutual information and channel capacity……Page 226
Exercises……Page 230
Further reading……Page 234
10.1 Stochastic processes……Page 235
10.2.1 Definition and examples……Page 237
10.2.1 Calculating joint probabilities……Page 241
10.3 The Chapman–Kolmogorov equations……Page 242
10.4 Stationary processes……Page 245
10.5.1 Invariant distributions……Page 247
10.5.2 The detailed balance condition……Page 249
10.5.3 Limiting distributions……Page 250
10.5.4 Doubly stochastic matrices and information theory revisited……Page 252
10.6.1 The chain rule……Page 253
10.6.2 Entropy rates……Page 255
Exercises……Page 258
Further reading……Page 261
Exploring further……Page 263
Appendix 1: Proof by mathematical induction……Page 265
Appendix 2: Lagrange multipliers……Page 267
Appendix 3: Integration of exp -12×2……Page 270
Appendix 4: Table of probabilities associated with the standardnormal distribution……Page 272
Appendix 5: A rapid review of matrix algebra……Page 274
Selected solutions……Page 278
Index……Page 286

Reviews

There are no reviews yet.

Be the first to review “Probability and information: An integrated approach”
Shopping Cart
Scroll to Top