Entropy and Information Theory

Free Download

Authors:

Size: 1 MB (1535840 bytes)

Pages: 311/311

File format:

Language:

Category:

Gray R.M.


Table of contents :
Prologue……Page 11
Probability Spaces and Random Variables……Page 23
Random Processes and Dynamical Systems……Page 27
Distributions……Page 28
Standard Alphabets……Page 32
Expectation……Page 33
Asymptotic Mean Stationarity……Page 36
Ergodic Properties……Page 37
Entropy and Entropy Rate……Page 39
Basic Properties of Entropy……Page 42
Entropy Rate……Page 53
Conditional Entropy and Information……Page 57
Entropy Rate Revisited……Page 63
Relative Entropy Densities……Page 66
Introduction……Page 69
Stationary Ergodic Sources……Page 72
Stationary Nonergodic Sources……Page 78
AMS Sources……Page 81
The Asymptotic Equipartition Property……Page 85
Stationary Codes and Approximation……Page 87
Information Rate of Finite Alphabet Processes……Page 95
Divergence……Page 99
Conditional Relative Entropy……Page 114
Limiting Entropy Densities……Page 126
Information for General Alphabets……Page 128
Some Convergence Results……Page 138
Information Rates for General Alphabets……Page 141
A Mean Ergodic Theorem for Densities……Page 144
Information Rates of Stationary Processes……Page 146
Relative Entropy Densities and Rates……Page 153
Markov Dominating Measures……Page 156
Stationary Processes……Page 159
Mean Ergodic Theorems……Page 162
Stationary Ergodic Sources……Page 167
Stationary Nonergodic Sources……Page 172
AMS Sources……Page 175
Ergodic Theorems for Information Densities…….Page 178
Introduction……Page 181
Channels……Page 182
Stationarity Properties of Channels……Page 184
Examples of Channels……Page 187
The Rohlin-Kakutani Theorem……Page 207
Distortion and Fidelity Criteria……Page 213
Performance……Page 215
The rho-bar distortion……Page 217
d-bar Continuous Channels……Page 219
The Distortion-Rate Function……Page 223
Block Source Codes for AMS Sources……Page 233
Block Coding Stationary Sources……Page 243
Block Coding AMS Ergodic Sources……Page 244
Subadditive Fidelity Criteria……Page 250
Asynchronous Block Codes……Page 252
Sliding Block Source Codes……Page 254
A Geometric Interpretation of OPTA’s……Page 263
Noisy Channels……Page 265
Feinstein’s Lemma……Page 266
Feinstein’s Theorem……Page 269
Channel Capacity……Page 271
Robust Block Codes……Page 276
Block Coding Theorems for Noisy Channels……Page 279
Joint Source and Channel Block Codes……Page 280
Synchronizing Block Channel Codes……Page 283
Sliding Block Source and Channel Coding……Page 287
Bibliography……Page 296
Index……Page 297

Reviews

There are no reviews yet.

Be the first to review “Entropy and Information Theory”
Shopping Cart
Scroll to Top