A Short Course in Information Theory by David J. C. MacKay

A Short Course in Information Theory

A Short Course in Information Theory by David J. C. MacKay
Publisher: University of Cambridge 1995
Is it possible to communicate reliably from one point to another if we only have a noisy communication channel? How can the information content of a random variable be measured? This course will discuss the remarkable theorems of Claude Shannon, starting from the source coding theorem, which motivates the entropy as the measure of information, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.
Computers & Internet Computer Science Information & Coding Theory



More Free E-Books For Information & Coding Theory


Similar Books For Information & Coding Theory

1. Information, Entropy and Their Geometric Structures by Frederic Barbaresco, Ali Mohammad-Djafari
2. Essential Coding Theory by Venkatesan Guruswami, Atri Rudra, Madhu Sudan
3. Data Compression Explained by Matt Mahoney
4. A primer on information theory, with applications to neuroscience by Felix Effenberger
5. Data Compression by
6. From Classical to Quantum Shannon Theory by Mark M. Wilde
7. Logic and Information by Keith Devlin
8. Conditional Rate Distortion Theory by Robert M. Gray
9. Algorithmic Information Theory by Peter D. Gruenwald, Paul M.B. Vitanyi
10. The Limits of Mathematics by Gregory J. Chaitin
11. Quantum Information Theory by Renato Renner
12. Theory of Quantum Information by John Watrous
13. Generalized Information Measures and Their Applications by Inder Jeet Taneja
14. Information Theory, Excess Entropy and Statistical Complexity by David Feldman
15. Information Theory and Statistical Physics by Neri Merhav
16. Quantum Information Theory by Robert H. Schumann
17. Lecture Notes on Network Information Theory by Abbas El Gamal, Young-Han Kim
18. Information-Theoretic Incompleteness by Gregory J. Chaitin
19. A Short Course in Information Theory by David J. C. MacKay
20. Exploring Randomness by Gregory J. Chaitin
21. Entropy and Information Theory by Robert M. Gray
22. Information Theory and Coding by John Daugman
23. Network Coding Theory by Raymond Yeung, S-Y Li, N Cai
24. Algorithmic Information Theory by Gregory. J. Chaitin
25. A Mathematical Theory of Communication by Claude Shannon
26. Information Theory, Inference, and Learning Algorithms by David J. C. MacKay



Categories