Information Theory
出版済み 3-5週間でお届けいたします。
情報理論:コーディングから学習へ From Coding to Learning Author: Polyanskiy, Yury (Massachusetts Institute of Technology) / Wu, Yihong (Yale University, Connecticut) Publisher: Cambridge University Press ISBN: 9781108832908 Cover: HARDCOVER Date: 2025年01月 DESCRIPTION この情報理論の基礎の熱意あふれる入門書には、古典的なシャノン理論から統計的学習における最新の応用までが盛り込まれています。210を超える学生用の演習を収録し、統計学、機械学習、最新のコミュニケーション理論の実用的応用に重点を置いています。オンラインの講師用解答が付属します。 This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning, equipping students with a uniquely well-rounded and rigorous foundation for further study. Introduces core topics such as data compression, channel coding, and rate-distortion theory using a unique finite block-length approach. With over 210 end-of-part exercises and numerous examples, students are introduced to contemporary applications in statistics, machine learning and modern communication theory. This textbook presents information-theoretic methods with applications in statistical learning and computer science, such as f-divergences, PAC Bayes and variational principle, Kolmogorov's metric entropy, strong data processing inequalities, and entropic upper bounds for statistical estimation. Accompanied by a solutions manual for instructors, and additional standalone chapters on more specialized topics in information theory, this is the ideal introductory textbook for senior undergraduate and graduate students in electrical engineering, statistics, and computer science. * Provides a systematic treatment of information-theoretic techniques in statistical learning and high-dimensional statistics * Develops information theory for both continuous and discrete variables providing examples relevant to statistical and machine learning applications * Focuses on finite block length (non-asymptotic) results, equipping students with information theory knowledge required for modern applications such as 6G and future network design * Advanced material suitable for skipping on first reading is clearly indicated, enabling a fast introduction to fundamental concepts which can be enhanced with additional material on re-reading TABLE OF CONTENTS Part I. Information measures: 1. Entropy 2. Divergence 3. Mutual information 4. Variational characterizations and continuity of information measures 5. Extremization of mutual information: capacity saddle point 6. Tensorization and information rates 7. f-divergences 8. Entropy method in combinatorics and geometry 9. Random number generators Part II. Lossless Data Compression: 10. Variable-length compression 11. Fixed-length compression and Slepian-Wolf theorem 12. Entropy of ergodic processes 13. Universal compression Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma 15. Information projection and large deviations 16. Hypothesis testing: error exponents Part IV. Channel Coding: 17. Error correcting codes 18. Random and maximal coding 19. Channel capacity 20. Channels with input constraints. Gaussian channels 21. Capacity per unit cost 22. Strong converse. Channel dispersion. Error exponents. Finite blocklength 23. Channel coding with feedback Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory 25. Rate distortion: achievability bounds 26. Evaluating rate-distortion function. Lossy Source-Channel separation 27. Metric entropy Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics 30. Mutual information method 31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation 33. Strong data processing inequality.
![]()
|