Skip to content

Booksellers & Trade Customers: Sign up for online bulk buying at trade.atlanticbooks.com for wholesale discounts

Booksellers: Create Account on our B2B Portal for wholesale discounts

Mathematical Foundations of Information Theory

by Alexander I. Khinchin
Save 30% Save 30%
Current price ₹704.00
Original price ₹1,008.00
Original price ₹1,008.00
Original price ₹1,008.00
(-30%)
₹704.00
Current price ₹704.00

Imported Edition - Ships in 18-21 Days

Free Shipping in India on orders above Rs. 500

Request Bulk Quantity Quote
+91
Book cover type: Paperback
  • ISBN13: 9780486604343
  • Binding: Paperback
  • Subject: N/A
  • Publisher: Dover Publications
  • Publisher Imprint: Dover Publications
  • Publication Date:
  • Pages: 128
  • Original Price: USD 10.95
  • Language: English
  • Edition: N/A
  • Item Weight: 146 grams
  • BISAC Subject(s): Philosophy & Social Aspects and General

The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists, and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.
In his first paper, Dr. Khinchin develops the concept of entropy in probability theory as a measure of uncertainty of a finite "scheme," and discusses a simple application to coding theory. The second paper investigates the restrictions previously placed on the study of sources, channels, and codes and attempts "to give a complete, detailed proof of both ... Shannon theorems, assuming any ergodic source and any stationary channel with a finite memory."
Partial Contents: I. The Entropy Concept in Probability Theory -- Entropy of Finite Schemes. The Uniqueness Theorem. Entropy of Markov chains. Application to Coding Theory. II. On the Fundamental Theorems of Information Theory -- Two generalizations of Shannon's inequality. Three inequalities of Feinstein. Concept of a source. Stationarity. Entropy. Ergodic sources. The E property. The martingale concept. Noise. Anticipation and memory. Connection of the channel to the source. Feinstein's Fundamental Lemma. Coding. The first Shannon theorem. The second Shannon theorem.

Trusted for over 48 years

Family Owned Company

Secure Payment

All Major Credit Cards/Debit Cards/UPI & More Accepted

New & Authentic Products

India's Largest Distributor

Need Support?

Whatsapp Us