The Mathematical Theory of Information

The Mathematical Theory of Information
Author: Jan Kåhre
Publisher: Springer Science & Business Media
Total Pages: 528
Release: 2002-06-30
Genre: Technology & Engineering
ISBN: 9781402070648

The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron's law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making 'two plus two greater than four'. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation.

The Mathematical Theory of Communication

The Mathematical Theory of Communication
Author: Claude E Shannon
Publisher: University of Illinois Press
Total Pages: 141
Release: 1998-09-01
Genre: Language Arts & Disciplines
ISBN: 025209803X

Scientific knowledge grows at a phenomenal pace--but few books have had as lasting an impact or played as important a role in our modern world as The Mathematical Theory of Communication, published originally as a paper on communication theory more than fifty years ago. Republished in book form shortly thereafter, it has since gone through four hardcover and sixteen paperback printings. It is a revolutionary work, astounding in its foresight and contemporaneity. The University of Illinois Press is pleased and honored to issue this commemorative reprinting of a classic.

Mathematical Foundations of Information Theory

Mathematical Foundations of Information Theory
Author: Aleksandr I?Akovlevich Khinchin
Publisher: Courier Corporation
Total Pages: 130
Release: 1957-01-01
Genre: Mathematics
ISBN: 0486604349

First comprehensive introduction to information theory explores the work of Shannon, McMillan, Feinstein, and Khinchin. Topics include the entropy concept in probability theory, fundamental theorems, and other subjects. 1957 edition.

Mathematical Theory of Entropy

Mathematical Theory of Entropy
Author: Nathaniel F. G. Martin
Publisher: Cambridge University Press
Total Pages: 292
Release: 2011-06-02
Genre: Computers
ISBN: 9780521177382

This excellent 1981 treatment of the mathematical theory of entropy gives an accessible exposition its application to other fields.

Entropy and Information Theory

Entropy and Information Theory
Author: Robert M. Gray
Publisher: Springer Science & Business Media
Total Pages: 346
Release: 2013-03-14
Genre: Computers
ISBN: 1475739826

This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behavior of sample information and expected information. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.

Information Theory

Information Theory
Author: JV Stone
Publisher: Sebtel Press
Total Pages: 243
Release: 2015-01-01
Genre: Business & Economics
ISBN: 0956372856

Originally developed by Claude Shannon in the 1940s, information theory laid the foundations for the digital revolution, and is now an essential tool in telecommunications, genetics, linguistics, brain sciences, and deep space communication. In this richly illustrated book, accessible examples are used to introduce information theory in terms of everyday games like ‘20 questions’ before more advanced topics are explored. Online MatLab and Python computer programs provide hands-on experience of information theory in action, and PowerPoint slides give support for teaching. Written in an informal style, with a comprehensive glossary and tutorial appendices, this text is an ideal primer for novices who wish to learn the essential principles and applications of information theory.

Information: A Very Short Introduction

Information: A Very Short Introduction
Author: Luciano Floridi
Publisher: Oxford University Press
Total Pages: 153
Release: 2010-02-25
Genre: Computers
ISBN: 0199551375

Introduction; 1 The information revolution; 2 The language of information; 3 Mathematical information; 4 Semantic information; 5 Physical information; 6 Biological information; 7 Economic information; 8 The ethics of information; Conclusion; References.

A Mathematical Theory of Evidence

A Mathematical Theory of Evidence
Author: Glenn Shafer
Publisher: Princeton University Press
Total Pages:
Release: 2020-06-30
Genre: Mathematics
ISBN: 0691214697

Both in science and in practical affairs we reason by combining facts only inconclusively supported by evidence. Building on an abstract understanding of this process of combination, this book constructs a new theory of epistemic probability. The theory draws on the work of A. P. Dempster but diverges from Depster's viewpoint by identifying his "lower probabilities" as epistemic probabilities and taking his rule for combining "upper and lower probabilities" as fundamental. The book opens with a critique of the well-known Bayesian theory of epistemic probability. It then proceeds to develop an alternative to the additive set functions and the rule of conditioning of the Bayesian theory: set functions that need only be what Choquet called "monotone of order of infinity." and Dempster's rule for combining such set functions. This rule, together with the idea of "weights of evidence," leads to both an extensive new theory and a better understanding of the Bayesian theory. The book concludes with a brief treatment of statistical inference and a discussion of the limitations of epistemic probability. Appendices contain mathematical proofs, which are relatively elementary and seldom depend on mathematics more advanced that the binomial theorem.