| 000 | 00000cam u22002054a 4500 | |
| 001 | 000046126282 | |
| 005 | 20220901131923 | |
| 008 | 220901s2002 maua b 001 0 eng | |
| 010 | ▼a 2002141574 | |
| 020 | ▼a 1402070640 (acid-free paper) | |
| 020 | ▼a 9781461353324 (pbk.) | |
| 035 | ▼a (KERIS)REF000006732839 | |
| 040 | ▼a DLC ▼c DLC ▼d DLC ▼d 211009 | |
| 042 | ▼a pcc | |
| 050 | 0 0 | ▼a Q360 ▼b .K28 2002 |
| 082 | 0 0 | ▼a 003/.54 ▼2 23 |
| 084 | ▼a 003.54 ▼2 DDCK | |
| 090 | ▼a 003.54 ▼b K12m | |
| 100 | 1 | ▼a Kåhre, Jan. |
| 245 | 1 4 | ▼a The mathematical theory of information / ▼c Jan Kåhre. |
| 260 | ▼a Boston : ▼b Kluwer Academic Publishers, ▼c c2002. | |
| 300 | ▼a xiv, 502 p. : ▼b ill. ; ▼c 24 cm. | |
| 490 | 1 | ▼a The Kluwer international series in engineering and computer science ; ▼v SECS 684 |
| 504 | ▼a Includes bibliographical references (p. 478-490) and index. | |
| 650 | 0 | ▼a Information theory. |
| 830 | 0 | ▼a Kluwer international series in engineering and computer science ; ▼v SECS 684. |
| 945 | ▼a ITMT |
소장정보
| No. | 소장처 | 청구기호 | 등록번호 | 도서상태 | 반납예정일 | 예약 | 서비스 |
|---|---|---|---|---|---|---|---|
| No. 1 | 소장처 과학도서관/Sci-Info(2층서고)/ | 청구기호 003.54 K12m | 등록번호 121260687 (1회 대출) | 도서상태 대출가능 | 반납예정일 | 예약 | 서비스 |
컨텐츠정보
책소개
The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron's law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making 'two plus two greater than four'. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation.
The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron's law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making 'two plus two greater than four'. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation.
정보제공 :
목차
Summary. Acknowledgements. 1. Introduction. 2. The Law of Diminishing Information. 3. General Properties of Information. 4. Specific Information Measures. 5. Selected Applications. 6. Infodynamics. 7. Statistical Information. 8. Algorithmic Information. 9. Continuous Systems. 10. Continuous Information. 11. Deterministic Dynamics. 12. Control and Communication. 13. Information Physics. 14. The Information Quantum. References. Symbols. Index. (Re)Search hints; J. Hajek.
