정보 이론
노트
위키데이터
- ID : Q131222
말뭉치
- How information theory bears on the design and operation of modern-day systems such as smartphones and the Internet.[1]
- Information theory , a mathematical representation of the conditions and parameters affecting the transmission and processing of information .[2]
- The real birth of modern information theory can be traced to the publication in 1948 of Claude Shannon’s “A Mathematical Theory of Communication” in the Bell System Technical Journal.[2]
- Since the 1940s and ’50s the principles of classical information theory have been applied to many fields.[2]
- Indeed, even in Shannon’s day many books and articles appeared that discussed the relationship between information theory and areas such as art and business.[2]
- Information theory holds the exciting answer to these questions.[3]
- A year after he founded and launched information theory, Shannon published a paper that proved that unbreakable cryptography was possible.[4]
- From the beginning of the development of information theory, it was known more how to measure information than what information is.[5]
- In information theory, we think about the noisy communication channel which is used to communicate some events from one side to the other.[6]
- Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPEGs), and channel coding (e.g. for DSL).[7]
- Information theory studies the transmission, processing, extraction, and utilization of information.[7]
- Concepts, methods and results from coding theory and information theory are widely used in cryptography and cryptanalysis.[7]
- Much of the mathematics behind information theory with events of different probabilities were developed for the field of thermodynamics by Ludwig Boltzmann and J. Willard Gibbs.[7]
- The authors analysed in detail the relative strengths and weaknesses of information theory and Bayesian decoders when they are used to rule out neural codes.[8]
- We present a minimal formal model grounded in information theory and selection, in which successive generations of agents are mapped into transmitters and receivers of a coded message.[9]
- We adopt an information theory perspective in which agents are inference devices interacting with a Boolean environment.[9]
- As will be shown below, it is possible to derive the critical conditions to survive as a function of the agent’s complexity and to connect these conditions to information theory.[9]
- A satisfactory connection between natural selection and information theory can be obtained by mapping our survival function ρ into Shannon’s transmitter–receiver scheme.[9]
소스
- ↑ EE 376A: Information Theory
- ↑ 2.0 2.1 2.2 2.3 Information theory | mathematics
- ↑ What is information theory? (video)
- ↑ Claude E. Shannon: Founder of Information Theory
- ↑ Information Theory: a Multifaceted Model of Information
- ↑ What is the “Information” in Information Theory?
- ↑ 7.0 7.1 7.2 7.3 Information theory
- ↑ Extracting information from neuronal populations: information theory and decoding approaches
- ↑ 9.0 9.1 9.2 9.3 Information theory, predictability and the emergence of complex life