"정보 이론"의 두 판 사이의 차이

수학노트
둘러보기로 가기 검색하러 가기
(→‎메타데이터: 새 문단)
 
25번째 줄: 25번째 줄:
 
  <references />
 
  <references />
  
== 메타데이터 ==
+
==메타데이터==
 
 
 
===위키데이터===
 
===위키데이터===
 
* ID :  [https://www.wikidata.org/wiki/Q131222 Q131222]
 
* ID :  [https://www.wikidata.org/wiki/Q131222 Q131222]
 +
===Spacy 패턴 목록===
 +
* [{'LOWER': 'information'}, {'LEMMA': 'theory'}]
 +
* [{'LOWER': 'information'}, {'LEMMA': 'theory'}]

2021년 2월 17일 (수) 00:49 기준 최신판

노트

위키데이터

말뭉치

  1. How information theory bears on the design and operation of modern-day systems such as smartphones and the Internet.[1]
  2. Information theory , a mathematical representation of the conditions and parameters affecting the transmission and processing of information .[2]
  3. The real birth of modern information theory can be traced to the publication in 1948 of Claude Shannon’s “A Mathematical Theory of Communication” in the Bell System Technical Journal.[2]
  4. Since the 1940s and ’50s the principles of classical information theory have been applied to many fields.[2]
  5. Indeed, even in Shannon’s day many books and articles appeared that discussed the relationship between information theory and areas such as art and business.[2]
  6. Information theory holds the exciting answer to these questions.[3]
  7. A year after he founded and launched information theory, Shannon published a paper that proved that unbreakable cryptography was possible.[4]
  8. From the beginning of the development of information theory, it was known more how to measure information than what information is.[5]
  9. In information theory, we think about the noisy communication channel which is used to communicate some events from one side to the other.[6]
  10. Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPEGs), and channel coding (e.g. for DSL).[7]
  11. Information theory studies the transmission, processing, extraction, and utilization of information.[7]
  12. Concepts, methods and results from coding theory and information theory are widely used in cryptography and cryptanalysis.[7]
  13. Much of the mathematics behind information theory with events of different probabilities were developed for the field of thermodynamics by Ludwig Boltzmann and J. Willard Gibbs.[7]
  14. The authors analysed in detail the relative strengths and weaknesses of information theory and Bayesian decoders when they are used to rule out neural codes.[8]
  15. We present a minimal formal model grounded in information theory and selection, in which successive generations of agents are mapped into transmitters and receivers of a coded message.[9]
  16. We adopt an information theory perspective in which agents are inference devices interacting with a Boolean environment.[9]
  17. As will be shown below, it is possible to derive the critical conditions to survive as a function of the agent’s complexity and to connect these conditions to information theory.[9]
  18. A satisfactory connection between natural selection and information theory can be obtained by mapping our survival function ρ into Shannon’s transmitter–receiver scheme.[9]

소스

메타데이터

위키데이터

Spacy 패턴 목록

  • [{'LOWER': 'information'}, {'LEMMA': 'theory'}]
  • [{'LOWER': 'information'}, {'LEMMA': 'theory'}]