Information theory is the study of how the transfer of information can be done most effectively.
The so-called statistical information theory was founded in the mid-1900s by Claude Shannon as part of a communication theory he developed, where the information theoretical concept of entropy plays a central role. In his theory, information measure is a measure of the increased determination degree you will be provided a variety of possible alternatives reduced to a smaller amount of options. This is measured as the logarithm of the ratio of the number of options before and after the reduction.
This information measure is used inter alia in communication theory for calculating a channel capacity to transfer information and in coding theory for calculating the redundancy, and the degree of data compression possible.
An example of areas in which the theory has gained much importance is telecommunications, computer technology, space technology, and biology. In most cases, this involves sending a complete message from a sender to a receiver.
The history of the information theory dates back to the 1940s, when Claude Elwood Shannon delivered significant contributions to the theory of data transmission and probability theory.
He wondered how to ensure lossless data transmission channels via electronic (and optical today). This involves, in particular, to separate the data signals from the background noise. He was also trying to identify errors occurring during the transmission and correct it. The additional redundant (bearing no additional information) is necessary for the data to be sent and received, as well as to be verified or corrected.
It is doubtful and was not claimed by Shannon that his 1948 study published A Mathematical Theory of Communication is of substantial importance for issues outside the communications technology. The concept of entropy used by him, associated with the thermodynamics is a formal analogy for a mathematical expression. In general can be defined as an engineering theory at a high abstraction level of information theory. It shows the trend for the scientific nature of art, which led to the formation of the engineering sciences. The reference point of Shannon’s theory is the accelerated development of electrical communications technology with its forms of telegraphy, telephony, radio, and television in the first half of the 20th Century. Before and next to Shannon there were Harry Nyquist, RVL Hartley, and Karl Küpfmüller who made important contributions to the theory of communication engineering. Norbert Wiener gave mathematical clarifications of relevance to the information theory, which helped its considerable publicity in the context of his reflections on cybernetics.
If you need a good source of a highly urgent data you will need to look through free sample research papers on information theory, which will give a clear understanding of the issue.
Are you looking for a top-notch custom research paper on Information Theory topics? Is confidentiality as important to you as the high quality of the product?
Try our writing service at EssayLib.com! We can offer you professional assistance at affordable rates. Our experienced PhD and Master’s writers are ready to take into account your smallest demands. We guarantee you 100% authenticity of your paper and assure you of dead on time delivery. Proceed with the order form:
Please, feel free to visit us at EssayLib.com and learn more about our service!