In the late 1940s Claude Shannon, a research mathematician at Bell Telephone Laboratories, invented a mathematical theory of communication that gave the first systematic framework in which to optimally design telephone systems. The main questions motivating this were how to design telephone systems to carry the maximum amount of information and how to correct for distortions on the lines.
His ground-breaking approach introduced a simple abstraction of human communication, called the channel. Shannon's communication channel consisted of a sender (a source of information), a transmission medium (with noise and distortion), and a receiver (whose goal is to reconstruct the sender's messages).
In order to quantitatively analyze transmission through the channel he also introduced a measure of the amount of information in a message. To Shannon the amount of information is a measure of surprise and is closely related to the chance of one of several messages being transmitted. For Shannon a message is very informative if the chance of its occurrence is small. If, in contrast, a message is very predictable, then it has a small amount of information---one is not surprised to receive it.
To complete his quantitative analysis of the communication channel, Shannon introduced the entropy rate, a quantity that measured a source's information production rate and also a measure of the information carrying capacity, called the communication channel capacity.
He showed that if the entropy rate, the amount of information you wish to transmit, exceeds the channel capacity, then there were unavoidable and uncorrectable errors in the transmission. This is intuitive enough. What was truly surprising, though, is that he also showed that if the sender's entropy rate is below the channel capacity, then there is a way to encode the information so that it can be received without errors. This is true even if the channel distorts the message during transmission.
Shannon adapted his theory to analyze ordinary human (written) language. He showed that it is quite redundant, using more symbols and words than necessary to convey messages. Presumably, this redundancy is used by us to improve our ability to recognize messages reliably and to communicate different types of information.
In the study of complex systems today, we view deterministic chaotic processes as information sources and use Shannon's entropy rate, as adapted by Kolmogorov and his student Y. Sinai in the late 1950s, to measure how random a chaotic system is.
SIMPLIFIED VERSION:
- There are four kinds of communication contexts: physical (the environment where communication occurs); social/psychological (which includes the status of a relationship or the seriousness of the situation); temporal (the time of the interaction); and cultural (the backgrounds of the people communicating).
- When you speak to someone, you are the source of the message. When you listen to someone, you are the receiver. These two functions are not mutually exclusive, as we send and receive messages simultaneously in conversations.
- We send messages both verbally and non verbally, and they can be transmitted from a combination of all of our senses.
- Channels are the mediums used in communication, such as the telephone or chat room.
- Noise is anything that prevents you from receiving a message. Noise can include thoughts that distract you or loud noises that make it difficult to hear.
- There are effects to every communication act that we engage in. The effects can be cognitive (which changes our thinking); affective (which changes our feelings); or psycho motor (which affects bodily movements).
Context
Sources and Receivers
Messages
Channels
Noise
Effects
References:
- http://www.ehow.com/facts_5297430_human-communication-theory.html
- http://www.exploratorium.edu/complexity/CompLexicon/Shannon.html
DONE BY: ISHVERJIT SINGH
No comments:
Post a Comment