-
Home
- Fundamentals of electrical
- Information communication
- Entropy
Shannon showed the power of probabilistic models for symbolic-valued signals. The dey quantity that characterizes such a signal is the entropy of its alphabet.
Communication theory has been formulated best for
symbolic-valued signals. ClaudeShannon published in 1948
The Mathematical Theory
of Communication , which became the cornerstone of digital
communication. He showed the power of
probabilistic
models for symbolic-valued signals, which allowed him to
quantify the information present in a signal. In the simplestsignal model, each symbol can occur at index
with a probability
,
. What this model says is that for each signal value a
-sided coin is flipped (note that
the coin need not be fair). For this model to make sense, theprobabilities must be numbers between zero and one and must sum
to one.
This coin-flipping model assumes that symbols occur without
regard to what preceding or succeeding symbols were, a falseassumption for typed text. Despite this probabilistic
model's over-simplicity, the ideas we develop here alsowork when more accurate, but still probabilistic, models are
used. The key quantity that characterizes a symbolic-valuedsignal is the
entropy of its alphabet.
Because we use the base-2 logarithm, entropy has units of
bits. For this definition to make sense, we must take specialnote of symbols having probability zero of occurring. A
zero-probability symbol never occurs; thus, we define
so that such symbols do not affect the entropy. The
maximum value attainable by an alphabet's entropy occurswhen the symbols are equally likely
(
). In this case, the entropy equals
. The minimum value occurs when only one symbol
occurs; it has probability one of occurring and the rest haveprobability zero.
Derive the maximum-entropy results, both the
numeric aspect (entropy equals
) and the theoretical one (equally likely symbols
maximize entropy). Derive the value of the minimum entropyalphabet.
Equally likely symbols each have a probability of
. Thus,
. To prove that this is the maximum-entropy
probability assignment, we must explicitly take into accountthat probabilities sum to one. Focus on a particular
symbol, say the first.
appears
twice in the entropy
formula: the terms
and
. The derivative with respect to this probability
(and all the others) must be zero. The derivative equals
, and all other derivatives have the same form
(just substitute your letter's index). Thus, eachprobability must equal the others, and we are done. For the
minimum entropy answer, one term is
, and the others are
, which we define to be zero also. The minimum
value of entropy is zero.
Got questions? Get instant answers now!
A four-symbol alphabet has the following probabilities.
Note that these probabilities sum to one as they should. As
,
. The
entropy of this alphabet equals
Got questions? Get instant answers now!
Questions & Answers
discuss how the following factors such as predation risk, competition and habitat structure influence animal's foraging behavior in essay form
location of cervical vertebra
define biology infour way
Explain the following terms .
(1) Abiotic factors in an ecosystem
Abiotic factors are non living components of ecosystem.These include physical and chemical elements like temperature,light,water,soil,air quality and oxygen etc
Qasim
Define the term Abiotic
Marial
passive process of transport of low-molecular weight material according to its concentration gradient
AI-Robot
what is production?
Catherine
how did the oxygen help a human being
Got questions? Join the online conversation and get instant answers!
Source:
OpenStax, Fundamentals of electrical engineering i. OpenStax CNX. Aug 06, 2008 Download for free at http://legacy.cnx.org/content/col10040/1.9
Google Play and the Google Play logo are trademarks of Google Inc.