info theory lectures

Upload: chikyip

Post on 06-Apr-2018

221 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/3/2019 Info Theory Lectures

    1/75

    Information Theory and Coding

    Computer Science Tripos Part II, Michaelmas Term11 Lectures by J G Daugman

    1. Foundations: Probability, Uncertainty, and Information

    2. Entropies Defined, and Why they are Measures of Information

    3. Source Coding Theorem; Prefix, Variable-, & Fixed-Length Codes

    4. Channel Types, Properties, Noise, and Channel Capacity

    5. Continuous Information; Density; Noisy Channel Coding Theorem

    6. Fourier Series, Convergence, Orthogonal Representation

    7. Useful Fourier Theorems; Transform Pairs; Sampling; Aliasing

    8. Discrete Fourier Transform. Fast Fourier Transform Algorithms

    9. The Quantized Degrees-of-Freedom in a Continuous Signal

    10. Gabor-Heisenberg-Weyl Uncertainty Relation. Optimal Logons

    11. Kolmogorov Complexity and Minimal Description Length

  • 8/3/2019 Info Theory Lectures

    2/75

    Information Theory and Coding

    J G Daugman

    Prerequisite courses: Probability; Mathematical Methods for CS; Discrete Mathematics

    Aims

    The aims of this course are to introduce the principles and applications of information theory.The course will study how information is measured in terms of probability and entropy, and therelationships among conditional and joint entropies; how these are used to calculate the capacityof a communication channel, with and without noise; coding schemes, including error correctingcodes; how discrete channels and measures of information generalise to their continuous forms;the Fourier perspective; and extensions to wavelets, complexity, compression, and efficient codingof audio-visual information.

    Lectures

    Foundations: probability, uncertainty, information. How concepts of randomness,redundancy, compressibility, noise, bandwidth, and uncertainty are related to information.Ensembles, random variables, marginal and conditional probabilities. How the metrics ofinformation are grounded in the rules of probability.

    Entropies defined, and why they are measures of information. Marginal entropy, joint entropy, conditional entropy, and the Chain Rule for entropy. Mutual informationbetween ensembles of random variables. Why entropy is the fundamental measure of infor-mation content.

    Source coding theorem; prefix, variable-, and fixed-length codes. Symbol codes.The binary symmetric channel. Capacity of a noiseless discrete channel. Error correctingcodes.

    Channel types, properties, noise, and channel capacity. Perfect communicationthrough a noisy channel. Capacity of a discrete channel as the maximum of its mutualinformation over all possible input distributions.

    Continuous information; density; noisy channel coding theorem. Extensions of thediscrete entropies and measures to the continuous case. Signal-to-noise ratio; power spectraldensity. Gaussian channels. Relative significance of bandwidth and noise limitations. The

    Shannon rate limit and efficiency for noisy continuous channels. Fourier series, convergence, orthogonal representation. Generalised signal expan-

    sions in vector spaces. Independence. Representation of continuous or discrete data bycomplex exponentials. The Fourier basis. Fourier series for periodic functions. Examples.

    Useful Fourier theorems; transform pairs. Sampling; aliasing. The Fourier trans-form for non-periodic functions. Properties of the transform, and examples. NyquistsSampling Theorem derived, and the cause (and removal) of aliasing.

    Discrete Fourier transform. Fast Fourier Transform Algorithms. Efficient al-gorithms for computing Fourier transforms of discrete data. Computational complexity.

    Filters, correlation, modulation, demodulation, coherence.

  • 8/3/2019 Info Theory Lectures

    3/75

    The quantised degrees-of-freedom in a continuous signal. Why a continuous sig-nal of finite bandwidth and duration has a fixed number of degrees-of-freedom. Diverseillustrations of the principle that information, even in such a signal, comes in quantised,countable, packets.

    Gabor-Heisenberg-Weyl uncertainty relation. Optimal Logons. Unification ofthe time-domain and the frequency-domain as endpoints of a continuous deformation. The

    Uncertainty Principle and its optimal solution by Gabors expansion basis of logons.Multi-resolution wavelet codes. Extension to images, for analysis and compression.

    Kolmogorov complexity. Minimal description length. Definition of the algorithmiccomplexity of a data sequence, and its relation to the entropy of the distribution fromwhich the data was drawn. Fractals. Minimal description length, and why this measure ofcomplexity is not computable.

    Objectives

    At the end of the course students should be able to

    calculate the information content of a random variable from its probability distribution

    relate the joint, conditional, and marginal entropies of variables in terms of their coupledprobabilities

    define channel capacities and properties using Shannons Theorems

    construct efficient codes for data on imperfect communication channels

    generalise the discrete concepts to continuous signals on continuous channels

    understand Fourier Transforms and the main ideas of efficient algorithms for them describe the information resolution and compression properties of wavelets

    Recommended book

    * Cover, T.M. & Thomas, J.A. (1991). Elements of information theory. New York: Wiley.

  • 8/3/2019 Info Theory Lectures

    4/75

    Information Theory and Coding

    Computer Science Tripos Part II, Michaelmas Term

    11 lectures by J G Daugman

    1. Overview: What is Information Theory?

    Key idea: The movements and transformations of information, just like

    those of a fluid, are constrained by mathematical and physical laws.

    These laws have deep connections with: probability theory, statistics, and combinatorics

    thermodynamics (statistical physics)

    spectral analysis, Fourier (and other) transforms

    sampling theory, prediction, estimation theory

    electrical engineering (bandwidth; signal-to-noise ratio) complexity theory (minimal description length)

    signal processing, representation, compressibility

    As such, information theory addresses and answers the

    two fundamental questions of communication theory:

    1. What is the ultimate data compression?(answer: the entropy of the data, H, is its compression limit.)

    2. What is the ultimate transmission rate of communication?

    (answer: the channel capacity, C, is its rate limit.)

    All communication schemes lie in between these two limits on the com-

    pressibility of data and the capacity of a channel. Information theory

    can suggest means to achieve these theoretical limits. But the subjectalso extends far beyond communication theory.

    1

  • 8/3/2019 Info Theory Lectures

    5/75

    Important questions... to which Information Theory offers answers:

    How should information be measured?

    How much additional information is gained by some reduction in

    uncertainty?

    How do the a priori probabilities of possible messages determine

    the informativeness of receiving them?

    What is the information content of a random variable?

    How does the noise level in a communication channel limit its

    capacity to transmit information? How does the bandwidth (in cycles/second) of a communication

    channel limit its capacity to transmit information?

    By what formalism should prior knowledge be combined with

    incoming data to draw formally justifiable inferences from both?

    How much information in contained in a strand of DNA?

    How much information is there in the firing pattern of a neurone?

    Historical origins and important contributions:

    Ludwig BOLTZMANN (1844-1906), physicist, showed in 1877 that

    thermodynamic entropy (defined as the energy of a statistical en-semble [such as a gas] divided by its temperature: ergs/degree) is

    related to the statistical distribution of molecular configurations,

    with increasing entropy corresponding to increasing randomness. He

    made this relationship precise with his famous formula S= k logW

    where S defines entropy, W is the total number of possible molec-

    ular configurations, and k is the constant which bears Boltzmanns

    name: k =1.38 x 1016

    ergs per degree centigrade. (The aboveformula appears as an epitaph on Boltzmanns tombstone.) This is

    2

  • 8/3/2019 Info Theory Lectures

    6/75

    equivalent to the definition of the information (negentropy) in an

    ensemble, all of whose possible states are equiprobable, but with a

    minus sign in front (and when the logarithm is base 2, k=1.) The

    deep connections between Information Theory and that branch of

    physics concerned with thermodynamics and statistical mechanics,hinge upon Boltzmanns work.

    Leo SZILARD (1898-1964) in 1929 identified entropy with informa-

    tion. He formulated key information-theoretic concepts to solve the

    thermodynamic paradox known as Maxwells demon (a thought-

    experiment about gas molecules in a partitioned box) by showing

    that the amount of information required by the demon about the

    positions and velocities of the molecules was equal (negatively) tothe demons entropy increment.

    James Clerk MAXWELL (1831-1879) originated the paradox called

    Maxwells Demon which greatly influenced Boltzmann and which

    led to the watershed insight for information theory contributed by

    Szilard. At Cambridge, Maxwell founded the Cavendish Laboratory

    which became the original Department of Physics.

    R V HARTLEY in 1928 founded communication theory with his

    paper Transmission of Information. He proposed that a signal

    (or a communication channel) having bandwidth over a duration

    T has a limited number of degrees-of-freedom, namely 2T, and

    therefore it can communicate at most this quantity of information.

    He also defined the information content of an equiprobable ensembleofN possible states as equal to log2 N.

    Norbert WIENER (1894-1964) unified information theory and Fourier

    analysis by deriving a series of relationships between the two. He

    invented white noise analysis of non-linear systems, and made the

    definitive contribution to modeling and describing the information

    content of stochastic processes known as Time Series.

    3

  • 8/3/2019 Info Theory Lectures

    7/75

    Dennis GABOR (1900-1979) crystallized Hartleys insight by formu-

    lating a general Uncertainty Principle for information, expressing

    the trade-off for resolution between bandwidth and time. (Signals

    that are well specified in frequency content must be poorly localized

    in time, and those that are well localized in time must be poorlyspecified in frequency content.) He formalized the Information Di-

    agram to describe this fundamental trade-off, and derived the con-

    tinuous family of functions which optimize (minimize) the conjoint

    uncertainty relation. In 1974 Gabor won the Nobel Prize in Physics

    for his work in Fourier optics, including the invention of holography.

    Claude SHANNON (together with Warren WEAVER) in 1949 wrote

    the definitive, classic, work in information theory: MathematicalTheory of Communication. Divided into separate treatments for

    continuous-time and discrete-time signals, systems, and channels,

    this book laid out all of the key concepts and relationships that de-

    fine the field today. In particular, he proved the famous Source Cod-

    ing Theorem and the Noisy Channel Coding Theorem, plus many

    other related results about channel capacity.

    S KULLBACK and R A LEIBLER (1951) defined relative entropy

    (also called information for discrimination, or K-L Distance.)

    E T JAYNES (since 1957) developed maximum entropy methods

    for inference, hypothesis-testing, and decision-making, based on the

    physics of statistical mechanics. Others have inquired whether these

    principles impose fundamental physical limits to computation itself. A N KOLMOGOROV in 1965 proposed that the complexity of a

    string of data can be defined by the length of the shortest binary

    program for computing the string. Thus the complexity of data

    is its minimal description length, and this specifies the ultimate

    compressibility of data. The Kolmogorov complexityKof a string

    is approximately equal to its Shannon entropy H, thereby unifying

    the theory of descriptive complexity and information theory.

    4

  • 8/3/2019 Info Theory Lectures

    8/75

    2. Mathematical Foundations; Probability Rules;

    Bayes Theorem

    What are random variables? What is probability?

    Random variables are variables that take on values determined by prob-

    ability distributions. They may be discrete or continuous, in either their

    domain or their range. For example, a stream of ASCII encoded text

    characters in a transmitted message is a discrete random variable, with

    a known probability distribution for any given natural language. An

    analog speech signal represented by a voltage or sound pressure wave-

    form as a function of time (perhaps with added noise), is a continuousrandom variable having a continuous probability density function.

    Most of Information Theory involves probability distributions of ran-

    dom variables, and conjoint or conditional probabilities defined over

    ensembles of random variables. Indeed, the information content of a

    symbol or event is defined by its (im)probability. Classically, there are

    two different points of view about what probability actually means:

    relative frequency: sample the random variable a great many times

    and tally up the fraction of times that each of its different possible

    values occurs, to arrive at the probability of each.

    degree-of-belief: probability is the plausibility of a proposition or

    the likelihood that a particular state (or value of a random variable)

    might occur, even if its outcome can only be decided once (e.g. the

    outcome of a particular horse-race).

    The first view, the frequentist or operationalist view, is the one that

    predominates in statistics and in information theory. However, by no

    means does it capture the full meaning of probability. For example,

    the proposition that "The moon is made of green cheese" is one

    which surely has a probability that we should be able to attach to it.We could assess its probability by degree-of-belief calculations which

    5

  • 8/3/2019 Info Theory Lectures

    9/75

    combine our prior knowledge about physics, geology, and dairy prod-

    ucts. Yet the frequentist definition of probability could only assign

    a probability to this proposition by performing (say) a large number

    of repeated trips to the moon, and tallying up the fraction of trips on

    which the moon turned out to be a dairy product....

    In either case, it seems sensible that the less probable an event is, the

    more information is gained by noting its occurrence. (Surely discovering

    that the moon IS made of green cheese would be more informative

    than merely learning that it is made only of earth-like rocks.)

    Probability Rules

    Most of probability theory was laid down by theologians: Blaise PAS-

    CAL (1623-1662) who gave it the axiomatization that we accept today;

    and Thomas BAYES (1702-1761) who expressed one of its most impor-

    tant and widely-applied propositions relating conditional probabilities.

    Probability Theory rests upon two rules:

    Product Rule:

    p(A,B) = joint probability ofbothA andB

    = p(A|B)p(B)

    or equivalently,= p(B|A)p(A)

    Clearly, in case A and B are independent events, they are not condi-

    tionalized on each other and so

    p(A|B) = p(A)

    and p(B|A) = p(B),

    in which case their joint probability is simply p(A,B) = p(A)p(B).

    6

  • 8/3/2019 Info Theory Lectures

    10/75

    Sum Rule:

    If event A is conditionalized on a number of other events B, then the

    total probability ofA is the sum of its joint probabilities with all B:

    p(A) =Bp(A,B) =

    Bp(A|B)p(B)

    From the Product Rule and the symmetry that p(A,B) = p(B,A), it

    is clear that p(A|B)p(B) = p(B|A)p(A). Bayes Theorem then follows:

    Bayes Theorem:

    p(B|A) = p(A|B)p(B)p(A)

    The importance of Bayes Rule is that it allows us to reverse the condi-

    tionalizing of events, and to computep(B|A) from knowledge ofp(A|B),

    p(A), and p(B). Often these are expressed as priorand posteriorprob-

    abilities, or as the conditionalizing of hypotheses upon data.

    Worked Example:

    Suppose that a dread disease affects 1/1000th of all people. If you ac-

    tually have the disease, a test for it is positive 95% of the time, and

    negative 5% of the time. If you dont have the disease, the test is posi-

    tive 5% of the time. We wish to know how to interpret test results.

    Suppose you test positive for the disease. What is the likelihood thatyou actually have it?

    We use the above rules, with the following substitutions of data D

    and hypothesis H instead ofA and B:

    D = data: the test is positive

    H= hypothesis: you have the diseaseH= the other hypothesis: you do not have the disease

    7

  • 8/3/2019 Info Theory Lectures

    11/75

    Before acquiring the data, we know only that the a priori probabil-

    ity of having the disease is .001, which sets p(H). This is called a prior.

    We also need to know p(D).

    From the Sum Rule, we can calculate that the a priori probability

    p(D) of testing positive, whatever the truth may actually be, is:

    p(D) = p(D|H)p(H) + p(D|H)p(H) =(.95)(.001)+(.05)(.999) = .051

    and from Bayes Rule, we can conclude that the probability that you

    actually have the disease given that you tested positive for it, is muchsmaller than you may have thought:

    p(H|D) =p(D|H)p(H)

    p(D)=

    (.95)(.001)

    (.051)= 0.019 (less than 2%).

    This quantity is called the posterior probabilitybecause it is computed

    after the observation of data; it tells us how likely the hypothesis is,

    given what we have observed. (Note: it is an extremely common humanfallacy to confound p(H|D) with p(D|H). In the example given, most

    people would react to the positive test result by concluding that the

    likelihood that they have the disease is .95, since that is the hit rate

    of the test. They confound p(D|H) = .95 with p(H|D) = .019, which

    is what actually matters.)

    A nice feature of Bayes Theorem is that it provides a simple mech-

    anism for repeatedly updating our assessment of the hypothesis as more

    data continues to arrive. We can apply the rule recursively, using the

    latest posterior as the new prior for interpreting the next set of data.

    In Artificial Intelligence, this feature is important because it allows the

    systematic and real-time construction of interpretations that can be up-

    dated continuously as more data arrive in a time series, such as a flowof images or spoken sounds that we wish to understand.

    8

  • 8/3/2019 Info Theory Lectures

    12/75

    3. Entropies Defined, and Why They are

    Measures of Information

    The information content I of a single event or message is defined as the

    base-2 logarithm of its probability p:

    I= log2p (1)

    and its entropyH is considered the negative of this. Entropy can be

    regarded intuitively as uncertainty, or disorder. To gain information

    is to lose uncertainty by the same amount, so I and H differ only in

    sign (if at all): H= I. Entropy and information have units ofbits.

    Note that I as defined in Eqt (1) is never positive: it ranges between

    0 and as p varies from 1 to 0. However, sometimes the sign is

    dropped, and I is considered the same thing as H (as well do later too).

    No information is gained (no uncertainty is lost) by the appearance

    of an event or the receipt of a message that was completely certain any-

    way (p = 1, so I = 0). Intuitively, the more improbable an event is,

    the more informative it is; and so the monotonic behaviour of Eqt (1)

    seems appropriate. But why the logarithm?

    The logarithmic measure is justified by the desire for information to

    be additive. We want the algebra of our measures to reflect the Rules

    of Probability. When independent packets of information arrive, wewould like to say that the total information received is the sum of

    the individual pieces. But the probabilities of independent events

    multiply to give their combined probabilities, and so we must take

    logarithms in order for the joint probability of independent events

    or messages to contribute additively to the information gained.

    This principle can also be understood in terms of the combinatoricsof state spaces. Suppose we have two independent problems, one with n

    9

  • 8/3/2019 Info Theory Lectures

    13/75

    possible solutions (or states) each having probability pn, and the other

    with m possible solutions (or states) each having probability pm. Then

    the number of combined states is mn, and each of these has probability

    pmpn. We would like to say that the information gained by specifying

    the solution to both problems is the sumof that gained from each one.This desired property is achieved:

    Imn = log2(pmpn) = log2pm + log2pn = Im + In (2)

    A Note on Logarithms:

    In information theory we often wish to compute the base-2 logarithmsof quantities, but most calculators (and tools like xcalc) only offer

    Napierian (base 2.718...) and decimal (base 10) logarithms. So the

    following conversions are useful:

    log2 X= 1.443 logeX= 3.322 log10 X

    Henceforward we will omit the subscript; base-2 is always presumed.

    Intuitive Example of the Information Measure (Eqt 1):

    Suppose I choose at random one of the 26 letters of the alphabet, and we

    play the game of 25 questions in which you must determine which let-

    ter I have chosen. I will only answer yes or no. What is the minimum

    number of such questions that you must ask in order to guarantee find-

    ing the answer? (What form should such questions take? e.g., Is it A?Is it B? ...or is there some more intelligent way to solve this problem?)

    The answer to a Yes/No question having equal probabilities conveys

    one bit worth of information. In the above example with equiprobable

    states, you never need to ask more than 5 (well-phrased!) questions to

    discover the answer, even though there are 26 possibilities. Appropri-

    ately, Eqt (1) tells us that the uncertainty removed as a result of solvingthis problem is about -4.7 bits.

    10

  • 8/3/2019 Info Theory Lectures

    14/75

    Entropy of Ensembles

    We now move from considering the information content of a single event

    or message, to that of an ensemble. An ensemble is the set of outcomesof one or more random variables. The outcomes have probabilities at-

    tached to them. In general these probabilities are non-uniform, with

    event i having probability pi, but they must sum to 1 because all possi-

    ble outcomes are included; hence they form a probability distribution:

    ipi = 1 (3)

    The entropy of an ensemble is simply the average entropy of all theelements in it. We can compute their average entropy by weighting each

    of the logpi contributions by its probability pi:

    H= I=

    ipi logpi (4)

    Eqt (4) allows us to speak of the information content or the entropy of

    a random variable, from knowledge of the probability distribution that

    it obeys. (Entropy does not depend upon the actual values taken by

    the random variable! Only upon their relative probabilities.)

    Let us consider a random variable that takes on only two values, one

    with probability p and the other with probability (1 p). Entropy is a

    concave function of this distribution, and equals 0 ifp = 0 or p = 1:

    11

  • 8/3/2019 Info Theory Lectures

    15/75

    Example of entropy as average uncertainty:

    The various letters of the written English language have the followingrelative frequencies (probabilities), in descending order:

    E T O A N I R S H D L C ...

    .105 .072 .066 .063 .059 .055 .054 .052 .047 .035 .029 .023 ...

    If they had been equiprobable, the entropy of the ensemble would

    have been log2(1

    26) = 4.7 bits. But their non-uniform probabilities im-

    ply that, for example, an E is nearly five times more likely than a C;

    surely this prior knowledge is a reduction in the uncertainty of this ran-

    dom variable. In fact, the distribution of English letters has an entropy

    of only 4.0 bits. This means that as few as only four Yes/No questionsare needed, in principle, to identify one of the 26 letters of the alphabet;

    not five.

    How can this be true?

    That is the subject matter of Shannons SOURCE CODING THEOREM

    (so named because it uses the statistics of the source, the a priori

    probabilities of the message generator, to construct an optimal code.)

    Note the important assumption: that the source statistics are known!

    Several further measures of entropy need to be defined, involving the

    marginal, joint, and conditional probabilities of random variables. Somekey relationships will then emerge, that we can apply to the analysis of

    communication channels.

    Notation: We use capital letters X and Y to name random variables,

    and lower case letters x and y to refer to their respective outcomes.

    These are drawn from particular sets A and B: x {a1, a2, ...aJ}, and

    y {b1, b2, ...bK}. The probability of a particular outcome p(x = ai)is denoted pi, with 0 pi 1 and

    ipi = 1.

    12

  • 8/3/2019 Info Theory Lectures

    16/75

    An ensemble is just a random variable X, whose entropy was defined

    in Eqt (4). A joint ensemble XY is an ensemble whose outcomes

    are ordered pairs x, y with x {a1, a2, ...aJ} and y {b1, b2, ...bK}.

    The joint ensemble XY defines a probability distribution p(x, y) overall possible joint outcomes x, y.

    Marginal probability: From the Sum Rule, we can see that the proba-

    bility ofX taking on a particular value x = ai is the sum of the joint

    probabilities of this outcome for X and all possible outcomes for Y:

    p(x = ai) =yp(x = ai, y)

    We can simplify this notation to: p(x) =yp(x, y)

    and similarly: p(y) =xp(x, y)

    Conditional probability: From the Product Rule, we can easily see that

    the conditional probability that x = ai, given that y = bj, is:

    p(x = ai|y = bj) =p(x = ai, y = bj)

    p(y = bj)

    We can simplify this notation to: p(x|y) =p(x, y)

    p(y)

    and similarly: p(y|x) =p(x, y)

    p(x)

    It is now possible to define various entropy measures for joint ensembles:

    Joint entropy ofXY

    H(X, Y) =x,yp(x, y)log

    1

    p(x, y)(5)

    (Note that in comparison with Eqt (4), we have replaced the sign infront by taking the reciprocal ofp inside the logarithm).

    13

  • 8/3/2019 Info Theory Lectures

    17/75

    From this definition, it follows that joint entropy is additive ifX and Y

    are independent random variables:

    H(X, Y) = H(X) + H(Y) iff p(x, y) = p(x)p(y)

    Prove this.

    Conditional entropy of an ensemble X, given that y = bj

    measures the uncertainty remaining about random variable X after

    specifying that random variableY

    has taken on a particular valuey = bj. It is defined naturally as the entropy of the probability dis-

    tribution p(x|y = bj):

    H(X|y = bj) =xp(x|y = bj)log

    1

    p(x|y = bj)(6)

    If we now consider the above quantity averaged over all possible out-

    comes that Y might have, each weighted by its probability p(y), thenwe arrive at the...

    Conditional entropy of an ensemble X, given an ensemble Y:

    H(X|Y) =yp(y)

    xp(x|y)log

    1

    p(x|y)

    (7)

    and we know from the Sum Rule that if we move the p(y) term fromthe outer summation over y, to inside the inner summation over x, the

    two probability terms combine and become just p(x, y) summed over all

    x, y. Hence a simpler expression for this conditional entropy is:

    H(X|Y) =x,yp(x, y)log

    1

    p(x|y)(8)

    This measures the average uncertainty that remains about X, when Yis known.

    14

  • 8/3/2019 Info Theory Lectures

    18/75

    Chain Rule for Entropy

    The joint entropy, conditional entropy, and marginal entropy for two

    ensembles X and Y are related by:H(X, Y) = H(X) + H(Y|X) = H(Y) + H(X|Y) (9)

    It should seem natural and intuitive that the joint entropy of a pair of

    random variables is the entropy of one plus the conditional entropy of

    the other (the uncertainty that it adds once its dependence on the first

    one has been discounted by conditionalizing on it). You can derive the

    Chain Rule from the earlier definitions of these three entropies.

    Corollary to the Chain Rule:

    If we have three random variables X,Y,Z , the conditionalizing of the

    joint distribution of any two of them, upon the third, is also expressed

    by a Chain Rule:

    H(X, Y|Z) = H(X|Z) + H(Y|X, Z) (10)

    Independence Bound on Entropy

    A consequence of the Chain Rule for Entropy is that if we have many

    different random variables X1, X2,...,Xn, then the sum of all their in-dividual entropies is an upper bound on their joint entropy:

    H(X1, X2,...,Xn) n

    i=1H(Xi) (11)

    Their joint entropy only reaches this upper bound if all of the random

    variables are independent.

    15

  • 8/3/2019 Info Theory Lectures

    19/75

  • 8/3/2019 Info Theory Lectures

    20/75

    Distance D(X, Y) between X and Y

    The amount by which the joint entropy of two random variables ex-

    ceeds their mutual information is a measure of the distance betweenthem:

    D(X, Y) = H(X, Y) I(X;Y) (16)

    Note that this quantity satisfies the standard axioms for a distance:

    D(X, Y) 0, D(X,X) = 0, D(X, Y) = D(Y,X), and

    D(X, Z) D(X, Y) + D(Y, Z).

    Relative entropy, or Kullback-Leibler distance

    Another important measure of the distance between two random vari-

    ables, although it does not satisfy the above axioms for a distance metric,

    is the relative entropy or Kullback-Leibler distance. It is also called

    the information for discrimination. Ifp(x) and q(x) are two proba-

    bility distributions defined over the same set of outcomes x, then theirrelative entropy is:

    DKL(pq) =xp(x)log

    p(x)

    q(x)(17)

    Note that DKL(pq) 0, and in case p(x) = q(x) then their distance

    DKL(pq) = 0, as one might hope. However, this metric is not strictly a

    distance, since in general it lacks symmetry: DKL(pq) = DKL(qp).

    The relative entropy DKL(pq) is a measure of the inefficiency of as-

    suming that a distribution is q(x) when in fact it is p(x). If we have an

    optimal code for the distribution p(x) (meaning that we use on average

    H(p(x)) bits, its entropy, to describe it), then the number of additional

    bits that we would need to use if we instead described p(x) using an

    optimal code for q(x), would be their relative entropy DKL(pq).

    17

  • 8/3/2019 Info Theory Lectures

    21/75

    Venn Diagram: Relationship among entropies and mutual information.

    Fanos Inequality

    We know that conditioning reduces entropy: H(X|Y) H(X). It

    is clear that ifX and Y are perfectly correlated, then their conditional

    entropy is 0. It should also be clear that ifX is any deterministicfunction ofY, then again, there remains no uncertainty about X once

    Y is known and so their conditional entropy H(X|Y) = 0.

    Fanos Inequality relates the probability of error Pe in guessing X from

    knowledge ofY to their conditional entropy H(X|Y), when the number

    of possible outcomes is |A| (e.g. the length of a symbol alphabet):

    Pe H(X|Y) 1

    log |A|(18)

    The lower bound on Pe is a linearly increasing function ofH(X|Y).

    The Data Processing Inequality

    If random variables X, Y, and Z form a Markov chain (i.e. the condi-tional distribution ofZ depends only on Y and is independent ofX),

    which is normally denoted as X Y Z, then the mutual informa-

    tion must be monotonically decreasing over steps along the chain:

    I(X;Y) I(X;Z) (19)

    We turn now to applying these measures and relationships to the study

    of communications channels. (The following material is from McAuley.)

    18

  • 8/3/2019 Info Theory Lectures

    22/75

    # % ' ( # 0 4 ( 6

    7 8 @ B D D @ B H I P R T R W H B ` 8 a H R e a T 8 H @ B P I g 8 g R a q r H R @ 8 s D H R T R W H B ` 8 a u s a v R x

    a R T 8 H H 8 H e a R e a 8 x 8 W P

    a R T 8 H H s H q g R D B H 8 W 8 a s P 8 ` R W 8 s T I 8 ` 8 B H

    H I R @ W a s

    I B T s D D q P R 8 P I 8 a @ B P I s P @ R H P s P 8 u s a v R x

    a R T 8 H H R a P I 8 s D

    I s 8 P

    r r r r j k B W n e a 8 o 7 8 T s W P I 8 W H R D x 8 R a P I 8 H P s P 8 R T T e

    s W T q e H B W

    R @ 8 e s P B R W H P I B H 8 s g

    D 8 B H P a B x B s D

    D, 1/8

    C, 1/8

    B, 1/4

    A, 1/2 A, 1/2

    B, 3/8

    C, 1/8

    D, 1/8

    B, 1/4

    C, 1/8

    A, 1/4

    E, 1/4

    { B e a 8 o | } a s

    I H a 8

    a 8 H 8 W P B W g 8 g R a q D 8 H H H R e a T 8 s W ` P @ R H P s P 8 u s a v R x

    a R

    T 8 H H

    W 8 W 8 a s D P I 8 W @ 8 g s q ` 8 n W 8 s W q n W B P 8 H P s P 8

    a R T 8 H H @ B P I H P s P 8 H

    k r

    @ B P I P a s W H B P B R W

    a R s B D B P B 8 H 8 B W P I 8

    a R s B D B P q R g R x B W a R g H P s P 8

    P R H P s P 8

    @ B P I P I 8 8 g B H H B R W R H R g 8 H q g R D { B a H P @ 8 T s W ` 8 n W 8 P I 8 8 W P a R

    q

    R 8 s T I H P s P 8 B W P I 8 W R a g s D g s W W 8 a |

    D R

    s W ` P I 8 W P I 8 8 W P a R

    q R P I 8 H q H P 8 g P R 8 P I 8 H e g R P I 8 H 8 B W ` B x B ` e s D H P s P 8

    8 W P a R

    q x s D e 8 H @ 8 B I P 8 ` @ B P I P I 8 H P s P 8 R T T e

    s W T q T s D T e D s P 8 ` a R g P I 8

    R @

    8 e s P B R W H |

    D R

    D 8 s a D q R a s H B W D 8 H P s P 8 r @ 8 I s x 8 P I 8 8 W P a R

    q R P I 8 g 8 g R a q D 8 H H H R e a T 8

    4 ( % ' ( 4 ( (

    P 8 W @ 8 @ B H I P R 8 T B 8 W P D q a 8

    a 8 H 8 W P P I 8 H q g R D H 8 W 8 a s P 8 ` q H R g 8 H R e a T 8

    7 8 H I s D D T R W H B ` 8 a 8 W T R ` B W P I 8 H q g R D H s H B W s a q ` B B P H

    o

  • 8/3/2019 Info Theory Lectures

    23/75

    Symbols EncodingSource

    encoder

    { B e a 8 o | B H T a 8 P 8 g 8 g R a q D 8 H H H R e a T 8 s W ` 8 W T R ` 8 a

    R W H B ` 8 a 8 W T R ` B W P I 8 H q g R D H

    k r 8 W P a R

    q r s H s n 8 ` D 8 W P I D R T v

    B W s a q ` B B P H R 8 W H e a 8 @ 8 T s W ` 8 T R ` 8 P I 8 H 8 H q g R D H @ 8 W 8 8 ` |

    s

    R @ 8 a R

    o R P I 8 a @ B H 8

    @ I 8 a 8

    B H P I 8 D s a 8 H P B W P 8 8 a D 8 H H P I s W I 8 B H P I 8 W B P H

    8 a

    H q g R D r s W ` s H @ 8 v W R @

    P I 8 W I 8 8 T B 8 W T q R P I 8 T R ` B W

    B H B x 8 W q |

    7 I 8 W P I 8 H q g R D H s a 8 8 e B

    a R s D 8 r s W ` B H s

    R @ 8 a R P @ R r

    o s W `

    R P 8 P I s P B @ 8 P a q P R s T I B 8 x 8 B W P I B H T s H 8 @ 8 g e H P s D D R T s P 8 s P

    D 8 s H P R W 8 8 W T R ` B W P R g R a 8 P I s W R W 8 H q g R D P I B H g 8 s W H P I s P @ 8 s a 8 B W T s

    s D 8

    R e W B e 8 D q ` 8 T R ` B W

    P B D D @ B P I 8 e B

    a R s D 8 H q g R D H r e P @ I 8 W B H W R P s

    R @ 8 a R P @ R r P I B H T R ` B W

    B H B W 8 T B 8 W P P R P a q P R R x 8 a T R g 8 P I B H @ 8 T R W H B ` 8 a H 8 e 8 W T 8 H R H q g R D H R D 8 W P I

    s W ` T R W H B ` 8 a 8 W T R ` B W 8 s T I

    R H H B D 8 H 8 e 8 W T 8 R D 8 W P I s H s D R T v R B W s a q

    ` B B P H r P I 8 W @ 8 R P s B W |

    D R

    o

    @ I 8 a 8 B H W R @ P I 8 s x 8 a s 8 W e g 8 a R B P H

    8 a H q g R D R P 8 P I s P s H 8 P H

    D s a 8 r

    o

    W 8 W 8 a s D @ 8 ` R W R P I s x 8 8 e B

    a R s D 8 H q g R D H r s W ` @ 8 @ R e D ` I R

    8 P R s T I B 8 x 8

    H R g 8 g R a 8 T R g

    a 8 H H 8 ` R a g R 8 W T R ` B W q e H 8 R x s a B s D 8 D 8 W P I T R ` 8 H s W

    8 s g

    D 8 R H e T I s W 8 W T R ` B W B H u R a H 8 T R ` 8 ` s P B W a R g P I 8 ` s q H R P 8 D 8 a s

    I H

    7 8 T R W H B ` 8 a s s B W R e a H B g

    D 8 R e a H q g R D s D

    I s 8 P s W ` H R g 8

    R H H B D 8 x s a B s D 8

    D 8 W P I T R ` 8 H |

    R ` 8 o R ` 8 R ` 8

    o o

    o o o

    o o o o o o

    o o o o o o o o

  • 8/3/2019 Info Theory Lectures

    24/75

    7 8 T R W H B ` 8 a 8 s T I T R ` 8 B W P e a W |

    o H B W P I B H 8 W T R ` B W r @ 8 n W ` P I s P

    a 8 H 8 W P 8 ` @ B P I s H 8 e 8 W T 8 D B v 8 o o r @ 8

    ` R W R P v W R @ @ I 8 P I 8 a P R ` 8 T R ` 8 s H

    R a I B H g s v 8 H H e T I s T R ` 8

    e W H s P B H s T P R a q { e a P I 8 a r B W 8 W 8 a s D r 8 x 8 W s T R ` 8 @ I 8 a 8 H e T I s W s g B e B P q

    T R e D ` 8 a 8 H R D x 8 ` e W B e 8 D q q D R R v B W s P B P H e a P I 8 a s I 8 s ` B W P I 8 H P a 8 s g

    s W ` s T v P a s T v B W B H e W H s P B H s T P R a q

    H 8 a x 8 P I s P R a P I B H T R ` 8 r P I 8 T R ` B W a s P 8 r R a s x 8 a s 8 W e g 8 a R B P H

    8 a

    H q g R D r B H B x 8 W q |

    o

    @ I B T I B H D 8 H H P I s W P I 8 8 W P a R

    q

    I B H T R ` 8 B H e W B e 8 D q ` 8 T R ` s D 8 e a P I 8 a P I B H T R ` 8 B H B W P 8 a 8 H P B W B W P I s P

    @ 8 T s W ` 8 T R ` 8 P I s P B H W R s T v P a s T v B W B H a 8 e B a 8 ` R W T 8

    @ 8 I s x 8 P I 8 B P H R P I 8 8 W T R ` 8 ` H q g R D @ 8 T s W ` 8 T R ` 8 @ B P I R e P @ s B P B W

    R a g R a 8 { e a P I 8 a P I B H s D H R H s P B H n 8 H P I 8 T R W ` B P B R W r P I s P B H P I 8 a 8 B H

    W R T R ` 8 @ R a ` @ I B T I B H

    a 8 n B 8 H s g 8 B P

    s P P 8 a W R s D R W 8 a T R ` 8 @ R a `

    W P I B H T s H 8 P I 8 T R ` B W a s P 8 B H 8 e s D P R P I 8 8 W P a R

    q

    7 I B D 8 P I B H B H ` 8 T R ` s D 8 s W ` T R ` B W a s P 8 B H 8 e s D P R P I 8 8 W P a R

    q s s B W r

    R H 8 a x 8 P I s P B P ` R 8 H W R P I s x 8 P I 8

    a 8 n

    a R

    8 a P q s W ` B H W R P s W B W H P s W P s

    W 8 R e H T R ` 8

    I s W W R W H n a H P P I 8 R a 8 g B H P I 8 # $ % @ I B T I B H |

    { R a s ` B H T a 8 P 8 g 8 g R a q D 8 H H H R e a T 8 @ B P I n W B P 8 8 W P a R

    q R a s W q

    R H B P B x 8 ' B P B H

    R H H B D 8 P R 8 W T R ` 8 P I 8 H q g R D H s P s W s x 8 a s 8 a s P 8

    r H e T I P I s P |

    '

    { R a

    a R R H 8 8

    I s W W R W 0 7 8 s x 8 a I B H B H s D H R H R g 8 P B g 8 H T s D D 8 ` P I 8 W R B H 8 D 8 H H

    T R ` B W P I 8 R a 8 g s H B P ` 8 s D H @ B P I T R ` B W @ B P I R e P T R W H B ` 8 a s P B R W R W R B H 8

    a R T 8 H H 8 H

    B 8 B P T R a a e

    P B R W 8 P T

    I 8 8 W P a R

    q e W T P B R W P I 8 W a 8

    a 8 H 8 W P H s e W ` s g 8 W P s D D B g B P R W P I 8 W e g 8 a R B P H

    R W s x 8 a s 8 a 8 e B a 8 ` P R a 8

    a 8 H 8 W P P I 8 H q g R D H R P I 8 H R e a T 8

    2 3

    7 8 I s x 8 s D a 8 s ` q g 8 W P B R W 8 ` P I 8

    a 8 n

    a R

    8 a P q @ 8 n W ` P I s P R a s

    a 8 n T R ` 8

    P R 8 B H P r B P g e H P H s P B H q P I 8 4 6 8 8 B W 8 e s D B P q I s P B H r s W 8 T 8 H H s a q W R P

    H e T B 8 W P T R W ` B P B R W R a s T R ` 8 I s x B W B W s a q T R ` 8 @ R a ` H @ B P I D 8 W P I @

    @

    C C C

    @ F P R H s P B H q P I 8

    a 8 n T R W ` B P B R W B H |

    F

    H

    o

    P

    o

    o

  • 8/3/2019 Info Theory Lectures

    25/75

    R # ' ( ( S ( 6 U ( # # 4 ( U

    7 8 I s x 8 T R W H B ` 8 a 8 ` P I 8 ` B H T a 8 P 8 H R e a T 8 r W R @ @ 8 T R W H B ` 8 a s T I s W W 8 D P I a R e I

    @ I B T I @ 8 @ B H I P R

    s H H H q g R D H 8 W 8 a s P 8 ` q H e T I s H R e a T 8 q H R g 8 s

    a R

    a B s P 8

    8 W T R ` B W g 8 T I s W B H g @ 8 s D H R B W P a R ` e T 8 P I 8 B ` 8 s R W R B H 8 B W P R P I 8 H q H P 8 g P I s P

    B H @ 8 T R W H B ` 8 a P I 8 T I s W W 8 D P R g R ` B q P I 8 B W

    e P T R ` B W s W `

    R H H B D q 8 W 8 a s P 8

    H R g 8 g R ` B n 8 ` x 8 a H B R W

    7 8 H I R e D ` ` B H P B W e B H I 8 P @ 8 8 W H q H P 8 g s P B T g R ` B n T s P B R W R P I 8 8 W T R ` 8 ` H q g R D H r

    B 8 ` B H P R a P B R W r s W ` W R B H 8 B H P R a P B R W B H @ I 8 W s W B W

    e P T R ` 8 s D @ s q H a 8 H e D P H B W P I 8

    P I 8 H s g 8 R e P

    e P T R ` 8 P I B H

    a R T 8 H H T s W T D 8 s a D q 8 a 8 x 8 a H 8 ` R B H 8 R W P I 8 R P I 8 a

    I s W ` B W P a R ` e T 8 H P I 8 8 D 8 g 8 W P R a s W ` R g W 8 H H B W P R P I 8 a 8 H e D P B W R e P

    e P T R ` 8

    Symbols Source

    encoderDecoder

    Symbols

    ChannelX Y

    { B e a 8 o | R ` B W s W ` ` 8 T R ` B W R H q g R D H R a P a s W H 8 a R x 8 a s T I s W W 8 D

    7 8 T R W H B ` 8 a s W B W

    e P s D

    I s 8 P W

    X X `

    k s W ` R e P

    e P s D

    I s 8 P a

    c c e

    k s W ` a s W ` R g x s a B s D 8 H

    s W ` f @ I B T I a s W 8 R x 8 a P I 8 H 8 s D

    I s 8 P H

    R P 8 P I s P s W ` g W 8 8 ` W R P 8 P I 8 H s g 8 R a 8 s g

    D 8 @ 8 g s q I s x 8 P I 8 B W s a q

    B W

    e P s D

    I s 8 P

    o k s W ` P I 8 R e P

    e P s D

    I s 8 P

    o

    i

    k r @ I 8 a 8

    i

    a 8

    a 8 H 8 W P H

    P I 8 ` 8 T R ` 8 a B ` 8 W P B q B W H R g 8 8 a a R a I 8 ` B H T a 8 P 8 g 8 g R a q D 8 H H T I s W W 8 D T s W P I 8 W

    8 a 8

    a 8 H 8 W P 8 ` s H s H 8 P R P a s W H B P B R W

    a R s B D B P B 8 H |

    c p r X

    f

    c p r

    X

    I s P B H P I 8

    a R s B D B P q P I s P B

    X

    B H B W v 8 T P 8 ` B W P R P I 8 T I s W W 8 D r

    c p

    B H 8 g B P P 8 `

    cp

    r X

    B H P I 8 T R W ` B P B R W s D

    a R s B D B P q I B H T s W 8 @ a B P P 8 W s H P I 8 $ V

    % |

    w

    x

    c

    r X

    c

    r X

    ce

    r X

    c r X

    c r X

    c e r X

    c r X `

    c r X `

    c e r X `

    R P 8 P I s P @ 8 I s x 8 P I 8

    a R

    8 a P q P I s P R a 8 x 8 a q B W

    e P H q g R D r @ 8 @ B D D 8 P H R g 8

    P I B W R e P |

    e

    p

    H

    c p r X

    o

  • 8/3/2019 Info Theory Lectures

    26/75

    8 P @ 8 P s v 8 P I 8 R e P

    e P R s ` B H T a 8 P 8 g 8 g R a q D 8 H H H R e a T 8 s H P I 8 B W

    e P P R s T I s W

    W 8 D

    R @ 8 I s x 8 s H H R T B s P 8 ` @ B P I P I 8 B W

    e P s D

    I s 8 P R P I 8 T I s W W 8 D P I 8

    a R s B D

    B P q ` B H P a B e P B R W R R e P

    e P a R g s g 8 g R a q D 8 H H H R e a T 8

    X

    o

    k 7 8

    P I 8 W R P s B W P I 8 v R B W P

    a R s B D B P q ` B H P a B e P B R W R P I 8 a s W ` R g x s a B s D 8 H

    s W `

    f |

    X c p

    X

    f

    c p

    c p r X

    X

    7 8 T s W P I 8 W n W ` P I 8 g s a B W s D

    a R s B D B P q ` B H P a B e P B R W R r P I s P B H P I 8

    a R s

    B D B P q R R e P

    e P H q g R D

    c p

    s

    8 s a B W |

    c p

    f

    c p

    `

    H

    cp

    r X

    X

    @ 8 I s x 8 g r @ 8 R P 8 W B ` 8 W P B q 8 s T I R e P

    e P H q g R D s H 8 B W P I 8 ` 8 H B a 8 `

    a 8 H e D P R H R g 8 B W

    e P H q g R D a @ 8 g s q H 8 D 8 T P H R g 8 H e H 8 P R R e P

    e P H q g R D H r

    R a 8 s g

    D 8 B W P I 8 B W

    e P

    o k s W ` R e P

    e P

    o

    i

    k 7 8 P I 8 W ` 8 n W 8 P I 8 s x 8 a s 8

    a R s B D B P q R H q g R D 8 a a R a s H |

    e

    p

    H

    p

    H

    f

    c p r

    X

    e

    p

    H

    `

    H

    H

    p

    c p r X

    X

    s W ` T R a a 8 H

    R W ` B W D q r P I 8 s x 8 a s 8

    a R s B D B P q R T R a a 8 T P a 8 T 8

    P B R W s H o

    I 8 B W s a q H q g g 8 P a B T T I s W W 8 D I s H P @ R B W

    e P s W ` R e P

    e P H q g R D H e H e s D D q @ a B P

    P 8 W

    o k s W ` s T R g g R W

    a R s B D B P q r r R B W T R a a 8 T P ` 8 T R ` B W R s W B W

    e P

    s P P I 8 R e P

    e P P I B H T R e D ` 8 s H B g

    D B H P B T g R ` 8 D R s T R g g e W B T s P B R W H D B W v r n

    e a 8 s

    R @ 8 x 8 a r P R e W ` 8 a H P s W ` P I 8 s x 8 a s B W

    a R

    8 a P q R P I 8 8 a a R a a s P 8

    ` 8 H T a B 8 `

    s R x 8 r T R W H B ` 8 a P I 8 n e a 8 r @ I 8 a 8 @ 8 I s x 8 o H q g R D H r R @ I B T I P I 8 n a H P I s H

    s

    a R s B D B P q R 8 B W a 8 T 8 B x 8 ` B W 8 a a R a R

    o r s W ` P I 8 a 8 g s B W ` 8 a s a 8 s D @ s q H

    a 8 T 8 B x 8 `

    8 a 8 T P D q I 8 W R H 8 a x B W P I s P g R H P R P I 8 P 8 a g H B W P I 8 H e g R W P I 8

    a B I P R 8 e s P B R W s a 8 j 8 a R |

    c r X k

    X k

    o

    o l

    o l m

  • 8/3/2019 Info Theory Lectures

    27/75

    0

    11

    0

    p

    p

    1-p

    1-p

    X Y

    0

    11

    0

    YX

    0.1

    0.9

    1.0

    1.0

    1.0

    2

    3

    2

    3

    999999999999

    { B e a 8 | s B W s a q H q g g 8 P a B T T I s W W 8 D r s D 8 R P I 8 e W D e T v q H q g R D

    n S % % U ( o 6

    j P 8 W ` B W P I 8 B ` 8 s H R B W R a g s P B R W s W ` 8 W P a R

    q R a P I 8 ` B H T a 8 P 8 H R e a T 8 r @ 8 T s W

    W R @ T R W H B ` 8 a P I 8 B W R a g s P B R W s R e P

    R P s B W 8 ` q R H 8 a x B W P I 8 x s D e 8 R P I 8

    f 7 8 ` 8 n W 8 P I 8 8 W P a R

    q s P 8 a R H 8 a x B W f

    c p

    |

    W

    r

    f

    c p

    `

    H

    X r c p

    D R

    o

    X

    r c p

    P I B H B H s a s W ` R g x s a B s D 8 r H R @ 8 T s W P s v 8 P I 8 s x 8 a s 8 s s B W |

    W

    r

    a

    e

    p

    H

    W

    r

    f

    c p

    c p

    e

    p

    H

    `

    H

    X

    r c p

    D R

    o

    X r cp

    c p

    e

    p

    H

    `

    H

    X

    c p

    D R

    o

    X r c p

    W

    r

    a B H P I 8 7 8 P I 8 W @ a B P 8 P I 8 % " 6 % R

    P I 8 T I s W W 8 D |

    W a W W

    r

    a

    I B H

    a R x B ` 8 H e H @ B P I s g 8 s H e a 8 R P I 8 s g R e W P R B W R a g s P B R W R a e W T 8 a P s B W P q

    a 8 ` e T 8 ` s R e P

    s P 8 a R H 8 a x B W P I 8 R e P

    e Pf

    R s T I s W W 8 D 8 ` q

    I 8

    g e P e s D B W R a g s P B R W P 8 D D H e H H R g 8 P I B W s R e P P I 8 T I s W W 8 D

    W s D P 8 a W s P B x 8 x B 8 @

    R B W P B H P R P I B W v s R e P P I 8 T I s W W 8 D P R 8 P I 8 a @ B P I s T R a a 8 T

    P B R W ` 8 x B T 8 8 ` @ B P I B W R a g s P B R W a R g R H 8 a x s P B R W H R R P I P I 8 B W

    e P s W ` R e P

    e P

    R P I 8 T I s W W 8 D r 8 n e a 8 o W 8 g B I P s H v I R @ g e T I B W R a g s P B R W g e H P 8

  • 8/3/2019 Info Theory Lectures

    28/75

    Source

    encoderDecoder

    X Y XCorrection

    Observer

    Correction data

    { B e a 8 o | R a a 8 T P B R W H q H P 8 g

    s H H 8 ` s D R W P I 8 T R a a 8 T P B R W T I s W W 8 D P R a 8 T R W H P a e T P P I 8 B W

    e P P I B H P e a W H R e P P R

    8

    r

    f

    7 8 T s W e a P I 8 a a 8 @ a B P 8 P I 8 8 W P a R

    q e W T P B R W W s H s H e g R x 8 a P I 8 v R B W P

    a R s B D B P q ` B H P a B e P B R W |

    W1

    `

    H

    X

    D R

    X

    o

    `

    H

    X

    D R

    X

    e

    p

    H

    c p r X

    `

    H

    e

    p

    H

    c p r X

    X

    D R

    X

    `

    H

    e

    p

    H

    X

    c p

    D R

    X

    8 W T 8 @ 8 R P s B W s W 8

    a 8 H H B R W R a P I 8 g e P e s D B W R a g s P B R W |

    W a

    `

    H

    e

    p

    H

    X

    c p

    D R

    X

    r c p

    X

    7 8 T s W ` 8 ` e T 8 x s a B R e H

    a R

    8 a P B 8 H R P I 8 g e P e s D B W R a g s P B R W |

    o

    W a z

    R H I R @ P I B H r @ 8 W R P 8 P I s P

    X

    r c p

    c p

    X

    c p

    s W ` H e H P B P e P 8 P I B H

    B W P I 8 8 e s P B R W s R x 8 |

    W a

    e

    p

    H

    `

    H

    X c p

    D R

    X c p

    X

    c p

    z

    q e H 8 R P I 8 B W 8 e s D B P q @ 8 8 H P s D B H I 8 `

    a 8 x B R e H D q

  • 8/3/2019 Info Theory Lectures

    29/75

    W a B

    s W ` f s a 8 H P s P B H P B T s D D q B W ` 8

    8 W ` 8 W P

    s W ` f s a 8 B W ` 8

    8 W ` 8 W P r

    X

    c p

    X

    c p

    r I 8 W T 8 P I 8 D R P 8 a g

    8 T R g 8 H j 8 a R

    B H H q g g 8 P a B T r

    W a

    a W

    H B W

    X

    r c p

    c p

    c p r X

    X

    r @ 8 R P s B W |

    W a

    `

    H

    e

    p

    H

    X c p

    D R

    X r cp

    X

    `

    H

    e

    p

    H

    X c p

    D R }

    c p r X

    c p

    ~

    a W

    I 8

    a 8 T 8 ` B W D 8 s ` H P R P I 8 R x B R e H H q g g 8 P a B T ` 8 n W B P B R W R a P I 8 g e P e s D

    B W R a g s P B R W B W P 8 a g H R P I 8 8 W P a R

    q R P I 8 R e P

    e P |

    W a a a

    r

    W

    7 8 ` 8 n W 8 P I 8 v R B W P 8 W P a R

    q R P @ R a s W ` R g x s a B s D 8 H B W P I 8 R x B R e H g s W

    W 8 a r s H 8 ` R W P I 8 v R B W P

    a R s B D B P q ` B H P a B e P B R W |

    W

    a

    `

    H

    e

    p

    H

    X

    c p

    D R

    X

    c p

    I 8 g e P e s D B W R a g s P B R W B H P I 8 g R a 8 W s P e a s D D q @ a B P P 8 W |

    W a W a W

    a

    R W H B ` 8 a P I 8 T R W ` B P B R W s D 8 W P a R

    q s W ` g e P e s D B W R a g s P B R W R a P I 8 B W s a q H q g

    g 8 P a B T T I s W W 8 D I 8 B W

    e P H R e a T 8 I s H s D

    I s 8 P W

    o k s W ` s H H R T B s P 8 `

    a R s B D B P B 8 H

    o

    o k s W ` P I 8 T I s W W 8 D g s P a B B H |

    o

    o

    I 8 W P I 8 8 W P a R

    q r T R W ` B P B R W s D 8 W P a R

    q s W ` g e P e s D B W R a g s P B R W s a 8 B x 8 W q |

    W o

    W

    r

    a D R o D R o

    W aC o

    D R o D R o

    { B e a 8 s H I R @ H P I 8 T s

    s T B P q R P I 8 T I s W W 8 D s s B W H P P a s W H B P B R W

    a R s B D B P q

    R P 8 P I s P P I 8 T s

    s T B P q R P I 8 T I s W W 8 D ` a R

    H P R j 8 a R @ I 8 W P I 8 P a s W H B P B R W

    a R s

    B D B P q B H o r s W ` B H g s B g B j 8 ` @ I 8 W P I 8 P a s W H B P B R W

    a R s B D B P q B H R a o B @ 8

    a 8 D B s D q P a s W H

    R H 8 P I 8 H q g R D H R W P I 8 @ B a 8 @ 8 s D H R 8 P P I 8 g s B g e g s g R e W P

    R B W R a g s P B R W P I a R e I { B e a 8 H I R @ H P I 8 8 8 T P R W g e P e s D B W R a g s P B R W R a

    s H q g g 8 P a B T B W

    e P s D

    I s 8 P H

  • 8/3/2019 Info Theory Lectures

    30/75

    0

    0.1

    0.2

    0.3

    0.4

    0.5

    0.6

    0.7

    0.8

    0.9

    1

    0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1P(transition)

    I(X;Y)

    I(X;Y)

    0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.90.10.20.30.40.50.60.70.80.9

    0

    0.5

    1

    P(transition)

    P(X=0)

    { B e a 8 | s

    s T B P q R B W s a q T I s W W 8 D s H q g g 8 P a B T r s H q g g 8 P a B T

    4 ( U ' o ' 6

    7 8 @ B H I P R ` 8 n W 8 P I 8 T s

    s T B P q R s T I s W W 8 D r e H B W P I 8 g R ` 8 D R s a 8 8 B W

    e P

    s D

    I s 8 P s W ` ` 8

    8 W ` 8 W P R e P

    e P s D

    I s 8 P B x 8 W q P I 8 T I s W W 8 D g s P a B 7 8

    W R P 8 P I s P R a s B x 8 W T I s W W 8 D r B @ 8 @ B H I P R g s B g B j 8 P I 8 g e P e s D B W R a g s P B R W r

    @ 8 g e H P

    8 a R a g s g s B g B j s P B R W R x 8 a s D D

    a R s B D B P q ` B H P a B e P B R W H R a P I 8 B W

    e P

    s D

    I s 8 P 7 8 ` 8 n W 8 P I 8 $ G r ` 8 W R P 8 ` q r s H |

    g s

    W a o

    7 I 8 W @ 8 s T I B 8 x 8 P I B H a s P 8 @ 8 ` 8 H T a B 8 P I 8 H R e a T 8 s H 8 B W g s P T I 8 ` P R P I 8

    T I s W W 8 D

    4 ( U '

    R R x 8 a T R g 8 P I 8

    a R D 8 g R W R B H 8 B W P I 8 H q H P 8 g r @ 8 g B I P T R W H B ` 8 a s ` ` B W

    a 8 ` e W ` s W T q ` e a B W P I 8 8 W T R ` B W

    a R T 8 H H P R R x 8 a T R g 8

    R H H B D 8 8 a a R a H I 8 8

    s g

    D 8 H P I s P s a 8 e H 8 ` I 8 a 8 s a 8 a 8 H P a B T P 8 ` P R H R e a T 8 H @ I B T I @ R e D ` W s P e a s D D q 8

    8 W T R ` 8 ` B W s W R B H 8 D 8 H H 8 W x B a R W g 8 W P s H n 8 ` H B j 8 D R T v T R ` 8 H B 8 s H R e a T 8 s D

    I s 8 P W r @ I B T I I s H

    8 e B

    a R s D 8 H q g R D H I R @ 8 x 8 a r P I 8 ` B H T e H H B R W s

    D B 8 H

    P R g R a 8 8 W 8 a s D H R e a T 8 H s W ` x s a B s D 8 D 8 W P I T R ` B W H T I 8 g 8 H

    W 8

    s a P B T e D s a s H

    8 T P P R 8 T R W H B ` 8 a 8 ` B W a 8 s D e H 8 H R T I s W W 8 D T R ` B W B H P I s P g s W q

  • 8/3/2019 Info Theory Lectures

    31/75

    H R e a T 8 H @ I B T I @ 8 s a 8 B W P 8 a 8 H P 8 ` B W 8 W T R ` B W R a P a s W H g B H H B R W H I s x 8 s H B W B n T s W P

    s g R e W P R a 8 ` e W ` s W T q s D a 8 s ` q R W H B ` 8 a H 8 W ` B W s

    B 8 T 8 R H q W P s T P B T s D D q T R a

    a 8 T P s W ` H 8 g s W P B T s D D q g 8 s W B W e D j W D B H I R a T R g

    e P 8 a

    a R a s g P 8 P P I a R e I s

    T I s W W 8 D @ I B T I a s W ` R g D q T R a a e

    P 8 ` R W s x 8 a s 8 o B W o T I s a s T P 8 a H H e T I s H g B I P

    8 B W P a R ` e T 8 ` q P a s W H g B H H B R W s T a R H H s a s P I 8 a H B T v D q 8 D 8 H q H P 8 g 8 |

    o a B W a 8 B W R a T 8 g 8 W P H r @ 8 a 8 R B W P R s ` x s W T 8

    P H 8 s H q P R a 8 T R W B H 8 H

    8 8 T I

    8 T R W H P a e T P B R W a R g P I 8 R D D R @ B W ` e 8 P R T R a a e

    P B R W R o B W o T I s a s T P 8 a H @ R e D `

    8 T R g

    s a s P B x 8 D q H P a s B I P R a @ s a ` |

    o a B j a 8 B W R a T 8 8 W P H r @ 8 a 8 R B W P R s ` x s W T 8

    P H 8 s H q g R a 8 T R W B H 8 H

    8 8 T I

    R @ 8 x 8 a r @ I B D 8 P I 8 a 8 ` e W ` s W T q R P I B H H R e a T 8

    a R P 8 T P H s s B W H P H e T I a s W ` R g

    T I s a s T P 8 a 8 a a R a r T R W H B ` 8 a P I 8 8 a a R a ` e 8 P R s I e g s W g B H I 8 s a B W |

    o a B W P I a 8 8 s W ` R e a

    8 W T 8 r @ 8 a 8 R B W P R s ` s W T 8

    P H 8 s H q P R @ a 8 T v s W B T 8

    8 s T I

    I 8 T R ` B W W 8 8 ` H P R T R W H B ` 8 a P I 8 8 a a R a T I s a s T P 8 a B H P B T H R P I 8 T I s W W 8 D s W ` ` 8

    T R ` 8 a r s W ` P a q P R s T I B 8 x 8 s H B W B n T s W P ` B H P s W T 8 8 P @ 8 8 W

    D s e H B D 8 8 W T R ` 8 `

    g 8 H H s 8 H

    W 8 R P I 8 H B g

    D 8 H P T R ` 8 H B H s 7 8 P s v 8 s B W s a q H q g g 8 P a B T

    T I s W W 8 D @ B P I s P a s W H B P B R W

    a R s B D B P q P I B H B x 8 H s T I s W W 8 D T s

    s T B P q

    o D R o D R o I 8 W s P e a s D B W s a q 8 W T R ` B W R a P I B H B H P I 8 W

    o k P I 8 a 8

    8 P B P B R W T R ` 8 @ B D D a 8

    8 s P P I 8 H 8 ` B B P H s W R ` ` W e g 8 a R P B g 8 H s W `

    8 a R a g g s v R a B P q x R P B W

    8 W T 8 @ 8 P a s W H g B P @ o B P H

    8 a H q g R D r s W ` @ B D D R P s B W s W 8 a a R a B o

    R a g R a 8 B P H s a 8 a 8 T 8 B x 8 ` B W 8 a a R a r P I s P B H |

    H

    o

    o

    l

    R W H B ` 8 a s P a s W H B P B R W

    a R s B D B P q R

    o I 8 T I s W W 8 D T s

    s T B P q s H B x 8 W q

    8 e s P B R W o B H

    o n e a 8 s I 8 T R ` 8 a s P 8 R P I 8 a 8

    8 P B P B R W P 8 T I W B e 8

    s s B W H P P I 8 a 8 H B ` e s D

    a R s B D B P q R 8 a a R a B H ` 8 g R W H P a s P 8 ` B W n e a 8

  • 8/3/2019 Info Theory Lectures

    32/75

    0.9

    0.91

    0.92

    0.93

    0.94

    0.95

    0.96

    0.97

    0.98

    0.99

    1

    1e-08 1e-07 1e-06 1e-05 0.0001 0.001 0.01 0.1

    Capac

    ity

    P(transition)

    I(X;Y)(p)I(X;Y)(0.01)

    1e-10

    1e-09

    1e-08

    1e-07

    1e-06

    1e-05

    0.0001

    0.001

    0.01

    0.1

    0.01 0.1 1

    Res

    idua

    l

    error

    rate

    Code rate

    Repetitiion codeH(0.01)

    { B e a 8 | s s

    s T B P q R B W s a q H q g g 8 P a B T T I s W W 8 D s P D R @ D R H H a s P 8 H r j

    T B 8 W T q R a 8

    8 P B P B R W T R ` 8 R a s P a s W H B P B R W

    a R s B D B P q R

    o

    4 ( U 4 ( (

    7 8 s a a B x 8 s P

    I s W W R W H H 8 T R W ` P I 8 R a 8 g r P I 8 $ # $ % |

    { R a s T I s W W 8 D R T s

    s T B P q s W ` s H R e a T 8 R 8 W P a R

    q B r

    P I 8 W R a s a B P a s a B D q H g s D D ' r P I 8 a 8 8 B H P H s T R ` B W H T I 8 g 8 H e T I P I s P

    P I 8 H R e a T 8 B H a 8

    a R ` e T 8 ` @ B P I s a 8 H B ` e s D 8 a a R a a s P 8 D 8 H H P I s W '

    I s W W R W H

    a R R R P I B H P I 8 R a 8 g B H s W 8 B H P 8 W T 8

    a R R a s P I 8 a P I s W s g 8 s W H P R

    T R W H P a e T P H e T I T R ` 8 H B W P I 8 8 W 8 a s D T s H 8 W

    s a P B T e D s a P I 8 T I R B T 8 R s R R `

    T R ` 8 B H ` B T P s P 8 ` q P I 8 T I s a s T P 8 a B H P B T H R P I 8 T I s W W 8 D W R B H 8 W

    s a s D D 8 D @ B P I P I 8

    W R B H 8 D 8 H H T s H 8 r 8 P P 8 a T R ` 8 H s a 8 R P 8 W s T I B 8 x 8 ` q T R ` B W g e D P B

    D 8 B W

    e P H q g R D H

    R W H B ` 8 a s a s P I 8 a s a P B n T B s D T I s W W 8 D @ I B T I g s q a s W ` R g D q T R a a e

    P R W 8 B P B W 8 s T I

    D R T v R H 8 x 8 W e H 8 ` P R 8 W T R ` 8 H q g R D H B W P I 8 T I s W W 8 D @ 8 P s v 8 P I 8

    a R s B D

    B P q R s B P T R a a e

    P B R W 8 x 8 W P B H P I 8 H s g 8 s H T R a a 8 T P a 8 T 8

    P B R W 7 8 B W v 8 T P

    8 e B

    a R s D 8 B W

    e P H q g R D H T D 8 s a D q

    m

    R a e W B e 8 ` 8 T R ` B W 7 I s P B H P I 8

    T s

    s T B P q R P I B H T I s W W 8 D

    7 8 I s x 8

    m

    B W

    e P s W ` R e P

    e P

    s P P 8 a W H R a s B x 8 W B W

    e P

    X

    @ B P I B W s a q ` B B P

    a 8

    a 8 H 8 W P s P B R W

    m

    r @ 8 I s x 8 8 B I P 8 e B

    a R s D 8 B 8 @ B P I o

    a R s

    B D B P q R e P

    e P H q g R D H s W ` W R R P I 8 a H |

  • 8/3/2019 Info Theory Lectures

    33/75

    m

    m

    m

    m

    m

    m

    m

    m

    I 8 W T R W H B ` 8 a B W P I 8 B W R a g s P B R W T s

    s T B P q

    8 a H q g R D |

    X

    a a

    r

    W

    o

    p

    c p r X

    D R

    o

    c p r X

    X

    o

    o

    D R

    o

    o

    o

    }

    D R

    o

    o

    ~

    I 8 T s

    s T B P q R P I 8 T I s W W 8 D B H B W R a g s P B R W B P H

    8 a B W s a q ` B B P R P I 8

    T I s W W 8 D T R ` B W s W @ 8 n W ` s g 8 T I s W B H g P R 8 W T R ` 8 B W R a g s P B R W B P H B W

    T I s W W 8 D B P H H e v 8 T P P R P I 8 8 a a R a

    a R

    8 a P q ` 8 H T a B 8 ` s R x 8

    I 8 s g g B W R ` 8

    a R x B ` 8 H s % T R ` 8 P R

    8 a R a g P I B H s H q H

    P 8 g s P B T T R ` 8 B H R W 8 B W @ I B T I P I 8 R x B R e H B W s a q 8 W T R ` B W R P I 8 H R e a T 8 H q g R D H

    B H

    a 8 H 8 W P B W P I 8 T I s W W 8 D 8 W T R ` 8 ` R a g { R a R e a H R e a T 8 @ I B T I 8 g B P H s P 8 s T I P B g 8

    B W P 8 a x s D o R o H q g R D H r @ 8 P s v 8 P I 8 B W s a q a 8

    a 8 H 8 W P s P B R W R P I B H s W ` T R

    q B P

    P R B P H

    r r

    s W `

    m

    R P I 8 8 W T R ` 8 ` D R T v P I 8 a 8 g s B W B W B P H s a 8 B x 8 W q

    r s W ` % q

    |

    -

    -

    m

    s W ` r

    - -

    -

    m

    -

    -

    m

    s W ` r

    -

    -

    -

    m

    - -

    m

    s W ` r

    -

    - -

    m

    W a 8 T 8

    P B R W B P I 8 B W s a q W e g 8 a

    P I 8 W P I 8 a 8 B H W R 8 a a R a r 8 D H 8

    B H P I 8 B P B W 8 a a R a

    I B H s g g B W T R ` 8 e H 8 H B P H P R T R a a 8 T P

    o 8 a a R a

    s P P 8 a W H s W `

    P a s W H 8 a e H 8 e D B P H W 8 W 8 a s D s s g g B W T R ` 8 e H 8 H B P H P R T R a a 8 T P

    o

    8 a a R a

    s P P 8 a W H s W ` P a s W H 8 a

    o e H 8 e D B P H I 8 s g g B W T R ` 8 H s a 8

    T s D D 8 ` 6 s H P I 8 q e H 8 B P H P R T R a a 8 T P

    o 8 a a R a H

  • 8/3/2019 Info Theory Lectures

    34/75

    I 8 s g g B W T R ` 8 H 8 B H P R a s D D

    s B a H

    o

    l

    s W ` ` 8 P 8 T P R W 8 B P 8 a a R a H

    D H R P I 8 } R D s q T R ` 8 B H s

    o D R T v T R ` 8 @ I B T I T R a a 8 T P H e

    P R P I a 8 8 B P

    8 a a R a H r s W e W W s g 8 ` T R ` 8 8 B H P H s P

    @ I B T I T R a a 8 T P H e

    P R P @ R B P 8 a a R a H

    1e-20

    1e-10

    1

    1e-08 1e-07 1e-06 1e-05 0.0001 0.001 0.01 0.1

    Residualerrorrate

    Mean error rate

    { B e a 8 | s g g B W T R ` 8 a 8 H B ` e s D 8 a a R a a s P 8

    7 8 T s W P I 8 W T R W H B ` 8 a P I 8 g R a 8 8 W 8 a s D T s H 8 r @ I 8 a 8 @ 8 I s x 8 a s W ` R g B P 8 a a R a H

    e W B R a g D q ` B H P a B e P 8 ` B 8 @ 8 s D D R @ P I 8

    R H H B B D B P q R P @ R R a g R a 8 B P 8 a a R a H

    8 a

    B P D R T v

    s B W @ 8 R P s B W P I 8

    a R s B D B P q R a 8 H B ` e s D 8 a a R a s H P I 8 a 8 g s B W ` 8 a

    R P I 8 B W R g B s D H 8 a B 8 H |

    m

    H

    o m l

    7 8 W R @ T R W H B ` 8 a P I 8 T s H 8 B W @ I B T I P I 8 H B W s D H R a g 8 H H s 8 H @ 8 @ B H I P R P a s W H 8 a

    s a 8 T R W P B W e R e H D q x s a B s D 8 P I s P B H R P I P I 8 H q g R D H @ 8 @ B H I P R P a s W H g B P s a 8

    T R W P B W e R e H e W T P B R W H r s W ` P I 8 s H H R T B s P 8 `

    a R s B D B P B 8 H R P I 8 H 8 e W T P B R W H s a 8

    B x 8 W q s T R W P B W e R e H

    a R s B D B P q ` B H P a B e P B R W

    7 8 T R e D ` P a q s W ` R P s B W P I 8 a 8 H e D P H s H s D B g B P B W

    a R T 8 H H a R g P I 8 ` B H T a 8 P 8 T s H 8

    { R a 8 s g

    D 8 r T R W H B ` 8 a s a s W ` R g x s a B s D 8

    @ I B T I P s v 8 H R W x s D e 8 H

    X p

    X

    r

    o

    r @ B P I

    a R s B D B P q

    X p

    X

    r B 8

    a R s B D B P q ` 8 W H B P q e W T P B R W

    X p

    7 8 I s x 8 P I 8 s H H R T B s P 8 `

    a R s B D B P q W R a g s D B j s P B R W |

    p

    X p

    X

    o

    o

  • 8/3/2019 Info Theory Lectures

    35/75

    H B W R e a R a g e D s R a ` B H T a 8 P 8 8 W P a R

    q s W ` P s v B W P I 8 D B g B P |

    W1 D B g

    k

    p

    X p

    X

    D R

    }

    o

    X p

    X

    ~

    D B g

    k

    p

    X p

    D R

    }

    o

    X p

    ~

    X

    D R

    X

    p

    X

    X

    l

    X

    D R

    }

    o

    X

    ~

    X

    } D B g

    k

    D R

    X

    ~

    l

    X

    X

    W D B g

    k

    D R

    X

    I B H B H a s P I 8 a @ R a a q B W s H P I 8 D s P P 8 a D B g B P ` R 8 H W R P 8 B H P R @ 8 x 8 a r s H @ 8 s a 8

    R P 8 W B W P 8 a 8 H P 8 ` B W P I 8 ` B 8 a 8 W T 8 H 8 P @ 8 8 W 8 W P a R

    B 8 H B 8 B W P I 8 T R W H B ` 8 a s P B R W

    R g e P e s D 8 W P a R

    q R a T s

    s T B P q r @ 8 ` 8 n W 8 P I 8

    a R D 8 g s @ s q q e H B W P I 8 n a H P

    P 8 a g R W D q s H s g 8 s H e a 8 R |

    W

    l

    X

    D R

    }

    o

    X

    ~

    X

    7 8 T s W 8 P 8 W ` P I B H P R s T R W P B W e R e H a s W ` R g x 8 T P R a R ` B g 8 W H B R W @ T R W T 8 s D B W

    P I 8 @ R D ` B W P 8 a s D 8 I B W ` x 8 T P R a W R P s P B R W s W ` R D ` P q

    8 |

    l

    D R

    }

    o

    ~

    7 8 T s W P I 8 W s D H R ` 8 n W 8 P I 8 v R B W P s W ` T R W ` B P B R W s D 8 W P a R

    B 8 H R a T R W P B W e R e H ` B H

    P a B e P B R W H |

    D R

    }

    o

    ~

    r

    D R

    }

    ~

    r

    D R

    }

    ~

    @ B P I |

    X

    c

    { B W s D D q r g e P e s D B W R a g s P B R W R T R W P B W e R e H a s W ` R g x s a B s D 8 H

    s W ` f B H ` 8 n W 8 `

    e H B W P I 8 ` R e D 8 B W P 8 a s D |

    f

    X c

    D R

    }

    X r c

    X

    ~

    X

    c

    @ B P I P I 8 T s

    s T B P q R P I 8 T I s W W 8 D B x 8 W q g s B g B j B W P I B H g e P e s D B W R a g s P B R W

    R x 8 a s D D

    R H H B D 8 B W

    e P ` B H P a B e P B R W H R a

  • 8/3/2019 Info Theory Lectures

    36/75

    n o ( ( #

    7 8 R P s B W x s a B R e H

    a R

    8 a P B 8 H s W s D s R e H P R P I 8 ` B H T a 8 P 8 T s H 8 W P I 8 R D D R @ B W

    @ 8 ` a R

    P I 8 R D ` P q

    8 r e P 8 s T I ` B H P a B e P B R W r x s a B s D 8 8 P T H I R e D ` 8 P s v 8 W P R

    8 R @ ` B g 8 W H B R W H

    o j W P a R

    q B H g s B g B j 8 ` @ B P I 8 e B

    a R s D 8 H q g R D H

    X

    B H D B g B P 8 ` P R

    H R g 8 x R D e g 8 B 8 B H R W D q W R W j 8 a R @ B P I B W P I 8 x R D e g 8 P I 8 W

    X

    B H g s

    B g B j 8 ` @ I 8 W

    X

    o

    X c

    X

    c

    7 I s P B H P I 8 g s B g e g ` B 8 a 8 W P B s D 8 W P a R

    q R a H

    8 T B n 8 ` x s a B s W T 8 @ 8

    T I R R H 8 P I B H s H P I 8 x s a B s W T 8 B H s g 8 s H e a 8 R s x 8 a s 8

    R @ 8 a 8 W T 8 s a 8

    H P s P 8 g 8 W P R P I 8

    a R D 8 g B H P R n W ` P I 8 g s B g e g ` B 8 a 8 W P B s D 8 W P a R

    q R a

    s H

    8 T B n 8 ` g 8 s W

    R @ 8 a

    R W H B ` 8 a P I 8 o ` B g 8 W H B R W s D a s W ` R g x s a B s D 8

    r @ B P I P I 8 T R W H P a s B W P H |

    X

    X

    o

    X

    X

    X

    @ I 8 a 8 |

    X

    X

    X

    I B H R

    P B g B j s P B R W

    a R D 8 g B H H R D x 8 ` e H B W s a s W 8 g e D P B

    D B 8 a H s W ` g s

    B g B j B W |

    X

    D R

    X

    X

    X

    X

    X

    @ I B T I B H R P s B W 8 ` q H R D x B W |

    o D R

    X

    X

    H R P I s P @ B P I ` e 8 a 8 s a ` R a P I 8 T R W H P a s B W P H R W P I 8 H q H P 8 g |

    X

    o

    l

    l

    I 8 W T 8 |

    o

    D R

    7 8 R H 8 a x 8 P I s P | B R a s W q a s W ` R g x s a B s D 8 f @ B P I x s a B s W T 8 r f

    r B B P I 8 ` B 8 a 8 W P B s D 8 W P a R

    q B H ` 8

    8 W ` 8 W P R W D q R W P I 8 x s a B s W T 8 s W `

    B H B W ` 8

    8 W ` 8 W P R P I 8 g 8 s W r I 8 W T 8 B B B P I 8 a 8 s P 8 a P I 8

    R @ 8 a r P I 8 a 8 s P 8 a

    P I 8 ` B 8 a 8 W P B s D 8 W P a R

    q

    I B H 8 P 8 W ` H P R g e D P B

    D 8 ` B g 8 W H B R W H B W P I 8 R x B R e H g s W W 8 a

  • 8/3/2019 Info Theory Lectures

    37/75

    n # ( U ( #

    W P I 8 T s H 8 R T R W P B W e R e H H B W s D H s W ` T I s W W 8 D H @ 8 s a 8 B W P 8 a 8 H P 8 ` B W e W T P B R W H R

    H s q P B g 8 r T I R H 8 W a R g s H 8 P R

    R H H B D 8 e W T P B R W H r s H B W

    e P P R R e a T I s W W 8 D r @ B P I

    H R g 8

    8 a P e a 8 ` x 8 a H B R W R P I 8 e W T P B R W H 8 B W ` 8 P 8 T P 8 ` s P P I 8 R e P

    e P I 8 H 8

    e W T P B R W H P I 8 W P s v 8 P I 8

    D s T 8 R P I 8 B W

    e P s W ` R e P

    e P s D

    I s 8 P H R P I 8 ` B H T a 8 P 8

    T s H 8

    7 8 g e H P s D H R P I 8 W B W T D e ` 8 P I 8

    a R s B D B P q R s B x 8 W e W T P B R W 8 B W B W v 8 T P 8 `

    B W P R P I 8 T I s W W 8 D @ I B T I a B W H e H P R P I 8 B ` 8 s R & % P I B H 8 W 8 a s D s a 8 s R

    @ R a v B H v W R @ W s H % $

    { R a 8 s g

    D 8 r T R W H B ` 8 a P I 8 8 W H 8 g D 8 H |

    o

    H B W

    j s T I x s D e 8 R ` 8 n W 8 H s ` B 8 a 8 W P e W T P B R W r s W ` P R 8 P I 8 a @ B P I s

    a R s B D B P q

    ` B H P a B e P B R W r H s q r @ 8 I s x 8 s W 8 W H 8 g D 8 R P 8 P I s P I 8 a 8 g s q 8

    ` B H T a 8 P 8 R a T R W P B W e R e H T R W H B ` 8 a

    I s H 8 H I B P v 8 q B W

    R W H B ` 8 a s H 8 P R a s W ` R g x s a B s D 8 H

    |

    o

    k @ I 8 a 8 8 s T I

    P s v 8 H R W s a s W ` R g x s D e 8 s T T R a ` B W P R s } s e H H B s W ` B H P a B e P B R W @ B P I

    H P s W ` s a ` ` 8 x B s P B R W

    P I 8 W |

    k

    H B W T

    B H P I 8 @ I B P 8 W R B H 8 8 W H 8 g D 8 r s W ` D B g B P 8 ` P R 8 a P j s W ` @ B P I s x 8 a s 8

    R @ 8 a

    u R a 8 8 W 8 a s D D q r @ 8 I s x 8 R a a s W ` R g x s a B s D 8 H

    X

    k P I 8 8 W H 8 g D 8 R s W `

    D B g B P 8 ` e W T P B R W H |

    X

    k

    X

    H B W T

    @ I 8 a 8 R T R e a H 8 @ 8 a 8 g 8 g 8 a a R g P I 8 H s g

    D B W P I 8 R a 8 g P I s P |

    X

    }

    ~

    @ 8 s D H R T R W H B ` 8 a e W T P B R W H D B g B P 8 ` P R P B g 8 B W P 8 a x s D r P I 8 W @ 8 R

    P s B W R W D q W R W j 8 a R T R 8 T B 8 W P H s W ` @ 8 T s W T R W H B ` 8 a P I 8 8 W H 8 g D 8

    P R 8 a 8

    a 8 H 8 W P 8 ` q s W @ ` B g 8 W H B R W s D @

    a R s B D B P q ` B H P a B e P B R W

    X X X

    u R a 8 H

    8 T B n T s D D q r B @ 8 T R W H B ` 8 a P I 8 8 W H 8 g D 8 R D B g B P 8 `

    R @ 8 a q

    r

    s W ` D B g B P 8 ` P R

    s W ` P B g 8 D B g B P 8 ` H B W s D H W R W j 8 a R R W D q B W B W P 8 a x s D

    r @ 8 n W ` P I s P P I 8 8 W H 8 g D 8 B H a 8

    a 8 H 8 W P 8 ` q s W @ ` B g 8 W H B R W s D

    a R s B D B P q ` B H P a B e P B R W @ I B T I B H j 8 a R R e P H B ` 8 P I 8 @ H

    I 8 a 8 a s ` B e H

  • 8/3/2019 Info Theory Lectures

    38/75

    q T R W H B ` 8 a B W P I 8 D s P P 8 a P q

    8 H R 8 W H 8 g D 8 H r @ 8 T s W n P P I 8 g B W P R P I 8 n W B P 8

    ` B g 8 W H B R W s D T R W P B W e R e H ` B 8 a 8 W P B s D 8 W P a R

    q ` 8 n W B P B R W H B x 8 W B W H 8 T P B R W

    n 4 ( U o ' 6

    7 8 T R W H B ` 8 a T I s W W 8 D H B W @ I B T I W R B H 8 B H B W v 8 T P 8 ` B W ` 8

    8 W ` 8 W P D q R P I 8 H B W s D P I 8

    s a P B T e D s a W R B H 8 H R e a T 8 R B W P 8 a 8 H P B H P I 8 H R T s D D 8 ` $ b

    { e a P I 8 a @ 8 a 8 H P a B T P T R W H B ` 8 a s P B R W H P R P I 8 n W s D T D s H H R 8 W H 8 g D 8

    7 8 I s x 8 s H B W s D @ B P I s x 8 a s 8

    R @ 8 a

    r P B g 8 D B g B P 8 ` P R s W ` s W ` @ B ` P I

    D B g B P 8 ` P R

    7 8 P I 8 W T R W H B ` 8 a P I 8 @ H s g

    D 8 H

    p

    s W ` f

    p

    P I s P T s W 8 e H 8 ` P R

    T I s a s T P 8 a B H 8 R P I P I 8 B W

    e P s W ` R e P

    e P H B W s D H I 8 W P I 8 a 8 D s P B R W H I B

    8 P @ 8 8 W

    P I 8 B W

    e P s W ` R e P

    e P B H B x 8 W q |

    f

    p

    p

    p

    r o

    @

    @ I 8 a 8

    p

    B H a R g P I 8 s W ` D B g B P 8 ` } s e H H B s W ` B H P a B e P B R W @ B P I j 8 a R g 8 s W s W `

    x s a B s W T 8 |

    k

    @ I 8 a 8

    k

    B H P I 8

    R @ 8 a H

    8 T P a s D ` 8 W H B P q

    H B H B W ` 8

    8 W ` 8 W P R

    @ 8 R P s B W P I 8 T R W ` B P B R W s D 8 W P a R

    q s H 8 B W H R D 8 D q

    ` 8

    8 W ` 8 W P R W P I 8 W R B H 8 H R e a T 8 r s W ` a R g 8 e s P B R W n W ` B P H x s D e 8 |

    f

    r

    o

    D R

    k

    8 W T 8 |

    f f

    I 8 T s

    s T B P q R P I 8 T I s W W 8 D B H P I 8 W R P s B W 8 ` q g s B g B j B W P I B H R x 8 a s D D B W

    e P

    ` B H P a B e P B R W H P I B H B H s T I B 8 x 8 ` q g s B g B j B W @ B P I a 8 H

    8 T P P R P I 8 ` B H P a B e P B R W

    R

    H e v 8 T P P R P I 8 s x 8 a s 8

    R @ 8 a D B g B P s P B R W |

    w

    p

    x

    H @ 8 I s x 8 H 8 8 W P I B H B H s T I B 8 x 8 ` @ I 8 W @ 8 I s x 8 s } s e H H B s W ` B H P a B e P B R W r I 8 W T 8

    @ 8 n W ` R P I

    s W ` f g e H P I s x 8 } s e H H B s W ` B H P a B e P B R W H W

    s a P B T e D s a

    I s H

    x s a B s W T 8

    s W ` f I s H x s a B s W T 8

    k

    7 8 T s W P I 8 W 8 x s D e s P 8 P I 8 T I s W W 8 D T s

    s T B P q r s H |

    f

    o

    D R

    k

    o

    D R

    k

    o

    D R } o

    k

    ~

  • 8/3/2019 Info Theory Lectures

    39/75

    I B H T s

    s T B P q B H B W B P H

    8 a T I s W W 8 D H q g R D r H R @ 8 T s W R P s B W s a s P 8

    8 a H 8 T R W ` r

    q g e D P B

    D B T s P B R W q @ r B 8 a R g @ r g e D P B

    D B T s P B R W q |

    D R

    }

    o

    k

    ~

    B P H

    R

    I s W W R W H P I B a ` P I 8 R a 8 g r P I 8 $ # $ % |

    I 8 T s

    s T B P q R s T I s W W 8 D s W ` D B g B P 8 ` P R 8 a P j r

    8 a P e a 8 ` q

    s ` ` B P B x 8 @ I B P 8 } s e H H B s W W R B H 8 R

    R @ 8 a H

    8 T P a s D ` 8 W H B P q

    k

    s W ` s W `

    D B g B P 8 ` P R B H B x 8 W q |

    D R

    } o

    k

    ~

    B P H

    @ I 8 a 8

    B H P I 8 s x 8 a s 8 P a s W H g B P P 8 `

    R @ 8 a

    I 8 H 8 T R W ` P 8 a g @ B P I B W P I 8 D R B W 8 e s P B R W B H P I 8 H B W s D P R W R B H 8 a s P B R

    o H 8 a x 8 P I s P P I 8 T s

    s T B P q B W T a 8 s H 8 H g R W R P R W B T s D D q s W ` @ B P I R e P R e W `

    s H P I 8

    B W T a 8 s H 8 H

    B g B D s a D q P I 8 T s

    s T B P q B W T a 8 s H 8 H g R W R P R W B T s D D q s H P I 8 s W ` @ B ` P I B W T a 8 s H 8 H

    e P P R s D B g B P H B W s q D R a H 8

    s W H B R W R a D W |

    D W o

    C C C

    @ 8 R P s B W |

    k

    D R

    I B H B H R P 8 W a 8 @ a B P P 8 W B W P 8 a g H R # r r @ I B T I B H ` 8 n W 8 ` q

    I 8 D B g B P B W x s D e 8 B H P I 8 W |

    k

    D R

    I B H B H T s D D 8 ` P I 8

    I s W W R W B g B P

    I 8 T s

    s T B P q R P I 8 T I s W W 8 D B H s T I B 8 x 8 ` @ I 8 W P I 8 H R e a T 8

    D R R v H D B v 8 W R B H 8

    I B H B H P I 8 s H B H R a H

    a 8 s ` H

    8 T P a e g P 8 T I W B e 8 H R g R ` e D s P B R W s W ` B W

    s a P B T e D s a R ` 8 B x B H B R W u e D P B

    D 8

    T T 8 H H u

  • 8/3/2019 Info Theory Lectures

    40/75

    1. Foundations: Probability, Uncertainty, and Information2. Entropies Defined, and Why they are Measures of Information3. Source Coding Theorem; Prefix, Variable-, and Fixed-Length Codes4. Channel Types, Properties, Noise, and Channel Capacity5. Continuous Information; Density; Noisy Channel Coding Theorem6. Fourier Series, Convergence, Orthogonal Representation

    7. Useful Fourier Theorems; Transform Pairs; Sampling; Aliasing

    8. Discrete Fourier Transform. Fast Fourier Transform Algorithms9. The Quantized Degrees-of-Freedom in a Continuous Signal10. Gabor-Heisenberg-Weyl Uncertainty Relation. Optimal Logons11. Kolmogorov Complexity and Minimal Description Length.

    Jean Baptiste Joseph Fourier (1768 - 1830)

    37

  • 8/3/2019 Info Theory Lectures

    41/75

    Introduction to Fourier Analysis, Synthesis, and Transforms

    It has been said that the most remarkable and far-reaching relationship in all of mathemat-ics is the simple Euler Relation,

    ei + 1 = 0 (1)

    which contains the five most important mathematical constants, as well as harmonic analysis.This simple equation unifies the four main branches of mathematics: {0,1} represent arithmetic, represents geometry, i represents algebra, and e = 2.718... represents analysis, since one wayto define e is to compute the limit of (1 + 1

    n)n as n.

    Fourier analysis is about the representation of functions (or of data, signals, systems, ...) interms of such complex exponentials. (Almost) any function f(x) can be represented perfectly asa linear combination of basis functions:

    f(x) =k

    ckk(x) (2)

    where many possible choices are available for the expansion basis functions k(x). In the case

    of Fourier expansions in one dimension, the basis functions are the complex exponentials:

    k(x) = exp(ikx) (3)

    where the complex constant i =1. A complex exponential contains both a real part and an

    imaginary part, both of which are simple (real-valued) harmonic functions:

    exp(i) = cos() + i sin() (4)

    which you can easily confirm by using the power-series definitions for the transcendental functionsexp, cos, and sin:

    exp() = 1 +

    1!+

    2

    2!+

    3

    3!+ +

    n

    n!+ , (5)

    cos() = 1 2

    2!+

    4

    4!

    6

    6!+ , (6)

    sin() = 3

    3!+

    5

    5!

    7

    7!+ , (7)

    Fourier Analysis computes the complex coefficients ck that yield an expansion of some functionf(x) in terms of complex exponentials:

    f(x) =n

    k=n

    ck exp(ikx) (8)

    where the parameter k corresponds to frequency and n specifies the number of terms (whichmay be finite or infinite) used in the expansion.

    Each Fourier coefficient ck in f(x) is computed as the (inner product) projection of the functionf(x) onto one complex exponential exp(ikx) associated with that coefficient:

    ck =1

    T

    +T/2

    T/2f(x)exp(ikx)dx (9)

    where the integral is taken over one period (T) of the function if it is periodic, or from to+

    if it is aperiodic. (An aperiodic function is regarded as a periodic one whose period is

    ).

    For periodic functions the frequencies k used are just all multiples of the repetition frequency;for aperiodic functions, all frequencies must be used. Note that the computed Fourier coefficientsck are complex-valued.

    38

  • 8/3/2019 Info Theory Lectures

    42/75

    " % )

    2 3 5 7 9 A C E E C G H P G H S C A V C E 9 3 A 9 Y a S 5 Y c 9 3 5 7 d f g i q 9 r C r a 3 E 7 3 v C x G 5 A g q d f g x i

    d f g i r 9 c 3 S c H 3 7 7 3 a C 5 C E G H 9 c C c G C x r

    C 3 7 C E P C c C 3 E c 3 3 5 G H 9 c V E 3 V C E c 9 C 7 3 a 7 9 5 f g i G 5 A Y 3 7 f g i a 3 E 9 5 c C C E 7

    G 5 A k

    l m o

    Y 3 7 f g i Y 3 7 f g i 6 g

    9 a

    { 3 c C E 9 7 C

    l

    m o

    7 9 5 f g i 7 9 5 f g i g

    9 a

    {

    3 c C E 9 7 C

    l

    m o

    7 9 5 f g i Y 3 7 f g i 6 g ~ f i

    C S 7 C c C E 3 5 C Y C E a S 5 Y c 9 3 5 c 3 v C G 5 k

    {

    3 c C E 9 7 C

    C 5 c C 3 S E 9 C E C E 9 C 7 a 3 E d f g i 9 7 k

    x

    {

    f x { Y 3 7 f g i { 7 9 5 f g i i f i

    C E C c C 3 S E 9 C E 2 3 C Y 9 C 5 c 7 G E C k

    x {

    l

    m o

    d f g i Y 3 7 f g i g f i

    {

    l m o

    d f g i 7 9 5 f g i g f i

    C 3 V C c G c c C 3 S E 9 C E C E 9 C 7 V E 3 P 9 A C 7 G 5 C V G 5 7 9 3 5 3 a c C a S 5 Y c 9 3 5 d f g i 9 5

    c C E v 7 3 a Y 3 7 9 5 C G 5 A 7 9 5 C c C E v 7 r

    - -

    C c

    f g i C G 5 7 S v 3 a 7 9 5 C G 5 A Y 3 7 9 5 C c C E v 7 k

    f g i

    x

    {

    f x

    {

    Y 3 7 f g i

    {

    7 9 5 f g i i f i

    G 5 A

    f g i q c C c E S 5 Y G c C A 3 S E 9 C E C E 9 C 7 a 3 E G a S 5 Y c 9 3 5 d f g i k

    f g i

    x

    {

    f x { Y 3 7 f g i { 7 9 5 f g i i

  • 8/3/2019 Info Theory Lectures

    43/75

    C E C x{

    G 5 A {

    G E C c C 3 S E 9 C E Y 3 C Y 9 C 5 c 7 r 2 4 3 5 7 9 A C E c C 9 5 c C E G H 9 P 9 5 c C

    A 9 7 Y E C V G 5 Y C c C C 5 d f g i G 5 A

    {

    f g i f G 7 7 S v 9 5 d f g i 9 7 C H H C G P C A C 5 3 S a 3 E

    c C 9 5 c C E G H c 3 C 9 7 c i k

    lm o

    d f g i

    {

    f g i

    m

    g

    9 Y 7 9 v V H 9 C 7 c 3 k

    l

    m o

    d f g i

    m

    g

    xm

    {

    f xm

    {

    m

    {

    i

    f x

    x

    i m

    {

    f x

    {

    x {6 i m f

    {

    { i m f i

    3 c C c G c c C c C E v 7 9 5 P 3 H P 9 5 x G 5 A G E C G H H G 5 A P G 5 9 7 C 5 x

    {

    x {

    G 5 A

    {

    { r

    d

    - - -

    C 3 S E 9 C E C E 9 C 7 G 5 A 3 S E 9 C E Y 3 C Y 9 C 5 c 7 G E C A C 5 C A G 7 G 3 P C r 3 C P C E C

    v G C 5 Y 3 S 5 c C E 7 3 v C V E 3 H C v 7 k

    r 9 5 c C E G H 7 9 5 C S G c 9 3 5 7 q a G 9 H c 3 C 9 7 c r C r r k

    d f g i

    g

    3 E

    d f g i

    g E G c 9 3 5 G H

    g 9 E E G c 9 3 5 G H

    r G H c 3 S x { q { C 9 7 c q c C 7 C E 9 C 7 A 3 C 7 5 3 c Y 3 5 P C E C q

    r C P C 5 c 3 S c C 7 C E 9 C 7 Y 3 5 P C E C 7 q c C E C 7 S H c 9 7 5 3 c d f g i

    d f g i

    g

    g

    c C 5 k

    x { ~ {

    l

    o

    7 9 5 f g i g

    {

    o

    3 A A

    C P C 5

    7 3 k

    d f g i

    7 9 5 f g i

    7 9 5 f g i

    7 9 5 f g i

    S c 7 C E 9 C 7 9 P C 7 d f i

    r

  • 8/3/2019 Info Theory Lectures

    44/75

    -1.5

    -1

    -0.5

    0

    0.5

    1

    1.5

    -1 0 1 2 3 4 5 6 7

    fseries( 1,x)fseries( 7,x)fseries(63,x)

    rect(x)

    9 S E C k V V E 3 9 v G c 9 3 5 7 c 3 E C Y c G 5 S H G E V S H 7 C

    -1.5

    -1

    -0.5

    0

    0.5

    1

    1.5

    -0.2 -0.15 -0.1 -0.05 0 0.05 0.1 0.15 0.2

    fseries( 63,x)fseries(163,x)

    rect(x)

    9 S E C k 9 7 V C 5 3 v C 5 3 5

  • 8/3/2019 Info Theory Lectures

    45/75

    3 C P C E q 9 5 E C G H H 9 a C C G v V H C 7 C 5 Y 3 S 5 c C E C A 9 5 7 9 5 G H V E 3 Y C 7 7 9 5 c 9 5 7 G E C 7 9 v

    V H C E G 7 k

    r a d f g i 9 7 3 S 5 A C A 9 5 f ~ i G 5 A V 9 C Y C 9 7 C Y 3 5 c 9 5 S 3 S 7 q c C 3 S E 9 C E 7 C E 9 C 7

    Y 3 5 P C E C 7 c 3

    d f g

    i d f g i G c 9 5 c C E 9 3 E V 3 9 5 c 7 G 5 A

    d f i d f

    i

    G c G 5 A r

    r a d f g i 9 7 G H 7 3 Y 3 5 c 9 5 S 3 S 7 3 5 f ~ i q c C 7 S v 3 a c C 3 S E 9 C E C E 9 C 7 9 7 C S G H

    c 3 d f g i G c G H H V 3 9 5 c 7 r

    r x { G 5 A { c C 5 A c 3 C E 3 G c H C G 7 c G 7 a G 7 c G 7 r

    -

    C E 9 c C S 7 9 5

    {

    9 5 c C 3 P 9 3 S 7 G a E 3 v c C a 3 E v S H G C a 3 E 7 9 5 G 5 A Y 3 7 k

    d f g i

    x

    {

    f x { Y 3 7 f g i { 7 9 5 f g i i

    x

    {

    x {

    f

    {

    {

    i

    {

    f

    {

    {

    i

    9

    {

    {

    f i

    9 c k

    x

    { f x

    { !

    {i

    { f x { ! { i

    3 7 C E P C k

    {

    $

    {

    f % i

    C E C

    $

    A C 5 3 c C 7 c C Y 3 v V H C Y 3 5 & S G c C q G 5 A k

    {

    lm o

    d f g i

    {

    g f i

    ( -

    2 3 5 7 9 A C E c C P G H S C 3 a G V C E 9 3 A 9 Y a S 5 Y c 9 3 5 d f g i G c ) A 9 7 Y E C c C C S G H H 7 V G Y C A

    P G H S C 7 3 a g k

    g 1 3 5 f 5

    )

    q 3 ~ ~ 8 8 8 ~ ) i

    c E c 3 5 A Y 3 C Y 9 C 5 c 7

    { 7 S Y c G c k

    d f 3 5 i

    {

    {

    A

    1 {

    f i

    C D E G I Q I R G E T G V W R Y R I a R a R E I V G e f g h e q h V h t v x y W R Y t v x Q E e t v x W R Y t v x

  • 8/3/2019 Info Theory Lectures

    46/75

    S H c 9 V H

    A

    1

    G 5 A 7 S v 9 c E C 7 V C Y c c 3 3 k

    1

    d f 3 5 i

    A

    1

    1

    {

    {

    A

    1 {

    S c c C 7 S v 3 a G C 3 v C c E 9 Y 7 C E 9 C 7 G 5 A H G c G 5 c G 7 7 C E c 9 3 5 k

    1

    A

    1 {

    jl

    l

    l

    m

    l

    l

    ln

    A

    {

    A

    {

    o

    )

    7 3 k

    )

    1

    d f 3 5 i

    A

    1

    f i

    C E Y 9 7 C a 3 E c C E C G A C E k 7 3 9 5 7 C E c 9 5 C S G c 9 3 5 9 5 c 3 C S G c 9 3 5 7 G c 9 7 C 7

    c C 3 E 9 9 5 G H C S G c 9 3 5 7 r

    -5

    -4

    -3

    -2

    -1

    0

    1

    2

    3

    4

    0 2 4 6 8 10 12

    saw(x)trig.pts.8trig.pts.64

    9 S E C k % G 5 A V 3 9 5 c 9 5 c C E V 3 H G c 9 3 5 7 c 3 7 G c 3 3 c a S 5 Y c 9 3 5

    C Y G 5 c E G 5 7 H G c C c 9 7 G Y 9 5 c 3 Y 3 7 G 5 A 7 9 5 7 C E 9 C 7 k

    d f g 1 i x

    m

    {

    f x { Y 3 7 f

    3