unit-1.ppt

Post on 29-Oct-2014

233 Views

Category:

Documents

19 Downloads

Preview:

Click to see full reader

DESCRIPTION

unit-1.ppt

TRANSCRIPT

IT2303 IT2303 INFORMATION THEORY INFORMATION THEORY AND CODINGAND CODING

SYLLABUSSYLLABUSUNIT I INFORMATION THEORY

Information – Entropy, Information rate, classification of codes, Kraft McMillan inequality, Source coding theorem, Shannon-Fano coding, Huffman coding, Extended Huffman coding - Joint and conditional entropies, Mutual information - Discrete memoryless channels – BSC, BEC – Channel capacity, Shannon limit.

UNIT II SOURCE CODING: TEXT, AUDIO AND SPEECH

Text: Adaptive Huffman Coding, Arithmetic Coding, LZW algorithm – Audio: Perceptual coding, Masking techniques, Psychoacoustic model, MEG Audio layers I,II,III, Dolby AC3 - Speech: Channel Vocoder, Linear Predictive Coding

UNIT III SOURCE CODING: IMAGE AND VIDEO

Image and Video Formats – GIF, TIFF, SIF, CIF, QCIF – Image compression: READ, JPEG – Video Compression: Principles-I,B,P frames, Motion estimation, Motion compensation, H.261, MPEG standard

UNIT IV ERROR CONTROL CODING: BLOCK CODES

Definitions and Principles: Hamming weight, Hamming distance, Minimum distance decoding - Single parity codes, Hamming codes, Repetition codes - Linear block codes, Cyclic codes - Syndrome calculation, Encoder and decoder - CRC

UNIT V -ERROR CONTROL CODING: CONVOLUTIONAL CODES

Convolutional codes – code tree, trellis, state diagram - Encoding – Decoding: Sequential search and Viterbi algorithm – Principle of Turbo coding

 

REFERENCE BOOKSREFERENCE BOOKSTEXT BOOKS:

  R Bose, “Information Theory, Coding and Cryptography”,

TMH 2007 Fred Halsall, “Multimedia Communications: Applications,

Networks, Protocols and Standards”, Pearson Education Asia, 2002 

REFERENCES: K Sayood, “Introduction to Data Compression” 3/e, Elsevier

2006 S Gravano, “Introduction to Error Control Codes”, Oxford

University Press 2007 Amitabha Bhattacharya, “Digital Communication”, TMH 2006

 

UNIT I UNIT I INFORMATION THEORYINFORMATION THEORY

ContentsContentsInformation – Entropy, Information rate,classification of codes, Kraft McMillan

inequality, Source coding theorem, Shannon-Fano

coding, Huffman coding, Extended Huffman coding

Joint and conditional entropies, Mutual information Discrete memoryless channels – BSC,

BECChannel capacity, Shannon limit.

Communication systemCommunication system

Information is closely related to uncertainty or surprise.When message from source known->No surprise No informationProbability is low more surprise more information.Amount of information is inverse of probability of occurrence

What is information theory ?◦ Information theory is needed to enable

the communication system to carry information (signals) from sender to receiver over a communication channel it deals with mathematical modelling and

analysis of a communication system its major task is to answer to the questions of

signal compression and transfer rate◦ Those answers can be found and solved

by entropy and channel capacity

Uncertainty, surprise & Uncertainty, surprise & InformationInformation

Before the event X= X i occurs, amount of uncertainty.

When the event X= X i occurs, amount of surprise.

After the occurrence of X= X i ,gain in amount of information.

Amount of information is related to inverse of probability of occurrence.

EntropyEntropy

Property of entropyProperty of entropy

Entropy is bounded by

0 ≤ H(X) ≤ log2 K

•The entropy is maximum with uniform distribution and minimum when there is only one possible value.

Source Coding TheoremSource Coding Theorem

Source coding- an effective representation of data generated by a discrete source◦representation by source

encoderstatistics of the source must

be known (e.g. if coding priorities exist)

• Two types of coding

1)Fixed length code

2)Variable length code (Morse code)• In morse code, letters and alphabets are encoded

as dots”.” and dashes”-”• Short code frequently occurring source symbol

(e)• Long code rare source symbol (q)• Efficient source should satisfy 2 condition

i. Code word produce by the encoder are in binary form

ii. The source code should be uniquely decodable.

Shannon’s first TheoremShannon’s first TheoremL represents the average code word length.Lmin represents minimum possible value of

L.Coding efficiency is defined as ή = Lmin / L L ≥ LminAccording to source coding theorem, H(X)

represents as fundamental limit on the average number of bits per source symbol,so we can equate H(X) to Lmin

ή = H(X) / L

Data CompactionData CompactionData compaction (lossless data

compression) means that we will remove redundant information from the signal prior the transmission◦ basically this is achieved by assigning

short descriptions to the most frequent outcomes of the source output and vice versa

Source-coding schemes that are used in data compaction are e.g. prefix coding, huffman coding, lempel-ziv,shano-fano.

Prefix CodingPrefix Coding

Huffman CodingHuffman Coding

Contd.,Contd.,

Discrete memoryless Discrete memoryless channelschannels

EntropyEntropy

Contd.,Contd.,Conditional entropy

(equivocation)-amount of uncertainty remaining about the channel input after the channel output is observed.

Marginal probability distribution of o/p random variable Y is obtained by averaging out dependence of on

Binary symmetric channelBinary symmetric channel

BSC.,BSC.,Conditional probability of error channel capacity is

C varies with probability of error in convex manner ,which is symmetric about p=1/2.

Channel noise free, set p=0 => C attains maximum value of one bit per channel use.At this value H(p) attains min value.

When error p=1/2, => C attains maximum value of zero,whereas entropy H(p) attains max value of unity, and channel is said to be useless.

Mutual informationMutual information

I(X,Y) = H(X) – H(X,Y)

Mutual informationMutual information

Properties of Mutual Properties of Mutual informationinformation

Symmetric

Non negative

I(X,Y) = H(X) + H(Y) – H(X,Y)

Mutual information of channel is related to joint entropy of channel input and channel output by

Channel CapacityChannel Capacity

Definition – channel Definition – channel capacitycapacityChannel capacity (C)of a discrete

memory less channel is the maximum mutual information I(X;Y) in any single use of the channel( i.e., signaling interval),where maximization is over all possible input probability distributions.

C measured in bits per channel use or bits per transmission.

Channel coding theoremChannel coding theorem“If a discrete memoryless source with an alphabet

‘S’ has an entropy H(S) and produces symbols every ‘Ts’

seconds; and a discrete memoryless channel has a capacity I (X,

Y) Max and is used once every Tc seconds; then if

c

Max

s T

)Y,X(I

T

)S(H

There exists a coding scheme for which the source output can be transmitted over the channel and be reconstructed with an arbitrarily small probability of error. The parameter C/Tc is called the critical rate. When this condition is satisfied with the equality sign, the system is said to be signaling at the critical rate.

Conversely, if,

c

Max

s T

)Y,X(I

T

)S(H

it is not possible to transmit information over the channel and reconstruct it with an arbitrarily small probability of error

top related