unit-1.ppt

45
IT2303 IT2303 INFORMATION THEORY AND INFORMATION THEORY AND CODING CODING

Upload: abhishek-bose

Post on 29-Oct-2014

233 views

Category:

Documents


19 download

DESCRIPTION

unit-1.ppt

TRANSCRIPT

Page 1: unit-1.ppt

IT2303 IT2303 INFORMATION THEORY INFORMATION THEORY AND CODINGAND CODING

Page 2: unit-1.ppt

SYLLABUSSYLLABUSUNIT I INFORMATION THEORY

Information – Entropy, Information rate, classification of codes, Kraft McMillan inequality, Source coding theorem, Shannon-Fano coding, Huffman coding, Extended Huffman coding - Joint and conditional entropies, Mutual information - Discrete memoryless channels – BSC, BEC – Channel capacity, Shannon limit.

UNIT II SOURCE CODING: TEXT, AUDIO AND SPEECH

Text: Adaptive Huffman Coding, Arithmetic Coding, LZW algorithm – Audio: Perceptual coding, Masking techniques, Psychoacoustic model, MEG Audio layers I,II,III, Dolby AC3 - Speech: Channel Vocoder, Linear Predictive Coding

Page 3: unit-1.ppt

UNIT III SOURCE CODING: IMAGE AND VIDEO

Image and Video Formats – GIF, TIFF, SIF, CIF, QCIF – Image compression: READ, JPEG – Video Compression: Principles-I,B,P frames, Motion estimation, Motion compensation, H.261, MPEG standard

UNIT IV ERROR CONTROL CODING: BLOCK CODES

Definitions and Principles: Hamming weight, Hamming distance, Minimum distance decoding - Single parity codes, Hamming codes, Repetition codes - Linear block codes, Cyclic codes - Syndrome calculation, Encoder and decoder - CRC

UNIT V -ERROR CONTROL CODING: CONVOLUTIONAL CODES

Convolutional codes – code tree, trellis, state diagram - Encoding – Decoding: Sequential search and Viterbi algorithm – Principle of Turbo coding

 

Page 4: unit-1.ppt

REFERENCE BOOKSREFERENCE BOOKSTEXT BOOKS:

  R Bose, “Information Theory, Coding and Cryptography”,

TMH 2007 Fred Halsall, “Multimedia Communications: Applications,

Networks, Protocols and Standards”, Pearson Education Asia, 2002 

REFERENCES: K Sayood, “Introduction to Data Compression” 3/e, Elsevier

2006 S Gravano, “Introduction to Error Control Codes”, Oxford

University Press 2007 Amitabha Bhattacharya, “Digital Communication”, TMH 2006

 

Page 5: unit-1.ppt

UNIT I UNIT I INFORMATION THEORYINFORMATION THEORY

Page 6: unit-1.ppt

ContentsContentsInformation – Entropy, Information rate,classification of codes, Kraft McMillan

inequality, Source coding theorem, Shannon-Fano

coding, Huffman coding, Extended Huffman coding

Joint and conditional entropies, Mutual information Discrete memoryless channels – BSC,

BECChannel capacity, Shannon limit.

Page 7: unit-1.ppt
Page 8: unit-1.ppt

Communication systemCommunication system

Page 9: unit-1.ppt
Page 10: unit-1.ppt
Page 11: unit-1.ppt

Information is closely related to uncertainty or surprise.When message from source known->No surprise No informationProbability is low more surprise more information.Amount of information is inverse of probability of occurrence

Page 12: unit-1.ppt

What is information theory ?◦ Information theory is needed to enable

the communication system to carry information (signals) from sender to receiver over a communication channel it deals with mathematical modelling and

analysis of a communication system its major task is to answer to the questions of

signal compression and transfer rate◦ Those answers can be found and solved

by entropy and channel capacity

Page 13: unit-1.ppt
Page 14: unit-1.ppt

Uncertainty, surprise & Uncertainty, surprise & InformationInformation

Before the event X= X i occurs, amount of uncertainty.

When the event X= X i occurs, amount of surprise.

After the occurrence of X= X i ,gain in amount of information.

Amount of information is related to inverse of probability of occurrence.

Page 15: unit-1.ppt
Page 16: unit-1.ppt

EntropyEntropy

Page 17: unit-1.ppt

Property of entropyProperty of entropy

Entropy is bounded by

0 ≤ H(X) ≤ log2 K

Page 18: unit-1.ppt

•The entropy is maximum with uniform distribution and minimum when there is only one possible value.

Page 19: unit-1.ppt

Source Coding TheoremSource Coding Theorem

Source coding- an effective representation of data generated by a discrete source◦representation by source

encoderstatistics of the source must

be known (e.g. if coding priorities exist)

Page 20: unit-1.ppt

• Two types of coding

1)Fixed length code

2)Variable length code (Morse code)• In morse code, letters and alphabets are encoded

as dots”.” and dashes”-”• Short code frequently occurring source symbol

(e)• Long code rare source symbol (q)• Efficient source should satisfy 2 condition

i. Code word produce by the encoder are in binary form

ii. The source code should be uniquely decodable.

Page 21: unit-1.ppt

Shannon’s first TheoremShannon’s first TheoremL represents the average code word length.Lmin represents minimum possible value of

L.Coding efficiency is defined as ή = Lmin / L L ≥ LminAccording to source coding theorem, H(X)

represents as fundamental limit on the average number of bits per source symbol,so we can equate H(X) to Lmin

ή = H(X) / L

Page 22: unit-1.ppt

Data CompactionData CompactionData compaction (lossless data

compression) means that we will remove redundant information from the signal prior the transmission◦ basically this is achieved by assigning

short descriptions to the most frequent outcomes of the source output and vice versa

Source-coding schemes that are used in data compaction are e.g. prefix coding, huffman coding, lempel-ziv,shano-fano.

Page 23: unit-1.ppt

Prefix CodingPrefix Coding

Page 24: unit-1.ppt

Huffman CodingHuffman Coding

Page 25: unit-1.ppt

Contd.,Contd.,

Page 26: unit-1.ppt

Discrete memoryless Discrete memoryless channelschannels

Page 27: unit-1.ppt

EntropyEntropy

Page 28: unit-1.ppt

Contd.,Contd.,Conditional entropy

(equivocation)-amount of uncertainty remaining about the channel input after the channel output is observed.

Marginal probability distribution of o/p random variable Y is obtained by averaging out dependence of on

Page 29: unit-1.ppt
Page 30: unit-1.ppt
Page 31: unit-1.ppt
Page 32: unit-1.ppt

Binary symmetric channelBinary symmetric channel

Page 33: unit-1.ppt
Page 34: unit-1.ppt

BSC.,BSC.,Conditional probability of error channel capacity is

C varies with probability of error in convex manner ,which is symmetric about p=1/2.

Channel noise free, set p=0 => C attains maximum value of one bit per channel use.At this value H(p) attains min value.

When error p=1/2, => C attains maximum value of zero,whereas entropy H(p) attains max value of unity, and channel is said to be useless.

Page 35: unit-1.ppt

Mutual informationMutual information

I(X,Y) = H(X) – H(X,Y)

Page 36: unit-1.ppt

Mutual informationMutual information

Page 37: unit-1.ppt
Page 38: unit-1.ppt

Properties of Mutual Properties of Mutual informationinformation

Symmetric

Non negative

I(X,Y) = H(X) + H(Y) – H(X,Y)

Mutual information of channel is related to joint entropy of channel input and channel output by

Page 39: unit-1.ppt
Page 40: unit-1.ppt

Channel CapacityChannel Capacity

Page 41: unit-1.ppt

Definition – channel Definition – channel capacitycapacityChannel capacity (C)of a discrete

memory less channel is the maximum mutual information I(X;Y) in any single use of the channel( i.e., signaling interval),where maximization is over all possible input probability distributions.

C measured in bits per channel use or bits per transmission.

Page 42: unit-1.ppt

Channel coding theoremChannel coding theorem“If a discrete memoryless source with an alphabet

‘S’ has an entropy H(S) and produces symbols every ‘Ts’

seconds; and a discrete memoryless channel has a capacity I (X,

Y) Max and is used once every Tc seconds; then if

c

Max

s T

)Y,X(I

T

)S(H

There exists a coding scheme for which the source output can be transmitted over the channel and be reconstructed with an arbitrarily small probability of error. The parameter C/Tc is called the critical rate. When this condition is satisfied with the equality sign, the system is said to be signaling at the critical rate.

Page 43: unit-1.ppt

Conversely, if,

c

Max

s T

)Y,X(I

T

)S(H

it is not possible to transmit information over the channel and reconstruct it with an arbitrarily small probability of error

Page 44: unit-1.ppt
Page 45: unit-1.ppt