documentq1

2
Q1. A. Calculate the amount of information in decit if p(X i )=1/4 . B. What is information rate? C. What is difference between conditional and joint entropy? D. How mutual information is related to channel capacity? E. What is source coding? Q2. For a BMS H(X) is maximum when both X1 and X2 equiprobable. Q3. For a noiseless channel with m i/p and m o/p prove that H(X)=H(Y) and H(Y/X)=0. Q4. Determine the Huffman coding for the following message with their probabilities. X1=0.05,X2=0.15,X3=0.2,X4=0.05,X5=0.15,X6=0.3,X7=0.1 Q1. A. Calculate the amount of information in decit if p(X i )=1/4 . B. What is information rate? C. What is difference between conditional and joint entropy? D. How mutual information is related to channel capacity? E. What is source coding? Q2. For a BMS H(X) is maximum when both X1 and X2 equiprobable. Q3. For a noiseless channel with m i/p and m o/p prove that H(X)=H(Y) and H(Y/X)=0. Q4. Determine the Huffman coding for the following message with their probabilities. X1=0.05,X2=0.15,X3=0.2,X4=0.05,X5=0.15,X6=0.3,X7=0.1 Q1. A. Calculate the amount of information in decit if p(X i )=1/4 . B. What is information rate? C. What is difference between conditional and joint entropy? D. How mutual information is related to channel capacity? E. What is source coding? Q2. For a BMS H(X) is maximum when both X1 and X2 equiprobable. Q3. For a noiseless channel with m i/p and m o/p prove that H(X)=H(Y) and H(Y/X)=0. Q4. Determine the Huffman coding for the following message with their probabilities. X1=0.05,X2=0.15,X3=0.2,X4=0.05,X5=0.15,X6=0.3,X7=0.1

Upload: bhargabjyoti-saikia

Post on 18-Dec-2015

212 views

Category:

Documents


0 download

DESCRIPTION

Problems

TRANSCRIPT

Q1. A. Calculate the amount of information in decit if p(Xi)=1/4 .B. What is information rate?C. What is difference between conditional and joint entropy?D. How mutual information is related to channel capacity?E. What is source coding?Q2. For a BMS H(X) is maximum when both X1 and X2 equiprobable.Q3. For a noiseless channel with m i/p and m o/p prove that H(X)=H(Y) and H(Y/X)=0.Q4. Determine the Huffman coding for the following message with their probabilities.X1=0.05,X2=0.15,X3=0.2,X4=0.05,X5=0.15,X6=0.3,X7=0.1

Q1. A. Calculate the amount of information in decit if p(Xi)=1/4 .B. What is information rate?C. What is difference between conditional and joint entropy?D. How mutual information is related to channel capacity?E. What is source coding?Q2. For a BMS H(X) is maximum when both X1 and X2 equiprobable.Q3. For a noiseless channel with m i/p and m o/p prove that H(X)=H(Y) and H(Y/X)=0.Q4. Determine the Huffman coding for the following message with their probabilities.X1=0.05,X2=0.15,X3=0.2,X4=0.05,X5=0.15,X6=0.3,X7=0.1

Q1. A. Calculate the amount of information in decit if p(Xi)=1/4 .B. What is information rate?C. What is difference between conditional and joint entropy?D. How mutual information is related to channel capacity?E. What is source coding?Q2. For a BMS H(X) is maximum when both X1 and X2 equiprobable.Q3. For a noiseless channel with m i/p and m o/p prove that H(X)=H(Y) and H(Y/X)=0.Q4. Determine the Huffman coding for the following message with their probabilities.X1=0.05,X2=0.15,X3=0.2,X4=0.05,X5=0.15,X6=0.3,X7=0.1

Q1. A. Calculate the amount of information in decit if p(Xi)=1/4 .B. What is information rate?C. What is difference between conditional and joint entropy?D. How mutual information is related to channel capacity?E. What is source coding?Q2. For a BMS H(X) is maximum when both X1 and X2 equiprobable.Q3. For a noiseless channel with m i/p and m o/p prove that H(X)=H(Y) and H(Y/X)=0.Q4. Determine the Huffman coding for the following message with their probabilities.X1=0.05,X2=0.15,X3=0.2,X4=0.05,X5=0.15,X6=0.3,X7=0.1