multiple access channels
TRANSCRIPT
-
8/8/2019 Multiple Access Channels
1/30
Network Information Theory
Multiple access channels
Broadcasting channel
Capacity region
-
8/8/2019 Multiple Access Channels
2/30
Network Information TheoryCommunication problems:
interference, cooperation and feedback
Many senders and receivers and channel transition matrix.
Distributed source coding (data compression), distributed
Communication (capacity region).
Multiple access channels
Broadcasting channel
-
8/8/2019 Multiple Access Channels
3/30
Examples of large communication network: computer
networks, satellite network and the phone system.
Other channels: relay channel, interference channel andChannel.
Relay channel: there is one source and one destination, but one or more
intermediate sender-receiver pairs that act as relays to facilitate the
communication between the source and one destination.
Network Information Theory
-
8/8/2019 Multiple Access Channels
4/30
Gaussian Multiple User Channels
The channel with input power P and additive white Gaussian
noise channel and noise variance is modeled by
Yi = Xi + Zi , i = 1, 2, 3, where Zi are i.i.d. Gaussian
random variables with mean zero and variance N.
The signal X = (X1, X2,, Xn) has a constrain
For a single userGaussian channel
Y = X+Z
PXn
n
i
i e!1
21
issionper transmbits),1log(2
1);(max
][:)( 2 N
PYXIC
PXEXp!!
e
)1log(2
1
N
PR
-
8/8/2019 Multiple Access Channels
5/30
The Gaussian multiple access channel with m users
)NP(
NPCZXY
m
i
i !!!
1log21)(and
1
capacity of a single user
The achievable rate region for the Gaussian channel is given
)(
)3
(
)2
(
)(
1 N
mPCR
N
PCRRR
N
PCRR
N
PCR
m
i
i
jji
ji
i
!
Here we have m codebooks, the i-th codebook
having codeword of power P.
Each of the independent transmitters chooses
an arbitrary codeword from the own codebook
and simultaneously send these vectors. The
Gaussian noise is added. Y = X + Z
The receive looking for the m codewords.
If (R1, R2, . . , Rm) is in the capacity region,
then the probability of error goes to 0 as n
tends to infinity.
inR2
-
8/8/2019 Multiple Access Channels
6/30
The Gaussian broadcasting channel
The model of channel: Y1
= X + Z1
Y2
= X + Z2
where Z1 and Z2 are arbitrarily correlated Gaussian random
variables with variance N1 and N2. The sender wishes to
send independent message at rate R1 and R2 to receivers Y1and Y
2.
The capacity region:
10,)1(
and2
2
1
1 ee
E
EE
N
PCR
N
PCR
The transmitter generates two codebooks: P, R1, (1- )P, R2.
The transmitter send the sum of the codewords X(i) +X(j).
The receivers decodes their message.
}2...,2,1,{and}2...,2,1,{ 21nRnR ji
-
8/8/2019 Multiple Access Channels
7/30
The Gaussian relay channel
For relay channel, we have a sender X and an ultimate intended receiver
Y. Also present is the relay channel intended solely to help the receive.
X Y
Y1:X1
Y1= X + Z1Y = X + Z1 + X1 + Z2
where Z1 and Z2 are independent zero mean
Gaussian random variables with varianceN1 and N2, respectively.
X1i = fi(Y1i, Y2i, , Y1(i-1))
X: power P; X1: power P1; Capacity
!
ee121
11
10,
)1(2minmax
N
PC
NN
PPPPCC
EE
E
1N
PCC
,if
1
12
1
!
!
u
E
N
P
N
P
-
8/8/2019 Multiple Access Channels
8/30
The channel appears to be noise-free after relay, and the
capacity C(P/N1) from X to the relay can be achieved. Thus
the rate C(P/(N1+N2)) without the relay is increased by the
presence of the relay to C(P/N1). For large N2, and for
we see that the increment in the rate is from
12
1
N
P
N
P
u
)C(P/NNNPC 121 to0))/(( }
The Gaussian relay channel
-
8/8/2019 Multiple Access Channels
9/30
-
8/8/2019 Multiple Access Channels
10/30
The Gaussian Two-way Channel
Very similar to interference channel, with the additional provision thatsender 1 is attached to receiver 2 and the sender 2 is attached to
receiver 1.
),/,( 2121 xxyyp
X1 X2
Y1 Y2 1W2W
W1 W2
Let P1 and P2 : the power of transmitters 1 and 2
N1 and N2: the noise variances of the two channel
R1 < C(P1/N1) and C(P2/N2)
-
8/8/2019 Multiple Access Channels
11/30
Jointly Typical Sequences
Let (X1, X2, , Xk) a finite collection of discrete random variables with some
fixed joint distribution,p(x1,x2, ,xk). Let S the sub-set of these randomvariables and considern independent copies of S. Thus
By the law of the large numbers, for any subset of random variables,
Definition: The set de -typical n-sequence (x1,x2, ... ,xk) is defined by
!
!
!!!!!
!!!
n
k
jkikjijiji
n
i
ii
xxpPsSPS
sSPsSP
1
1
),()],(),[()(),,(iexampleFor
)()(
xxXXXX
)()(log1
),...,,(log1
1
21 SHSp
n
SSSp
n
n
i
in p! !)(nAI
!
!
],...,,[,)()(logn
1:)(
),...,,(
21
)(
21
)(
kk21
n
k
n
XXXSSHp
AXXXA
I
II
sx,...,x,x
-
8/8/2019 Multiple Access Channels
12/30
Jointly Typical Sequences
If S=(X1,X2), we have
Notation
Theorem: For any > 0, for sufficiently large n
})()(log1
,)()(log1
,),()(log1:){(),(
21
2121
)(
II
II
!
XHpn
XHpn
XXHpn
XXA
21
2121
n
xx
x,xx,x
l rls ffici tf rt log1
2 )( II ! s ban
a nbn
n
)2)/((
21
)(
2121
)2)(()(
))(()(21
)(
212)(then
),,(),(If}.,...,,{,Let.4
2)(.3
2)()(.2
},...,,{,1)]([.1
I
I
I
I
I
I
I
I
s
s
s
!
!
!
u
SSHn
n
k
SHnn
SHnnk
n
p
SSAXXXSS
SA
pSA
XXXSSAP
21
21
/ss
xx
ss
-
8/8/2019 Multiple Access Channels
13/30
Jointly Typical Sequences
Theorem:
Let S1,S2 be two subset of X1, X2, , Xk. For any >0, defineto be the set ofs1 sequences that are jointly -typical with a particulars2sequence.
If then for sufficiently large n, we have.
Let denote the typical set for the probability mass functionp(s1,s2,s3),
and let
)/( 1)( 2sSA nI
),( 2)( SA n
I
2s
ee
2
2121 |)/(|)(2)1(and2)/( 1)()2)/(()2)/((1)(s
nSSnSSnn SApSA 222 sss IIII I
)(nA
I
)6)/;(()(
21
3323
1
1221
2212),,{(
)()/()/(),,(
I
I
s
!
!ddd
!!d!d!d SSSInn
iiii
n
i
i
ASSSPthen
spsspsspSSSP
2
321sss
-
8/8/2019 Multiple Access Channels
14/30
The multiple access channel
Definition: A discrete memoryless multiple access channel consists of
three alphabets, X1, X2and Y, and a probability transition matrixp(y/x1,x2).
A Code for multiple access channel consists of two set of
integers W1 = and W2 = called the message
sets, two encode function
X1: W
1 X
1n, ; X2 :W2 X2
n
Decode function g: Yn W1
xW2
),2,2( 21 nnRnR
p(y/x1,x2)
W1
W2
X1
X2
Y ),( 11 WW
}2...,2,1,{ 1nR
}2...,2,1,{ 1nR
send}),(|),()({2
1
21),(
21)(
)(
21
21
wwwwYgPPxww
n
RRn
n
e
{!
21
-
8/8/2019 Multiple Access Channels
15/30
The multiple access channel
The capacity region of the multiple access channel is the closure of the set
of achievable (R1, R2).Theorem:The capacity of a multiple access channel (X1, X2,p(y/x1,x2), Y)
is the closure of the convex hull of all (R1, R2) satisfying
R1 < I(X1; Y/X2) ; R2 < I(X2; Y/X1) ; R1 + R2 I( (X1;X2); Y)
for some distributionp1(x1)p2(x2) on X1 xX1
Example of the capacity region for a multiple access channel
R2
C2
C1 R1
I(X2; Y)
I(X1; Y)
-
8/8/2019 Multiple Access Channels
16/30
Independent binary symmetric channel
1-p1
1-p1
0
1
0
1
X1 Y1p1
p1
1-p2
1-p2
0
1
0
1
X2 Y2
p2
p2
R2
C2=1-H(p2)
C1=1-H(p1) R10
Binary multiplier channel : Y = X1 X2
Setting X2=1, we can send at a rate of 1
bit per transmission from sender 1
to receiver. If X1=1, we can achieve R2=1.
R1 + R2 = 1
R2
C2=1
C1=1 R10
Capacity region
Capacity region
-
8/8/2019 Multiple Access Channels
17/30
Binary erasure multiple access channel
Capacity region
R2
C2=1
C1=1 R1
1/2
1/20
Binary input X1= X
2= (0,1) and a ternary output Y = X1+ X2
1-p
CBEC = 1 - p
p = 1/2;
CBEC=1 bit per transmission
1-p
0
1
0
E
1
Yp
p
X
X1
X2
0
1
2
Y
-
8/8/2019 Multiple Access Channels
18/30
The capacity region of the multiple access channel
The closure of the set of achievable (R1, R2).
Theorem:The capacity of a multiple access channel (X1, X2,p(y/x1,x2), Y)is the closure of the convex hull of all (R1, R2) satisfying
R1 < I(X1; Y/X2) ; R2 < I(X2; Y/X1) ; R1 + R2 I( (X1;X2); Y)
for some distributionp1(x1)p2(x2) on X1 xX1
The point A correspond to the maximum
rate achievable from sender 1 to the
receiver 2 is not sending any
Information. This is
maxR1 < maxp(x1)p(x2) I(X1; Y /X2) ;
For any distributionp1(x1)p2(x2)
)/;(max
)/;()()/;(
221
2212221
2
xXYXI
xXYXIxpXYXIX
!e
!!
R2
I(X2;Y/X1)
I(X1;Y/X2) R1
I(X2; Y)
I(X1; Y)
D C
B
A
Achievable region of multiple
access channel or fixed input
Distribution.
-
8/8/2019 Multiple Access Channels
19/30
Multiple access channel Gaussian - MAC
Independent joint Gaussian input distributions achieves the capacity region
)1log(2
1)2log(
2
1))(2log(
2
1
)2log(21)(
)()(),/()/(
),/()/(
),/()/()/;(
11
1
12121
2121221
21221
N
PNeNPe
NeZXh
ZhZXhXXZhXZXh
XXZXXhXZXXh
XXYhXYhXYXI
!e
!
!!
!
!
TT
T
-
8/8/2019 Multiple Access Channels
20/30
Multiple access channel Gaussian capacity region
)();();(:)1log(
2
1)( 2121
22
11
N
PPCRR
N
PCR
N
PCRxxC
!
-
8/8/2019 Multiple Access Channels
21/30
Example:
-
8/8/2019 Multiple Access Channels
22/30
Distributed Source Coding
How to encoding a source X: A rate R > H(X) is sufficient.
Two sourceH
(X,Y) ~p(x,y): A rate R >H
(X,Y) is sufficient.
If X-source and Y-source are separately described: R=Rx+Ry>H(X)+H(Y)
is sufficient.
R=H(X,Y)
-
8/8/2019 Multiple Access Channels
23/30
Slepian-Wolf Theorem
Theorem: For the distributed source coding probability for the source (X,Y)
drawn i.i.d ~p(x,y), the achievable rate region is given by
-
8/8/2019 Multiple Access Channels
24/30
Broadcast Channel
One-to-many channel Downlink of cellular or satellite channels
TV, radio broadcasting, DMB, DVB
-
8/8/2019 Multiple Access Channels
25/30
General capacity region unknown
Capacity region known for degraded broadcast channel Physically degraded X Y1 Y2 Stochastically degraded
Same conditional marginal distributions as a
physically degraded channel
Broadcast capacity depends only on conditional
marginal distributions since users do not cooperate
Superposition coding is optimal
Example) Gaussian broadcast channel
Capacityregion: convex hull of the closure of all (R1, R2)
such that
Broadcast capacity region
222
22222
)1()1(
)(1)/()();(
ppp
pHUYHYHYUIR
FFF
F
!
!!e
-
8/8/2019 Multiple Access Channels
26/30
Gaussian Broadcast Channel
-
8/8/2019 Multiple Access Channels
27/30
Gaussian Broadcast Channel
-
8/8/2019 Multiple Access Channels
28/30
Example:
-
8/8/2019 Multiple Access Channels
29/30
Example:
-
8/8/2019 Multiple Access Channels
30/30
Gaussian Vector Broadcast Channel