Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
1
Example: 2-input, 4-output DMC
2
Example
3
Example BSC
• Transition probability = 0.3• Recall: MLD decodes to closest
codeword in Hamming distance• Note: Extremely poor channel!• Channel capacity = 0.12• Code rate = 5/21 = 0.24
4
Example
1
2
3
4
2
3
4
5
4
2
4
3
6
5
5
4
4
7 5
6 7
5
The Viterbi algorithm for the binary-input AWGN channel
• Recall from Chapter 10:• Metrics: Correlation
• Decode to the (modulated) codeword v whose correlation r⋅v with r is maximum
• Bit metric M(rl|vl) = rl⋅vl
6
Performance bounds• Linear code and symmetric DMC
• We can w.l.o.g. assume that the all-zero codeword is transmitted• If we make a mistake, the wrong codeword is something else• Thus, the decoding mistake will take place at the time instant in
the trellis when the wrong codeword and the all-zero codeword merge
• Therefore, we are only interested in the elementary codewords (first event errors)
7
Performance bounds: BSC• Assume that the wrong codeword has weight d
• Then, the first event error probability at time t is bounded as follows:
Pd is channel specific
8
Performance boundsa) Note: The bound for Pf (E,t) is a union bound
– The probability of the union of events is upper bounded by the sum of the probabilities of the individual events
b) In the bound for Pf (E,t) we only need to sum the contribution of all incorrect paths of length t branches or less, since these are the paths that can cause a first event error at time t
c) Of course, we can also extend the summation to take into account all incorrect paths of any length to make the bound independent of t
d) When the bound is independent of t, it holds for all time instantse) Thus, Pf (E) (the first event error probability at any time) is bounded
by the same expression
9
Simplifications of Pd for odd d
10
Event error probabilitya) After one or more event errors have occurred, the two paths
compared at the all-zero state at time t, v' and v'' will both be incorrect paths. The correct path v has been eliminated for the first time by v' at a previous time instant
b) If the partial path metric for v'' exceeds the partial path metric for v' at time t, then it must also exceed the partial path metric for v
c) Thus, the event that v'' has a better metric than v' at time t is contained in the event that v'' has a better metric than v at time t, and it follows that P(E,t) ≤ Pf (E,t)
11
Event error probabilitya) The situation on the previous slide is not the only way an event error
can follow an event error made earlierb) The same bound on P(E,t) holds for any event error occuring at time
tc) To summaries: The event error probability at any time can be upper
bounded using the upper bound for Pf (E,t), which is independent of t
12
Bit error probabilitya) Each event error causes a number of information bit errors equal to
the number of non-zero information bits on the incorrect pathb) If each event error probability term Pd is weighted by the total
number of non-zero information bits on all weight-d paths, a bound on the expected number of information bit errors made at any time results
c) Divide by k to obtain a bound on Pb (E). Then, we get the following expression
BPb (E)
13
Performance bounds: Binary-input hard quantized AWGN channel
• BSC derived from AWGN: 0/21
0 )/2( NEs
seNEQp −≈=
• Asymptotic coding gain γ = 10 log10(Rdfree/2)
( ) ( )
( ) ( )0freefree
free
0freefree
free
/2/2/
/2/2/
2
2)(NERdd
d
NEdddb
b
s
eB
eBEP−
−
=
≈
14
Performance bounds: Binary-input Q-ary quantized AWGN channel
P(E) < A(X) |X=D0 and Pb(E) < B(X) |X=D0 where D0 is the
Bhattacharyya parameter
D0=∑ j=0Q−1 P j | 0P j |1
15
Pd for binary-input AWGN channel
Count only positions where v’ and v differ
v correct: (-1,...,-1)v’ incorrect
Sum of d ind. Gaussian random variables ~ N(-1, N0/2Es)
16
Pd for binary-input AWGN channel
• Asymptotic coding gain γ = 10 log10(Rdfree)
( ) ( )0free
free
/)( NERddb
beBEP −≈
< ∑
∞
= 0
2)(free
NdREQAEP b
ddd
< ∑
∞
= 0
2)(free
NdREQBEP b
dddb
17
Rate 1/3 convolutional code
γ = 10 log10(1/3*7) = 3.68 dB
3.3 dB
18
Some further comments• Union bound is poor for small SNRs
• Contributions from higher order terms become significant• When can you stop the summation?
• Other bounds: Divide summation in two parts near / far
Replace this by a bound
+
=
<
∑∑
∑∞
+==
∞
=
010
0
22
2)(
free
free
NdREQB
NdREQB
NdREQBEP
b
mdd
bm
ddd
b
dddb
19
Performance: Optimum R=1/2 codes on the AWGN channel
20
Performance: Optimum R=1/2 codes on the hard quantized AWGN channel
21
R=1/2, ν=4 code with varying quantization
22
Rate 1/3 codes
23
Rate 2/3 codes
24
Code design: Computer search• Criteria
• High dfree
• Low Adfree (and low Bdfree
?)
• High slope / low truncation length• Use the Viterbi algorithm to determine the weight properties
25
Example: Viterbi algorithm for distance properties calculation
3
4
5
6
7
5
6
7
7
6
7
8
7
7
8