Reference no: EM133146930
Encoding, decoding, and Shannon's Theorem
Exercise 1: Assume communication is over a BSC with crossover probability ς.
(a) Using Example 1.11.7, compute Perr for the extended Hamming code H^3.
(b) Prove that the values of Perr for both 76, found in Example 1.11.9, and 713 are equal.
(c) Which code H3 or H^3 would be better to use when communicating over a BSC? Why?
Exercise 2: Assume communication is over a BSC with crossover probability ς using the [23, 12, 7] binary Golay code G23.
(a) In Exercises 78 and 80 you will see that for G23 there are (23) cosets of weight i for 0 < i < 3 and no others. Compute Par for this code.
(b) Compare Perr for sending 212 binary messages unencoded to encoding with G23 when Q = 0 .01.
Exercise 3: Assume communication is over a BSC with crossover probability Q using the [24, 12, 8] extended binary Golay code G24.
(a) In Example 8.3.2 you will see that for G24 there are 1, 24, 276, 2024, and 1771 cosets of weights 0, 1, 2, 3, and 4, respectively. Compute Pm for this code.
(b) Prove that the values of Perr for both G23, found in Exercise 75, and G24 are equal.
(c) Which code G23 or G24 would be better to use when communicating over a BSC? Why?
For a BSC with crossover probability ς, the capacity of the channel is
C(ς)= 1 + ςlog2ς + (1 - ς) log2(1 - ς).
The capacity C(ς) = 1 - H2(ς), where H2(ς) is the Hilbert entropy function that we define in Section 2.10.3. For binary symmetric channels, Shannon's Theorem is as follows.
Theorem 1.11.10 (Shannon) Let δ > 0 and R < C(ς). Then for large enough n, there exists an [n, k] binary linear code C with k/n ≥ R such that Perr < S when C is used for communication over a BSC with crossover probability p. Furthermore no such code exists if R > C(p).
Shannon's Theorem remains valid for nonbinary codes and other channels provided the channel capacity is defined appropriately. The fraction k/n is called the rate, or information rate, of an [n, k] code and gives a measure of how much information is being transmitted; we discuss this more extensively in Section 2.10.
Exercise 4: Do the following.
(a) Graph the channel capacity as a function of ς for 0 < ς < 1.
(b) In your graph, what is the region in which arbitrarily reliable communication can occur according to Shannon's Theorem?
(c) What is the channel capacity when ς = 1/2? What does Shannon's Theorem say about communication when ς = 1/2? (See Footnote 5 earlier in this section.)