Reference no: EM133335345
Problem 1: Entropy and mutual information proofs.
1. If A → B → C is a markov chain, prove that I(A, B) ≥ I(A,C). Recall that A → B → C is a Markov chain if A and C are conditionally independent given B.
2. Show that H(Y|X ) = 0 if and only if there exists a function φ such that Y = φ (X ),
i.e. if and only if Y is determined by X .
3. Prove that H(g(X )) ≤ H(X ) for any function g.
Problem 2: The joint density function of X and Y is given by:
f (x, y) = 1 e-(y+ x/y), x > 0, y > 0
Find E[X ], E[Y ], and show that Cov(X,Y ) = 1.
Problem 3: The inhabitants of a certain village are divided into two groups A and B. Half the people in group A always tell the truth, three tenths always lie, and two tenths always refuse to answer. In group B, three tenths of the people are truthful, half are liars, and two tenths always refuse to answer. Let p be the probability that a person selected at random will belong to group A. Let I(p) be the information conveyed about a person's truth-telling status by specifying his group membership. What is the proportion of group A in the village that would maximize this information? What's the value of this maximal information?
Problem 4: We are investigating two SNP biomarkers B and C and a binary phenotype, Y . Let XB, XC denote discrete random variables for the two SNPs B and C, both taking values in {AA, Aa, aa}. Assume the joint distribution of Y, XB, XC is determined by the following probability distributions:
XB ⊥ Y
P(XB = aa) = 0.2, P(XB = Aa) = 0.5, P(XB = AA) = 0.3
P(XC = aa|Y = 0) = 0.25, P(XC = Aa|Y = 0) = 0.5, P(XC = AA|Y = 0) = 0.25
P(XC = aa|Y = 1) = 0, P(XC = Aa|Y = 1) = 0.1, P(XC = AA|Y = 1) = 0.9
P(Y = 0) = 0.6, P(Y = 1) = 0.4
1. Compute the entropies H(XB) and H(XC).
2. Compute H(Y ).
3. Compute the conditional entropies H(XB|Y ) and H(XC|Y ).
4. Let PB(x) and PC(x) denote the probability function of XB and XC. Compute the KL divergence D(PB||PC).