問題詳情
【題組】
(d) (5%) Suppose the prior probability is not known in advance. We manage toproduce an estimated prior probability
which may or may notbe equal to the true prior probability P(X = 0) = p. In this case, the cross-entropy between the true and estimated prior distributions P and
isdefined by
, which can beconsidered as an approximated entropy of X. Please show that the cross-entropy
is always no less than the true entropy H(X) of X, i.e.
(Hint: You may use the Jensen's inequality: plog2 a + (1 -p)log2≤ b <log2(pa +(1 -p)b) for 0 ≤p ≤ 1, a > 0, and b >0.)
參考答案