[試題] 103-2 林茂昭 消息理論 期末考

作者: bearhaha (囧王熊)   2015-06-23 11:34:44
課程名稱︰消息理論
課程性質︰選修
課程教師︰林茂昭
開課學院:電資
開課系所︰電信所
考試日期(年月日)︰2015.6.23
考試時限(分鐘):100分鐘
試題 :
Final Exam of Information Theory
June 23, 2015
1. Consider the discrete-time memoryless channel, for which the output Y is
the sum of the input X and tje noise Z, where X and Y are real numbers
and Z is a Gaussian random variable with zero mean and variance N.
Suppose the power constraint of X is E[x^2] ≦ P.
(a)(5%) What is the channel capacity of this channel?
(b)(5%) What is the least signal-to-noise ratio needed to achieve coding
rate of 1/2 bits per transmission?
(c)(5%) Suppose that X = {±1, ±3} and p(x) = 1/4 for x ∈ {±1,±3}.
What is the capacity for large signal-to-noise ratios.
2. (a)(5%) Find the differential entropy h(X) = -∫f㏑f for the random
variable with the exponential density, f(x) = λexp(-λx), x≧0.
(b)(5%) Please show that the exponential distribution with mean 1/λ is
the maximum entropy distribution among all continuous disstributions
supported in [0,∞] that have a mean of 1/λ.
(c)(5%) Let Yi = Xi + Zi, where Zi is i.i.d. exponential distribution with
mean μ. Assume that we have a mean constraint on the signal (i.e.
EXi ≦ λ). Show that the capacity of such a channel is C = ㏒(1+λ/μ).
3. (a)(5%) Consider a binary symmetric channel with transition probability ε.
Please find its channel capacity.
(b)(5%) A channel with input alphabet X = {a1, a2, a3, a4, a5, a6} and
output alphabet Y = {b1, b2, b3} and the transition probabilities,
P(b1|a1) = 1, P(b1|a2) = 1,
P(b2|a3) = 1, P(b2|a4) = 1, P(b2|a5) = 1,
P(b3|a6) = 1.
4. (10%) Please find the rate distortion function of the N(0,σ^2) Gaussian
source X with squared error distortion.
5. (a)(5%) Use the sliding window Lempel-Ziv algorithm to encode the sequence
0000001101010000011010111. (window size = 4 bits)
(b)(5%) Use the tree-structured Lempel-Ziv algorithm to encode the sequence
0000001101010000011010111.
6. (10%) Describe th watering filling principle for the parallel Gaussian
channels.
7. Consider two random variables X and Y with X = Y = {0,1} and joint
probability p(x,y) given p(y|x) =
Y| 0 1
X |
────────
0 | 0.72 0.18
|
1 | 0.02 0.08
10000 10000
(a)(5%) Please given an example of a joint typical sequence in X Y
(b)(5%) Suppose that source A and source B are separately encode and
jointly decoded. What is the achievable rate region for source A and B?
(c)(5%) Please give a heuristic explanation for the method to achieve the
rate region given in (b).
8. (8%) Consider a Gaussian multiple-access channel with two users. Two senders
, X1 and X2 communicate to a single receiver, Y. The received signal at time
i is Yi=X1i+X2i+Zi
where {Zi} is a sequence of i.i.d. zero mean Gaussian random variables with
variance N. Assume that there is a power constraint Pj on sender j; that is,
we have n
1/n(Σ(x_ji)^2 (w_j)) <= Pj, wj belongs to {1,2,...,2^nR}, j=1,2.
i=1
Please describe the achievable regions of R1 and R2.
9. (7%) Assume that we have a sender of power P and two distant receivers.
The model of the channel is Y1 = X + Z1 and Y2 = X + Z2, where Z1 and Z2
are arbitrarily correlated Gaussian random variables with variances N1 and
N2 respectively and N1 < N2.
Describe the capacity region of this Gaussian broadcast channel and the
procedure of decoding.

Links booklink

Contact Us: admin [ a t ] ucptt.com