(i) Verify that the Kullback-Leibler divergence of two univariate Gaussians Xi ∼
.
(ii) Verify that the Wasserstein distance between πX1 and πX2 is given by
We need at least 10 more requests to produce the answer.
0 / 10 have requested this problem solution
The more requests, the faster the answer.
(i) Verify that the Kullback-Leibler divergence of two univariate Gaussians Xi ∼ N(¯xi, σ2 i ), i = 1, 2, is given by DKL(πX1 ||πX2) = ∫ R ln πX1(x) πX2(x) πX1(x)dx = 1 2 ( σ−2 2 σ2 1 + σ−2 2 (¯x2 − ¯x1)2 − 1 − 2 log σ1 σ2 ) . (ii) Verify that the Wassers
Given a continuous random variable, prove that s--a:G-x) 2 converges to σ2 as Σ-1(xi-x) 2 converges to σ2 as n-1 Given a continuous random variable, prove that s--a:G-x) 2 converges to σ2 as Σ-1(xi-x) 2 converges to σ2 as n-1
4. Let X1,X2, x 2) distribution, and let sr_ Ση:1 (Xi-X)2 and S2 n-l Σηι (Xi-X)2 be the estimators of σ2. (i) Show that the MSE of S" is smaller than the MSE of S2 (ii) Find ElvS2] and suggest an unbiased estimator of σ. n be a random sample from N (μ, σ
5. Let X-1Ση-1 Xi : 1 with n-100. (i) Obtain Ση: 1 Xp (ii) Evaluate Ση-1(Xi-X] and
Let Xi Pn(2) and X2 Pn(5) be two independent random variables and it that y = Xi + X-Pn(7). is shown (a) Given Y-n, n 20, what are the possible values of X1? (b) Calculate the conditional distribution of Xi given Y-n for n 2 0. Let Xi Pn(2) and X2 Pn(5) be two independent random variables and it that y = Xi + X-Pn(7). is shown (a) Given Y-n, n 20, what are the possible values of X1? (b)...
3. Let Xi, , Xn be i.i.d. Lognormal(μ, σ2) (a) Suppose σ-1, prove that S-X(n)/X(i) is an ancillary statistics. (b) Suppose p 0, prove T-X(n) is a sufficient and complete statistics (c) Find a minimal sufficient statistics. 3. Let Xi, , Xn be i.i.d. Lognormal(μ, σ2) (a) Suppose σ-1, prove that S-X(n)/X(i) is an ancillary statistics. (b) Suppose p 0, prove T-X(n) is a sufficient and complete statistics (c) Find a minimal sufficient statistics.
5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J, 1,,-1, , n. OV&.for any two random variables X and Y) or each 1, and (11 CoV(X,Y) var(x)var(y) (Recall that p vararo 5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J,...
6.2.1 2. Recall that θ--r/ Σ (θ, 1 ) distribution. Also, W - i-1 log Xi has the gamma distribution Г(n, 1/ ) -1 log X, is the mle of θ for a beta (a) Show that 2θW has a X2(2n) distribution. (b) Using part (a), find ci and c2 so that (6.2.35) for 0 < α < 1 . Next, obtain a (1-a) 100% confidence interval for θ.
Given production function: y=f(x1,x2)=(α⋅x(σ−1)/σ1+(1−α)⋅x(σ−1)/σ2)σ/(σ−1) consider, α = 0.2 and σ = 0.7. The first factor is currently used in the amount x1 = 9, and the second factor is used in the amount x2 = 3. a) When (x1,x2) = (9,3), how much output is being produced? Output: b) When (x1,x2) = (9,3), what is the marginal product of factor 1? Marginal product: c) When (x1,x2) = (9,3), what is the average product of factor 1? Average product: d) When...
1. For each question: i) verify that yı(2) is a solution. ii) Use reduction of order to find the general solution. iii) Find a fundamental solution set. iv) Find the Wronkskian, and list it's zeroes and discontinuities. Verify that the Wronskian is nonzero and continuous on the given interval. (e) y" + 4y + 4y = 0, yı = -2% (-00,00). () r’y" – 2xy' + 2y = 0, yı = x. (0,00). -
Let X1,X2, , Xn be a random sample from a normal distribution with a known mean μ (xi-A)2 and variance σ unknown. Let ơ-- Show that a (1-α) 100% confidence interval for σ2 is (nơ2/X2/2,n, nơ2A-a/2,n). Let X1,X2, , Xn be a random sample from a normal distribution with a known mean μ (xi-A)2 and variance σ unknown. Let ơ-- Show that a (1-α) 100% confidence interval for σ2 is (nơ2/X2/2,n, nơ2A-a/2,n).