IF YOU HAVE ANY DOUBTS COMMENT BELOW I WILL BE TTHERE TO HELP YOU..ALL THE BEST..
as for given data...
Let Y ~N(rly, σ). Then X := exp(Y) is said to be lognormally distributed with pdf. (In(x)y)2 202 and denoted as LN(μΥ, σ) Let Xi, ,X, be random samples from the LN(py, σ distribution
EXPLANATION ::-
We have
(a) Find the maximum likelihood estimator for μΥ, which we denote as μΥ (Hint: Use the fact that Y In(X) is normally distributed with known mean and variance) Verify that the sought stationary point is a maximum.
SOL ::-
a)The joint PDF is
The loglikelihood function is
Taking partial derivatives and equating to 0,
The second derivative
Hence the sought point is maximum.
(b) Verify that the estimator consistent? is unbiased and derive its variance. Is the estimator
SOL ::-
Hence is unbiased.
(c) Assuming fy is normally distributed, how many data points would you need to collect to obtain a 95% random interval ofPv t l for μΥ if σ '-2?
SOL ::-
We need the margin of error
(d) It is straightforward to show that Ax := E(X) = exp(pv + σ/2). Verify that although μΥ is an unbiased estimator of μΥ, exp(pv + σ/2) is not an unbiased estimator of Ax, for n 1
SOL ::-
For , .
Hence is not an unbiased estimator of
I HOPE YOU UNDERSTAND..
PLS RATE THUMBS UP..ITS HELPS ME ALOT..
THANK YOU...!!
Question 4 16 marks Let Y N(Hy, o). Then X := exp(Y) is said to be lognormally distributed with p.d.f. (In(x)-Hy) e...
1. Let Xi,..., Xn be a random sample from a distribution with p.d.f. f(x:0)-829-1 , 0 < x < 1. where θ > 0. (a) Find a sufficient statistic Y for θ. (b) Show that the maximum likelihood estimator θ is a function of Y. (c) Determine the Rao-Cramér lower bound for the variance of unbiased estimators 12) Of θ
Question 3 15 marks] Let X1,..,X be independent identically distributed random variables with pdf common ) = { (#)%2-1/64 0 fx (a;e) 0 where 0 >0 is an unknown parameter X-1. Show that Y ~ T (}, ); (a) Let Y (b) Show that 1 T n =1 is an unbiased estimator of 0-1 ewhere / (0; X) is the log- likeliho od function; (c) Compute U - (d) What functions T (0) have unbiased estimators that attain the relevant...
Question 3 [17 marks] The random variable X is distributed exponentially with parameter A i.e. X~ Exp(A), so that its probability density function (pdf) of X is SO e /A fx(x) | 0, (2) (a) Let Y log(X. When A = 1, (i) Show that the pdf of Y is fr(y) = e (u+e-") (ii) Derive the moment generating function of Y, My(t), and give the values of t such that My(t) is well defined. (b) Suppose that Xi, i...
Question 3 [10 marks Let W Then the p.d.f. 1 fw (w) 2"/21 (n/2) exp(-w/2) w3-1, w>0. and the c.d.f. is denoted as Fw (w) (a) Show that 0, n > 0, and (i) The function fw(w) is a p.d.f. (i.e., that fw(w) 2 0 for w Jo fw(w)dw 1). (ii) The mode of W is n - 2 for n > 2. (b) As n oo, W becomes normally distributed with mean n and variance 2n. This has led...
Problem 2. (26 points) Two random variables X and Y are jointly normally distributed, with E(X)x, EY) y and co-variance Cov(X,Y) = ơXY. To estimate the population co-variance ơXY, a very simple random sample is drawn from the population. This random sample consists of n pairs of random variables {OG, Yİ), (XyW), , (x,,y,)). Based on the sample, we construct sample co-variance SXY as: Ti-1 2-1 1. (4 points) Show Σ(Xi-X) (Yi-Y) = Σ Xix-n-X-Y. 2. (4 points) Find E(Xi...
3. Let Ya» . . . , Yn be independent normally distributed random variables with E(X) Gai and V(X)-1. Recall that the normal density with mean μ and variance σ given by TO 202 (a) Find the maximum likelihood estimator β of β (b) Show that ß is unbiased. (c) Determine the distribution of β (d) Recall that the likelihood ratio test of Ho : θ 02] L1] L2] θ° is to θ0 against H1: θ reject Ho if L(e)...
Question 5 15 marks] Let X be a random variable with pdf -{ fx(z) = - 0<r<1 (1) 0 :otherwise, Xa, n>2, be iid. random variables with pdf where 0> 0. Let X. X2.... given by (1) (a) Let Ylog X, where X has pdf given by (1). Show that the pdf of Y is Be- otherwise, (b) Show that the log-likelihood given the X, is = n log0+ (0- 1)log X (0 X) Hence show that the maximum likelihood...
Question 5 15 marks] Let X be a random variable with pdf -{ fx(z) = - 0<r<1 (1) 0 :otherwise, Xa, n>2, be iid. random variables with pdf where 0> 0. Let X. X2.... given by (1) (a) Let Ylog X, where X has pdf given by (1). Show that the pdf of Y is Be- otherwise, (b) Show that the log-likelihood given the X, is = n log0+ (0- 1)log X (0 X) Hence show that the maximum likelihood...
4. Exercise Let X, Y be RVs. Denote E[X] = Hy and E[Y] =py. Suppose we want to test the null hypothesis Ho : Mx = uy against the alternative hypothesis Hi : 4x > uy. Suppose we have i.i.d. pairs (X1,Yı),...,(Xn, Yn) from the joint distribution of (X,Y). Further assume that we know the X - Y follows a normal distribution. (i) Show that exactly) T:= (X-Y)-(ux-uy) - tn-1), Sin (3) where s2 = n-1 [?-,((X; – Y;) –...
be a random sample from the density 16 1. Let Xi, . f(x; β) otherwise 8(1-/4). You may suppose that E(X)(/ (a) Find a sufficient statistic Y for B and Var(X) C21 C2] 031 (b) Find the maximum likelihood estimator B of B and show that it is a function (c) Determine the Rao-Cramér lower bound (RCLB) for the variance of unbiased (d) Use the following data and maximum likelihood estimator to give an approxi- 2.66, 2.02, 2.02, 0.76, 1.70,...