Question

Question 4 16 marks Let Y N(Hy, o). Then X := exp(Y) is said to be lognormally distributed with p.d.f. (In(x)-Hy) exp 202 fx(
0 0
Add a comment Improve this question Transcribed image text
Answer #1

IF YOU HAVE ANY DOUBTS COMMENT BELOW I WILL BE TTHERE TO HELP YOU..ALL THE BEST..

as for given data...

Let Y ~N(rly, σ). Then X := exp(Y) is said to be lognormally distributed with pdf. (In(x)y)2 202 and denoted as LN(μΥ, σ) Let Xi, ,X, be random samples from the LN(py, σ distribution

EXPLANATION ::-

We have

(In (x)-py)2 Σ; 0<x< o),i = 1,2, 1 fx(x) η e χ/2πσγ

(a) Find the maximum likelihood estimator for μΥ, which we denote as μΥ (Hint: Use the fact that Y In(X) is normally distributed with known mean and variance) Verify that the sought stationary point is a maximum.

SOL ::-

a)The joint PDF is
fxX(X, X2,Xn) II fx(x) i-1 (In(X)-y)2 1 fx1,X2...Xn(X1, X2, Xn)T, X 2Toy 1 e

The loglikelihood function is
ΣL)-μγ 2 Σ Π 1 1 L (X1, X2, , Χ) , μY, σγ) ε Ε ΠΕΧν2πσγ (Χ1, X , . ., Χ), μγ, σy) - InL (X, X , Χ i μγ, σγ) - In | Χ-n In (2π
Taking partial derivatives and equating to 0,

- (Χ, X2, . , Χ) μΥ, σy) - Αίρ ,Σ.in(X)-μ)-0 20φ Σ- In(X) 24 μΥ

The second derivative -n I(X1, X2,., Xni y, ay) 2 202

Hence the sought point is maximum.

(b) Verify that the estimator consistent? is unbiased and derive its variance. Is the estimator

SOL ::-

1 In(X) E (y) E i-1 n Y E (fty) = E ( E (Y)HY i-1 n

Hence μΥ is unbiased.

In(X) Var (Ry) Var ( ΣΥ Var (y) Var (Li=1. Var (y)

(c) Assuming fy is normally distributed, how many data points would you need to collect to obtain a 95% random interval ofPv t l for μΥ if σ '-2?

SOL ::-

We need the margin of error

OY 1 ME 1.96 2 Y n 1.962 n> 1.962(4) n> 16

(d) It is straightforward to show that Ax := E(X) = exp(pv + σ/2). Verify that although μΥ is an unbiased estimator of μΥ, exp(pv + σ/2) is not an unbiased estimator of Ax, for n 1

SOL ::-

For n=1 , LyIn(X) 1. .

\widehat{\mu_Y }=\ln(X_1)\\ E(e^{\widehat{\mu_Y }+\sigma ^2_Y/2})=E(e^{\ln(X_1)+\sigma ^2_Y/2})\\ E(e^{\widehat{\mu_Y }+\sigma ^2_Y/2})=E\left (X_1 \right )E(e^{\sigma ^2_Y/2})\\ E(e^{\widehat{\mu_Y }+\sigma ^2_Y/2})=\mu _Xe^{\sigma ^2_Y/2}\neq \mu _X

Hence eytof/2 is not an unbiased estimator of \mu _X

I HOPE YOU UNDERSTAND..

PLS RATE THUMBS UP..ITS HELPS ME ALOT..

THANK YOU...!!

Add a comment
Know the answer?
Add Answer to:
Question 4 16 marks Let Y N(Hy, o). Then X := exp(Y) is said to be lognormally distributed with p.d.f. (In(x)-Hy) e...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
  • 1. Let Xi,..., Xn be a random sample from a distribution with p.d.f. f(x:0)-829-1 , 0...

    1. Let Xi,..., Xn be a random sample from a distribution with p.d.f. f(x:0)-829-1 , 0 < x < 1. where θ > 0. (a) Find a sufficient statistic Y for θ. (b) Show that the maximum likelihood estimator θ is a function of Y. (c) Determine the Rao-Cramér lower bound for the variance of unbiased estimators 12) Of θ

  • Question 3 15 marks] Let X1,..,X be independent identically distributed random variables with pdf common )...

    Question 3 15 marks] Let X1,..,X be independent identically distributed random variables with pdf common ) = { (#)%2-1/64 0 fx (a;e) 0 where 0 >0 is an unknown parameter X-1. Show that Y ~ T (}, ); (a) Let Y (b) Show that 1 T n =1 is an unbiased estimator of 0-1 ewhere / (0; X) is the log- likeliho od function; (c) Compute U - (d) What functions T (0) have unbiased estimators that attain the relevant...

  • Question 3 [17 marks] The random variable X is distributed exponentially with parameter A i.e. X~...

    Question 3 [17 marks] The random variable X is distributed exponentially with parameter A i.e. X~ Exp(A), so that its probability density function (pdf) of X is SO e /A fx(x) | 0, (2) (a) Let Y log(X. When A = 1, (i) Show that the pdf of Y is fr(y) = e (u+e-") (ii) Derive the moment generating function of Y, My(t), and give the values of t such that My(t) is well defined. (b) Suppose that Xi, i...

  • Question 3 [10 marks Let W Then the p.d.f. 1 fw (w) 2"/21 (n/2) exp(-w/2) w3-1, w>0. and the c.d.f. is denot...

    Question 3 [10 marks Let W Then the p.d.f. 1 fw (w) 2"/21 (n/2) exp(-w/2) w3-1, w>0. and the c.d.f. is denoted as Fw (w) (a) Show that 0, n > 0, and (i) The function fw(w) is a p.d.f. (i.e., that fw(w) 2 0 for w Jo fw(w)dw 1). (ii) The mode of W is n - 2 for n > 2. (b) As n oo, W becomes normally distributed with mean n and variance 2n. This has led...

  • Problem 2. (26 points) Two random variables X and Y are jointly normally distributed, with E(X)x,...

    Problem 2. (26 points) Two random variables X and Y are jointly normally distributed, with E(X)x, EY) y and co-variance Cov(X,Y) = ơXY. To estimate the population co-variance ơXY, a very simple random sample is drawn from the population. This random sample consists of n pairs of random variables {OG, Yİ), (XyW), , (x,,y,)). Based on the sample, we construct sample co-variance SXY as: Ti-1 2-1 1. (4 points) Show Σ(Xi-X) (Yi-Y) = Σ Xix-n-X-Y. 2. (4 points) Find E(Xi...

  • 3. Let Ya» . . . , Yn be independent normally distributed random variables with E(X)...

    3. Let Ya» . . . , Yn be independent normally distributed random variables with E(X) Gai and V(X)-1. Recall that the normal density with mean μ and variance σ given by TO 202 (a) Find the maximum likelihood estimator β of β (b) Show that ß is unbiased. (c) Determine the distribution of β (d) Recall that the likelihood ratio test of Ho : θ 02] L1] L2] θ° is to θ0 against H1: θ reject Ho if L(e)...

  • Question 5 15 marks] Let X be a random variable with pdf -{ fx(z) = - 0<r<1 (1) 0 :otherwise, Xa, n>2, be...

    Question 5 15 marks] Let X be a random variable with pdf -{ fx(z) = - 0<r<1 (1) 0 :otherwise, Xa, n>2, be iid. random variables with pdf where 0> 0. Let X. X2.... given by (1) (a) Let Ylog X, where X has pdf given by (1). Show that the pdf of Y is Be- otherwise, (b) Show that the log-likelihood given the X, is = n log0+ (0- 1)log X (0 X) Hence show that the maximum likelihood...

  • Question 5 15 marks] Let X be a random variable with pdf -{ fx(z) = -...

    Question 5 15 marks] Let X be a random variable with pdf -{ fx(z) = - 0<r<1 (1) 0 :otherwise, Xa, n>2, be iid. random variables with pdf where 0> 0. Let X. X2.... given by (1) (a) Let Ylog X, where X has pdf given by (1). Show that the pdf of Y is Be- otherwise, (b) Show that the log-likelihood given the X, is = n log0+ (0- 1)log X (0 X) Hence show that the maximum likelihood...

  • 4. Exercise Let X, Y be RVs. Denote E[X] = Hy and E[Y] =py. Suppose we...

    4. Exercise Let X, Y be RVs. Denote E[X] = Hy and E[Y] =py. Suppose we want to test the null hypothesis Ho : Mx = uy against the alternative hypothesis Hi : 4x > uy. Suppose we have i.i.d. pairs (X1,Yı),...,(Xn, Yn) from the joint distribution of (X,Y). Further assume that we know the X - Y follows a normal distribution. (i) Show that exactly) T:= (X-Y)-(ux-uy) - tn-1), Sin (3) where s2 = n-1 [?-,((X; – Y;) –...

  • be a random sample from the density 16 1. Let Xi, . f(x; β) otherwise 8(1-/4)....

    be a random sample from the density 16 1. Let Xi, . f(x; β) otherwise 8(1-/4). You may suppose that E(X)(/ (a) Find a sufficient statistic Y for B and Var(X) C21 C2] 031 (b) Find the maximum likelihood estimator B of B and show that it is a function (c) Determine the Rao-Cramér lower bound (RCLB) for the variance of unbiased (d) Use the following data and maximum likelihood estimator to give an approxi- 2.66, 2.02, 2.02, 0.76, 1.70,...

ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT