N (,02). We 7. A positive random variable Y is said to be a lognormal random...
7. A positive random variable Y is said to be a lognormal random variable, LOGN (u, 0), if In Y ~ N(No?). We assume that Y, LOGN (1,0%), i = 1,..., n are independent. [5] (a) Find the distribution of T = 11",Y. [4] (b) Find E(T) and Var(T) (5] (c) If we assume that M = ... = Hn and a = ... = 0, what does the the successive geometric average, lim (II",Y), converge in probability to? Justify...
If we observe y0 as the value for a geometric random variable Y, P(Y = y0) is maximized when p = 1/Y0. The maximum likelihood estimator for p is 1/Y (note that Y is the geometric random variable, not a particular value of it). Derive E(1/y). E (1/y) =
6. Let Y be a continuous random variable with probability density function Oyo-1, for 0< y< k; f(y) 0, otherwise, where 0 > 1 and k > 0. (a) Show that k = 1. (b) Find E(Y) and Var(Y) in terms of 0. (c) Derive 6, the moment estimator of 0 based on a random sample Y1,...,Y. (d) Derive ô, the maximum likelihood estimator of 0 based on a random sample Y1,..., Yn. (e) A random sample of n =...
X is a random variable with a lognormal distribution and that Y = ln(X) ∼ N(µ, σ2 ). Prove that µX = e ^ (µ+ (σ^2)/2 )
[Probability] Let N be a geometric random variable with parameter p. Given N,generate N many i.i.d. random numbers U1, U2, . . . , UN uniformly from [0,1]. Let M= max 1≤i≤N Ui. Find the cdf of M, i.e., find P(M≤x).
(2) Let Y be a binomial random variable with parameters n and p. Remember that We know that Y/n is an unbiased estimator of p. Now we want to estimate the variance of Y with n借)(1-n) (a) Find the expected value of this estimator (b) Find an unbiased estimator that is a simple modification of the proposed estimator
Let X1, ..., X., be i.i.d random variables N(u, 02) where u is known parameter and o2 is the unknown parameter. Let y() = 02. (i) Find the CRLB for yo?). (ii) Recall that S2 is an unbiased estimator for o2. Compare the Var(S2) to that of the CRLB for
(2) Let Y be a binomial random variable with parameters n and p. Remember that E(Y) V(Y)p1 -p) We know that Y/n is an unbiased estimator of p. Now we want to estimate the variance of Y with n(2(1 (a) Find the expected value of this estimator (b) Find an unbiased estimator that is a simple modification of the proposed estimator
QUESTION 7 Let Y, Y2, ....Yn denote a random sample of size n from a population whose density is given by (a) Find an estimator for θ by the maximum likelihood method. (b) Find the maximum likelihood estimator for E( Y4).
QUESTION 7 Let Y,, Y2,..., Yn denote a random sample of size n from a population whose density is given by (a) Find an estimator for 0 by the maximum likelihood method. (b) Find the maximum likelihood estimator for E(Y4).