7. A positive random variable Y is said to be a lognormal random variable, LOGN (u,...
N (,02). We 7. A positive random variable Y is said to be a lognormal random variable, LOGN(1,02), if In Y assume that Y, LOGN(Mi, 0), i = 1,...,n are independent. [5] (a) Find the distribution of T = II Y. [4] (b) Find E(T) and Var(T) 5) (c) If we assume that Hi = ... = Hn and oi = ... = on what does the the successive geometric average, lim (IIYA), converge in probability to? Justify your answer....
If we observe y0 as the value for a geometric random variable Y, P(Y = y0) is maximized when p = 1/Y0. The maximum likelihood estimator for p is 1/Y (note that Y is the geometric random variable, not a particular value of it). Derive E(1/y). E (1/y) =
6. Let Y be a continuous random variable with probability density function Oyo-1, for 0< y< k; f(y) 0, otherwise, where 0 > 1 and k > 0. (a) Show that k = 1. (b) Find E(Y) and Var(Y) in terms of 0. (c) Derive 6, the moment estimator of 0 based on a random sample Y1,...,Y. (d) Derive ô, the maximum likelihood estimator of 0 based on a random sample Y1,..., Yn. (e) A random sample of n =...
Let Xi,... , Xn be a random sample from a normal random variable X with E(X) 0 and var(X)-0, i.e., X ~N(0,0) (a) What is the pdf of X? (b) Find the likelihood function, L(0), and the log-likelihood function, e(0) (c) Find the maximun likelihood estimator of θ, θ (d) Is θ unbiased?
A random variable X is said to follow a lognormal
distribution if Y = log(X) follows a normal
distribution. The lognormal is sometimes used as a model for
heavy-tailed skewed distributions.
please answer the follow:
110 15 60 5419 15 73 190 57 4344 18 37 43 55 19 23 82 175 50 80 65 63 36 6 10 17 52 43 70 22 95 20 4 17 15 12 29 29 6 22 40 17 26 30 16 116...
If the random variable Y denotes an individual's income,
Pareto's law claims that P(Yy)
=
, where k is the entire population's minimum income. it follow
that
. The income information has been collected on a random sample of
n individuals:
.
To answer this question enter your answer as a formula. In
addition to the usual guidelines, a few more instructions for this
problem: write
as single variable p and
as m. These can be used as the input...
4. Let X be a random variable that describes the annual counts of tropical cyclones in the North Atlantic. Assume that X1,..., X, is a random sample that describes the counts of tropical cyclones in the North Atlantic during n years and assume they are distributed according to a geometric distribution with probability parameter 8 and p.f. given by fxjex | ) = (1 - 0)*-11{1,2,...,x), 0<O<1. (a) Write the statistical model. (b) Find the maximum likelihood estimator of 0....
1. Consider a variable y = θ+e where θ is an unknown parameter and e is a random variable with mean zero (a) What is the expected value of y (b) Suppose you draw a sample of in y-Derive the least squares estimator for θ. For full credit you must check the 2nd order condition. (c) Can this estimator () be described as a method of moments estimator? (d) Now suppose e is independent normally distributed with mean 0 and...
(2) Let Y be a binomial random variable with parameters n and p. Remember that E(Y) V(Y)p1 -p) We know that Y/n is an unbiased estimator of p. Now we want to estimate the variance of Y with n(2(1 (a) Find the expected value of this estimator (b) Find an unbiased estimator that is a simple modification of the proposed estimator
Consider a random vector Y () y(2). y(k) where the elements y(i) are made yi)wi), j-1, ...k where w(j) are independent, identically distributed, Gaussian, zero-mean, and with the variance σ2 i.e., N(0, σ2). 1. Find the Maximum Likelihood (ML) estimator for xr, i.e., ML 2. Find the Mean Square Error (MSE) of ML estimator, i.e., MSE(XML) Ξ Var@sL) 3. Is this estimator consistent? Prove your answer 4. Is this estimator efficient? Prove your answer