Let (X1, ..., Xn) denote a random variable from X having a Log-normal density fX(x) = φ(Ln(x) − m)/x, x > 0 where m is an unknown parameter. Show (1/n) sigma(i=1 to n) of Ln(Xi) is a MVU estimator for m.
Let (X1, ..., Xn) denote a random variable from X having a Log-normal density fX(x) =...
Problem 1: Let (Xi,..., Xn) denote a random variable from X having a Log-normal density fx (x) = d(L m)/ x, x 〉 0 n(x) - where m is an unknown parameter. Show n-1 Σ'al Ln(X) is a MVU estimator for m.
Problem 1: Let (Xi,..., Xn) denote a random variable from X having a Log-normal density fx (x) = d(L m)/ x, x 〉 0 n(x) - where m is an unknown parameter. Show n-1 Σ'al Ln(X) is a...
Problem 2: Let (X1,... Xn) denote a random variable from X having density fx(x) = 1/ β,0 < x < β where β > 0 is an unknown param eter. Explain why the Cramer Rao Theorem cannot be applied to show that an unbiased estimator of β is MVU. (Hint: see slides. Condition (A) of Cramer Rao Theorem)
Let X1,... Xn i.i.d. random variable with the following riemann density: with the unknown parameter θ E Θ : (0.00) (a) Calculate the distribution function Fo of Xi (b) Let x1, .., xn be a realization of X1, Xn. What is the log-likelihood- function for the parameter θ? (c) Calculate the maximum-likelihood-estimator θ(x1, , xn) for the unknown parameter θ
8. Let X1,...,Xn denote a random sample of size n from an exponential distribution with density function given by, 1 -x/0 -e fx(x) MSE(1). Hint: What is the (a) Show that distribution of Y/1)? nY1 is an unbiased estimator for 0 and find (b) Show that 02 = Yn is an unbiased estimator for 0 and find MSE(O2). (c) Find the efficiency of 01 relative to 02. Which estimate is "better" (i.e. more efficient)?
8. Let X1,...,Xn denote a random...
Let X1, X2, ..., Xn denote a random sample of size n from a population whose density fucntion is given by 383x-4 f S x f(x) = 0 elsewhere where ß > 0 is unknown. Consider the estimator ß = min(X1, X2, ...,Xn). Derive the bias of the estimator ß.
. Suppose that x1, . . . , xn are a random sample having probability density function f(x; θ) = (θ + 1)x^θ , 0 < x < (1) Here the parameter θ > 0. (a) Determine the log-likelihood, l(θ), and a 1-dimensional sufficient statistic. (b) Show that P(Xi ≤ b; θ) = b θ+1 for f(x; θ) given in (1). (c) Suppose now that because of a recurring computer glitch in storing the observations, only a random subset of...
3. Let X1, X2, . . . , Xn be independent samples of a random variable with the probability density function (PDF): fX(x) = θ(x − 1/ 2 ) + 1, 0 ≤ x ≤ 1 ,0 otherwise where θ ∈ [−2, 2] is an unknown parameter. We define the estimator ˆθn = 12X − 6 to estimate θ. (a) Is ˆθn an unbiased estimator of θ? (b) Is ˆθn a consistent estimator of θ? (c) Find the mean squared...
Let X be a random variable with probability density function (pdf) given by fx(r0)o elsewhere where θ 0 is an unknown parameter. (a) Find the cumulative distribution function (cdf) for the random variable Y = θ and identify the distribution. Let X1,X2, . . . , Xn be a random sample of size n 〉 2 from fx (x10). (b) Find the maximum likelihood estimator, Ỗmle, for θ (c.) Find the Uniform Minimum Variance Unbiased Estimator (UMVUE), Bumvue, for 0...
Suppose that x1, . . . , xn are a random sample having probability density function f(x; θ) = (θ + 1)x^θ , 0 < x < 1. (1) Here the parameter θ > 0. (a) Show that P(Xi ≤ b; θ) = b^(θ+1) for f(x; θ) given in (1). (b) Suppose now that because of a recurring computer glitch in storing the observations, only a random subset of the xi are observed. For the rest of the observations, it...
2. Let Xi, X,.., Xn denote a random sample from the probability density function Show that X(i) = min(X1,X2, . . . , Xn} is sufficient for ?. Hint: use an indictorfunction since the support depends on ?