Suppose Y1, Y2, ..., Yn are such that Y; ~ Bernoulli(p) and let X = 2h+Yi....
Suppose X1, X2, . . . , Xn follows Bernoulli(p), and Y1, Y2, . . . , Ym follows Bernoulli(p + q), where both 0 < p, q < 0.5. Compute the moment estimator of p and q using first moments.
Let Y1, Y2, . .. , Yn be independent and identically distributed random variables such that for 0 < p < 1, P(Yi = 1) = p and P(H = 0) = q = 1-p. (Such random variables are called Bernoulli random variables.) a Find the moment-generating function for the Bernoulli random variable Y b Find the moment-generating function for W = Yit Ye+ … + . c What is the distribution of W? 1.
QUESTION 5 Let Y , Y2, , Yn denote a random sample of size n from a population whose density is given by (a) Find the method of moments estimator for β given that α is known. Find the mean and variance of p (b) (c) show that β is a consistent estimator for β.
Let Y1,Y2, …… Yn be a random sample from the distribution f(y) = θxθ-1 where 0 < x < 1 and 0 < θ < ∞. Show that the maximum likelihood estimator (MLE) for θ is
Let Yi, Y2,.... Yn denote independent and identically distributed uniform random variables on the interval (0,4A) obtain a method of moments estimator for λ, λ. Calculate the mean squared error of this estimator when estimating λ. (Your answer will be a function of the sample size n and λ
Let Yi, Y2,.... Yn denote independent and identically distributed uniform random variables on the interval (0,4A) obtain a method of moments estimator for λ, λ. Calculate the mean squared error of this estimator when estimating λ. (Your answer will be a function of the sample size n and λ
Suppose that X1, X2,.... Xn and Y1, Y2,.... Yn are independent random samples from populations with the same mean μ and variances σ., and σ2, respectively. That is, x, ~N(μ, σ ) y, ~ N(μ, σ ) 2X + 3Y Show that is a consistent estimator of μ.
1. Suppose Yi,½, , Yn is an iid sample from a Bernoulli(p) population distribution, where 0< p<1 is unknown. The population pmf is py(ulp) otherwise 0, (a) Prove that Y is the maximum likelihood estimator of p. (b) Find the maximum likelihood estimator of T(p)-loglp/(1 - p)], the log-odds of p. 1. Suppose Yi,½, , Yn is an iid sample from a Bernoulli(p) population distribution, where 0
Lct Yi, Y2.. . Yn denote a random sample from the probability density function Show that Y is a consistent estimator of 1
Suppose Y1, Y2, ... Yn are mutually independent random variables with Y1 ~ N(μ1, (σ1)^2) Y2 ~ N(μ2, (σ2)^2) ... Yn ~ N(μn, (σn)^2) Find the distribution of U=summation(from i=1 to n) ((Yi - μi)/σi)^2 I am not sure where should I start this question, could you please show me the detail that how you do these two parts? thanks :)