Suppose Y1, Y2, ... Yn are mutually independent random variables with
Y1 ~ N(μ1, (σ1)^2)
Y2 ~ N(μ2, (σ2)^2)
...
Yn ~ N(μn, (σn)^2)
Find the distribution of U=summation(from i=1 to n) ((Yi - μi)/σi)^2
I am not sure where should I start this question, could you please show me the detail that how you do these two parts? thanks :)
Suppose Y1, Y2, ... Yn are mutually independent random variables with Y1 ~ N(μ1, (σ1)^2) Y2...
Let X1, ...., Xm be iid N(μ1,σ2) and Y1, ..., Yn be iid N(μ2,σ2), and X's and Y's are independent. Here -∞<μ1,μ2<∞ and 0<σ<∞ are unknown. Derive the MLE for (μ1,μ2,σ2). Is the MLE sufficient for (μ1,μ2,σ2)? Also derive the MLE for (μ1-μ2)/σ.
Let Y1, Y2, . . . , Yn be independent random variables with Exponential distribution with mean β. Let Y(n) = max(Y1,Y2,...,Yn) and Y(1) = min(Y1,Y2,...,Yn). Find the probability P(Y(1) > y1,Y(n) < yn).
Let Y1, Y2, ..., Yn be independent random variables each having uniform distribution on the interval (0, θ). Find variance(Y(j) − Y(i)) Let Yİ,Y2, , Yn be independent random variables each having uniform distribu - tion on the interval (0,0) Fin ar(Y)-Yo
Suppose Y1, Y2, …, Yn are independent and identically distributed random variables from a uniform distribution on [0,k]. a. Determine the density of Y(n) = max(Y1, Y2, …, Yn). b. Compute the bias of the estimator k = Y(n) for estimating k.
Let Y1, Y2, . .. , Yn be independent and identically distributed random variables such that for 0 < p < 1, P(Yi = 1) = p and P(H = 0) = q = 1-p. (Such random variables are called Bernoulli random variables.) a Find the moment-generating function for the Bernoulli random variable Y b Find the moment-generating function for W = Yit Ye+ … + . c What is the distribution of W? 1.
1. Let X1, ..., Xn, Y1, ..., Yn be mutually independent random variables, and Z = + Li-i XiYi. Suppose for each i E {1,...,n}, X; ~ Bernoulli(p), Y; ~ Binomial(n,p). What is Var[Z]?
Let Y1, Y2, ..., Yn be independent random variables each having uniform distribution on the interval (0, θ) (c) Find var(Y(j) − Y(i)). Let Y İ, Y2, , Yn be independent random variables each having uniform distribu- tion on the interval (0,0) Let Y İ, Y2, , Yn be independent random variables each having uniform distribu- tion on the interval (0,0)
Let Y1, Y2, ..., Yn be independent random variables each having uniform distribution on the interval (0, θ). (a) Find the distribution of Y(n) and find its expected value. (b) Find the joint density function of Y(i) and Y(j) where 1 ≤ i < j ≤ n. Hence find Cov(Y(i) , Y(j)). (c) Find var(Y(j) − Y(i)). Let Yİ, Ya, , Yn be independent random variables each having uniform distribu- tion on the interval (0, 6) (a) Find the distribution...
Let the independent normal random variables Y1,Y2, . . . ,Yn have the respective distributions N(μ, γ 2x2i ), i = 1, 2, . . . , n, where x1, x2, . . . , xn are known but not all the same and no one of which is equal to zero. Find the maximum likelihood estimators for μ and γ 2.
Suppose that X1, X2,.... Xn and Y1, Y2,.... Yn are independent random samples from populations with the same mean μ and variances σ., and σ2, respectively. That is, x, ~N(μ, σ ) y, ~ N(μ, σ ) 2X + 3Y Show that is a consistent estimator of μ.