Suppose Xi, X2, . . . , xn are i.id. random variables with Xi ~「α, β)....
suppose Xi, X2, . . . , X, are i.id. random variables with Xi ~ exp(A). Show that Σ-x, ~ T(n, t).
Let Xi, known , xn be a random sample from a gamna(α, β) distribution. Find the MLE of β, assuming α s
4. Let Xi,X2, , Xn be n i.id. exponential random variables with parameter λ > Let X(i) < X(2) < < X(n) be their order statistics. Define Yǐ = nX(1) and Ya = (n +1 - k)(Xh) Xk-n) for 1 < k Sn. Find the joint probability density function of y, . . . , h. Are they independent? 15In
Let Xi, X2, X3 be i.id. N(0.1) Suppose Yı = Xi + X2 + X3,Ý, = Xi-X2, у,-X,-X3. Find the joint pdf of Y-(y, Ya, y), using: andom variables. a. The method of variable transformations (Jacobian), b. Multivariate normal distribution properties.
Suppose that the random variables Y Y, satisfy where xi, ,Xn are fixed constants and Ei, ,En are îid N(0, σ2), where σ2 is a fixed constant. (a) What distribution do Yi,.., Yn follow? What is your reasoning? (b) Find the MLEs for α and β.
Problem 3.1 Suppose that XI, X2,... Xn is a random sample of size n is to be taken from a Bermoulli distribution for which the value of the parameter θ is unknown, and the prior distribution of θ is a Beta(α,β) distribution. Represent the mean of this prior distribution as μο=α/(α+p). The posterior distribution of θ is Beta =e+ ΣΧ, β.-β+n-ΣΧ.) a) Show that the mean of the posterior distribution is a weighted average of the form where yn and...
Specifically, suppose that Xi, X2, .., Xn denote n payments, modeled as iid random variables with common Weibull pdf 0, otherwise, where m > 0 is known and θ is unknown. In turn, suppose that θ ~ IG(α, β), that is, θ has an inverted gamma (prior) pdf 0, otherwise (a) Prove that the inverted gamma IG(α, β) prior is a conjugate prior for the Weibull family above. (b) Suppose that m-2, α-05, and β-2. Here are n-10 insurance payments...
Let X1 ,……, Xn be a random sample from a Gamma(α,β) distribution, α> 0; β> 0. Show that T = (∑n i=1 Xi, ∏ n i=1 Xi) is complete and sufficient for (α, β).
explan the answer . Suppose that Xi, X2,.... Xn are independent random variables. Assume that E[A]-: μί ald Var(Xi)-σ? where i-| , 2, , n. If ai, aam., an are constants. (i) Write down expression for (i) E{E:-aiX.) and (ii) Var(Σ-lai%). (i) Rewrite the expression if X,'s are not independent.
6. Suppose that X, , ,x, are i.id. randon variables and let X-n Σηί Xi. (a) Show that the sum of residuals always equal to zero, .e., show that (b) Show that (c) Please answer the following questions and provide a brief explanation to your answers What is the degree of freedom of (X1,.,Xn)? What is the degree of freedom of (Xi,... ,Xn, X) ? What is the degree of freedom of the residuals: (Xi -X,... ,X.-X)? What is the...