4. Let Xi,... . Xn be lid discrete uniform random variables with common pmf θ, with...
5. Let Xi, , X, (n 3) be iid Bernoulli random variables with parameter θ with 0<θ<1. Let T = Σ_iXi and 0 otherwiase. (a) Derive Eo[6(X,, X.)]. (b) Derive Ee16(X, . . . , Xn)IT = t], for t = 0, i, . . . , n.
4. Let Xi, X2,... be uncorrelated random variables, such that Xn has a uniform distribution over -1/n, 1/n]. Does the sequence converge in probability? 5. Let Xi,X2 be independent random variables, such that P(X) PX--) Does the sequence X1 +X2+...+X satisfy the WLLN? Converge in probability to 0?
Conditional on θ, the random variables X1, X2, ,Xn are îid from In turn, the parameter θ is best regarded as random with prior distribution αθ where a 0 is known (a) Find the posterior mean of θ (b) Discuss how you would formulate the Bayesian test of versus Conditional on θ, the random variables X1, X2, ,Xn are îid from In turn, the parameter θ is best regarded as random with prior distribution αθ where a 0 is known...
Let X1, . . . , Xn be a random sample from the discrete uniform distribution on 1, 2, . . . , θ. Using the definition of sufficient statistic, show that X(n) is a sufficient statistic for θ.
+1,20] distribution, where , X.) 3. Let Xi,...,X be a random sample from Uniform θ > 1 is unknown. Let X(1)-min(X, , X.) and X() = max(X,, (a) Derive the edf of X(n) and then its pdf. b) Derive EoX(n) (c) Find a function g(X(n)) such that Eolg(X())-θ for all θ > 1. (d) Replace X(n) by Xu) in the above questions, parts (a) - (c), and answer those.
Let X1,... Xn i.i.d. random variable with the following riemann density: with the unknown parameter θ E Θ : (0.00) (a) Calculate the distribution function Fo of Xi (b) Let x1, .., xn be a realization of X1, Xn. What is the log-likelihood- function for the parameter θ? (c) Calculate the maximum-likelihood-estimator θ(x1, , xn) for the unknown parameter θ
Let X1, . . . , Xn be independent Poisson(θ) random variables with parameter θ > 0. (1) Find the Bayes estimator of θ for a Gamma(α, β) prior. (2) Find the MSE of the Bayes estimator.
Let X1, . . . , Xn be independent Beta(θ, 1) random variables with parameter θ > 0. (1) Find the Bayes estimator of θ for a Gamma(α, β) prior. (2) Find the MSE of the Bayes estimator.
assume that the random variables X1, · · · , Xn form a random sample of size n form the distribution specified in that exercise, and show that the statistic T specified in the exercise is a sufficient statistic for the parameter A uniform distribution on the interval [a, b], where the value of a is known and the value of b is unknown (b > a); T = max(X1, · · · , Xn).
1.9 Let Xi, -.. .Xn be nonnegative integer-valued random variables with identical pffx (-). A discrete mixture distribution W is created with pf fw (x)-puxi(x) +..+pfx, (x), where pi0 for i -1,... .n and X\-iPi1. Another random variable Y is defined by Y - (a) Compare the mean of W and Y. (b) If Xi,.. ,Xn are independent, compare the variance of W and Y.