A parameter ξ is obtained from n independent observations (x1, x2, ...). Construct a posterior probability density function p(ξ|x1,x2,...) for ξ.
A parameter ξ is obtained from n independent observations (x1, x2, ...). Construct a posterior probability...
3. Given X1 ~ N(0,1), X2 ~ N(20,1) with unknown parameter 0. X1 and X2 are independent. Derive the most powerful a-level test for Ho : 0 = 0 vs. H1 : 0 = 1 using both X1 and X2. Give an implementable form of this test. (Note that our sample X1 and X2 have different distributions now, but you can still write out the likelihood function for Xį and X2 jointly, and then use the N-P lemma as usual.)
Let Ņ, X1. X2, . . . random variables over a probability space It is assumed that N takes nonnegative inteqer values. Let Zmax [X1, -. .XN! and W-min\X1,... ,XN Find the distribution function of Z and W, if it suppose N, X1, X2, are independent random variables and X,, have the same distribution function, F, and a) N-1 is a geometric random variable with parameter p (P(N-k), (k 1,2,.)) b) V - 1 is a Poisson random variable with...
Suppose X1,X2,…,Xn represent the outcomes of n independent
Bernoulli trials, each with success probability p. Note that we can
write the Bernoulli distribution as:
Suppose X1 2 X, represent the outcomes of n independent Bernou i als, each with success probabil ,p. Note that we can writ e the Bernoulǐ distribution as 0,1 otherwise Given the Bernoulli distributional family and the iid sample of X,'s, the likelihood function is: -1 a. Find an expression for p, the MLE of p...
Let X1,X2 be two independent
exponential random variables with λ=1, compute the
P(X1+X2<t) using the joint density function. And let Z be gamma
random variable with parameters (2,1). Compute the probability that
P(Z < t). And what you can find by comparing P(X1+X2<t) and
P(Z < t)? And compare P(X1+X2+X3<t) Xi iid
(independent and identically distributed) ~Exp(1) and P(Z < t)
Z~Gamma(3,1) (You don’t have to compute)
(Hint: You can use the fact that Γ(2)=1,
Γ(3)=2)
Problem 2[10 points] Let...
Let Xi and X2 independent random variables, with distribution functions F1, and F2, respectively Let Y a Bernoulli random variable with parameter p. Suppose that Y, X1 and X2 are independent. Proof using the de finition of distribution function that the the distribution function of Z =Y Xit(1-Y)X2 is F = pF14(1-p)F2 Don't use generatinq moment functions, characteristic functions) Xi and X2 independent random variables, with distribution functions F1, and F2, respectively Let Y a Bernoulli random variable with parameter...
Let X1, X2, ..., Xr be independent exponential random variables with parameter λ. a. Find the moment-generating function of Y = X1 + X2 + ... + Xr. b. What is the distribution of the random variable Y?
Help PLEASE!
2. Find M.L.E for the parameter 0 There are 3 observations, X1 = 1,X2 = 2,X3 = 1 3(1-0) 20 3-0 , P(X1 = 2) = - 3 - 0
Let X1, X2,... X,n be a random sample of size n from a distribution with probability density function obtain the maximum likelihood estimator of λ, λ. Calculate an estimate using this maximum likelihood estimator when 1 0.10, r2 0.20, 0.30, x 0.70.
Suppose you have a sample of n independent observations X1,X2,...,Xn from a normal population with mean μ (known) and variance σ2 (unknown). (a) Find the ML estimator of σ2 . (b) Show that the ML estimator in (a) is a consistent estimator of θ. (c) Find a sufficient statistic for σ2. (d) Give a MVUE for θ based on the sufficient statistic.
Minimum and maximum of n independent exponentials. Let X1, X2, ..., Xn be independent, each with exponential (~) distribution. Let V min (X1, X2, ..., Xn) and W = max(X1, X2, ..., Xn). Find the joint density of V and W. .