We are given here that:
P(X3 = 1) = 1/8 and P(X4 = 1) = 3/8
Therefore, P(X3 = X4 = 0) = 1 - (1/8) - (3/8) = 1/2
Therefore, the joint mass function of X3, X4 here is obtained as:
P(X3 = x3, X4 = x4) = Probability of getting x3 instances of X3, x4 instances of X4 and (4 - x3 - x4) instances of anything other than x3 / x4.
Therefore, the probability now is obtained here as:
This is the required joint PDF for X3 , X4
I will rate asap. Thanks Problem 3. Let (Xi, X2, X3, X4) be Multinomial(n, 4,1/6, 1/3,1/8,3/8)....
Problem 3. Let (Xi,X2.Xs.X4) be Multinomial(n, 4,1/6, 1/3,1/8,3/8). Derive the joint mass function of the pair (Xs, X). You should be able to do this with almost no computation.
3. [20 marks] Consider the multinomial distribution with 3 categories, where the random variables Xi, X2 and X3 have the joint probability function where x = (zi, 2 2:23), θ = (θί, θ2), n = x1 + 2 2 + x3, θι, θ2 > 0 and 1-0,-26, > 0. (a) [4 marks] Find the maximum likelihood estimator θ of θ. (b) [4 marks] Find that the Fisher information matrix I(0) (c) [4 marks] Show that θ is an MVUE. (d)...
3. (25 pts.) Let X1, X2, X3 be independent random variables such that Xi~ Poisson (A), i 1,2,3. Let N = X1 + X2+X3. (a) What is the distribution of N? (b) Find the conditional distribution of (X1, X2, X3) | N. (c) Now let N, X1, X2, X3, be random variables such that N~ Poisson(A), (X1, X2, X3) | N Trinomial(N; pi,p2.ps) where pi+p2+p3 = 1. Find the unconditional distribution of (X1, X2, X3).
3. (25 pts.) Let X1,...
3. [20 marks] Consider the multinomial distribution with 3 categories, where the random variables Xi, X2 and X3 have the joint probability function where x = (zi, 2 2:23), θ = (θί, θ2), n = x1 + 2 2 + x3, θι, θ2 > 0 and 1-0,-26, > 0. (a) [4 marks] Find the maximum likelihood estimator θ of θ. (b) [4 marks] Find that the Fisher information matrix I(0) (c) [4 marks] Show that θ is an MVUE. (d)...
Let Xi , i = 1, · · · , n be a random sample from Poisson(θ) with pdf f(x|θ) = e −θ θ x x! , x = 0, 1, 2, · · · . (a) Find the posterior distribution for θ when the prior is an exponential distribution with mean 1; (b) Find the Bayesian estimator under the square loss function. (c) Find a 95% credible interval for the parameter θ for the sample x1 = 2, x2...
hi..please help me to solve these problem.
(15 marks) 3. Xi, X2, Xs, X4 is a random sample from the Normal (0, 4) distribution. We want to test HO: θ 15 versus Hi: θ< 15. Let X_13x (a) Suppose we have two decision rules: Reject Ho if and only if () X <IS (II) X <12 Which one is better and why? (b) Instead of n 4, let n be an unknown. Let the decision rule now be: Reject Ho...
(i) Show that 15 (ii) Show that (X) 5/12 and E(Y) 5/8 3(1 - 2X2 +X4) 4(2- 3X +X3) (iii) Show that 3(y|X) (iv) Verify thatE(Y)E(Y) 14] 7. (a) State Chebyshev's inequality and prove it using Markov's inequality 15] (b) Let (2, P) be a probability space representing a random experiment that can be repeated many times under the same conditions, and let A C S2 be a random event. Suppose the experiment is repeated n times (i) Write down...
Problem 5.10.10 Suppose you have n suitcases and suitcase i holds Xi dollars where X1, X2, …, Xn are iid continuous uniform (0, m) random variables. (Think of a number like one million for the symbol m.) Unfortunately, you don’t know Xi until you open suitcase i. Suppose you can open the suitcases one by one, starting with suitcase n and going down to suitcase 1. After opening suitcase i, you can either accept or reject Xi dollars. If...
Problem 5.10.10 Suppose you have n suitcases and suitcase i holds Xi dollars where X1, X2, …, Xn are iid continuous uniform (0, m) random variables. (Think of a number like one million for the symbol m.) Unfortunately, you don’t know Xi until you open suitcase i. Suppose you can open the suitcases one by one, starting with suitcase n and going down to suitcase 1. After opening suitcase i, you can either accept or reject Xi dollars. If you...
Assume that we have three independent observations: where Xi ~ Binomial(n 7,p) for i E { 1.2.3). The value of p E (0, 1) is not known. When we have observations like this from different, independent ran- dom variables, we can find joint probabilities by multiplying together th ndividual probabilities. For example This should remind you the discussion on statistical independence of random variables that can be found in the course book (see page 22) Answer the following questions a...