Suppose that x1, . . . ,xn are some fixed predictors and that that Y. ....
Let Ņ, X1. X2, . . . random variables over a probability space It is assumed that N takes nonnegative inteqer values. Let Zmax [X1, -. .XN! and W-min\X1,... ,XN Find the distribution function of Z and W, if it suppose N, X1, X2, are independent random variables and X,, have the same distribution function, F, and a) N-1 is a geometric random variable with parameter p (P(N-k), (k 1,2,.)) b) V - 1 is a Poisson random variable with...
Suppose that X1, … Xn are sample from the following truncated
Poisson distribution.
12:35 AM Fri Nov 29 100% 5 TORO+ 0 0 + Bo 2. Suppose that X ., X, are sampled from the following truncated Poisson distribution: PCX = r exp(-)" ! (r) for x=r +1,7 +2,... for some integer r > where *a(r) = Į PM) = 1 - . expe="JA" Such a sample might arise if we were sampling from a Poisson population but were unable...
Let X1, X2,...be a
sequence of random variables. Suppose that Xn?a in probability for
some a ? R. Show that (Xn) is Cauchy convergent in probability,
that is, show that for all
> 0 we have P(|Xn?Xm|> )?0 as n,m??.Is the converse true?
(Prove if “yes”, find a counterexample if “no”)
4. Let X1, X2, . .. be independent random variables satisfying E(X) E(Xn) --fi. (a) Show that Y, = Xn - E(Xn) are independent and E(Yn) = 0, E(Y2) (b) Show that for Y, = (Y1 + . . + Y,)/n, <B for some finite B > 0 and VB,E(Y) < 16B. 16B 6B 1 E(Y) E(Y) n4 i1 n4 n3 (c) Show that P(Y, > e) < 0 and conclude Y, ->0 almost surely (d) Show that (i1 +...
2. Suppose that {X1, ..., Xn} are independent and identically distributed random variables from a distribution with p.d.f. See-ox if x > 0 f(x) = 10 if x = 0 Let Y = min <i<n X;. Find the p.d.f. of Y.
Q3 Suppose X1, X2, ..., Xn are i.i.d. Poisson random variables with expected value ). It is well-known that X is an unbiased estimator for l because I = E(X). 1. Show that X1+Xn is also an unbiased estimator for \. 2 2. Show that S2 (Xi-X) = is also an unbaised esimator for \. n-1 3. Find MSE(S2). (We will need two facts) E com/questions/2476527/variance-of-sample-variance) 2. Fact 2: For Poisson distribution, E[(X – u)4] 312 + 1. (See for...
Problem D: Suppose X1, .,X, are independent random variables. Let Y be their sum, that is Y 1Xi Find/prove the mgf of Y and find E(Y), Var(Y), and P (8 Y 9) if a) X1,.,X4 are Poisson random variables with means 5, 1,4, and 2, respectively. b) [separately from part a)] X,., X4 are Geometric random variables with p 3/4. i=1
(a)Suppose X ∼ Poisson(λ) and Y ∼ Poisson(γ) are independent, prove that X + Y ∼ Poisson(λ + γ). (b)Let X1, . . . , Xn be an iid random sample from Poisson(λ), provide a sufficient statistic for λ and justify your answer. (c)Under the setting of part (b), show λb = 1 n Pn i=1 Xi is consistent estimator of λ. (d)Use the Central Limit Theorem to find an asymptotic normal distribution for λb defined in part (c), justify...
Let λ >0 and suppose that X1,X2,...,Xn be i.i.d. random variables with Xi∼Exp(λ). Find the PDF of X1+···+Xn. Use convolution formula and prove by induction
Suppose the random variables X1, X2, ..., Xn are independent each with the distribution 020 *; 0) (0+1); X2 2. Find the Maximum Likelihood estimate for 0. On Žin(x) + • 8Žin(x) + n In(2) i= 1 { ince) -- OD. Žince) - n ince) -n In(2) i= 1 O e. None of the above.