Let Mn be the maximum of n independent U(0, 1) random
variables.
a. Derive the exact expression for P(|Mn − 1| > ε).
Hint: see Section 8.4.
b. Show that limn→∞ P(|Mn − 1| > ε) = 0. Can this be derived
from Chebyshev’s
inequality or the law of large numbers?
Solution is provided in one long image
pdf of U(a,b) is 1/(b-a) where x values lie between a and b
Expression in a) can be further expanded to derive a longer expression using binomial expansion
Let Mn be the maximum of n independent U(0, 1) random variables. a. Derive the exact...
Let X1, X2, · · · be independent random variables, Xn ∼ U(−1/n, 1/n). Let X be a random variable with P(X = 0) = 1. (a) what is the CDF of Xn? (b) Does Xn converge to X in distribution? in probability?
2) Let X,..X, be ii.d. N(O, 1) random variables. Define U- Find the limiting distribution of Zn (Hint: Recall that if X and Y are independent N(0, 1) random variables, then has a Cauchy distribution
2) Let X,..X, be ii.d. N(O, 1) random variables. Define U- Find the limiting distribution of Zn (Hint: Recall that if X and Y are independent N(0, 1) random variables, then has a Cauchy distribution
3. (15 Points) Let Xi Bernoulli(p) and X2Bernoulli(3p) be independent Bernoulli random variables where p E [0, 1/3]. Derive the Maximum Likelihood Estimator (MLE) of p. Denote it by p.
3. (15 Points) Let Xi Bernoulli(p) and X2Bernoulli(3p) be independent Bernoulli random variables where p E [0, 1/3]. Derive the Maximum Likelihood Estimator (MLE) of p. Denote it by p.
Let ?1,?2,…,??be a collection of independent discrete random variables that all take the value 1 with probability p and take the value 0 with probability (1-p). a) Compute the mean and the variance of ?1 (which is the same for ?2, ?3, etc.) b) Use your answer to (a) to compute the mean and variance of ?̂ = 1/n (?1 + ?2 + ⋯+ ??), which is the proportion of “ones” observed in the n instances of ??. c) Suppose...
4. Let Z1, Z2,... be a sequence of independent standard normal random variables. De- fine Xo 0 and n=0, 1 , 2, . . . . TL: n+1 , The stochastic process Xn,n 0, 1,2,3 is a Markov chain, but with a continuous state space. (a) Find EXn and Var(X). (b) Give probability distribution of Xn (c) Find limn oo P(X, > є) for any e> 0. (d) Simulate two realisations of the Markov process from n = 0 until...
1.2 Let Yi and Y2 be independent random variables with Yi N(0, 1) and Y2 N(3,4). (a) What is the distribution of Y?? (b) If y-l (Y2-3)/2 | , obtain an expression for уту. What is its Yi and its distribution is yMVN(u, V), obtain an expression for yTV-ly. What is its distribution?
2. Let U and V be independent random variables, with P(U 1) 1/4 and P(U = -1) = P(V -1) 1) = P(V 3/4. Define X = U/V and Y = U V (a) Give the joint pmf of X and Y [4] (b) Calculate Cov(X,Y) [4]
2. Let U and V be independent random variables, with P(U 1) 1/4 and P(U = -1) = P(V -1) 1) = P(V 3/4. Define X = U/V and Y = U V...
(a) Let X and Y be independent random variables both with the same mean u +0. Define a new random variable W = ax + by, where a and b are constants. (i) Obtain an expression for E(W). (ii) What constraint is there on the values of a and b so that W is an unbiased estimator of u? Hence write all unbiased versions of W as a formula involving a, X and Y only (and not b). [2]
e (4 marks) Let m be an integer with the property that m 2 2. Consider that X1, X2,.. ., Xm are independent Binomial(n,p) random variables, where n is known and p is unknown. Note that p E (0,1). Write down the expression of the likelihood function We assume that min(x1, . . . ,xm) 〈 n and max(x1, . . . ,xm) 〉 0 5 marks) Find , and give all possible solutions to the equation dL dL -...
1 (10pts) Let U1, U2, ... ,Un be independent uniform random variables over [0, 0] with the probability density function (p.d.f). () = a 2 + [0, 03, 0 > 0. Let U(1), U(2), .-. ,U(n) be the order statistics. Also let X = U(1)/U(n) and Y = U(n)- (a) (5pts) Find the joint probability density function of (X, Y). (b) (5pts) From part (a), show that X and Y are independent variables.