Let Xo and Xı be independent exponentially distributed random variables with re- spective parameters Ao and...
Let X1, ...., Xn be independent random variables with X; ~ N(lli, 02). Let Q=[(Xı – M1)2 + ... +(Xn – Mn)2]. Find E(Q) as a function of o and n.
(5) Let X1,X2,,Xn be independent identically distributed (i.i.d.) random variables from 1.1 U(0,1). Denote V max{Xi,..., Xn) and W min{Xi,..., Xn] (a) Find the distributions and the densities and the distributions of each of V and W. (b) Find E(V) and E(W) (5) Let X1,X2,,Xn be independent identically distributed (i.i.d.) random variables from 1.1 U(0,1). Denote V max{Xi,..., Xn) and W min{Xi,..., Xn] (a) Find the distributions and the densities and the distributions of each of V and W. (b)...
Let X1, X2, · · · be independent random variables, Xn ∼ U(−1/n, 1/n). Let X be a random variable with P(X = 0) = 1. (a) what is the CDF of Xn? (b) Does Xn converge to X in distribution? in probability?
(a) Suppose that Xi, X2,... are independent and identically distributed random variables each taking the value 1 with probability p and the value-1 with probability 1-p For n 1,2,..., define Yn -X1 + X2+ ...+Xn. Is {Yn) a Markov chain? If so, write down its state space and transition probability matrix. (b) Let Xı, X2, ues on [0,1,2,...) with probabilities pi-P(X5 Yn - min(X1, X2,.. .,Xn). Is {Yn) a Markov chain and transition probability matrix. be independent and identically distributed...
3. Let U1, U2,. be a sequence of independent Ber(p) random variables. Define Xo 0 and Xn+1-Xn +2Un-1, 1,2,.. (a) Show that X, n 0,1,2, is a Markov chain, and give its transition graph. (b) Find EX and Var(X) c)Give P(X
e (4 marks) Let m be an integer with the property that m 2 2. Consider that X1, X2,.. ., Xm are independent Binomial(n,p) random variables, where n is known and p is unknown. Note that p E (0,1). Write down the expression of the likelihood function We assume that min(x1, . . . ,xm) 〈 n and max(x1, . . . ,xm) 〉 0 5 marks) Find , and give all possible solutions to the equation dL dL -...
4. , XnER, let Eo,E1,..,Enbe independent normally distributed random Let Xo, X1, variables with common mean 0 and common variance σ2, and suppose Let a, b and σ2 be the maximum likelihood estimators of b,a and σ2 Note that these expressions involve only the training data(X1, Y the test data(xo. Yo) ,(xn%,). They omit The training error of our regression model is While its teat prediction) error is We know that In this exercise, we prove MSE (1+1+grt*)e* Note that...
Let X1, Y.X2, ½, distributed in [0,1], and let ,X16, Y16 be independent random variables, uniformly 2. 16 Find a numerical approximation to P(IW E(W)l< 0.001) HINT: Use the central limit theorem
Let Ņ, X1. X2, . . . random variables over a probability space It is assumed that N takes nonnegative inteqer values. Let Zmax [X1, -. .XN! and W-min\X1,... ,XN Find the distribution function of Z and W, if it suppose N, X1, X2, are independent random variables and X,, have the same distribution function, F, and a) N-1 is a geometric random variable with parameter p (P(N-k), (k 1,2,.)) b) V - 1 is a Poisson random variable with...
Consider a sequence of random variables X1, ..., Xn, ..., where for each n, Xn~ tn. We will use Slutsky's Theorem to show that as the degrees of freedom go to infinity, the distribution converges to a standard normal. (a) Let V1, ..., Vn, ... be such that Vn ~ X2. Find the value b such that Vn/n þy b. (b) Letting U~ N(0,1), show that In = ☺ ~tn and that Tn "> N(0,1). VVn/n