Only need parts c, e, j, m, and p
only need parts c, e, j, m, and p
help me find point a-p are independent. distribution or otherwise state "unknown." that X, ~, ΝΟμ, σ 2), l a. 1, '.. , n and Z, ,~ NO. 1), i l' k and all variables State the distribution of each of the following variables if'it is a "named" (b) X, + 2X) nk(X-μ) (d) zi nx - 1) (e)S (k-1) Σ(Xi-X) Y NU6 251 and
5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J, 1,,-1, , n. OV&.for any two random variables X and Y) or each 1, and (11 CoV(X,Y) var(x)var(y) (Recall that p vararo 5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J,...
EXERCISE 6 Let Zi, Z2,-.., Zi6 be an i.i.d. sample of size 16 from the standard normal distribution N (0,1). Let Xi,X2,..., X64 be an i.i.d. sample of size 64 from the normal distribution (μ, 1). The two samples are independent. a. What is the distribution of Y, where Y-Σ161 Z2 + Σ-i(X-μ)2? Study List b. Find ΕΥ. c. Find Var(Y) d. Approximate P(Y105)
9. Consider the following hidden Markov model (HMM) (This is the same HMM as in the previous HMM problem): ·X=(x, ,x,Je {0,1)、[i.e., X is a binary sequence of length n] and Y-(Y Rt [i.e. Y is a sequence of n real numbers.) ·X1~" Bernoulli(1/2) ,%) E Ip is the switching probability; when p is small the Markov chain likes to stay in the same state] . conditioned on X, the random variables Yı , . . . , y, are...
2. The random variables X1, X2 and X3 are independent, with Xi N(0,1), X2 N(1,4) and X3 ~ N(-1.2). Consider the random column vector X-Xi, X2,X3]T. (a) Write X in the form where Z is a vector of iid standard normal random variables, μ is a 3x vector, and B is a 3 × 3 matrix. (b) What is the covariance matrix of X? (c) Determine the expectation of Yi = Xi + X3. (d) Determine the distribution of Y2...
proof the theorem 1 rst of Robert V. Hogg, Allen Craig-lntroduction t no say e instances it ma ul roblem ean be avoided if we will but prove the following factorization theorem of Neyman. Theorem 1. Let a distribution that has pdf. f(x, θ), θ62. The statistic Y1= n(x, , x2, . . . , X") is a suficient statistic for θ if and only if we can find two nonnegative functions, kj and k2, such that t X,, X2,...,...
4. Suppose X1, . . . ,X, are independent, normally distributed with mean E(Xi) and variance Var(X)-σί. Let Żi-(X,-μ.)/oi so that Zi , . . . , Ζ,, are independent and each has a N(0, 1) distribution. Show that LZhas a x2 distribution. Hint: Use the fact that each Z has a xî distribution i naS
need to check my work. Just need B and C Problem 2. Recall that a discrete random variable X has Poisson distribution with parameter λ if the probability mass function of X is fx (x) = e-λ- XE(0, 1,2, ) ar! This distribution is often used to model the number of events which will occur in a given time span, given that λ such events occur on average a Prove by direct cornputation that the mean of a Poisson randoln...
e (4 marks) Let m be an integer with the property that m 2 2. Consider that X1, X2,.. ., Xm are independent Binomial(n,p) random variables, where n is known and p is unknown. Note that p E (0,1). Write down the expression of the likelihood function We assume that min(x1, . . . ,xm) 〈 n and max(x1, . . . ,xm) 〉 0 5 marks) Find , and give all possible solutions to the equation dL dL -...
Central Limit Theorem: let x1,x2,...,xn be I.I.D. random variables with E(xi)= U Var(xi)= (sigma)^2 defind Z= x1+x2+...+xn the distribution of Z converges to a gaussian distribution P(Z<=z)=1-Q((z-Uz)/(sigma)^2) Use MATLAB to prove the central limit theorem. To achieve this, you will need to generate N random variables (I.I.D. with the distribution of your choice) and show that the distribution of the sum approaches a Guassian distribution. Plot the distribution and matlab code. Hint: you may find the hist() function helpful