Problem 5 Let x ~ Pa, a Pareto distribution.... Problem 2. Consider 31, ... ,In ..d....
Consider the Pareto distribution
Problem 6. Consider the Pareto distribution (BOB 78+1 f(x) = x = 0, 0, x < 0. Find the maximum likelihood estimator for ß > 0 if e > 0 is known.
4. Let y1θ ~iid Uniform (0,0), for i-1, n, Assume the prior distribution for θ to , be Pareto(a, b), where p()b1 for 0> a and 0 otherwise. Find the posterior distribution of θ.
Let X1, X2, ...,Xn denote a random sample of size n from a Pareto distribution. X(1) = min(X1, X2, ..., Xn) has the cumulative distribution function given by: αη 1 - ( r> B X F(x) = . x <B 0 Show that X(1) is a consistent estimator of ß.
Consider the following linear regression model 1. For any X x, let Y xBU, where 3 E R*. 2. X is exogenous 3. The probability model is {f(u;0) is a distribution on R: Ef [U] = 0, VAR, [U] = 02,0 > 0}. 4. Sampling model: Y} anidependent sample, sequentially generated using Yi x Ui,where the U IID(0,0) are (i) Let K 0 be a given number. We wish to estimate B using least-squares subject to the constraint 6BK2. Write...
Problem 8: 5 points] Let Xi,.,.Xn be IID from a Uniform distribution on (-0,0) where 0 0 is an unknown parameter (a) Find a minimal sufficient statistic T. (b) Define Show that T and V are independent.
4) Let Xi , X2, . . . , xn i id N(μ, σ 2) RVs. Consider the problem of testing Ho : μ- 0 against H1: μ > 0. (a) It suffices to restrict attention to sufficient statistic (U, v), where U X and V S2. Show that the problem of testing Ho is invariant under g {{a, 1), a e R} and a maximal invariant is T = U/-/ V. (b) Show.that the distribution of T has MLR,...
Problem 5 Let f : [0,1] → R be continuous and assume f(zje (0, 1) for all x E (0,1). Let n E N with n 22. Show that there is eractly one solution in (0,1) for the equation 7L IC nx+f" (t) dt-n-f(t) dt.
Consider the normal distribution f(x|θ) = [1 / sqrt(2π)] exp(−1/2 (x − θ)^2 ) for all x. Let the prior distribution for θ be f(θ) = [1 / sqrt(2π)] exp[(−1/2) (θ^2)] for all θ. (a) Show that the posterior distribution is a normal distribution. With what parameters? (b) Find the Bayes’ estimator for θ.
Problem 1. (Bivariate Normal Distribution) Let Z1, Z2 be i.i.d. N(0,1) distributed random variables, and p be a constant between –1 and 1. define X1, X2 as: x3 = + VF5223X = v T14:21 - VF52 23 1) Show that, (X1, X2)T follows bivariate Normal distribution, find out the mean vector and the covariance matrix. 2) Write down the moment generating function, and show that when p= 0, X11X2.
(b) Let f 0, 1-R be a C2 function and let g, h: [0, 00)-R be C1. Consider the initial-boundary value problem kwr w(r, 0) f(a) w(0, t) g(t) w(1, t) h(t) for a function w: [0,1 x [0, 0)- R such that w, wn, and wa exist and are continuous. Show that the solution to this problem is unique, that is, if w1 and w2 [0, 1] x [0, 00)- R both satisfy these conditions, then w1 = w2....