Consider a random vector Y () y(2). y(k) where the elements y(i) are made yi)wi), j-1,...
S y, and that yi-μ +Ei. You can assume that Ele]-0 for all i, Ele: -σ2 for all i, and Ele#3-0 for all i j You want to estimate a sample mean, and your friend tells you to use the following estimator: uppose that vou have collected n observations on where w is a known sample weight for observation i (this means w; is non-random) (a) Find E( (b) Under what conditions, if any, is p an unbiased estimator? Under...
4. (24 marks) Suppose that the random variables Yi,..., Yn satisfy Y-B BX,+ Ei, 1-1, , n, where βο and βι are parameters, X1, ,X, are con- stants, and e1,... ,en are independent and identically distributed ran- dom variables with Ei ~ N (0,02), where σ2 is a third unknown pa- rameter. This is the familiar form for a simple linear regression model, where the parameters A, β, and σ2 explain the relationship between a dependent (or response) variable Y...
Let with Y, Y, ..., Yn be i id random variables the following probability density function, 1 x)/x fyly) = f I y ocyc1 o otherwise a) b) where x>0 is an unknown parameter. Find the maximum likelihood estimator , ã of x. Show this is an unbaised estimator for a. Hint : make use of the fact that in y follows an exponential distribution with mean a. Toe., -lny ~ Exp(x) c) Find the MSE of the manimum likelihood...
I. Consider a variable y = θ + where θ is an unknown parameter and e is a random variable with mean zero. (a) What is the expected value of y? (b) Suppose you draw a sample of yi yn. Derive the least squares estimator for θ. For full credit you must check the 2nd order condition c) Can this estimator (0) be described as a method of moments estimator? (d) Now suppose є is independent normally distributed with mean...
2. Consider a simple linear regression model for a response variable Yi, a single predictor variable ri, i-1,... , n, and having Gaussian (i.e. normally distributed) errors Ý,-BzitEj, Ejį.i.d. N(0, σ2) This model is often called "regression through the origin" since E(Yi) 0 if xi 0 (a) Write down the likelihood function for the parameters β and σ2 (b) Find the MLEs for β and σ2, explicitly showing that they are unique maximizers of the likelihood function. (Hint: The function...
linear stat modeling & regression 1) Consider n data points with 3 covariates and observations {xn, ^i2, xi3,yid; i,,n, and you fit the following model, y Bi+Br2+Br+e that is yi A) +Ari,1 +Ari,2 +Buri,3 + єї where є,'s are independent normal distribution with mean zero and variance ơ2 . H the vectors of (Y1, . . . ,Yn). Assume the covariates are centered: Σίχί,,-0, k = 1,2,3. ere, n = 50, Let L are Assume, X'X is a diagonal matrix...
1. Suppose Yi,½, , Yn is an iid sample from a Bernoulli(p) population distribution, where 0< p<1 is unknown. The population pmf is py(ulp) otherwise 0, (a) Prove that Y is the maximum likelihood estimator of p. (b) Find the maximum likelihood estimator of T(p)-loglp/(1 - p)], the log-odds of p. 1. Suppose Yi,½, , Yn is an iid sample from a Bernoulli(p) population distribution, where 0
2. Consider a simple linear regression i ion model for a response variable Y, a single predictor variable ,i1.., n, and having Gaussian (i.e. normally distributed) errors: This model is often called "regression through the origin" since E(X) = 0 if xi = 0 (a) Write down the likelihood function for the parameters β and σ2 (b) Find the MLEs for β and σ2, explicitly showing that they are unique maximizers of the likelihood function Hint: The function g(x)log(x) +1-x...
4) Consider n data points with 2 covariates and observation {xi,i, Vi,2, yi); i -1,... ,n, where yi 's are indicator variable for the experiment that is if a particular medicine is effective on some individual. Here, xi1 and ri.2 are age and blood pressure of i th individual, respectively. Our assumption is that the log odds ratio follows a linear model. That is p-P(i-1) and 10i b) What should be a good estimator for ?,A, e) Suppose. On, A,n...
1. Consider a variable y = θ+e where θ is an unknown parameter and e is a random variable with mean zero (a) What is the expected value of y (b) Suppose you draw a sample of in y-Derive the least squares estimator for θ. For full credit you must check the 2nd order condition. (c) Can this estimator () be described as a method of moments estimator? (d) Now suppose e is independent normally distributed with mean 0 and...