6. In class, we saw two different expressions for the Fisher Information (and hence, the CRLB)...
6. In class, we saw two different expressions for the Fisher Information (and hence, the CRLB). Here, you will show the two expressions are equal (assuming that you may switch the order of differentiation and integration as needed). As a hint: Differentiatefx(x; 0) d-1 with -oO 1nJX(x, fx(r, 0) respect to θ to show that with respect to x;0) dz0. Then, differentiate this equation
The two Fisher Information expressions used in class: 6. In class, we saw two different expressions for the Fisher Information (and hence, the CRLB) Here, you will show the two expressions are equal (assuming that you may switch the order of differentiation and integration as needed). As a hint: Differentiatefx(0) d-1 with respect to θ to show that /oo อ In fx (2,0) with respect to θ fx (x:0) dr = 0, Then, differentiate this equation -00 д92
B2. (a) Suppose θ is an unknown parameter which is to be estimated from a single measurement X, distributed according to some probability density function f(r0). The Fisher information I(0) is defined by de Show that, under some suitable regularity conditions, the variance of any unbi- ased estimator θ of θ is then bounded by the reciprocal of the Fisher information Var | θ 1(8) Note that the suitable regularity conditions, which are not specified here, allow the interchange of...
Fix θ > 0 and let Xi, , x, i d. Unif[0.0]. We saw in class that the MLE of θ, oMLE- I give two other estimators of θ, which can be made unbiased by appropriate choice of -C1 max(Xs , . . . , X,) max(X., Xn), is biased. constants C1,C2 We have two questions: (1) Find values of C1, C2 for which these estimators are unbiased. Note that Ci,C2 may depend on n (2) Which of these estimators...
Need help with a and b In class we saw how the objective of OLS is to choose coefficients Bo and B, that minimize the squared residuals, summed over all the sample data points, ie. . Since Y Bo+BX,te, we can re-arrange the objective function to: 3 Differentiating equation (1) with respect to Bo and B, leads to the two normal equations: 2(Y-3-3x)-1) 0 i-1 We have two equations, (2) and (3), and two unknowns, Bo and B. Please solve...
5. In class we saw that the function r(u, v) = (sin u, (2 + cos u) cos v, (2 + cos u) sin v), 0<u<27, 050521 parametrizes a torus T, which is depicted below. (a) Calculate ||ru x rull. (b) Show that T is smooth. (c) Find the equation of the tangent plane to T at (0,). (d) Find the surface area of T (e) Earlier in the semester, we observed that a torus can be built out of...
In class we discussed the relationship between the hyperbolic functions and a hyperbola then showed that it is analogous to that of the trigonometric functions and a circle a. Derive an analogue to the Pythagorean Identities (cos2 x + sin2 x 1, etc. ) for the hyperbolic functions hint: Which hyperbola and which circle? (this will give you the relationship between cosh x and sinh x and the others are then easily found as they were in the case of...
2 Sales Tax with Limited Attention In class, we saw evidence that consumers are not fully responsive to "shrouded" prices, like shipping costs and sales taxes. In this problem we will work through a model of rational limited attention. Consider a consumer whose income is $100. Her utility function is given by: U(X, Y, c) = ln(X) + Y-ve. In(-) is the natural logarithm. The derivative of the natural logarithm is dln(X)/dX 1/X The price of both X and Y...
Newtonian Cosmology 1. In class, we solved the Friedmann equation for the critical case, where the constant of integration was set to k 0; this resulted in the Einstein-de Sitter model, where a ox t2/3 Now, let us consider the closed case (k 1), where the universe starts with a Big Bang, reaches a maximum expansion, turns around, and eventually ends in a Big Crunch. For the closed model, it is convenient to write the Friedmann equation as follows: 8T...
4. Let x- be a two dimensional feature vector (a) Suppose that we collect the following four measurements for an input belonging to class a x1 -6 -10 10 -6 Assuming that the class conditional density is Gaussian, find the maximum likelihood estimates of the mean vector and covariance matrix for class ω (b) Suppose that data come from two classes, a, and co,. Assume that . The a priori class probabilities are equal . The class conditional densities are...