Let δ be a minimax estimator of g(9) under squared error loss. Show that a84 b is minimax for ag(e) + b Let δ be a minimax estimator of g(9) under squared error loss. Show that a84 b is minim...
1. Suppose that X P(A), the Poisson distribution with mean λ Assuming squared error loss, derive that Bayes estimator of λ with respect to the prior distribution「(Q), first by explicitly deriving the marginal probability mass function of X, obtaining an expression for the posterior density of A and evaluating E(alx) and secondly by identifying g(Alx) by inspection and noting that it is a familiar distribution with a known mean.
Please answer the following question and show every step. Thank
you.
Let Xi,..,Xn be a random sample from a population with pdf 0, x<0, where θ > 0 is unknown. (a) Show that the Gamma(a, b) prior with pdf 0, θ < 0. is a conjugate prior for θ (a > 0 and b > 0 are known constants). (b) Find the Bayes estimator of θ under square error loss. (c) Find the Bayes estimator of (2π-10)1/2 under square error...
this is a challenging question
Let X ~ POI(μ), and let θ-P(X = 0-e-". (a) Is -e-r an unbiased estimator of θ? (b) Show that θ = u(X) is an unbiased estimator of θ, where u(0) 1 and u(x)-0 if (c) Compare the MSEs of, and è for estimating θ-e-, when μ 1 and 2.
Let X ~ POI(μ), and let θ-P(X = 0-e-". (a) Is -e-r an unbiased estimator of θ? (b) Show that θ = u(X) is an...
P(8), θ eQa(0 4.6 Let X, ..., X, be independent r.v.'s distributed as estimate δ(x , , x )-r and the loss function L ( , δ)-[8-6(5- ,o0 ), and consider the -5)]218. E,[e-5(X, ,X,)],andshow that it isin dependent ofe. R(0:δ)- (i) Calculate the risk (ii) Can you conclude that the estimate is minimax by using Theorem 9?
2a) Let a, b e R with a < b and let g [a, bR be continuous. Show that g(x) cos(nx) dx→ 0 n →oo. as Hint: Let ε > 0, By uniform continuity of g, there exists δ > 0 such that 2(b - a Choose points a = xo < x1 < . . . < Xm such that Irh-1-2k| < δ. Then we may write rb g (z) cos(nx) dx = An + Bn where 7m (g(x)...
Let X1,...X be i.i.d with density f()(1/0)exp(-/0) for r >0 and 0> 0. a. Find the pitman estimator of 0 b. Show that the pitman estimator has smaller risk than the UMVUE of when the loss function is (t-0)2 02 Suppose C. f(x)= 0exp(-0x) and that 0 has a gamma prior with parameters a and p, find the Bayes estimator of 0 d. Find the minimum Bayes risk e. Find the minimax estimator of 0 if one exists. 1
Let...
2. Suppose that X|θ ~ U(0.0), the uniform distribution on the interval (09). Assuming squared error loss, derive that Bayes estimator of θ with respect to the prior distribution P(α.θο), the two-parameter Pareto model specified in (3.36), first by explicitly deriving the marginal probability mass function of X, obtaining an expression for the posterior density of θ and evaluating E(θ x) and secondly by identifying g(θ|x) by inspection and noting that it is a familiar distribution with a known mean.
4. (a) Let A [0, oo) and let f.g:AR be functions which are continuous at 0 and are such that f(0) 9(0)-1. Show that there exists some δ > 0 such that ifTE 0,d) then (b) Consider the function 0 l if z e R is rational, if zER is irrational f(z) Show that limfr) does not exists for any ceR.
4. (a) Let A [0, oo) and let f.g:AR be functions which are continuous at 0 and are such...
0 Let (f.) be a group, show that (ly) G where Gly); = Lael | ag= ga & gely is the center of G. (So, show that cly)< ; & cgjat. ) @ let y be a group, gel & Haf. Prove that Ks4 where us Ki Cig) := {acly I ag = gay is the centralizer of g inily, and K: N (H): = hatly I aH=Hay in the normalizen of Henly.
Problem 1 Let Xi, ,Xn be a random sample from a Normal distribution with mean μ and variance 1.e Answer the following questions for 8 points total (a) Derive the moment generating function of the distribution. (1 point). Hint: use the fact that PDF of a density always integrates to 1. (b) Show that the mean of the distribution is u (proof needed). (1 point) (c) Using random sample X1, ,Xn to derive the maximum likelihood estimator of μ (2...