This is the example of bayesian problem,
To calculate the p(u/y) which is posterior distribution we have formula,
p(u/y)=(p(y/u)*p(u))/p(y)
so first we need to calculate p(y/u)
here prior distribution is ~ N(,^2)
The probability density function of Yi is,
p(Yi/) = (2^2)^(-1/2) exp(-(yi -)^2/2^2)
since y1,y2...yn are independent,the likelihood is,
p(y/u)=p(Yi/)= (2^2)^(-n/2) exp(-(yi -)^2/2^2)
The prior is,
p(u) =(2^2)^(-1/2) exp(-(-)^2/2^2)
p(y)=(2^2)^(-1/2) exp(-(y-)^2/2^2)
p(u/y)=(p(y/u)p(u)/p(y)
(p(y/u)p(u)= [(2^2)^(-n/2)*(2^2)^(-1/2) exp[-1/2((yi -)^2/^2)+ exp(-(-)^2/^2))]]
by solving square brackets in exponential and adding and subtracting by [(/^2)+(yi/^2)]^2 / [(1/^2)+(n/^2)]
here terms involving ^2 is ((1/^2)+(n/^2))^2
and another terms involving is -2((/^2)+(yi/^2))
remaining terms are (^2/^2)+(yi^2/^2)
after solving these we define some terms .,
" =((1/^2)+(n/^2)^(-1/2)) and " =[(/^2)+(yi/^2)]/[(1/^2)+(n/^2)]
the expression which comes in bracket of exponent is ,
{((-")/")^2 +[(^2/^2)+(yi^2/^2)-((/^2)+(yi/^2)]^2 / [(1/^2)+(n/^2)])]}
posterior pdf for is
p(u/y)=(2"^2)^(-1/2) exp(-(-")^2/2"^2)
posterior distribution is normal with mean " and variance "^2
NOTE: Solve the exponent brackets as per given instruction above,I've given all the possible terms which will come in exponent brackets.
2) Let Yi,., Yn be iid N(a,a2). Let a~ known Find the posterior distribution p(u|3y). This...
,X, be iid N(μχ, σ*), Yi, ,Yn be iid N(Pv, σ*), and X's and Question 2: Let X1, Y's are independent. Let be the pooled variance. Show that Sg(0/n+1/m) is distributed at t with (n+m-2) degrees of freedom.
iid Let Yı, Y2, ..., Yn N(u,), where the population mean y and population variance o are both unknown. Show that the Method of Moments (MOM) estimators of u and o? are given by n i =Y, Y n =1 72 = n-1 S2 (Y; -Y) n n i=1 Note: In this case, (Y, S?) is a sufficient statistic for (u, o?). The MOM estimators of u and o2 are therefore functions of a sufficient statistic.
Yi, Y2...., Yn is a random sample from the Uniform distribution ([a, b]). Let u to be the population mean, one wants to test Ho : μ = 1 against Ha : μ 1. Suppose n is large, and both the one-sample t-test and the binomial test can be applied here. Derive the approximate analytic formula for computing the power for each of the test. Besides the sample size n and significance level α, what quantity is essential in the...
Could I grab some help on problem 2? Thank you 2. Suppose Yi, Yn are iid normal random variables with normal distribution with unknown mean and variance, μ and ơ2. Let Y ni Y. For this problem you may not assume that n is large. n (a) What is the distribution of Y? (b) What is the distribution of Z = (yo)' + ( μ)' + (⅓ュ)? (o) What is the distribution of ta yis (d) What is the distribution...
: Let Yi, ½' . . . , Yn be an iid random sample from an exponential distribution with parameter where θ > 0. Here each Y, represents the lifetime of the ith battery, while θ represents the theoretical average lifetime. The pdf of each Y, is therefore given by fy (y) ei-1,2,...,n Consider the empirical average lifetime of the sample of n batteries given by Let a E R be a nonnegative real number. Consider the event A, defined...
Let y,p ~iid Exp (0), for i = 1, . . . , n. (p(y|0) for 6 to be Gamma(a, b), tha distribution of θ BeAy). Assume the prior distribution Find the posterior 2. t is, p(0) -ba/ra)ge-i exp{-be. 3. Find the posterior predictive distribution of a future observation in problem 2
4.(120) Let X1,,,Xn be iid r(, 1) and g(u) given. Let 6n be the MLE of g(4) (1)(60) Find the asymptotic distribution of 6, (2)(60) Find the ARE of T Icc(X) w.r.t. on P(X1> c), c > 0 is i n i1 5.(80) Let X1, ,,Xn be iid with E(X1) = u and Var(X1) limiting distribution of nlog (1 +). o2. Find the where T n(X - 4)/s. - 1 - 4.(120) Let X1,,,Xn be iid r(, 1) and g(u)...
Suppose Yı, Y2, ..., Yn|7 vid N(10, 7-2). The population mean Mo is known. The un- known parameter T > 0, which is the inverse of the population variance, is called the precision. The pdf of N(Mo, T-1) is given by Syl-(wl=) = Vb exp (-5(v – wo)"] Let's now derive the posterior distribution of t from the Bayesian perspective. (a) Define U = (Y; – Mo)? i=1 Show that U is a sufficient statistic for t using the Factorization...
1. Suppose that X, X, X, are iid Berwulli(p),0 <p<1. Let U. - x Show that, U, can be approximated by the N (np, np(1-P) distribution, for large n and fixed <p<1. 2. Suppose that X1, X3, X. are iid N ( 0°). Where and a both assumed to be unknown. Let @ -( a). Find jointly sufficient statistics for .
3.4 Let X,, X be a random sample of size n from the U(Q,62) distribution, 6, and let Y, and Yn be the smallest and the largest order statistics of the Xs (i) Use formulas (28) and (29) in Chapter 6 to obtain the p.d.f.'s of Y and Y and then, by calculating depending only on Yi and 1,- Part i. (Note: it is not saying to find the joint pdf of Yi and Yn Find their marginal Theorem 13...