Suppose we are given two separate groups of data, y1 and y2, which are given by the following: y1: (y1,...yn)^T and y2: (yn+1,yn+2....yn+m)^T such that yi ~ N (µ, σ^2), i = 1,2,...,n and yi ~ N (µ*, σ^2*), i = n+1, n+2...,n+m. Assume all data are mutually independent and IID, and T = transpose.
a) if σ^2 = σ^2* = (σ^2), where (σ^2) is known, find the posterior dist p(µ,µ*|y1,y2) and find bayes estimator for µ and µ* (assume normal prior)
b) assume µ, µ*, σ^2, σ^2* are all unknown. assume normal prior for both µ, µ*, and Inverse gamma for σ^2, σ^2*. Find posterior and marginal density.
Suppose we are given two separate groups of data, y1 and y2, which are given by...
Suppose Y1, Y2, ... Yn are mutually independent random variables with Y1 ~ N(μ1, (σ1)^2) Y2 ~ N(μ2, (σ2)^2) ... Yn ~ N(μn, (σn)^2) Find the distribution of U=summation(from i=1 to n) ((Yi - μi)/σi)^2 I am not sure where should I start this question, could you please show me the detail that how you do these two parts? thanks :)
Suppose
that Y1 , Y2 ,..., Yn denote a random sample of size n from a
normal population with mean μ and variance 2 .
Problem # 2: Suppose that Y , Y,,...,Y, denote a random sample of size n from a normal population with mean u and variance o . Then it can be shown that (n-1)S2 p_has a chi-square distribution with (n-1) degrees of freedom. o2 a. Show that S2 is an unbiased estimator of o. b....
1. Suppose we are going to sample n individuals and ask each sampled person whether they support policy A or not. Let Yi Y0 otherwise 1 if person i in the sample supports the policy, and (a) Assume Y1, , Yn are, conditional on θ. 1.1.d. binary random variables with expec- tation θ. Write down the joint distribution Pr(Yi-yi, . . . ,Ý,-yn(9) in a compact form. Also write down the form of Pr(> Ý,-y|0) (b) For the moment, suppose...
Suppose Y1, Y2, ..., Yn is an iid sample from a Pareto population distribution described by the pdf fy(y|0) = 4ao y -0-1 y > 20, 2 where the parameter do is known. The unknown parameter is 0 > 0. (a) Find the MOM estimator of 0. (b) Find the MLE of 0.
Let Y1, Y2, , Yn be independent, normal random variables, each
with mean μ and variance σ^2.
(a) Find the density function of
f Y(u) =
(b) If σ^2 = 25 and n = 9, what is the
probability that the sample mean, Y, takes on a value that is
within one unit of the population mean, μ?
That is, find P(|Y − μ| ≤ 1). (Round your answer to four decimal
places.)
P(|Y − μ| ≤ 1) =
(c)...
Let a two-output cost function be given by: C(y1, y2) = y1 + y2 + Y1Y2 – (y192)2 + y2. Assume that yı > 1, and y2 > 1. Does this cost function exhibit economies of scope?
4
and 5
samples, the other in small samples. Which is which? Explain. (d) Suppose we know that the 5 values are from a symmetric distribution. Then the sample median is also unbiased and consistent for the population mean. The sample mean has lower variance. Would you prefer to use the sample 4. Suppose Yi, Y, are iid r ables with E(n)-μ, Var(K)-σ2 < oo. For large n, find the approximate 5. Suppose we observe Yi...Yn from a normal distribution...
2. 12 points Bjorn, a member of ABBA, knows that X (which is the amount an ABBA song brightens or worsens a person's day) has a Normal distribution with a mean of 0... and he's OK with that. What he wants to learn more about is the precision of X, which is the inverse of the variance. Bjorn thinks that the precision has a Gamma distribution as described below. Help Bjorn out by finding the posterior distribution for the percision....
Suppose Yı, Y2, ..., Yn|7 vid N(10, 7-2). The population mean Mo is known. The un- known parameter T > 0, which is the inverse of the population variance, is called the precision. The pdf of N(Mo, T-1) is given by Syl-(wl=) = Vb exp (-5(v – wo)"] Let's now derive the posterior distribution of t from the Bayesian perspective. (a) Define U = (Y; – Mo)? i=1 Show that U is a sufficient statistic for t using the Factorization...
As before, suppose we have N data points of the form (21, yı), (C2, y2),..., (IN, YN). We consider the linear model Yi = Bo + Bili + Ej. Substitute into this equation the formula for Bo given at the bottom of page 2 in the “linearregression.pdf" notes. Then sum both sides from i=1 to i = N. In this way, compute the total sum of errors: What does this sum equal? (Please enter a numerical value.)