(e)
R code:
n=c(25,64,100,400,1000)
k=10000
p=0.05
hat_p=matrix(0,nrow=5,ncol=k)
LB=UB=hat_p
LB1=1:5*0
UB1=LB1
for(i in 1:5)
{
for(j in 1:k)
{
hat_p[i,j]=sum(rbinom(n[i],1,p))/n[i]
LB[i,j]=hat_p[i,j]-qnorm(0.975)*sqrt(hat_p[i,j]*(1-hat_p[i,j])/n[i])
UB[i,j]=hat_p[i,j]+qnorm(0.975)*sqrt(hat_p[i,j]*(1-hat_p[i,j])/n[i])
}
LB1[i]=mean(LB[i,])
UB1[i]=mean(UB[i,])
}
LB1
UB1
Output:
> LB1
[1] -0.0192771189 -0.0006119577 0.0085541845 0.0287154169
0.0365087445
> UB1
[1] 0.11950912 0.09977758 0.09132982 0.07110208 0.06344866
R code:
n=c(25,64,100,400,1000)
k=10000
p=0.5
hat_p=matrix(0,nrow=5,ncol=k)
LB=UB=hat_p
LB1=1:5*0
UB1=LB1
for(i in 1:5)
{
for(j in 1:k)
{
hat_p[i,j]=sum(rbinom(n[i],1,p))/n[i]
LB[i,j]=hat_p[i,j]-qnorm(0.975)*sqrt(hat_p[i,j]*(1-hat_p[i,j])/n[i])
UB[i,j]=hat_p[i,j]+qnorm(0.975)*sqrt(hat_p[i,j]*(1-hat_p[i,j])/n[i])
}
LB1[i]=mean(LB[i,])
UB1[i]=mean(UB[i,])
}
LB1
UB1
Output:
> LB1
[1] 0.3092422 0.3776961 0.4021676 0.4508827 0.4690739
> UB1
[1] 0.6932538 0.6208132 0.5971704 0.5487613 0.5310219
Let x1, x2,..,xn represent a random sample from a distribution with pdf f(x)=px(1-p)1-x for x=0,1 and 0<p<1. Find MLE for p. Choose an answer: n O b. 1/29=1*; O d. None are correct 59
7. Let X1,....Xn random sample from a Bernoulli distribution with parameter p. A random variable X with Bernoulli distribution has a probability mass function (pmf) of with E(X) = p and Var(X) = p(1-p). (a) Find the method of moments (MOM) estimator of p. (b) Find a sufficient statistic for p. (Hint: Be careful when you write the joint pmf. Don't forget to sum the whole power of each term, that is, for the second term you will have (1...
Let X1,X2,X3..Xn be iid of f(x)= theta. x^(theta-1), with x(0,1) and theta being a positive number. Is the parameter identifiable?.Compute the maximum likelihood estimate. If instead of X1,X2,,, We observe, Y1,Y2,...Yn, where Yi=1(Xi<=0.5).What distribution does Yi follow? What is the parameter of this distribution? Compute MLE and the method of moments and Fisher information.
Let X1,X2,X3..Xn be iid of f(x)= theta. x^(theta-1), with x(0,1) and theta being a positive number. Is the parameter identifiable?.Compute the maximum likelihood estimate. If instead of X1,X2,,, We observe, Y1,Y2,...Yn, where Yi=1(Xi<=0.5).What distribution does Yi follow? What is the parameter of this distribution? Compute MLE and the method of moments and Fisher information.
Please give detailed steps. Thank you. 5. Let {X, : i-1..n^ denote a random sample of size n from a population described by a random varaible X following a Poisson(θ) distribution with PDF given by θ and var(X) θ (i.e. you do not You may take it as given that E(X) need to show these) a. Recall that an estimator is efficient, if it satisfies 2 conditions: 2) it achieves the Cramer-Rao Lower Bound (CLRB) for unbiased estimators: Show that...
3. Consider a discrete random variable X which follows the geometric distribution f(x,p) = pr-1(1-p), x = 1.2. . . . , 0 < p < 1. Recall that E(x) (1-p) (a) Find the Fisher information I(p). (b) Show that the Cramer-Rao inequality is strict e) Let XX ~X. Find the maximum likelihood estimator of p. Note that the expression you find may look complicated and hard to evaluate. (d) Now modify your view by setting μ T1p such that...
2. (Discrete uniform). Consider the PMF P(X x)= for x 1,2,...0 _ You have a random sample of size three from this distribution: {2,3,10}. a. Find the method of moments estimate for 0 HINT: a very useful fact is that k1 n(n+1) 2 b. Find the MLE for 0 c. Which estimator is unbiased? d. Which estimator is preferred? 2. (Discrete uniform). Consider the PMF P(X x)= for x 1,2,...0 _ You have a random sample of size three from...
Question 3: A random variable X has a Bernoulli distribution with parameter θ є (0,1) if X {0,1} and P(X-1)-θ. Suppose that we have nd random variables y, x, following a Bernoulli(0) distribution and observed values y1,... . Jn a) Show that EIX) θ and Var[X] θ(1-0). b) Let θ = ỹ = (yit . .-+ yn)/n. Show that θ is unbiased for θ and compute its variance. c) Let θ-(yit . . . +yn + 1)/(n + 2) (this...
CPoisson can not be determined. distribution P(np) ) Suppose X~N(0,1) and YN(24), they are independent, then (is incorrect. DX+Y-N(2, 5) BP(Y <2)>0.5 -Y-N (-2,5) D Var(X) < Var(Y) 5) Suppose X,Xy..,X, (n>1) is a random sample from N(μ,02) , let-ly, is| then Var(x)- ( Instruction: The followins ass
The geometric distribution is a probability distribution of the number X of Bernoulli trials needed to get one success. For example, how many attempts does a basketball player need to get a goal. Given the probability of success in a single trial is p, the probability that the xth trial is the first success is: Pr(x = x|p) = (1 - p*-'p for x=1,2,3,.... Suppose, you observe n basketball players trying to score and record the number of attempts required...