Question

QUESTION 3 17) Let Xi. X. X be a random sample from a distribution with probability density function f(x, ?) | ße_ß, for x >0 elsewhere (a) What is the likelihood (LU) = L (x1.X2. xalß)) of the sample? Simplify it. (b) Use the factorization criterion/theorem to show that ? x, is a sufficient statistic for . 4

0 0
Add a comment Improve this question Transcribed image text
Answer #1

According to the random sample of X_{1},X_{2},.....X_{n} froma distribution with probability function ias

f(x)=\beta e^{-\beta x} for x\geq 0

=0 elsewhere .

(a) Now the likelihood estimation is calculated as

L(\beta\mid x )=\prod_{1}^{n}f(x,\beta )=\prod_{1}^{n}\beta e^{-\beta x}

  =\beta e^{-\beta x_{1}}\times \beta e^{-\beta x_{2}}\times \beta e^{-\beta x_{3}}\times ......\times \beta e^{-\beta x_{n}}

  =\beta^{n} e^{-\beta \sum_{1}^{n}x}

Taking log both side we get

logL(\beta\mid x )=log(\beta^{n} e^{-\beta \sum_{1}^{n}x})

  =nlog\beta -\beta \sum xloge......................(1)

Taking derivative both side with respect to \beta we get

\frac{dlogL(\beta\mid x )}{d\beta }=\frac{n}{\beta }-\sum x=0 first order condition of optimization.

or,\frac{n}{\beta }=\sum x

or,\beta =\frac{n}{\sum x}=\frac{1}{\bar{x}} as \bar{x}=\frac{\sum x}{n}

Therefore likelihood of the sample is \beta =\frac{1}{\bar{x}} (maximum likelihood estimator)

(b) According to the random sample of X_{1},X_{2},.....X_{n} froma distribution with probability function , joint probability density function is

f(x_{1},x_{2},.....x_{3},\beta )=f(x_{1},\beta )\times f(x_{2},\beta )\times f(x_{3},\beta )\times .....\times f(x_{n},\beta )

Therefore by replacing f(x)=\beta e^{-\beta x} we get

f(x_{1},x_{2},.....x_{3},\beta )=\beta e^{-\beta x_{1}}\times \beta e^{-\beta x_{2}}\times \beta e^{-\beta x_{3}}\times .....\times \beta e^{-\beta x_{n}}

f(x_{1},x_{2},.....x_{3},\beta )=\beta^{n} e^{-\beta\sum_{i=1}^{n} x_{i}

Now according to the factorization theorem we can factored the joint proabbility function as one which is depends on the parameter \beta with the statistics \sum_{i=1}^{n}X_{i} and another is independent on the parameter \beta as

f(x_{1},x_{2},.....x_{3},\beta )=[\beta^{n} e^{-\beta\sum_{i=1}^{n} x_{i}]\times 1

Where [\beta^{n} e^{-\beta\sum_{i=1}^{n} x_{i}] depends on parameter \beta and [1] is independent of parameter \beta .

Therefore the factorization theorem shows that  \sum_{i=1}^{n}X_{i}is a sufficient statistics for \beta and as \bar{x} is one to one relationship with \sum_{i=1}^{n}X_{i} , then \bar{x} also a sufficient statistics for \beta .

Add a comment
Know the answer?
Add Answer to:
QUESTION 3 17) Let Xi. X. X be a random sample from a distribution with probability...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT