(a) Observe
Y is the number of success in X number of independent bernoulli trials. So if the probability of success is p then the avegare number of success in X independent bernoulli trials will be Xp. Then again X is the number of trials performed before a success occurs. Hence the average number of trials needed to get a success is 1/p and X is independent of Y so E(Y) = 1/p * p = 1
(b) E(Y)=E(E(Y|X)) = E(X*p) = 1/p * p = 1
(c) E(Y^2) = E(E(Y^2|X)) = E( X(X-1)(p^2)+1) =(p^2) E(X^2-X)+1 = (p^2)*(2-p)/(p^3) +1= 2-p/p +1 =2/p
Var(Y) = 2/p - 1= (2-p)/p
7.158 Let G . Deicrhm Call this number of performed until the first success occurs. 7.159...
Wiout feplacement. 6.9 Consider a sequence of Bernoulli trials with success probability p. Let X denote the number of trials up to and including the first success and let Y denote the number of trials up to and including the second success. a) Identify the (marginal) PMF of X c) Determine the joint PMF of X and Y. d) Use Proposition 6.2 on page 263 and the result of part (c) to obtain the marginal PMFS of X and Y....
Problem 1 Consider a sequence of n+m independent Bernoulli trials with probability of success p in each trial. Let N be the number of successes in the first n trials and let M be the number of successes in the remaining m trials. (a) Find the joint PMF of N and M, and the marginal PMFs of N and AM (b) Find the PMF for the total number of successes in the n +m trials. Problem 1 Consider a sequence...
Problem 5 (10 points). Suppose that the independent Bernoulli trials each with success probability p, are performed independently until the first success occurs, Let Y be the number of trials that are failure. (1) Find the possible values of Y and the probability mass function of Y. (2) Use the relationship between Y and the random variable with a geometric distribution with parameter p to find E(Y) and Var(Y).
7.75. Let us repeat Bernoulli trials with parameter 0 until k successes occur. If Y is the number of trials needed: (a) Show that the p.d.f. of Y is g(y; 0) Oyyk. k- k+1,..., zero elsewhere, where 0< es 1. (b) Prove that this family of probability density functions is complete. (c) Demonstrate that E[(k - 1)/(Y- 1)] 0 (d) Is it possible to find another statistic, which is a function of Y alone, that is unbiased? Why? 7.75. Let...
Negative Binomial experiment is based on sequences of Bernoulli trials with probability of success p. Let x+m be the number of trials to achieve m successes, and then x has a negative binomial distribution. In summary, negative binomial distribution has the following properties Each trial can result in just two possible outcomes. One is called a success and the other is called a failure. The trials are independent The probability of success, denoted by p, is the...
Let X be a discrete random variable whose value is given by the number of successes observed on a series of 10 Bernoulli trials in which the probability of success is 1/3. Which of the following statements is or are true? I. X = B(10, 1/3) II. The only possible values of X are the integers 1 through 10 inclusive. III. If Y=10 - X, then Y = B(10, 2/3). A. I only B. I and II only C. I...
Suppose we toss a coin (with P(H) p and P(T) 1-p-q) infinitely many times. Let Yi be the waiting time for the first head so (i-n)- (the first head occurs on the n-th toss) and Xn be the number of heads after n-tosses so (X·= k)-(there are k heads after n tosses of the coin). (a) Compute the P(Y> n) (b) Prove using the formula P(AnB) P(B) (c) What is the physical meaning of the formula you just proved? Suppose...