c) The following code gives us the estimate of empirical mean (i.e. the sample mean) and the pmf of the sample:
It gives us the following
output:
d & e) Now the code below can be used for both part d) and e) to find out the memorylessness from putting the time t=3,6 and 9:
it gives us the following output:
for part e) also we will get the same pmf from the sample which ensures us about the memoryless property. Their mean also becomes the same as X.
python C-E please C) Generate 1,000, 000 samples from the random variable X of part B. Estimate the empirical mean o...
*ONLY PART C: Plotting the PMF*
Please do this in Python.
Discrete memorylessness: Wheel of Fortune 2.1 Imagine a wheel of fortune at a casino in Vegas. By playing the wheel each time, you will be a winner of the grand prize with a probability of 0.1 (So generous of a casino!) A) (Short answer) What is the distribution of the probability of winning? Does the probability of winning each time depend on how many times you have already played...
python coding please
1.2 Sum of the Independent Random Variables Consider a set of 'n random variables XI,Xy . . . Х,, . Let's define the random variable Y as the stinmation of all X, variables: A) For the case m 10 and Xis being independent uniform variables in the interval -0.5,0.5, generate 100,000 samples of Y. Use the discretization technique from the previous section for the [-5,5 interval and plot the pmf of Y B) Now increase m to...
Part B: Sampling and Random Variable You already have ten marked pennies (ones with numbers from Part A) and 15 unmarked pennies. Thought experiment: Throw them all in a jar and shake. Without looking, pull three out and record how many of them are marked (have a number). You will get 0, 1, 2, or 3 marked coins. How many different samples of 3 pennies out of 25 can you get? (Order doesn’t matter.) Answer: 2,300 Show why 2,300 is...
You only need to do Q2 (a)'s (i) and (ii). No need to do part
B
2. (a) Let X be a random variable with a continuous distribution F. (i) Show that the Random Variable Y = F(X) is uniformly distributed over (0,1). (Hint: Al- though F is the distribution of X, regard it simply as a function satisfying certain properties required to make it a CDF ! (ii) Now, given that Y = y, a random variable Z is...
Please answer from a-d
Problem 2. Let X be a random variable with one of the following cumulative distribution function. 1.2 1,2 1.0 1.0 0.8 0.8 0.6 0.6 0.4 0.4 0.2 0.2 0.0 0.0 -1.0 -0.5 0.0 0.5 1.0 1.5 2,0 -1.0 -0.5 0.0 0.5 1.0 1.5 2.0 X X Pick the correct cumulative distribution function plot and answer questions: Page 2 of 9 Write down the probability mass function and What is the PMF of X? A. Poisson (3...
Q5. Simulations to estimate the expectation Let X be a Gaussian random variable with mean 0 and variance 1 i.e., h = 0 and o = 1). Use R code to take 10k samples from X. (a) plot the histogram and compare with the p.d.f. of X (using the formula from the textbook or Wikipedia). Show both plots. (b) compute E[X] empirically (i.e., for each sample compute X" and take their average); now repeat this computation with 50k samples.
2. Det X be a geometric random variable with mean S. Define a new random variable Y using the following function Y-11,-31 ifXcS 2 ifX25 Where| | denote the absolute value. (a) Find the PMF ofY (b) Find the CDF of Y (c) Find E[Y] and Var(Y] (d) Find P IYel Y 3]
2. Det X be a geometric random variable with mean S. Define a new random variable Y using the following function Y-11,-31 ifXcS 2 ifX25 Where| |...
Please explain where does the
"e^-2" come from
Consider a discrete random variable whose PMF is given by Px-0.1.2, a) Determine the constant c. What is the class of distribution X belongs to, and what is the parameter of X as a member of that class? Since 00 21 0O thus CO -2 2i Note that P(X-)-e2. f is the PMF of Poisson distribution with parameter 2 3
7. Let X1,....Xn random sample from a Bernoulli distribution with parameter p. A random variable X with Bernoulli distribution has a probability mass function (pmf) of with E(X) = p and Var(X) = p(1-p). (a) Find the method of moments (MOM) estimator of p. (b) Find a sufficient statistic for p. (Hint: Be careful when you write the joint pmf. Don't forget to sum the whole power of each term, that is, for the second term you will have (1...
3. Consider a discrete random variable X which follows the geometric distribution f(x,p) = pr-1(1-p), x = 1.2. . . . , 0 < p < 1. Recall that E(x) (1-p) (a) Find the Fisher information I(p). (b) Show that the Cramer-Rao inequality is strict e) Let XX ~X. Find the maximum likelihood estimator of p. Note that the expression you find may look complicated and hard to evaluate. (d) Now modify your view by setting μ T1p such that...