I am attaching the t table below
You can see that against 24 degree of freedom and at 0.001 level of significance the t value is 3.745 which implies that if our t calculated value is 3.74 then p value would be 0.001.
But here t calculated value is 13 which is obviously much much lower than 0.001.
So clearly , here p-value < 0.1 and hence we have enough evidence to reject our null hypothesis at 10% level of significance.
14. According to a survey, the average college student spends only 2.2 hours on homework per...
A random sample of 46 college students reported the number of hours per day they typically spend on social media. Their sample mean is, M = 2.61, and their standard deviation is, s = 1.02. What is the point estimate of the mean time spent per day on social media for the population of college students?
Suppose a study reported that the average person watched 4.35 hours of television per day. A random sample of 15 people gave the number of hours of television watched per day shown. At the 5% significance level, do the data provide sufficient evidence to conclude that the amount of television watched per day last year by the average person differed from the value reported in the study? (Note: x overbarequals3.947 hours and sequals1.499 hours.) 3.0 3.4 6.2 2.4 4.1 2.5...
According to a survey, the average American person reads for 3 hours per week. To test if the amount of reading in Chicago is more than the national average, a researcher decides to do a hypothesis test, at a 5% significance level. She surveys 17 Chicagoans randomly and asks them about their amount of TV each week, on average. From the data, the sample mean time is 3.4 hours per week, and the sample standard deviation (s) is 0.8 hours....
QUESTION According to a survey, the average American person watches TV for 3 hours per week. To test if the amount of TV in New York City is less than the national average, a researcher decides to do a hypothesis test, at a 1% significance level. She surveys 19 New Yorkers randomly and asks them about their amount of TV each week, on average. From the data, the sample mean time is 2.5 hours per week, and the sample standard...
cording to a survey, the average young adult male plays video games for 4 hours per week. To test fthe amount of video male plays video games for 4 hours per week. To test if the amount of video est, at a 5% significance level. She surveys 28 male college students randomly and asks them about their amount of video games each week on average. games in college is more than the national average, a researcher decides to do a...
A graduate student believed that, on the average, college students spend more time on the Internet compared to the rest of the population. She conducted a study to determine if her hypothesis was correct. The student randomly surveyed 25 students and found that the average amount of time spent on the Internet was 8.75 hours per week with a SD = 3 hours. The last census found that, on the average, people spent 11 hour per week on the Internet....
13 . According to the American Time Use Survey, the typical American spends 154.8 minutes per day watching television. A survey of 50 Internet users results in a mean time watching television per day of 128.7 minutes, with a standard deviation of 46.5 minutes. Using 0.05 level of significance to test the claim that Internet users spend less time watching television. 14. College math instructors suggest that students spend 2 hours outside class studying for every hour in class. So,...
According to a certain survey, adults spend 2.35 hours per day watching television on a weekday. Assume that the standard deviation for "time spent watching television on a weekday" is 1.93 hours. If a random sample of 50 adults is obtained, describe the sampling distribution of x overbar, the mean amount of time spent watching television on a weekday. One consequence of the popularity of the Internet is that it is thought to reduce television watching. Suppose that a random...
) According to a certain survey, adults spend 2.252.25 hours per day watching television on a weekday. Assume that the standard deviation for "time spent watching television on a weekday" is 1.931.93 hours. If a random sample of 6060 adults is obtained, describe the sampling distribution of x overbarx, the mean amount of time spent watching television on a weekday. x overbarx is approximately normal with mu Subscript x overbarμxequals= 2.25 and sigma Subscript x overbarσxequals=0.2491620.249162 . (Round to six...
A claim is made that the average college student spends 85 minutes per day on social media. Casey believes this claimed value is too low. He surveys a random sample of 115 college students, and they report spending an average of 90 minutes per day on social media, with a sample standard deviation of 19.5 minutes. When Casey computed his test statistic, he did so as follows: 2 -2.7 85-90 19.5 V115 What is wrong with Casey's computations? Instead of...