Using R code we have the answers.
CODE:
x <-
c(0.16,0.22,0.3,0.3,0.32,0.33,0.35,0.37,0.37,0.38,0.5,0.53,0.6,0.6,0.61,0.63,0.73,0.76,0.84,0.85)
y <-
c(3.3,11.7,13.6,14.3,14.4,16,16.7,17.8,19.1,21,22.2,23.7,27.5,27.9,28.5,28.6,28.7,29.2,31.1,31.5)
reg <- lm(y~x)
reg
predict(reg,data.frame(x = 0.47))
14.4 - predict(reg,data.frame(x = 0.47))
OUTPUT:
> x <-
c(0.16,0.22,0.3,0.3,0.32,0.33,0.35,0.37,0.37,0.38,0.5,0.53,0.6,0.6,0.61,0.63,0.73,0.76,0.84,0.85)
> y <-
c(3.3,11.7,13.6,14.3,14.4,16,16.7,17.8,19.1,21,22.2,23.7,27.5,27.9,28.5,28.6,28.7,29.2,31.1,31.5)
> reg <- lm(y~x)
> reg
Call:
lm(formula = y ~ x)
Coefficients:
(Intercept) x
3.844 35.888
> predict(reg,data.frame(x = 0.47))
1
20.71195
> 14.4 - predict(reg,data.frame(x = 0.47))
1
-6.311952
(a) The least squares estimates are,
slope = 35.888 and intercept = 3.844.
The regression line is Y = 3.844 + 35.888*X
(b) For X = 0.47, predicted Y = 20.71195.
(c) For X = 0.32, we have the residual -6.311952, which incidcates the prediction was overestimate.
Problem 4: Using the data in the table below, answer the following: 0.16 0.22 0.30 0.30...