Question

Show the per-iteration computational cost of Gradient Descent for Linear Regression is O(nd); n is the...

Show the per-iteration computational cost of Gradient Descent for Linear Regression is O(nd); n is the sample size, d is the dimension.

0 0
Add a comment Improve this question Transcribed image text
Answer #1

in order to answer the question we need to have basic understanding of following terms :

  1. Linear_regression : LA is another technique in machine learning which can be used to predict output given some input and attributes. The attributes here are trainable and are used later for prediction once the training phase is completed. the Dimension of a linear regression problem is defined by how many attributes it has and said to have n dimension for n attributes.
  2. Gradient Descent : Its just an algorithm and rather a powerful one to minimize the error we get from the regression model and the target output during training phase. This error is then propagated back to the network to adjust the trainable parameters like attributes in this case. The very basic equation used to update the weights or attributes is : new_weights = old_weights - (learning_rate * error)

So now we can see one probable dependency when we run gradient descent on linear regression model . that is , the update is required for each and every attribute and hence is to the worst o(d)

and also now to mention , GD is done on each and every instance of the training dataset and so this whole above mentioned process is repeated n no of times where n is the number of sample in the dataset. So clearly there is one more dependency on the sample size.

Both combined , we can get the total computational cost or time complexity which O(nd).

Add a comment
Know the answer?
Add Answer to:
Show the per-iteration computational cost of Gradient Descent for Linear Regression is O(nd); n is the...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
  • 1. Choose all the valid answers to the description about linear regression and logistic regressio...

    machine learning/ stats questions 1. Choose all the valid answers to the description about linear regression and logistic regression from the options below: A. Linear regression is an unsupervised learning problem; logistic regression is a super- vised learning problem. B. Linear regression deals with the prediction of co ontinuous values; logistic regression deals with the prediction of class labe C. We cannot use gradient descent to solve linear regression: we must resort to least square estimation to compute a closed-form...

  • def gradient_descent(feature_matrix, label, learning_rate = 0.05, epoch = 1000): """ Implement gradient descent algorithm for regression....

    def gradient_descent(feature_matrix, label, learning_rate = 0.05, epoch = 1000): """ Implement gradient descent algorithm for regression.    Args: feature_matrix - A numpy matrix describing the given data, with ones added as the first column. Each row represents a single data point.    label - The correct value of response variable, corresponding to feature_matrix.    learning_rate - the learning rate with default value 0.5    epoch - the number of iterations with default value 1000 Returns: A numpy array for the...

  • def stochastic_gradient_descent(feature_matrix, label, learning_rate = 0.05, epoch = 1000): """ Implement gradient descent algorithm for regression....

    def stochastic_gradient_descent(feature_matrix, label, learning_rate = 0.05, epoch = 1000): """ Implement gradient descent algorithm for regression.    Args: feature_matrix - A numpy matrix describing the given data, with ones added as the first column. Each row represents a single data point.    label - The correct value of response variable, corresponding to feature_matrix.    learning_rate - the learning rate with default value 0.5    epoch - the number of iterations with default value 1000 Returns: A numpy array for the...

  • How would you show that, in a linear regression, as the sample size N goes to...

    How would you show that, in a linear regression, as the sample size N goes to infinity, the estimated parameters converge to the real (true) ones. [Hint: what is the standard error of the estimates].

  • Show how to get this linear regression equation The data in the table were collected from n = 10 ...

    Show how to get this linear regression equation The data in the table were collected from n = 10 home sales. Property appraisers used the data to estimate the population regression model of E(Sales Price) = b0 + b1(Home Size), where Sales Price (in thousands of dollars) Home Size (in hundreds of square feet) Sales Price Home Size 160 23 132.7 11 157.7 20 145.5 17 147 15 155.3 21 164.5 24 142.6 13 154.5 19 157.5 25 What is...

  • Given a simple linear regression model with a sample size of n = 2; The sample...

    Given a simple linear regression model with a sample size of n = 2; The sample data: (y1, x1), (y2, x2) (a) State the two normal equations in terms of the sample data (b) If there is only one observation (y1, x1) in the sample, how would the two normal equations look like? (c) What conclusion can we draw from the answer in part (a) and (b)?

  • Exercise 1 Consider the regression model through the origin y.-β1zi-ci, where Ei ~ N(0,o). It is ...

    Exercise 2b please! Exercise 1 Consider the regression model through the origin y.-β1zi-ci, where Ei ~ N(0,o). It is assumed that the regression line passes through the origin (0, 0) that for this model a: T N, is an unbiased estimator of o2. a. Show d. Show that (n-D2 ~X2-1, where se is the unbiased estimator of σ2 from question (a). Exercise2 Refer to exercise 1 a. Show that is BLUE (best linear unbiased estimator) b. Show that +1 has...

  • Please solve the question Simulation: Assume the simple linear regression model i = 1,... , n Ул 3D Во + B1; + ei, N(0,...

    Please solve the question Simulation: Assume the simple linear regression model i = 1,... , n Ул 3D Во + B1; + ei, N(0, o2) for i = 1,...,n. where e Let's set Bo = 10, B1 = -2.5, and n = 30 (a) Set a = 100, and x; = i for i = 1,...,n. (b) Your simulation will have 10,000 iterations. Before you start your iterations, set a random seed using your birthday date (MMDD) and report the...

  • LX2 For a random sample of size n For a random sample ofsice n ta). Show...

    LX2 For a random sample of size n For a random sample ofsice n ta). Show that the error sum ol squares Can be expressed b shau hat the leasl-squares est Can be expressed as Pa :y-阮A: rule of B, nd P, of a line ) d. ) (d). Using part (), shre that the hne h Hed b4 the mefhod of least squeres passes threugh the point e y SiM he method o e point (o

  • In a simple linear regression model, the intercept of the regression line measures

    QUESTION 1In a simple linear regression model, the intercept of the regression line measuresa.the change in Y per unit change in X.b.the change in X per unit change in Y.c.the expected change in Y per unit change in X.d.the expected change in X per unit change in Y.e.the value of Y when X equals 0.f.the value of X when Y equals 0.g.the average value of Y when X equals 0.h.the average value of X when Y equals 0.QUESTION 2In a...

ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT