Linear regression is a linear model, e.g. a model that assumes a linear relationship between the input variables (x) and the single output variable (y). More specifically, that y can be calculated from a linear combination of the input variables (x).
In this analysis, 100 random input samples have been considered with value less than 1.
A function, here, y = 3x has been considered and some randomness has been added to it.
Straight lines are of the form, y = mx + c.
Here, we consider m to be the weights and c to be the bias added.
Therefore, we can say, y = wx + b.
y = [w,b] . [x,1]T
Here, we create a new variable x_dash = [x,1]
The bias added is 1 here.
The cost here is mean squared error.
Cost = Sum (y(predicted) - y(actual))2
= Sum (wTx - y(actual))2
Residuals = wTx - y(actual) Gradient vector = x * (Residuals)
w(t+1) = w(t) - (Learning Rate) * (Gradient vector)
The code has been run for 100 epochs with a learning rate of 0.01.