The code implements optimization (through gradient descent) of different cost functions in a Linear and Polynomial Regression model. The cost functions are further compared on how well the linear/polynomial function fits the data and how close the parameters get to the "desired parameters."
The desired value of theta is [[15] [4]]
We got, theta = [[14.99448849] [3.90734145]]
Theta turned out to be [[14.6476] [4.06246221]]
Comparing the parameters, we see that the absolute error function gives a slightly better parameter values.
The desired value of theta is [[1] [1] [2]]
Theta turned out to be [[1.01463795] [0.99882387] [1.98143011]]
We got, theta = [[1.00994373] [1.02028566] [1.83644596]]