The Cost Function
We can measure the accuracy of our hypothesis function by using a cost function.
The cost function is defined as:
The cost function is actually where is "Mean squared error (MSE)" or "Squared error function". The mean is halved as a convenience for the computation of the gradient descent, as the derivative term of the square function will cancel out the term.
If we try to think of it in visual terms, our training data set is scattered on the x-y plane. We are trying to make straight line (defined by which passes through this scattered set of data. Our objective is to get the best possible line. The best possible line will be such so that the cost function is minimized with respect to and .
The cost function is sometimes also called optimization function or objective function.
A contour plot is a graph that contains many contour lines. A contour line of a two variable function has a constant value at all points of the same line. Taking any color and going along the 'circle', one would expect to get the same value of the cost function.
- : Number of training examples
- : Input variable
- : Output variable / Target variable
- : Training examples
- : th training example
- : Hypothesis function
- 's: parameters of the model