Logistic Regression: Hypothesis Representation
Recall that in linear regression, our hypothesis function looked like:
Intuitively, it also doesn’t make sense for to take values larger than 1 or smaller than 0 when we know that . To fix this, let’s change the form for our hypothesis to satisfy . This is accomplished by plugging into the Logistic Function.
The "Sigmoid Function" or "Logistic Function" is given as:
The Sigmoid Function maps any real number to the interval, making it useful for transforming an arbitrary-valued function into a function better suited for classification.
Note that when .
Plugging into the Logistic Function, we get our hypothesis function for Logistic regression:
Interpretation of hypothesis output:
is the estimated probability that y = 1 on input x. Formally,
Logistic regression is actually a classification algorithm and the word 'regression' appearing in its name is only for historical reasons.
m x (n+1) matrix
Note that in this vectorized implementation, we calculate hypotheses of all training examples at once.