Logistic Regression: Simplified Cost function

 

We can compress our cost function's two conditional cases into one case.

Cost(hθ(x),y)=y  log(hθ(x))(1y)log(1hθ(x))\mathrm{Cost}(h_\theta(x),y) = - y \; \log(h_\theta(x)) - (1 - y) \log(1 - h_\theta(x))

Now, we can fully write out our entire cost function as follows:

J(θ)= 1mi=1m[y(i)log(hθ(x(i)))+(1y(i))log(1hθ(x(i)))]J(\theta) = - \space \frac{1}{m} \displaystyle \sum_{i=1}^m \left[ y^{(i)}\log (h_\theta (x^{(i)})) + (1 - y^{(i)})\log (1 - h_\theta(x^{(i)})) \right]

Vectorized implementation:

h=g(Xθ)J(θ)= 1m(yTlog(h)+(1y)Tlog(1h))\begin{aligned} & h = g(X\theta)\\ & J(\theta) = - \space \frac{1}{m} \left(y^{T}\log(h)+(1-y)^{T}\log(1-h)\right) \end{aligned}