## Logistic Regression: Simplified Cost function

We can compress our cost function's two conditional cases into one case.

$\mathrm{Cost}(h_\theta(x),y) = - y \; \log(h_\theta(x)) - (1 - y) \log(1 - h_\theta(x))$

Now, we can fully write out our entire cost function as follows:

$J(\theta) = - \space \frac{1}{m} \displaystyle \sum_{i=1}^m \left[ y^{(i)}\log (h_\theta (x^{(i)})) + (1 - y^{(i)})\log (1 - h_\theta(x^{(i)})) \right]$

Vectorized implementation:

\begin{aligned} & h = g(X\theta)\\ & J(\theta) = - \space \frac{1}{m} \left(y^{T}\log(h)+(1-y)^{T}\log(1-h)\right) \end{aligned}