Objective and loss functions

Jihong on May 1, 2017

0/1 loss

where

Hinge loss

Logistic loss

where $ h_\theta(x) = \frac{1}{1+\exp(-\theta^\top x)} $ is the sigmoid activation function.

Cross entropy

where $P(y^{(i)} = k \vert x^{(i)}; \theta) = \frac{\exp(\theta^{(k)\top}x^{(i)})}{\sum_{j=1}^K \exp(\theta^{(j)\top} x^{(i)})}$ is the softmax activation function.

Mean squared error (MSE)