Processing math: 100%

Objective and loss functions

Jihong on May 1, 2017

0/1 loss

J(θ)=miL0/1(θTx)

where

L0/1(f)={1if yf<00if yf>0

Hinge loss

J(θ)=max(0,1yf)

Logistic loss

J(θ)=[mi=1y(i)loghθ(x(i))+(1y(i))log(1hθ(x(i)))]

where hθ(x)=11+exp(θx) is the sigmoid activation function.

Cross entropy

J(θ)=[mi=11k=01{y(i)=k}logP(y(i)=k|x(i);θ)] where P(y(i)=k|x(i);θ)=exp(θ(k)x(i))Kj=1exp(θ(j)x(i)) is the softmax activation function.

Mean squared error (MSE)

i||yθTx||2