criterion = ClassNLLCriterion()
The negative log likelihood criterion. It is useful to train a classication
problem with n
classes. The input
given through a forward()
is
expected to contain log-probabilities of each class: input
has to be a
1D tensor of size n
. Obtaining log-probabilities in a neural network is
easily achieved by adding a LogSoftMax layer in the last
layer of your neural network.
This criterion expect a class index (1 to the number of class) as target
when calling forward(input, target)
and
backward(input, target)
.
The loss can be described as:
loss(x, class) = forward(x, class) = -x[class]
The following is a code fragment showing how to make a gradient step
given an input x
, a desired output y
(an integer 1
to n
,
in this case n
= 2
classes),
a network mlp
and a learning rate learningRate
:
function gradUpdate(mlp,x,y,learningRate) local criterion = nn.ClassNLLCriterion() pred = mlp:forward(x) local err = criterion:forward(pred, y); mlp:zeroGradParameters(); local t = criterion:backward(pred, y); mlp:backward(x, t); mlp:updateParameters(learningRate); end