module
= CMul(inputDimension)
Applies a component-wise multiplication to the incoming data, i.e.
y_i
= w_i
x_i
.
Example:
mlp=nn.Sequential() mlp:add(nn.CMul(5)) y=torch.Tensor(5); sc=torch.Tensor(5); for i=1,5 do sc[i]=i; end -- scale input with this function gradUpdate(mlp,x,y,criterion,learningRate) local pred = mlp:forward(x) local err = criterion:forward(pred,y) local gradCriterion = criterion:backward(pred,y); mlp:zeroGradParameters(); mlp:backward(x, gradCriterion); mlp:updateParameters(learningRate); return err end for i=1,10000 do x=lab.rand(5) y:copy(x); y:cmul(sc); err=gradUpdate(mlp,x,y,nn.MSECriterion(),0.01) end print(mlp:get(1).weight)gives the output:
1.0000 2.0000 3.0000 4.0000 5.0000 [torch.Tensor of dimension 5]i.e. the network successfully learns the input x has been scaled by those scaling factors to produce the output y.