StochasticGradient has several field which have an impact on a call to train().

learningRate
This is the learning rate used during training. The update of the parameters will be
parameters = parameters - learningRate * parameters_gradient
Default value is 0.01.
learningRateDecay: The learning rate decay. If non-zero, the learning rate (note
the field learningRate will not change value) will be computed after each iteration (pass over the dataset) with:
current_learning_rate =
          learningRate / (1 + iteration * learningRateDecay)
maxIteration
The maximum number of iteration (passes over the dataset). Default is 25.
shuffleIndices
Boolean which says if the examples will be randomly sampled or not. Default is true. If false, the examples will be taken in the order of the dataset.
hookExample
A possible hook function which will be called (if non-nil) during training after each example forwarded and backwarded through the network. The function takes (self, example) as parameters. Default is nil.
hookIteration
A possible hook function which will be called (if non-nil) during training after a complete pass over the dataset. The function takes (self, iteration) as parameters. Default is nil.