sgd {CNN} R Documentation

Stochastic gradient descent (often shortened in SGD), also known as incremental gradient descent, is a

Description

stochastic approximation of the gradient descent optimization method for minimizing an objective function
that is written as a sum of differentiable functions. In other words, SGD tries to find minimums or
maximums by iteration.

Usage

sgd(batch.size,
    learning.rate = 0.01,
    momentum = 0.9,
    eps = 1E-08,
    l2.decay = 0.001);

Arguments

batch.size

[as integer]

l2.decay

[as double]

Details

Authors

MLkit

Value

this function returns data object of type TrainerAlgorithm.

clr value class

Examples


[Package CNN version 1.0.0.0 Index]