#include <SGD.h>
|
| void | update (ConstAlignedMapVec &dvec, AlignedMapVec &vec) |
| |
| virtual void | reset () |
| |
|
|
Scalar | m_lrate |
| |
|
Scalar | m_decay |
| |
The Stochastic Gradient Descent (SGD) algorithm
Definition at line 16 of file SGD.h.
◆ update()
| void MiniDNN::SGD::update |
( |
ConstAlignedMapVec & |
dvec, |
|
|
AlignedMapVec & |
vec |
|
) |
| |
|
inlinevirtual |
Update the parameter vector using its gradient
It is assumed that the memory addresses of dvec and vec do not change during the training process. This is used to implement optimization algorithms that have "memories". See the AdaGrad algorithm for an example.
- Parameters
-
| dvec | The gradient of the parameter. Read-only |
| vec | On entering, the current parameter vector. On exit, the updated parameters. |
Implements MiniDNN::Optimizer.
Definition at line 31 of file SGD.h.
The documentation for this class was generated from the following file: