- , , where is a convex, continuously differentiable (smooth) functions,
- is a convex, non-differentiable (non-smooth) function
- Choose a set of indices without replacement , , , wherebis the batch size.
- Compute the gradient where
- Convergence check:Stop if whereUis an algorithm-specific vector (argument or gradient) and d is an algorithm-specific power of Lebesgue space
- Compute using the algorithm-specific transformationTthat updates the function’s argument:
- Update whereUis an algorithm-specific update of the set of intrinsic parameters.