PHP MCordingley\Regression\Algorithm\GradientDescent\StoppingCriteria Namespace

Interfaces

Name Description
StoppingCriteria

Classes

Name Description
Any Checks for when any of a given list of stopping criteria is met.
CoefficientEquality Checks for when the coefficients themselves have completely stopped changing.
DescentSpy StoppingCriteria decorator that will call some callback on each iteration before passing on to the delegated criteria. Useful for debugging gradient descent convergence.
GradientNorm Stops when the normal of the gradient vector drops below some specified eta.
MaxIterations Place a cap on the total number of iterations through the descent. It won't give actual convergence, but is good if "good enough" can be expected to be reached after a given number of times through the data. Note that each iteration does not necessarily correspond to a complete epoch of the data. Check your descent method for how many records are processed per iteration. Pairs well with a criteria that checks for actual convergence inside of an `Any` object to stop early if convergence has occurred.
MaxTime Run the descent for a certain amount of time before stopping. Goes well with some criteria that checks for convergence within an instance of `Any` to cap the amount of time that a descent can run. Nice to make sure that a descent doesn't become runaway on a production server.
NthIteration Decorator that calls the underlying StoppingCriteria only after every N iterations. This is for when the stopping criteria is expensive to calculate, such as one that involves computing the cost of the entire data-set given the current set of weights.
SteppedCriteria Decorates another StoppingCriteria and passes it the gradient as updated by the provided Schedule object. This way, you can test your stopping criteria not against the current gradient, but against the gradient as it is for coefficient updates. Useful for Stochastic gradient descent and MiniBatch gradient descent with small batches, where the actual gradient never quite settles down, despite the descent itself converging.