LM101-068: How to Design Automatic Learning Rate Selection for Gradient Descent Type Machine Learning Algorithms
Listen now
Description
Simple mathematical formulas are presented that ensure convergence of a generated sequence of parameter vectors which are updated using an iterative algorithm consisting of adding a stepsize number multiplied by a search direction vector to the current parameter values and repeating this process. These formulas may be used as the basis for the design of artificially intelligent smart automatic learning rate selection algorithms. Please visit:  www.learningmachines101.com
More Episodes
This 86th episode of Learning Machines 101 discusses the problem of assigning probabilities to a possibly infinite set of observed outcomes in a space-time continuum which corresponds to our physical world. The machine learning algorithm uses information about the frequency of environmental...
Published 07/20/21
Published 07/20/21
This 85th episode of Learning Machines 101 discusses formal convergence guarantees for a broad class of machine learning algorithms designed to minimize smooth non-convex objective functions using batch learning methods. Simple mathematical formulas are presented based upon research from the late...
Published 05/21/21