Monograph on gradient-based learning and optimization by Bernd Heidergott
The monograph Optimization and Learning via Stochastic Gradient Descent by research fellow Bernd Heidergott (Vrije Universiteit Amsterdam) and Felisa Vazques-Abad (Hunter College, City University of New York), published by Princeton University Press, offers a comprehensive treatment of recursive optimization and learning algorithms for stochastic problems.
The book explores gradient-based stochastic optimization through the methodologies of stochastic approximation and gradient estimation. While the presentation is primarily theoretical, the authors emphasize the development and implementation of practical algorithms. The guiding philosophy is that effective problem-solving arises from the interplay between mathematical theory, modeling, and numerical methods—none of which should dominate the others.
The first part of the text presents a rigorous treatment of stochastic approximation theory, including advanced models and state-of-the-art analytical techniques, focusing first on applications that do not require gradient estimation. The second part introduces modern approaches to gradient estimation, integrating cutting-edge numerical algorithms. The final part presents a series of rich case studies that synthesize the earlier concepts into complete, worked examples. Applications in statistics and machine learning are discussed in depth, and selected gradient estimation methods receive detailed theoretical analysis.
Numerous examples throughout the book illustrate how these methods can be applied in practice. End-of-chapter exercises help readers consolidate their understanding, while sections on “Practical Considerations” highlight common implementation trade-offs. Offering the first unified treatment of this field, the book addresses a broad audience of researchers and graduate students in applied mathematics, engineering, computer science, physics, and economics.