Research#llm📝 BlogAnalyzed: Jan 3, 2026 06:22

Evolution Strategies

Published:Sep 5, 2019 00:00
1 min read
Lil'Log

Analysis

The article introduces black-box optimization algorithms as alternatives to stochastic gradient descent for optimizing deep learning models. It highlights the scenario where the target function's analytic form is unknown, making gradient-based methods infeasible. The article mentions examples like Simulated Annealing, Hill Climbing, and Nelder-Mead method, providing a basic overview of the topic.

Reference

Stochastic gradient descent is a universal choice for optimizing deep learning models. However, it is not the only option. With black-box optimization algorithms, you can evaluate a target function $f(x): \mathbb{R}^n \to \mathbb{R}$, even when you don’t know the precise analytic form of $f(x)$ and thus cannot compute gradients or the Hessian matrix.