Random Gradient-Free Optimization in Infinite Dimensional Spaces
Published:Dec 25, 2025 05:00
•1 min read
•ArXiv Stats ML
Analysis
This paper introduces a novel random gradient-free optimization method tailored for infinite-dimensional Hilbert spaces, addressing functional optimization challenges. The approach circumvents the computational difficulties associated with infinite-dimensional gradients by relying on directional derivatives and a pre-basis for the Hilbert space. This is a significant improvement over traditional methods that rely on finite-dimensional gradient descent over function parameterizations. The method's applicability is demonstrated through solving partial differential equations using a physics-informed neural network (PINN) approach, showcasing its potential for provable convergence. The reliance on easily obtainable pre-bases and directional derivatives makes this method more tractable than approaches requiring orthonormal bases or reproducing kernels. This research offers a promising avenue for optimization in complex functional spaces.
Key Takeaways
- •Introduces a random gradient-free optimization method for infinite-dimensional Hilbert spaces.
- •Relies on directional derivatives and pre-bases to avoid computing infinite-dimensional gradients.
- •Demonstrates application in solving partial differential equations using physics-informed neural networks (PINNs).
Reference
“To overcome this limitation, our framework requires only the computation of directional derivatives and a pre-basis for the Hilbert space domain.”