Overlap-Resampled L-BFGS for Physics-Informed Neural Networks
Abstract
Physics-informed neural networks (PINNs) benefit from both L-BFGS optimization, which provides fast convergence via curvature information, and collocation resampling, which improves domain coverage. However, these techniques are fundamentally incompatible: L-BFGS requires consistent gradients across iterations, while resampling changes the loss function at each step. We propose \emph{overlap-resampled L-BFGS}, which computes curvature pairs only on points that persist between consecutive collocation batches, combined with cautious updates that filter unreliable estimates. On the ice-shelf inverse problem, our method achieves , outperforming both Adam with resampling (7% improvement) and fixed-collocation L-BFGS (30% improvement). On the 2D Poisson forward problem, it provides 8.5 improvement over Adam-only while maintaining resampling capability. The method operates stably at overlap fractions as low as 25%, well below theoretical thresholds, demonstrating practical robustness for PINN applications.