Greedy Stein Variational Gradient Descent: An algorithmic approach for wave prospection problems
Ponente(s): Marcos Aurelio Capistrán Ocampo, José Luis Varona Santana
In this project, we propose a Variational Inference algorithm to approximate a posterior distribution. Building on the results of Liu 2016 and Blei 2017, we develop the G-SVGD method. Using these findings, we introduce a loss function that incorporates a weighted gradient and the Evidence Lower Bound (ELBO). The learning rate is proposed as part of a gradient descent approach, formulated as a suboptimal minimization of the introduced loss function, with the aim of accelerating convergence to the desired results.
The convergence speed is evaluated against the standard SVGD method using the ADAM optimizer for learning rate selection, as well as the MCMC method. These results are applied to two wave prospection models, representing low-contrast and high-contrast scenarios. A five-point operator is employed to improve numerical approximations in the forward model solver, and the adjoint method is utilized to enhance accuracy in evaluating the gradient of the log posterior.