ABSTRACT

We have studied the preconditioned conjugate gradient (CG) algorithm in the context of shot-record extended model domain least-squares migration. The CG algorithm is a powerful iterative technique that can solve the least-squares migration problem efficiently; however, to see the merits of least-squares migration, one needs to apply the algorithm for several iterations. Generally speaking, the convergence rate of the CG algorithm depends on the condition number of the operator. Preconditioners are a family of operators that are easy to build and invert. Proper preconditioners can cluster the eigenvalues of the original operator; hence, they reduce the condition number of the operator that one wishes to invert. Accordingly, preconditioning the operator can, in theory, improve the convergence rate of the algorithm. In least-squares migration, the diagonal scaling of the Hessian and the approximated inverse of the Hessian are proven to work well as a preconditioner. We develop and apply two types of preconditioners for the shot-record extended model domain least-squares migration problem. The first preconditioner belongs to the diagonal scaling category, and a second preconditioner is a filter-based approach, which approximates the partial Hessian operators by local convolutional filters. The goal is to increase the convergence rate of the shot-record extended model domain least-squares migration using the reformulated cost function with a preconditioned operator. Experiments with a synthetic Sigsbee model and a real data example from the Gulf of Mexico, Mississippi Canyon data set, indicate that preconditioning the linear system of the equations improves the convergence rate of the algorithm.

You do not currently have access to this article.