The modified conjugate gradient (CG) method, called the conjugate guided gradient (CGG) method, is a robust iterative inversion method producing a parsimonious model estimation. The CG method for solving least-squares (LS) (i.e., 2-norm minimization) problems is modified to solve for different norms or different minimization criteria by guiding the gradient vector appropriately during iteration steps. Guiding is achieved by iteratively reweighting either the residual vector or the gradient vector during iteration steps like the iteratively reweighted least-squares IRLS method does. Robustness is achieved by weighting the residual vector and parsimonious model estimation is obtained by weighting the gradient vector. Unlike the IRLS method, however, the CGG method doesn't change the corresponding forward operator of the problem and is implemented in a linear inversion template. Therefore, the CGG method requires less computation than the IRLS method. Since the solution in the CGG method is found in a least-squares sense along the gradient direction guided by the weights, this solution can be interpreted as the LS solution located in the guided gradient direction. Guiding the gradient gives us more flexibility in the choice of weighting parameters than the IRLS method. I applied the CGG method to velocity-stack inversion, and the results show that the CGG method gives a far more robust and parsimonious model estimation than the standard 2-norm solution, with results comparable to the 1-norm IRLS solution.

You do not currently have access to this article.