During seismic wave propagation, intrinsic attenuation inside the earth gives rise to amplitude loss and phase dispersion. Without appropriate correction strategies in migration, these effects degrade the amplitudes and resolution of migrated images. Based on a new time-domain viscoacoustic wave equation, we have developed a viscoacoustic reverse time migration (RTM) approach to correct attenuation-associated dispersion and dissipation effects. A time-reverse wave equation is derived to extrapolate the receiver wavefields, in which the sign of the dissipation term is reversed, whereas the dispersion term remains unchanged. The difference between the forward and time-reverse wave equations is consistent with the physical insights of attenuation compensation during wavefield backpropagation. Due to the introduction of an imaginary unit in the dispersion term, the forward and time-reverse wave equations are complex valued. They are similar to the time-dependent Schrödinger equation, whose real and imaginary parts are coupled during wavefield extrapolation. The analytic property of the extrapolated source and receiver wavefields allows us to explicitly separate up- and downgoing waves. A causal imaging condition is implemented by crosscorrelating downgoing source and upgoing receiver wavefields to remove low-wavenumber artifacts in migrated images. Numerical examples demonstrate that our viscoacoustic RTM approach is capable of producing subsurface reflectivity images with correct spatial locations as well as amplitudes.

You do not currently have access to this article.