Abstract

We have developed a statistical modeling method to quantify the chronostratigraphic meaning of primary seismic reflections, with an emphasis on applications at reservoir scale (<50m). The time-correlation error (TCE) is defined as the difference between seismic events and relative geologic time (RGT). A series of statistically simulated impedance models with flat chronostratigraphic surfaces was generated from a subsurface data set to describe gradual lithofacies changes in contemporaneous strata and to account for vertical cyclicity from seed wireline logs. We converted these models to realistic seismic records using an exploding-reflector algorithm. The TCE from the seismic models was positively correlated to the lateral impedance variation, in which the TCE magnitude for a model of complex impedance variation could be quite significant. For example, a maximum, two-event, 32.5-m TCE had been observed in a small 1×1-km model. An increase in wavelet frequency in general reduced the TCE and improved the seismic chronostratigraphic correlation. In addition, a preliminary test confirmed that amplitude variance in the seismic model was related to lateral impedance variation and could be used to predict TCE. Therefore, certain attributes (such as amplitude variance) were useful in developing tools for generating TCE-based, hybrid, and RGT volumes from field seismic data. This strategy integrated the advantages, and avoided the disadvantages, of the present methods.

You do not currently have access to this article.