- Abstract
- Affiliation
- All
- Authors
- Book Series
- DOI
- EISBN
- EISSN
- Full Text
- GeoRef ID
- ISBN
- ISSN
- Issue
- Keyword (GeoRef Descriptor)
- Meeting Information
- Report #
- Title
- Volume
- Abstract
- Affiliation
- All
- Authors
- Book Series
- DOI
- EISBN
- EISSN
- Full Text
- GeoRef ID
- ISBN
- ISSN
- Issue
- Keyword (GeoRef Descriptor)
- Meeting Information
- Report #
- Title
- Volume
- Abstract
- Affiliation
- All
- Authors
- Book Series
- DOI
- EISBN
- EISSN
- Full Text
- GeoRef ID
- ISBN
- ISSN
- Issue
- Keyword (GeoRef Descriptor)
- Meeting Information
- Report #
- Title
- Volume
- Abstract
- Affiliation
- All
- Authors
- Book Series
- DOI
- EISBN
- EISSN
- Full Text
- GeoRef ID
- ISBN
- ISSN
- Issue
- Keyword (GeoRef Descriptor)
- Meeting Information
- Report #
- Title
- Volume
- Abstract
- Affiliation
- All
- Authors
- Book Series
- DOI
- EISBN
- EISSN
- Full Text
- GeoRef ID
- ISBN
- ISSN
- Issue
- Keyword (GeoRef Descriptor)
- Meeting Information
- Report #
- Title
- Volume
- Abstract
- Affiliation
- All
- Authors
- Book Series
- DOI
- EISBN
- EISSN
- Full Text
- GeoRef ID
- ISBN
- ISSN
- Issue
- Keyword (GeoRef Descriptor)
- Meeting Information
- Report #
- Title
- Volume
NARROW
GeoRef Subject
-
Primary terms
-
geophysical methods (1)
-
Abstract Small geologic features manifest themselves in seismic data in the form of diffracted waves, which are fundamentally different from seismic reflections. Using two field-data examples and one synthetic example, we demonstrate the possibility of separating seismic diffractions in the data and imaging them with optimally chosen migration velocities. Ourcri-teria for separating reflection and diffraction events are the smoothness and continuity of local event slopes that correspond to reflection events. For optimal focusing, we develop the local varimax measure. The objectives of this work are velocity analysis implemented in the poststack domain and high-resolution imaging of small-scale heterogeneities. Our examples demonstrate the effectiveness of the proposed method for high-resolution imaging of such geologic features as faults, channels, and salt boundaries.
Abstract Spectral decomposition is a standard tool to facilitate and accelerate seismic interpretation. Applications include highlighting changes in layer thicknesses (for example, in meandering channels and turbidite layers) as well as exploring for low-frequency gas shadows. We argue that local phase analysis serves as a complementary aid in seismic interpretation because the layer thickness, type of impedance contrast, and boundary shape determine the amplitude, peak frequency, and phase of the locally observed wavelet.
Abstract Because of their fast acoustic velocity, their ability to create steep slopes, and their important postdepositional diagenetic modification, carbonate rocks are notoriously more difficult to image and interpret using seismic than siliciclastic rocks. This paper shows how building a 3-D synthetic seismogram based on well-constrained outcrop-based 3-D geocellular models can help in seismic interpretation and seismic-based reservoir characterization. Workflow to populate a 3-D geological model with velocity is presented that is based on building statistical distribution of velocity per facies or lithostratigraphic units or diagenetic features and extrapolating velocity through the model using stochastic Gaussian simulation. A 3-D model built from Lower Permian deep-water carbonate gravity flows is used to demonstrate the complexity of interpreting intricate 3-D geometries using 2-D planar seismic slices and to assess volumetric error associated with the intrinsic resolution loss of seismic. Modern karst morphology is used to assess the seismic response of caves, sinkholes, or karst topography in seismic. Finally, a Permian dolomitized ramp-crest grainstone complex is used to test the sensitivity of prestack techniques to pore-type changes in grainy carbonate rocks. These few examples illustrate the strength of building a well-calibrated 3-D synthetic seismogram based on a 3-D geocellular model so that some of the complexity of seismic response of carbonate rocks might be unraveled.
AVO: Interpretation of AVO anomalies
Abstract We investigate the effects of changes in rock and fluid properties on amplitude-variation-with-offset (AVO) responses. In the slope-intercept domain, reflections from wet sands and shales fall on or near a trend that we call the fluid line. Reflections from the top of sands containing gas or light hydrocarbons fall on a trend approximately parallel to the fluid line; reflections from the base of gas sands fall on a parallel trend on the opposing side of the fluid line. The polarity standard of the seismic data dictates whether these reflections from the top of hydrocarbon-bearing sands are below or above the fluid line. Typically, rock properties of sands and shales differ, and therefore reflections from sand/shale interfaces are also displaced from the fluid line. The distance of these trends from the fluid line depends upon the contrast of the ratio of P-wave velocity V P and S-wave velocity V S . This ratio is a function of pore-fluid compressibility and implies that distance from the fluid line increases with increasing compressibility. Reflections from wet sands are closer to the fluid line than hydrocarbon-related reflections. Porosity changes affect acoustic impedance but do not significantly impact the V P / V S contrast. As aresult, porosity changes move the AVO response along trends approximately parallel to the fluid line. These observations are useful for interpreting AVO anomalies in terms of fluids, lithology, and porosity.
Anisotropy: Seismic anisotropy in exploration and reservoir characterization: An overview
Abstract Recent advances in parameter estimation and seismic processing have allowed incorporation of anisotropic models into a wide range of seismic methods. In particular, vertical and tilted transverse isotropy are currently treated as an integral part of velocity fields employed in prestack depth migration algorithms, especially those based on the wave equation. We briefly review the state of the art in modeling, processing, and inversion of seismic data for anisotropic media. Topics include optimal parameterization, body-wave modeling methods, P-wave velocity analysis and imaging, processing in the τ-p domain, anisotropy estimation from vertical-seismic-profiling (VSP) surveys, moveout inversion of wide-azimuth data, amplitude-variation-with-offset (AVO) analysis, processing and applications of shear and mode-converted waves, and fracture characterization. When outlining future trends in anisotropy studies, we emphasize that continued progress in data-acquisition technology is likely to spur transition from transverse isotropy to lower anisotropic symmetries (e.g., orthorhombic). Further development of inversion and processing methods for such realistic anisotropic models should facilitate effective application of anisotropy parameters in lithology discrimination, fracture detection, and time-lapse seismology.
Abstract Rock physics has evolved to become a key tool of reservoir geophysics and an integral part of quantitative seismic interpretation. Rock-physics models adapted to site-specific deposition and compaction help extrapolate rock properties away from existing wells and, by so doing, facilitate early exploration and appraisal. Many rock-physics models are available, each having benefits and limitations. During early exploration or in frontier areas, direct use of empirical site-specific models may not help because such models have been created for areas with possibly different geologic settings. At the same time, more advanced physics-based models can be too uncertain because of poor constraints on the input parameters without well or laboratory data to adjust these parameters. A hybrid modeling approach has been applied to siliciclastic unconsolidated to moderately consolidated sediments. Specifically in sandstones, a physical-contact theory (such as the Hertz-Mindlin model) combined with theoretical elastic bounds (such as the Hashin-Shtrikman bounds) mimics the elastic signatures of porosity reduction associated with depositional sorting and diagenesis, including mechanical and chemical compaction. For soft shales, the seismic properties are quantified as a function of pore shape and occurrence of cracklike porosity with low aspect ratios. A work flow for upscaling interbedded sands and shales using Backus averaging follows the hybrid modeling of individual homogenous sand and shale layers. Different models can be included in site-specific rock-physics templates and used for quantitative interpretation of lithology, porosity, and pore fluids from well-log and seismic data.
Electrical and Electromagnetic Methods: Electromagnetic geophysics: Notes from the past and the road ahead
Abstract During the last century, electrical geophysics has been transformed from a simple resistivity method to a modern technology that uses complex data-acquisition systems and high-performance computers for enhanced data modeling and interpretation. Not only the methods and equipment have changed but also our ideas about the geoelectrical models used for interpretation have been modified tremendously. This paper describes the evolution of the conceptual and technical foundations of EM methods. It outlines a framework for further development, which should focus on multitransmitter and multireceiver surveys, analogous to seismic data-acquisition systems. Important potential topics of future research efforts are in the areas of multidimensional modeling and inversion, including a new approach to the formulation and understanding of EM fields based on flux and voltage representation, which corresponds well to geophysical experiments involving the measurement of voltage and flux of electric and magnetic fields.
Abstract Marine controlled-source electromagnetic (CSEM) surveying has been in commercial use for predrill reservoir appraisal and hydrocarbon exploration for 10 years. Although a recent decrease has occurred in the number of surveys and publications associated with this technique, the method has become firmly established as an important geophysical tool in the offshore environment. This is a consequence of two important aspects associated with the physics of the method: First, it is sensitive to high electrical resistivity, which, although not an unambiguous indicator of hydrocarbons, is an important property of economically viable reservoirs. Second, although the method lacks the resolution of seismic wave propagation, it has a much better intrinsic resolution than potential-field methods such as gravity and magnetic surveying, which until now have been the primary nonseismic data sets used in offshore exploration. Although by many measures marine CSEM is still in its infancy, the reliability and noise floors of the instrument systems have improved significantly over the last decade, and interpretation methodology has progressed from simple anomaly detection to 3D anisotropic inversion of multicomponent data using some of the world’s fastest supercomputers. Research directions presently include tackling the airwave problem in shallow water by applying time-domain methodology, continuous profiling tools, and the use of CSEM for reservoir monitoring during production.
Abstract Today, surface-wave analysis is widely adopted for building near-surface S-wave velocity models. The surface-wave method is under continuous and rapid evolution, also thanks to the lively scientific debate among different disciplines, and interest in the technique has increased significantly during the last decade. A comprehensive review of the literature in the main scientific journals provides historical perspective, methodological issues, applications, and most-promising recent approaches. Higher modes in the inversion and retrieval of lateral variations are dealt with in great detail, and the current scientific debate on these topics is reported. A best-practices guideline is also outlined.
Gravity Exploration Methods: 75th Anniversary Historical development of the gravity method in exploration
Abstract The gravity method was the first geophysical technique to be used in oil and gas exploration. Despite being eclipsed by seismology, it has continued to be an important and sometimes crucial constraint in a number of exploration areas. In oil exploration the gravity method is particularly applicable in salt provinces, overthrust and foothills belts, underexplored basins, and targets of interest that underlie high-velocity zones. The gravity method is used frequently in mining applications to map subsurface geology and to directly calculate ore reserves for some massive sulfide orebodies. There is also a modest increase in the use of gravity techniques in specialized investigations for shallow targets. Gravimeters have undergone continuous improvement during the past 25 years, particularly in their ability to function in a dynamic environment. This and the advent of global positioning systems (GPS) have led to a marked improvement in the quality of marine gravity and have transformed airborne gravity from a regional technique to a prospect-level exploration tool that is particularly applicable in remote areas or transition zones that are otherwise inaccessible. Recently, moving-platform gravity gradiometers have become available and promise to play an important role in future exploration. Data reduction, filtering, and visualization, together with low-cost, powerful personal computers and color graphics, have transformed the interpretation of gravity data. The state of the art is illustrated with three case histories: 3D modeling of gravity data to map aquifers in the Albuquerque Basin, the use of marine gravity gradiometry combined with 3D seismic data to map salt keels in the Gulf of Mexico, and the use of airborne gravity gradiometry in exploration for kimberlites in Canada.
Abstract During the past 80 years, ground-penetrating radar (GPR) has evolved from a skeptically received glacier sounder to a full multicomponent 3D volume-imaging and characterization device. The tool can be calibrated to allow for quantitative estimates of physical properties such as water content. Because of its high resolution, GPR is a valuable tool for quantifying subsurface heterogeneity, and its ability to see nonmetallic and metallic objects makes it a useful mapping tool to detect, localize, and characterize buried objects. No tool solves all problems; so to determine whether GPR is appropriate for a given problem, studying the reasons for failure can provide an understanding of the basics, which in turn can help determine whether GPR is appropriate for a given problem. We discuss the specific aspects of borehole radar and describe recent developments to become more sensitive to orientation and to exploit the supplementary information in different components in polarimetric uses of radar data. Multicomponent GPR data contain more diverse geometric information than single-channel data, and this is exploited in developed dedicated imaging algorithms. The evolution of these imaging schemes is discussed for ground-coupled and air-coupled antennas. For air-coupled antennas, the measured radiated wavefield can be used as the basis for the wavefield extrapolator in linearinversion schemes with an imaging condition, which eliminates the source-time function and corrects for the measured radiation pattern. A handheld GPR system coupled with a metal detector is ready for routine use in mine fields. Recent advances in modeling, tomography, and full-waveform inversion, as well as Green’s function extraction through correlation and deconvolution, show much promise in this field.
Interpretation Methods: From reflection elements to structure — A look at the history of data interpretation
Abstract Traditionally, input acquired in the field consisted of the original paper records; output submitted to the client consisted of structural sections and depth-contour maps of selected interfaces. Before the introduction of magnetic recording, it was common practice to do the conversion in the field office. Tools for this conversion ranged from slide rules and desk calculators to wavefront charts. These tools were based on the geometry of rays in media where velocity is a function of depth only. The detailed algorithms underlying the conversion were often developed in the exploration companies and — originally — were carefully guarded. But at least the underlying principles were exchanged throughout the industry through books, journal articles, and presentations at meetings, such as noted in nearly 300 references in C. H. Dix’s Seismic Prospecting for Oil (1952). The techniques of data acquisition and data interpretation have changed considerably, but the underlying principles of ray geometry are the same. Therefore, many new methods are based on ideas formulated in the early times of the industry.
Practicing Geophysics: 3D seismic volume visualization and interpretation: An integrated workflow with case studies
Abstract One of the major problems in subsurface seismic exploration is the uncertainty (nonuniqueness) in geologic interpretation because of the complexity of subsurface geology and the limited dimension of the data available. Case studies from worldwide exploration projects indicate that an integrated, three-dimensional (3D) seismic volume visualization and interpretation workflow contributes to resolving the problem by mining and exposing critical geologic information from within seismic data volumes. Following 3D seismic data acquisition and processing, the interpretation workflow consists of four integrated phases from data selection and conditioning, to structure and facies characterization, to prospect evaluation and generation, to well-bore planning. In the data selection and conditioning phase, the most favored and frequently used data are the full-angle, limited-angle, and limited-azimuth stack amplitude with significant structure and facies enhancements. Signal-to-noise ratio, color scheme, dynamic range, bit resolution, and visual contrast all affect the visibility of features of interest. In the structure and facies characterization phase, vertical slicing along arbitrary traverses demonstrates structure styles, stratigraphic architecture, and reservoir geometry in the cross-sectional view. Time/depth slicing defines lateral and vertical variability in the structural trend and areal extent in the map view. Stratal slicing and fault slicing map chronostratigraphic seismic facies and cross-stratal, along-fault seismic signature. Volume flattening and structure restoration aid in unraveling paleostructural framework and stratigraphic architecture and their growth histories. In the prospect evaluation and generation phase, a combination of volume trimming, co-rendering, transparency, attribute analysis, and attribute-body detection is instrumental in delineating volumetric extent and evaluating spatial connectivity of critical seismic features. Finally, in the well-bore planning phase, informed decision-making relies on the integration of all the information and knowledge interrogated from 3D seismic data. Most importantly, interpreters’ geologic insight and play concept are crucial to optimal well-bore planning with high geologic potential and low economic risk.
Magnetic Exploration Methods: 75th Anniversary: The historical development of the magnetic method in exploration
Abstract The magnetic method, perhaps the oldest of geophysical exploration techniques, blossomed after the advent of airborne surveys in World War II. With improvements in instrumentation, navigation, and platform compensation, it is now possible to map the entire crustal section at a variety of scales, from strongly magnetic basement at regional scale to weakly magnetic sedimentary contacts at local scale. Methods of data filtering, display, and interpretation have also advanced, especially with the availability of low-cost, high-performance personal computers and color raster graphics. The magnetic method is the primary exploration tool in the search for minerals. In other arenas, the magnetic method has evolved from its sole use for mapping basement structure to include a wide range of new applications, such as locating intrasedimentary faults, defining subtle lithologic contacts, mapping salt domes in weakly magnetic sediments, and better defining targets through 3D inversion. These new applications have increased the method’s utility in all realms of exploration — in the search for minerals, oil and gas, geothermal resources, and groundwater, and for a variety of other purposes such as natural hazards assessment, mapping impact structures, and engineering and environmental studies.
Passive Seismic: Petroleum reservoir characterization using downhole microseismic monitoring
Abstract Imaging of microseismic data is the process by which we use information about the source locations, timing, and mechanisms of the induced seismic events to make inferences about the structure of a petroleum reservoir or the changes that accompany injections into or production from the reservoir. A few key projects were instrumental in the development of downhole microseismic imaging. Most recent microseismic projects involve imaging hydraulic-fracture stimulations, which has grown into a widespread fracture diagnostic technology. This growth in the application of the technology is attributed to the success of imaging the fracture complexity of the Barnett Shale in the Fort Worth basin, Texas, and the commercial value of the information obtained to improve completions and ultimately production in the field. The use of commercial imaging in the Barnett is traced back to earlier investigations to prove the technology with the Cotton Valley imaging project and earlier experiments at the M-Site in the Piceance basin, Colorado. Perhaps the earliest example of microseismic imaging using data from downhole recording was a hydraulic fracture monitored in 1974, also in the Piceance basin. However, early work is also documented where investigators focused on identifying microseismic trace characteristics without attempting to locate the microseismic sources. Applications of microseismic reservoir monitoring can be tracked from current steam-injection imaging, deformation associated with reservoir compaction in the Yibal field in Oman and the Ekofisk and Valhall fields in the North Sea, and production-induced activity in Kentucky, U.S.A.
Abstract Microseismic monitoring of reservoir processes can be performed using surface or near-surface arrays. We review the published technical basis for the use of the arrays and the historical development of the method, beginning with locating earthquakes through geothermal exploration to the growing field of hydraulic-fracture monitoring. Practical considerations for the array deployment and data processing are presented. The road ahead for the technology includes a move toward life-of-field buried arrays as well as opportunities for extended interpretation of the data, particularly inversion for source-mechanism estimation and measurement of anisotropy in the monitored subsurface.
Abstract One major cause of elastic wave attenuation in heterogeneous porous media is wave-induced flow of the pore fluid between heterogeneities of various scales. It is believed that for frequencies below 1 kHz, the most important cause is the wave-induced flow between mesoscopic inhomogeneities, which are large compared with the typical individual pore size but small compared to the wavelength. Various laboratory experiments in some natural porous materials provide evidence for the presence of centime-ter-scale mesoscopic heterogeneities. Laboratory and field measurements of seismic attenuation in fluid-saturated rocks provide indications of the role of the wave-induced flow. Signatures of wave-induced flow include the frequency and saturation dependence of P-wave attenuation and its associated velocity dispersion, frequency-dependent shear-wave splitting, and attenuation anisotropy. During the last four decades, numerous models for attenuation and velocity dispersion from wave-induced flow have been developed with varying degrees of rigor and complexity. These models can be categorized roughly into three groups according to their underlying theoretical framework. The first group of models is based on Biot’s theory of poroelasticity. The second group is based on elastodynamic theory where local fluid flow is incorporated through an additional hydrodynamic equation. Another group of models is derived using the theory of viscoelasticity. Though all models predict attenuation and velocity dispersion typical for a relaxation process, there exist differences that can be related to the type of disorder (periodic, random, space dimension) and to the way the local flow is incorporated. The differences manifest themselves in different asymptotic scaling laws for attenuation and in different expressions for characteristic frequencies. In recent years, some theoretical models of wave-induced fluid flow have been validated numerically, using finite-difference, finite-element, and reflectivity algorithms applied to Biot’s equations of poroelasticity. Application of theoretical models to real seismic data requires further studies using broadband laboratory and field measurements of attenuation and dispersion for different rocks as well as development of more robust methods for estimating dissipation attributes from field data.