- Abstract
- Affiliation
- All
- Authors
- Book Series
- DOI
- EISBN
- EISSN
- Full Text
- GeoRef ID
- ISBN
- ISSN
- Issue
- Keyword (GeoRef Descriptor)
- Meeting Information
- Report #
- Title
- Volume

- Abstract
- Affiliation
- All
- Authors
- Book Series
- DOI
- EISBN
- EISSN
- Full Text
- GeoRef ID
- ISBN
- ISSN
- Issue
- Keyword (GeoRef Descriptor)
- Meeting Information
- Report #
- Title
- Volume

- Abstract
- Affiliation
- All
- Authors
- Book Series
- DOI
- EISBN
- EISSN
- Full Text
- GeoRef ID
- ISBN
- ISSN
- Issue
- Keyword (GeoRef Descriptor)
- Meeting Information
- Report #
- Title
- Volume

- Abstract
- Affiliation
- All
- Authors
- Book Series
- DOI
- EISBN
- EISSN
- Full Text
- GeoRef ID
- ISBN
- ISSN
- Issue
- Keyword (GeoRef Descriptor)
- Meeting Information
- Report #
- Title
- Volume

- Abstract
- Affiliation
- All
- Authors
- Book Series
- DOI
- EISBN
- EISSN
- Full Text
- GeoRef ID
- ISBN
- ISSN
- Issue
- Keyword (GeoRef Descriptor)
- Meeting Information
- Report #
- Title
- Volume

- Abstract
- Affiliation
- All
- Authors
- Book Series
- DOI
- EISBN
- EISSN
- Full Text
- GeoRef ID
- ISBN
- ISSN
- Issue
- Keyword (GeoRef Descriptor)
- Meeting Information
- Report #
- Title
- Volume

### NARROW

**Follow your search**

*Close Modal*

The Handbook of Poststack Seismic Attributes is a general reference for poststack seismic attributes intended for reflection seismologists in petroleum exploration. The goal of the book is to bring greater understanding and order to the important and rapidly evolving science of seismic attributes, so that geophysicists can apply attributes more effectively to interpret seismic data. To this end, the emphasize is on what all attributes have in common, what they mean, and what they measure, and the argument is made that the meaning of an attribute should guide both its implementation and its application. Certain attributes are judged to be useful and others to be useless, and the advantages as well as the shortcomings of attribute analysis are considered. Sufficient mathematics is provided to implement the attributes, favoring clarity and simplicity over mathematical rigor. In the manner of a handbook, methods and ideas are covered that are more likely to be encountered in practice, but no pretense is made of being comprehensive. The book begins by introducing the fundamental ideas that underlie all seismic attribute analysis and reviewing the history of seismic attributes from their origins to current developments. The characteristics of key and familiar poststack attributes are described, starting with attribute maps and interval statistics, and progressing through to complex trace attributes, 3D attributes that quantify aspects of geologic structure and stratigraphy, seismic discontinuity attributes, spectral decomposition, thin-bed analysis, waveform classification, recursive inversion for relative acoustic impedance, and spectral ratioing for Q estimation. How attributes are usefully combined through multiattribute analysis through volume blending, cross-plotting, principal component analysis, and unsupervised classification is discussed. The book ends with a brief overview of how seismic attributes aid data interpretation, with a look at bright spots, frequency shadows, faults, channels, diapirs, and data reconnaissance. The glossary provides definitions of seismic attributes and methods, and appendices provide necessary background mathematics.

#### Multicomponent Seismic Technology

Bob A. Hardage, Michael V. DeAngelo, Paul E. Murray, Diana SavaA principle that is emphasized throughout this book is that the physics of any multicomponent seismic technology cannot be understood unless that technology is viewed in terms of the particle-displacement vectors associated with the various modes of a seismic wavefield. This material therefore begins with a discussion of seismic vector-wavefield behavior to set the stage for subsequent chapters.Several approaches can be used to explain why each wave mode of nine-component (9C) and three-component (3C) seismic data that propagates through subsurface geology provides a different amount and type of rock/fluid information about the geology that the wave modes illuminate. Some approaches appeal to people who have limited interest in mathematics. Other options need to be structured for people who have an appreciation of the mathematics of wavefield reflectivity. Another argument that can be used focuses on the fundamental differences in P-wave and S-wave radiation patterns and the distinctions in target illuminations associated with 9C and 3C seismic sources. We will consider all of those paths of logic.A principle that will be stressed is that each mode of a multicomponent seismic wavefield senses a different earth fabric along its propagation path because its particle-displacement vector is oriented in a different direction than are the particle-displacement vectors of its companion modes. Although estimations of earth fabric obtained from various modes of a multicomponent seismic wavefield can differ, each estimate still can be correct because each wave mode deforms a unit volume of rock in a different direction, depending on the orientation of its particle-displacement vector. Those deformations sense a different earth resistance in directions parallel to and normal to various symmetry planes in real-earth media. The logic of that nonmathematical approach appeals to people who are interested in the geologic and petrophysical information that multicomponent seismic data can provide and are less concerned about theory and mathematics.A second approach that is helpful for distinguishing one-component (1C), 3C, and 9C wavefield behavior focuses on the mathematics of the reflectivity equation associated with each mode of the full-elastic seismic wavefield. The mathematical structure of the reflectivity equation associated with each seismic wave mode describes why and how petrophysical properties of the propagation medium affect different wave modes in different ways. The logic of that analytical approach is appreciated by scientists who are comfortable with mathematics.All of these concepts lead to the development of a new seismic-interpretation science based on multicomponent seismic data called elastic wavefield seismic stratigraphy.

#### Seismology of Azimuthally Anisotropic Media and Seismic Fracture Characterization

Ilya Tsvankin, Vladimir GrechkaTraveltimes of reflected waves (reflection moveout) in heterogeneous anisotropic media are usually modeled by multioffset and multiazimuth ray tracing (e.g., Gajewski and Pšenčĺk, 1987). Whereas anisotropic ray-tracing codes are sufficiently fast for forward modeling, their application in moveout inversion requires repeated generation of azimuthally-dependent traveltimes around many common-midpoint (CMP) locations, which makes the inversion procedure extremely time-consuming. Also, purely numerical solutions do not give insight into the influence of anisotropy on reflection traveltimes.This chapter is devoted to analytic treatment of conventional-spread reflection moveout in anisotropic media. For models with moderate structural complexity and spreadlength-to-depth ratios close to unity, traveltimes in CMP geometry are welldescribed by normal-moveout (NMO) velocity defined in the zero-spread limit (Tsvankin and Thomsen, 1994; Tsvankin, 2005). Even in the presence of nonhyperbolic moveout, NMO velocity (Vnmo) is still responsible for the most stable, conventionaloffset portion of the moveout curve. The description of Vnmo given here provides an analytic basis for moveout inversion, helps evaluate the contribution of the anisotropy parameters to reflection traveltimes, and leads to a significant increase in the efficiency of traveltime modeling/inversion methods.

#### Digital Imaging and Deconvolution: The ABCs of Seismic Exploration and Processing

Enders A. Robinson, Sven TreitelDigital Imaging and Deconvolution: The ABCs of Seismic Exploration and Processing (SEG Geophysical References Series No. 15), covers the basic ideas and methods used in seismic processing, concentrating on the fundamentals of seismic imaging and deconvolution. Most chapters are followed by problem sets. Some exercises supplement textual material; others are meant to stimulate classroom discussions. Text and exercises deal mostly with simple examples that can be solved with nothing more than pencil and paper. The book covers wave motion; digital imaging; digital filtering; various visualization aspects of the seismic reflection method; sampling theory; the frequency spectrum; synthetic seismograms; wavelets and wavelet processing; deconvolution; the need for continuing interaction between the seismic interpreter and the computer; seismic attributes; phase rotation; and seismic attenuation. The last of the 15 chapters gives a detailed mathematical overview. Digital Imaging and Deconvolution, nominated for the Association of Earth Science Editors award for best geoscience publication of 2008–2009, will interest professional geophysicists, graduate students, and upper-level undergraduates in geophysics. The book also will be helpful to scientists and engineers in other disciplines who use digital signal processing to analyze and image wave-motion data in remote-detection applications. The methods described are important in optical imaging, video imaging, medical and biological imaging, acoustical analysis, radar, and sonar.

Geophysicists are often turned off by equations. This is unfortunate because equations are simply compact, quantitative expressions of relationships, and one should make an effort to understand the information that they convey. They tell us what factors are important in a relationship and their relative importance. They also suggest what factors are not relevant, except perhaps through indirect effects on the relevant factors. Graphs often help us visualize equations more clearly. We may think of derivatives as simply measures of the slopes of curves, maxima and minima being merely the places where the slopes are zero, and integration as simply summing up the area under a curve. An imaginary exponential indicates a periodic function. Limitations imposed by initial assumptions or by approximations in their derivations apply to most equations, and these should be appreciated in order to avoid drawing erroneous conclusions from the equations.

Three-dimensional (3-D) seismic surveys have become a major tool in the exploration and exploitation of hydrocarbons. The first few 3-D seismic surveys were acquired in the late 1970s, but it took until the early 1990s before they gained general acceptance throughout the industry. Until then, the subsurface was being mapped using two-dimensional (2-D) seismic surveys.Theories on the best way of sampling 2-D seismic lines were not published until the late 1980s, notably by Anstey, Ongkiehong and Askin, and Vermeer. These theories were all based on the insight that offset forms a third dimension, for which sampling rules must be given.The design of the first 3-D surveys was severely limited by what technology could offer. Gradually, the number of channels that could be used increased, leading to discussions on what constitutes a good 3-D acquisition geometry. The general philosophy was to expand lessons learned from 2-D acquisition to 3-D. This approach led to much emphasis on the properties of the CMP gather (or bin), because good sampling of offsets in a CMP gather was the main criterion in 2-D design. Three-D design programs were developed that concentrated mainly on analysis of bin attributes and, in particular, on offset sampling (regularity, effective fold, azimuth distribution, etc.).This conventional approach to 3-D survey design is limited by an incomplete understanding of the differing properties of the many geometries that can be used in 3-D seismic surveys. In particular, the sampling requirements for optimal prestack imaging were not properly taken into account. This book addresses these problems and provides a new methodology for the design of 3-D seismic surveys.The approach used in this book is the same as employed in my Seismic Wavefield Sampling, a book on 2-D seismic survey design published in 1990: Before the sampling problem can be addressed, it is essential to develop a good understanding of the continuous wavefield to be sampled. In 2-D acquisition, only a 3-D wavefield has to be studied, consisting of temporal coordinate t, and two spatial coordinates: shot coordinate xs, and receiver coordinate xr. In 3-D acquisition, the prestack wavefield is 5-D with two extra spatial coordinates, shot coordinate ys, and receiver coordinate yr.In practice, not all four spatial coordinates of the prestack wavefield can be properly sampled (proper sampling is defined as a sampling technique which allows the faithful reconstruction of the underlying continuous wavefield). Instead, it is possible to define three-dimensional subsets of the 5-D prestack wavefield which can be properly sampled. In fact, the 2-D seismic line is but one example of such 3-D subsets.

#### Geophysics in the Affairs of Mankind: A Personalized History of Exploration Geophysics

L. C. (Lee) Lawyer, Charles C. Bates, Robert B. RiceScientists have long been trained to build on the successes or failures of their predecessors, their teachers, and their fellows largely through scientific associations and their publications. Such societies range from small, local ones to huge organizations with membership drawn from over 100 countries. The oldest and most prestigious for geophysicists is the Royal Society, given both its name and charter by Britain’s King Charles II back in 1660. The Royal Astronomical Society, chartered in 1820, has also had a marked interest in geophysical matters, even to the extent of publishing a Geophysical Journal, because the earth is very much a part of the planetary system. Within the United States, the prestigious National Academy of Sciences (NAS) was started as an ally of government at the initiative of President Abraham Lincoln who asked the scientific community in 1863 for technical assistance with the war effort. Geophysical societies per se did not appear until the early 1900s. As a result of the great San Francisco earthquake, the Seismological Society of America (SSA) was formed in 1906. The International Union of Geodesy and Geophysics (IUGG) came into being in 1911, while its U.S. interface, the American Geophysical Union (AGU), was finally organized in 1919. The field of exploration geophysics lagged even further, with the Society of Exploration Geophysicists not being incorporated until 1930.Long before the advent of scientific societies, perceptive men had been contending with the physical forces of nature. Aristotle (384–322 BC) compiled the first known geophysical treatise, the Meteorologica, less than half of which pertained to weather matters—the remainder dealt with oceanography, astronomy, and meteors (also called shooting stars). Formal seismic instrumentation appeared as early as A.D. 132 when Chang Heng set up a seismoscope in China that not only indicated that an earthquake had occurred but also the direction of the first motion. However, man’s formal knowledge of the physics of the earth did not change much from the time of Aristotle until late in the European Renaissance, when the fertile mind of Leonardo da Vinci (1452–1519) initiated new thinking on this subject, as he did in so many others. Early in the 16th century, he studied, for example, the tides of the Euxine (Black) and Caspian Seas, as well as the mechanics and inherent dangers of rock slippage along a geological fault near Florence, Italy. He also deduced that Alpine rocks were at one time submerged for he found embedded sea shell fossils.

“This reference manual is designed to enable more geophysicists to appreciate static corrections, especially their limitations, their relationship with near-surface geology, and their impact on the quality of final interpreted sections. The book is addressed to those involved in data acquisition (datum static corrections), data processing (datum static and residual static corrections), and interpretation (the impact that unresolved static corrections, especially the long-wavelength or low-spatial-frequency component, have on interpretation of the final section). Simple explanations of the underlying principles are included in an attempt to remove some of the mystique of static corrections. The principles involved are illustrated with simple models, supplemented with many data examples. This book details differences in approaches that must be considered among 2D, 3D, and crooked-line recordings as well as between P-wave and S-wave surveys. Static corrections are shown to be a simplified yet practical approach to modeling the effects of the near surface where a more correct wavefield or raypath-modeled method might not be undertaken efficiently. Chapters cover near-surface topography and geology; computation of datum static corrections; uphole surveys; refraction surveys; static corrections limitations and effect on seismic data processes; residual static corrections; and interpretation aspects. An extensive index and a large list of references are included.”

#### Tensors of Geophysics, Volume 2: Generalized Functions and Curvilinear Coordinates

Frank Hadsell, Richard HansenYou must get out into the world if you are to understand your home. If you have never been “beyond the outhouse and the spring,” then you lack a certain perspective (John Niland, Wyoming sheepherder).This is also true in the abstract world of mathematics. A study of complex numbers imparts a greater understanding of real numbers, and one has a much better grasp of arithmetic after studying algebra. This chapter shows how one can generalize the idea of functions to obtain not only a better appreciation of functions but also a more versatile language for the study of geophysical theory.One can argue that all field measurements can be represented by functions and that the “aim of the game” is to predict future measurements. Thus, why is increased versatility needed in mathematics? Why is something more general than functions needed? The answer is that increased versatility is needed to analyze more complicated physical systems in terms of simpler component parts, and these relatively simple component parts often are not describable in terms of functions. For example, when dealing with waves, one can define a propagation distribution that describes the process of propagation separately from the source and receiver processes. This propagation distribution is generally not a function.The algebra of distributions was designed by Schwartz (1952) on a very pragmatic theme. Walt Whitman (Colorado School of Mines emeritus professor of mathematics and geophysics) says it is a “formalization of a potpourri of tricks” that have evolved in engineering and science since the time of Oliver Heaviside (1891) and before. A thorough understanding of distributions requires an appreciation of sets and measure as presented by Roman (1974). These subjects are treated briefly in TOG5.Early in academic training, we were exposed to the function as an ordered sequence of pairs of numbers. We also could view the function (single valued) as the process or rule for associating with a given number another number. From such a viewpoint sin (t) could be considered the process of associating with any given number another number , where g = sin (t). Examples from this ordered sequence are shown in Table 1. Note there is more to the sine story than appears in Table 1. Not only are there other values of the variable, but there are also other variables, as discussed later in this chapter.1-D tensors.—Think of tas the prototype coordinate of a one-space.

#### Geologic Applications of Gravity and Magnetics: Case Histories

Richard I. Gibson, Patrick S. MilleganThe idea for this book came from a perceived lack of recent, instructive examples of exploration-oriented interpretations of gravity and magnetic data. The Society of Exploration Geophysicists two volumes, Geophysical Case Histories, are probably closest in philosophy to this book. Published in 1948 and 1956, many of the examples in the Case Histories are relatively dated and specific to particular areas. We hope this new book provides an update that includes lessons about gravity and magnetic exploration that can be applied to many parts of the world. The Utility of Regional Gravity and Magnetic Anomaly Maps (SEG, 1985, W.J. Hinze, editor) contains some excellent papers dealing with tectonics that have clear bearing on hydrocarbon exploration, but no paper shows the relationships among hydrocarbon accumulations, exploration, gravity, and magnetics. Geophysical texts focusing on gravity and magnetics, including L.L. Nettleton's classics, include only a few (albeit often excellent) case histories, and many are dated.Thus, this book's target audience is geologists and geophysicists in operations offices, actively involved in exploration at any level from basin analysis to prospect generation. Although most of the papers deal with hydrocarbon exploration, several papers relate to gravity and magnetic data in mining and environmental applications. A final section is included on new developments, the state of the art.The book is not intended for gravity and magnetics specialists (although we hope they will find it interesting), or for geophysicists interested in theory, acquisition, and processing, unless those aspects are important to the geologic exploration problem and to the decisionmaking process.We believe that the philosophical approach to interpretation is almost as important as some aspects of a technical interpretation itself. This book reveals the diversity of philosophies that gravity and magnetic interpreters embrace, as well as the common threads to which all interpreters aspire.This book is not a textbook, although we have tried hard to highlight the exploration lessons inherent in each technical paper. Additional instructional aspects of the book are the glossary of gravity and magnetic terms, provided by Integrated Geophysics Corporation (with assistance from Richard Hansen of Pearson, DeRidder & Johnson) and an annotated bibliography, which has pointers to the rich literature of gravity and magnetics. Other short "lessons" can be found in stand-alone illustrations or short features throughout the book.We thank Ray Thomasson for continual encouragement, suggestions, and prodding. Reviewers, whose efforts are appreciated greatly, include Dale Bird, Bill Pearson, Mark Odegard, and several anonymous reviewers. We appreciate the help of the AAPG, especially Ken Wolgemuth, in this, the first effort at serious book publication by the coeditors.

“Geophysicists come from diverse academic disciplines including physics, geology, mathematics, engineering, and computer science. Students need a source where they can acquire a common language of mathematics that is appropriate to geophysics. This volume relies on five basic principles: conservation of momentum, conservation of energy, Maxwell's equations, conservation of mathematical form, and embedding of calculi. It is assumed that those who study this book have a respectable background in mathematics, physics, and computer science as applied to time-series analysis. This book is intended for students who wish to acquire more depth in the field of geophysics.”

“Written for both the nongeophysicist and the practicing geophysicist, this book collects many of the formulas, principles, concepts, and field approximations of seismic survey design. The basics of 2D and 3D design in this book offer an introduction to the nongeophysicist and provide a good review for the practicing geophysicist. Arrays, obstacles, and special problems are discussed, as are aspects introduced by 3D surveys. The author explores design attributes such as fold, costs, and field time.”

#### Seismic Wavefield Sampling: A Wave Number Approach to Acquisition Fundamentals

Michael R. Cooper, Gijs J. O. VermeerThe use of reflection seismic for the exploration of hydrocarbons entered a new epoch with the introduction of the horizontals tacking method in the early 1950s (Mayne, 1962, Sheriff and Geldart, 1982). In his pioneering article Mayne explained the benefit of using shorter receiver patterns for the preservation of reflected signals while delegating some of the groundroll suppression to the stacking process.For a long time the 6-fold stack was the ultimate of the seismic method, due to limitations in acquisition and processing hardware. Not until the late 1970s did reductions in station spacing, combined with a large increase in the number of recorded channels, allow higher stacking multiplicities leading to further drastic improvements in the quality of the final seismic reflection sections.The introduction of multiple-coverage recording called for the use of multichannel processing. Stacking, the multichannel process introduced first, was soon followed by post stack multichannel processes such as velocity filtering (Fail and Grau, 1963; Embree et al., 1963) and migration.Prestack multichannel processes became more feasible and even mandatory when the increase in number of recorded channels led to larger multiplicities and the reduced pattern lengths resulted in lower signal- to-noise (S/N) ratios. Some multichannel processes developed in the 1970s and 1980s are: multiple elimination, surface consistent statics correction, surface consistent deconvolution, slant stacking, dip move out (DMO) correction, and prestack migration. Most, if not all, of these techniques perform best for high multiplicities and regular spatial sampling.Though a variety of sophisticated multichannel processing techniques were developed by the best scientists in the industry, not much attention has been paid to the basic theories required for an optimal definition of acquisition parameters. Progress in the field of data acquisition has been driven by technology, experimentation and intuition, rather than by sound theories.A breakthrough in the formulation of seismic data acquisition techniques came with Nigel Anstey’s papers on the stack-array approach (1986a and b). Anstey’s recommendations, which were based on experience and intuition, need to be modified and extended on the basis of a more theoretical approach to the three-dimensional aspects of multiple coverage data. This theoretical background is provided here and the consequences for seismic data acquisition and seismic data processing are elaborated upon.The reader’s basic knowledge of the seismic method and of signal processing is assumed. Sheriff and Geldart (1982) is recommended as a more general introduction to seismic exploration Some aspects

#### An Overview of Exploration Geophysics in China — 1988

Zhao Jingxiang, Wang Yanjun, Fu Xuexin, Stanley H. Ward“This is the first collection of technical papers providing a general picture of exploration geophysics in China. Many case histories are included, plus some theory and technical developments.”

#### A Practical Introduction to Borehole Geophysics: An Overview of Wireline Well Logging Principles for Geophysicists

J. Labo, Samuel H. Mentemeier, Charles A. Cleneay“The introduction to borehole geophysics presented here emphasizes hardware, operational aspects, key geophysical measurements along with their pitfalls, and an overview of well log interpretation principles. This introduction gives an explanation of what is seen at the wellsite while the interpretation chapters aid in understanding how logs are used for formation evaluation, their most immediate purpose. This overview will help in understanding how each piece of a logging course fits together. By understanding well-logging principles, an explorationist will have a better knowledge of geophysical well logging than is provided by an interpretation course alone and will develop a better background from which to make log quality judgments.”