Data Quality and Management
The importance of data quality and management continues to increase as we address ever more demanding interpretation challenges and move irreversibly farther into the workstation environment. In the glare of technological developments enabled by massive growth in computing power, we must not lose sight of the critical needs to assess data quality as an essential element of every interpretation project and to manage the burgeoning volumes of data and interpretation products that threaten to overwhelm us. More than you may realize, these two concerns can affect the quality, timeliness, and ultimately the business value of your interpretation.
A discussion of seismic data quality necessarily begins by defining exactly what is meant by “quality.” In its most general sense, quality is the degree to which something fulfills its intended purpose. All measures of seismic data quality are inherently subjective, so it is important to know why a particular data set was acquired and processed the way it was so as to set the proper context for assessing its quality. For example, you wouldn't grade data from a conventional 3D survey purposely acquired and processed for deep exploration as poor quality because they aren't suitable for evaluationing shallow drilling hazards. Similarly, you wouldn't consider data from a 2D high-resolution shallow hazards survey as poor quality because they are useless for deep exploration (compare Figures 1 and 2). Given the purpose for a data set, you evaluate quality based on specific characteristics according to the degree to which the data set suits its purpose.
Assessing seismic data quality is one of the most important aspects of your job as a seismic interpreter. It is an expectation that you satisfy and a requirement that you meet in every interpretation project. Your ability to describe and effectively communicate your evaluation of data quality develops over time as you gain experience, expand your knowledge of seismic data acquisition and processing, and broaden your exposure to different elements of geology in a wide range of settings and environments.
Figures & Tables
First Steps in Seismic Interpretation
Accurate interpretation of geophysical data — in particular, reflection seismic data — is one of the most important elements of a successful oil and gas exploration program. Despite technological advances in data acquisition and processing and the regular use of powerful computers and sophisticated software applications, you still face a tremendous challenge each time you begin to reconstruct the geologic story contained in a grid or volume of seismic data — that is, to interpret the data. On occasion, this interpretive tale can be clearly told; but most of the time, each page of each chapter is slowly turned, and rarely is the full meaning of the story completely understood.
Where the correlation of one reflection record with another is very easy, little needs to be said. Almost anyone can understand such a correlation. On the other hand, this is a rare occurrence. The usual thing is for the correlation to be so difficult as to be impossible. It is for this reason that correlation procedure can hardly be described in words (Dix, 1952).
Although Dix is speaking about the correlation of individual reflection records, which were used routinely before the advent of continuous common-depth-point (CDP) profiling, he clearly recognized the essence of interpretation as the considered extraction of geologic information from indirect geophysical measurements. His words are no less relevant and applicable now than they were 60 years ago, even in view of the high standards of data quality made possible by advances in seismic acquisition and processing, to say nothing of accompanying developments in interpretation technology. In the modern interpretation environment, you still face correlations that are “so difficult as to be impossible“ because these correlations define the frontiers of opportunity, the ones posing the sternest challenges and ultimately leading to the greatest rewards.
The primary aim of this book is to describe Dix's correlation procedure in terms of the science, data, tools, and techniques now used in seismic interpretation in the oil and gas industry. As an individual geoscientist, you develop and apply your own approach and style when interpreting seismic data. You continually revise and refine correlation procedures during the course of your career and expand them as you complete different interpretation projects. With experience, you learn to check and recheck the validity of your procedures to fully understand the rules of evidence that govern their use:
You must have a good understanding of seismic acquisition and processing principles as well as fundamentals of geology before beginning to collect interpretive evidence and solve interpretation problems correctly.