SEISMIC PROCESSING

version 1.0 released 29/1/99

CONTENTS

 


INTRODUCTION

The purpose of seismic processing is to manipulate the acquired data into an image that can be used to infer the sub-surface structure. Only minimal processing would be required if we had a perfect acquisition system. Processing consists of the application of a series of computer routines to the acquired data guided by the hand of the processing geophysicist. There is no single "correct" processing sequence for a given volume of data. At several stages judgements or interpretations have to be made which are often subjective and rely on the processors experience or bias. The interpreter should be involved at all stages to check that processing decisions do not radically alter the interpretability of the results in a detrimental manner. A further chapter provides guidelines for specific processing problems encountered in various geological provinces.

Processing routines generally fall into one of the following categories:

In this chapter we briefly describe working with contractors and define a basic 2D marine processing flow which contains links to the more specialised descriptions of various processing routines.

return to contents

 


CONTRACTORS

Today most processing is carried out by contractors who are able to perform most jobs quickly and cheaply with specialised staff, software and computer hardware. There are currently five main contractors who are likely to have an office or an affiliation almost anywhere in the world where oil exploration is taking place. In addition there are many smaller localised contractors principally in London and Houston, and also some specialised contractors who concentrate on particular processing areas. These are summarised in the following table.

CONTRACTOR

SIZE

SPECIALITY

CGG

Large

3D, PSDM, Land.

WESTERN GEOPHYSICAL

Large

3D

GECO-PRAKLA

Large

3D, Land

PGS

Large

3D

VERITAS (former DIGICON)

Large

3D, 2D, Land.

ENSIGN

Moderate

3D marine, some 2D

SPECTRUM

Moderate

2D, basic 3D, scanning

GEOTEAM

Small

2D

ROBERTSONS

Small

2D, basic 3D

ODEGAARD

Specialised

Inversion

JASON

Specialised

Inversion

GXT

Specialised

PSDM (2D & 3D)

PARADIGM

Specialised

PSDM (2D & 3D)

 

In order to place a contract with one of the contractors, a budget and AFE (authorisation for expenditure) should be in place. The processing manager (or nominated staff) can then raise a CRD (contract request document) with contracts staff in Aberdeen. A large contract would typically be tendered to all the major and a subset of the minor contractors and the successful bid chosen by the contracts engineer and processing manager using criteria such as cost, turnaround, quality and specialised knowledge. Amerada Hess Limited has call-off contracts with all the major contractors such that the tendering, evaluation and award process can be completed in less than two weeks if required. For typical tasks most of the major contractors can be considered approximately equivalent, although to a certain extent this does depend on individual personnel involved. For certain smaller specialised projects a single-source purchase order may be raised, subject to appropriate justification. Depending on project size a typical 2D project may take 4-12 weeks and a typical 3D project 4-12 months to complete.

It is usual for the processing manager or nominated representative to follow the progress of the contract and liase with the exploration staff during the duration of the contract. It is usually the responsibility of the exploration staff to keep partners informed of progress. The contractor will typically produce a schedule at the beginning of a project and will almost certainly fail to keep to it. The schedule should include all appropriate testing, meetings, production and final copies (including reporting) for the project.

return to contents

 


A PROCESSING FLOW

 

processing job

A processing flow is a collection of processing routines applied to a data volume. The processor will typically construct several jobs which string certain processing routines together in a sequential manner. Most processing routines accept input data, apply a process to it and produce output data which is saved to disk or tape before passing through to the next processing stage. Several of the stages will be strongly interdependent and each of the processing routines will require several parameters some of which may be defaulted. Some of the parameters will be defined, for example by the acquisition geometry and some must be determined for the particular data being processed by the process of testing. It is reiterated that the parameter choice is often subjective. The best rule is to keep things a simple as possible unless forced to do otherwise.

TESTING

For the unknown parameters the processor will establish a series of tests which usually take a representative sample of the data (e.g. from X to Y in the figure) and subject it to several parameter choices (B,C,D,E). The processor and interpreter will then compare the test panels and pick the parameter which they judge best corrects the data by comparing the various tests. The bulk of the data will then be processed with these parameter choices (production processing). When production is underway, the next processing stage can be tested. Often tests are constructed which trial several parameters simultaneously and it can become very easy to design tests which are too complicated and are consequently poorly diagnostic. The general rule is "the simpler the better". Particular care must be taken when scaling the tests (e.g. to raw panel A) to ensure that fair comparisons are made between them - a so called "apples to apples" comparison.

For a 2D project typically a dip line and a strike line would be chosen for testing. Sometimes parameters would be checked on other lines too. A brute stack would often be produced for each 2D section, sometimes onboard the acquisition vessel. This brute stack typically uses a single set of parameters (particularly a single velocity function) for the entire data volume but may be very useful for highlighting areas where changing geology or other conditions may require different seismic processing parameters. Some tests would be performed on whole lines (ideal), but some tests may be performed on small areas e.g. 500 trace segments.

Often a zone of interest is defined by the interpreter so that routines can be concentrated in this area. Note that this can result in degradation of other parts of the section. This can be a significant problem when processing a regional 3D survey when only a single 2D line is chosen for testing.

For 3D projects several representative 2D lines should be chosen for initial testing. As soon as the data grid has been established then several crosslines should also be selected for testing. Most contractors can now extract crossline information following the first production processing stage. Crosslines can be particularly useful for confirming test parameters. In addition any onboard processed data (if available) should be used.

return to contents

 


A Simple Marine Processing Flow

A typical processing sequence has been established throughout the years for marine 2D processing. The sequence is displayed graphically in the following figure. Click on any item in the sequence or figure to obtain more detailed information or proceed to a later written description of the sequence.

processing flow

Phases 1 to 5 are described as prestack processes. The stack process typically reduces the data volume to be processed by 30 to 60 times so that post-stack processes are often much quicker to apply. Post-stack processing will be concentrated on in this course following a discussion of key prestack processes such as velocity analysis.

return to contents

 


Advanced Processing Flows

Alternative processing flows can be selected from the following list (not complete as of 29/1/99).

return to contents

 



Prestack Processing Flows

The full prestack processing sequence follows:

SHOT DOMAIN

  1. DATA INPUT & QC: usually reformatted to an internal format more efficient than that provided by SEG standards. Bad or noisy traces are edited and geometry applied according to observers logs.
  2. DESIGNATURE: conversion of source wavelet to minimum phase equivalent to prepare data for deconvolution.
  3. RESAMPLING: data are often re-sampled from 2ms to 4ms following anti-alias filtering. This makes subsequent processing cheaper and does not appreciably reduce frequency content for typical deep targets.
  4. GAIN CORRECTION: to compensate for geometric divergence and other amplitude losses.
  5. (TRACE REDUCTION): often applied to reduce the group interval from 12.5m to 25m and consequently the CMP interval to 12.5m from 6.25m. By halving the number of traces the subsequent processing stages will be cheaper at little reduction of resolution. The reduction may be applied by summing adjacent traces (either with or without NMO), but should be performed by K-filter and trace drop.
  6. (MULTIPLE SUPPRESSION): some routines perform better in the shot domain for example tau-p domain deconvolution and certain wave-equation methods.

Items in () are optional.

CMP DOMAIN

  1. CMP GATHER: Mandatory sorting process from shot gathers to CMP gathers.
  2. DECONVOLUTION: To collapse seismic wavelet and suppress short period multiple reflections.
  3. (MULTIPLE SUPPRESSION): either using moveout or periodicity filters.
  4. (DMO OR PRESTACK MIGRATION): to remove the effects of dip and structure from velocity analysis.
  5. VELOCITY ANALYSIS: To obtain NMO corrections prior to stacking.
  6. NMO CORRECTION: Apply NMO using the velocities determined above.
  7. MUTE: removal of unwanted direct arrivals, refractions and NMO stretch.

Items in () are optional, but are almost always applied.

return to contents

 


Post-Stack Processing Flows

  1. STACK: increase signal-to-noise, attenuates multiples.
  2. (NOISE SUPPRESSION): e.g. FK, FX deconvolution.
  3. (DECONVOLUTION): Further multiple suppression.
  4. MIGRATION: collapses diffractions and correctly positions dipping events.
  5. (SPECTRAL BALANCING/WHITENING): resolution improvement.
  6. (ZERO-PHASE CONVERSION): improved well ties and resolution.
  7. (FILTER): removal of high and low frequency noise.
  8. (SCALE): bring key reflectors to a suitable gain level for interpretation.
  9. DISPLAY: either paper or to workstation
  10. REPORT: essential to describe the processing tests performed, reasons for the decisions made etc.

 

Items in () are optional, but are almost always applied.

return to contents

 


FORMATS

Almost every seismic processing system will have it's own format for seismic trace data, velocity files, horizon files etc. This can lead to considerable confusion. A number of standard interchange formats are used by Amerada Hess. These are:

return to contents