work flows od

46
OpendTect Workflows Documentation version 4.4 dGB Earth Sciences Copyright © 2002-2012 by dGB Beheer B.V. All rights reserved. No part of this publication may be reproduced and/or published by print, photo print, microfilm or any other means without the written consent of dGB Beheer B.V. Under the terms and conditions of either of the licenses holders are permitted to make hardcopies for internal use: GNU GPL Commercial agreement Academic agreement Table of Contents 1. Introduction 1.1. Basic manipulation 1.2. Start a New Project 1.3. Import data 1.3.1. Import Seismic 1.3.2. Import Horizons 1.3.3. Import Wells 1.3.4. Import Faults 1.3.5. Import FaultStickSets 2. Attributes 2.1. Evaluate attributes 2.2. Dip-steering - Background vs Detailed 2.3. Spectral Decomposition 3. Filters 3.1. Dip-steered diffusion 3.2. Dip-steered median filter 3.3. Fault enhancement filter 3.4. Ridge enhancement filter 3.5. Spectral Blueing 4. Inversion and Rock Property Prediction 4.1. Colored Inversion 4.2. MPSI Stochastic Inversion 4.2.1. Variogram Analysis 4.3. Neural Network Rock Property Prediction 5. Object detection 5.1. Common Contour Binning 5.2. Chimney Cube 5.3. Fault Cube 5.4. Fingerprint 5.5. UVQ waveform segmentation 6. Sequence Stratigraphy

Upload: khurram-shahzad-qureshi

Post on 12-Jul-2016

15 views

Category:

Documents


1 download

DESCRIPTION

Manual OD

TRANSCRIPT

Page 1: Work Flows OD

OpendTect Workflows Documentation version 4.4

dGB Earth Sciences

Copyright © 2002-2012 by dGB Beheer B.V. All rights reserved. No part of this publication may be reproduced and/or published by print, photo print, microfilm or any other means without the written consent of dGB Beheer B.V. Under the terms and conditions of either of the licenses holders are permitted to make hardcopies for internal use: GNU GPL Commercial agreement Academic agreement

Table of Contents 1. Introduction

1.1. Basic manipulation1.2. Start a New Project1.3. Import data

1.3.1. Import Seismic1.3.2. Import Horizons1.3.3. Import Wells1.3.4. Import Faults1.3.5. Import FaultStickSets

2. Attributes2.1. Evaluate attributes2.2. Dip-steering - Background vs Detailed2.3. Spectral Decomposition

3. Filters3.1. Dip-steered diffusion3.2. Dip-steered median filter3.3. Fault enhancement filter3.4. Ridge enhancement filter3.5. Spectral Blueing

4. Inversion and Rock Property Prediction4.1. Colored Inversion4.2. MPSI Stochastic Inversion

4.2.1. Variogram Analysis4.3. Neural Network Rock Property Prediction

5. Object detection5.1. Common Contour Binning5.2. Chimney Cube5.3. Fault Cube5.4. Fingerprint5.5. UVQ waveform segmentation

6. Sequence Stratigraphy

Page 2: Work Flows OD

6.1. Chrono-stratigraphy6.2. Wheeler Transformation6.3. Stratal Slicing6.4. Systems Tracts Interpretation

7. Processing7.1. Time-depth conversion of 2D data

Page 3: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

Chapter 1. Introduction

Table of Contents 1.1. Basic manipulation 1.2. Start a New Project 1.3. Import data

This document describes various work flows in OpendTect + plugins. We describe the purpose, what software is needed (OpendTect only or commercial plugins), and how to do it. The workflows are given in the form of a bullet list describing the sequential steps to be taken. Where possible, links to the User doc and the OpendTect and dGB websites are given with further information. For the latter links to work, you need access to the Internet.

1.1. Basic manipulation

Purpose: To load (calculate), move and display seismic information. Theory: OpendTect is a system for interactive data analysis. Seismic data is either retrieved from files stored on disk or data is calculated on-the-fly. You only retrieve or calculate what is needed at user-specified locations. All elements (inlines, crosslines, time-slices, random lines, sub-volumes, 2D seismic, horizons, faults (v3.2+), picksets, wells, bodies, annotations) are controlled from the tree. Only data that is present in the tree is currently residing in memory. In the scene, there are two basic modes of operation: View mode is for rotating, zooming and panning. Interact mode is for moving and resizing elements. Toggle between the modes by pressing the escape button on the keyboard or clicking the hand or arrow icon. Software: OpendTect Workflow:

1. To view a seismic inline from a 3D volume: Right-click on the Inline entry in the tree.

2. Right-click on the new element and Select Attribute -> Stored data -> your seismic file (to do this you must

Page 4: Work Flows OD

already have imported seismic data into OpendTect).

3. To position the data either: 1) right click on the element -> Position, 2) fill in the line number in the position field (upper left corner of the UI, or 3) switch to interact mode (arrow icon) use the green anchors to resize, left click and drag to move and shift left click and drag to pan. Use right-click to reset accidental moves.

4. To change the view switch to view mode (hand icon). To rotate: left click and drag; to zoom: middle mouse wheel; to pan: middle mouse button click and drag.

Tips:

1. Mouse button settings can be changed under Utilities -> Settings -> Mouse controls.

2. Controlled zoom in view mode: press key board S and left click in the scene on the position to zoom to.

For more info, see this Tutorial video:

Basic interaction (flash video)

Prev Home Next

OpendTect Workflows Documentation version 4.4

Start a New Project

Page 5: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

1.2. Start a New Project

Purpose: Set up the structure for a new OpendTect project. Theory: OpendTect stores data in flat files organized in directories and managed by OpendTect's file management system. The root directory for projects must be created by OpendTect. Usually a root directory is created at installation time. Alternatively, a new root directory can be created when you define the new project. For each project OpendTect requires the survey boundaries and the transformation between inline, crossline numbers and X,Y rectangular co-ordinates to be specified. OpendTect projects can be set up for only 3D seismic, only 2D seismic and for 2D+3D seismic. OpendTect does not require 2D lines to lie inside the survey definition boundaries. Software: OpendTect Workflow:

1. Press the Survey Selection icon and press New to create a new project.

2. Specify a Survey name and Survey directory name and the type of project (2D, 3D, or both 2D and 3D).

3. Afterwards, specify the boundaries and inline, crossline to X,Y transformation. This can be done in different ways: (1) set working areas for 2D only (2) scan a 3D seismic data set, (3) copy from another survey, (4) manually specify 3 corner points (2 on the same inline, coordinate settings: easy) or (5) manually specify the transformation (coordinate settings: advanced).

4. The most commonly used method here is to scan a 3D seismic data set. Go to Range/coordinate settings: scan SEG-Y files. Afterwards, in the SEG-Y tool window select an input SEG-Y file and type (e.g. 3D volume or 2D line). Further, in this window you can also manipulate the headers of the SEG-Y file by pressing Manipulate.

5. Manipulating the headers could be necessary in case there is a prior information that a change is required in the sampling rate, start time or number of samples per trace (this is done by editing the binary header). Further, a static shift to the X/Y coordinates (e.g. to move them Northwards) can be applied. This is done by editing the trace headers (viz. "xcdp" and "ycdp") using the "Trace header calculation" option and specifying the formula: xcdp = xcdp + a and ycdp = ycdp + b; a and b are the required static shifts in X and Y directions respectively.(Optional)

6. Moreover, if the trace numbers are missing (in the SEG-Y header) they can be computed; provided that the survey area is rectangular, the Inline/Crossline range is known and there are no gaps in the inlines and crosslines. This can be done by specifying the following formula in the "Trace header calculation" dialogue box. For any Inline, IL and Crossline, XL the associated trace number can be calculated as: Trace-Number = (total number of crosslines) x (IL - (first-inline-number)) + (XL- (first-crossline-number + 1)).(Optional)

7. Leave rest of the options as default and press Next.

8. Use the options "No" or "Almost" if it is not certain that the file is in a correct SEG-Y Rev.1 format in the SEG-Y revision dialogue.

Page 6: Work Flows OD

9. If "No" is chosen, a modification in the Inline/Crossline and/or X/Y-coordinate byte locations can be done.

Further, you can overrule the coordinate scaling, start time and the sampling rate (this can also be achieved using the previously described Manipulate option).

10. If "Almost" is chosen, you have the possibility to overrule only the coordinate scaling, start time and the sampling rate.

11. Optionally, a pre-scan of the SEG-Y file using a limited number of traces can also be done at this point, to QC the modifications made here.

12. After having finished with the revision (or in case the file was already in a correct SEG-Y Rev.1 format without any revision requirement and the "Yes" option was chosen) press OK. This will automatically fill the "Survey ranges" and the "Coordinate settings" field alongside providing a full survey scan report. If the survey is in the Depth domain, manually change the "Z range" to meter.

13. This scan report could be analyzed to check if the range of Inlines/Crosslines and X/Y-coordinates is sensible and whether there are any gaps in inlines or crosslines. Further, an amplitude value range of the raw data along-with various clipping ranges is provided. This information can be used to clip the seismic amplitudes in order to reduce the file size (e.g. a 5% clipping range means that after clipping all the samples having an amplitude count of less than 5% will be given the amplitude of the samples exactly at 5% count, thus changing the overall amplitude range of the seismic data). This clipping range is applied by scaling the amplitude values of the seismic data. You can use the two scalars to store seismic data in either "16" or "8" bit format while importing from SEG-Y files.(Optional)

14. In the end, to leave the "Survey selection" window, press OK. After pressing 'OK' the software will ask you if you want to import the SEG-Y file (i.e. the file used for setting up the survey) now or later.

Tips:

1. If you have a license for Workstation Access you can get the survey settings for a new OpendTect project directly from a seismic volume in a Seisworks/OpenWorks or Geoframe-IESX project. After setting up the project you import seismic (horizons, wells) from these data stores with Workstation Access.

For more info, see this Tutorial video:

Start new project (flash video)

Prev Home Next

Introduction Import data

Page 7: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

1.3. Import data

1.3.1. Import Seismic

Purpose: Importing data to a new or existing OpendTect project. Theory: Importing seismic data into OpendTect from SEG-Y files. Software: OpendTect Workflow: The seismic data is imported using the 'Survey' menu in the main toolbar (Survey - Import - Seismics).

1. Seismic data can be imported to OpendTect in various formats: SEG-Y (Survey - Import - Seismic - SEG-Y), Scanned SEG-Y (Survey - Import - Seismic - SEG-Y scanned), Simple file (Survey - Import - Seismic - Simple file - 3D/2D/Pre-Stack 3D/Pre-Stack 2D) and CBVS (Survey - Import - Seismic - CBVS - From file/From other survey).

2. More often than not it is the SEG-Y file which would be used for importing seismic data. Go to (Survey - Import - Seismic - SEG-Y) to access the SEG-Y tool window. Select an input SEG-Y file(s) and define its type. Multiple files can also be inputted (LINK). OpendTect scans the input file before going through the actual loading process to check for any discrepancies. For this, specify the Number of traces to examine and define the SEG-Y format. You can have the other options set to default, press Next.

3. In the Determine SEG-Y revision window specify if the file is 100% correct SEG-Y Rev.1 (choose "Yes") or not (choose "No" or "Almost").

4. If "No" is chosen, you can do a modification in the Inline/Crossline or X-coordinate/Y-coordinate byte location. The coordinate scaling, start time and the sampling rate can be overruled as well.

5. If "Almost" is chosen, only overruling the coordinate scaling, start time and the sampling rate is possible.

6. You can finally in all the three cases (i.e. "No", "Almost" and "Yes") make a Volume sub-selection and do a pre-scan of the SEG-Y file (for quick inspection).(Optional)

7. You can also specify Format/Scaling options to store the seismic data into various formats and to scale the seismic amplitude values to reduce the OpendTect file size. The scaling is achieved using the clipping information from the pre-scan report: Use the two different amplitude scalars to store the seismic data in either "16" or "8" bit format respectively (e.g. 2.5% clipping range : -6513 - 6427 [scl 16/8-bits: 5.031015 ; 0.019499], i.e. for 2.5% clipping use a "Factor" of 5.031015 to "Scale values" for a 16-bit format such that the final amplitude value range of the imported seismic data becomes -6513 - 6427).(Optional)(LINK)

8. Additionally, you could tick Optimize horizontal slide access to "Yes" to store the data on the disk in such a way

Page 8: Work Flows OD

that loading a Z-slice becomes faster. Toggle Adjust Z range to survey range to "Yes" if the Z-range in SEG-Y file is different from the survey.(Optional)

9. In the end, give a name to the output cube and press OK to finish the SEG-Y import. Optionally, check the Depth option if the survey is in depth domain.

1.3.2. Import Horizons

Purpose: Importing data to a new or existing OpendTect project. Theory: Importing horizons into OpendTect from general ASCII files Software: OpendTect Workflow: Importing horizons in OpendTect is done using the 'Survey' menu in the main toolbar (Survey - Import - Horizons). It can be done in three different ways: 1) Geometry 3D (Survey - Import - Horizons - Ascii - Geometry 3D ), 2) Attributes 3D (Survey - Import - Horizons - Ascii - Attributes 3D ) and 3) Geometry 2D (Survey - Import - Horizons - Ascii - Geometry 2D ). Geometry 3D

1. For importing 3D horizons go to (Survey - Import - Horizons - Ascii - Geometry 3D ) and select an input ASCII

file containing the horizon data.

2. If you want to view/inspect the file, do so by pressing Examine. Select the attributes you want to import (if already defined) or define new attribute names by pressing Add new. Afterwards, specify if the file contains any header (Fixed size or variable) information or not.

3. Press Define format to link the X/Y-coordinates (or Inline/Crossline), Z and the above selected attributes with their respective columns in the input ASCII file.

4. Additionally, you can press Scan Input File to verify the Inline/Crossline, X/Y-coordinate and the Time ranges. You could also check if there are any gaps in the inlines/crosslines or if duplicate positions are present.(Optional)

5. If you decide to fill the gaps in the horizon, toggle Fill undefined parts to "Yes" and choose amongst the various methods available for interpolation.(Optional)(LINK)

6. Finally, provide an output horizon name and choose the Base colour. If you want you could also tie it to any previously loaded level/horizon. Press Go to begin the import.

Attributes 3D

1. If you want to load additional attributes on an already existing 3D horizon go to (Survey - Import - Horizons - Ascii - Attributes 3D ). Follow steps (2-4) as described above.

2. Choose an existing 3D horizon in Add to Horizon and press Go. In addition, you can also make an area sub-selection to load the attributes only in a particular area for the 3D horizon.

Page 9: Work Flows OD

Geometry 2D

1. For importing 2D horizons go to (Survey - Import - Horizons - Ascii - Geometry 2D ).

2. Select the Input ASCII File containing 2D horizons and a 2D line set. You may also Examine the file.

3. Next, choose the horizon names (if already defined) to import by pressing Select Horizons to import or create new horizon names by pressing Add new.

4. Provide the File header information followed by Define format to link the selected horizons with their respective columns in the input ASCII file.

5. Finally, press Go to begin the import.

1.3.3. Import Wells

Purpose: Importing well/checkshot data to a new or existing OpendTect project. Theory: Importing well data into OpendTect from SEG-Y, general ASCII or LAS files. Software: OpendTect Workflow: All import functions are accessed from the 'Survey' menu in the main toolbar (Survey - Import)

1. Well data can be imported to OpendTect in various formats; ASCII (Survey - Import - Wells - ASCII - Track/Logs/Markers), VSP (SEG-Y) (Survey - Import - Wells - VSP (SEG-Y)), Simple Multi-Well (Survey - Import - Wells - Simple Multi-Well).

2. For importing well data from ASCII file, first specify the type of data (tracks, logs or markers).

3. For well tracks, specify the input file, header and format (column order in data file). If the well is vertical or no well track file is available, uncheck the well track check-box and provide the surface coordinate in meters, KB elevation and total depth (TD).

4. Afterwards, in the second section, specify the depth to time model, header and format definition. Here, you also have the option to specify whether its a check-shot data or normal well data. If no depth to time model is available, uncheck the check-box and specify a temporary, constant model velocity (by default 2000 m/s).

5. Finally, in the advanced/optional settings the surface coordinate can be set (if different from the starting coordinate in the well track). If known, the SRD can also be given (by default in meters, check the 'Feet' checkbox in case your SRD is in feet). A number of additional fields are available for further well information such as the Well ID, Operator and State. All information filled out here will also show up in the well information window in the well manager.

6. For importing various well logs, go to (Survey - Import - Wells - ASCII - Logs) and in the newly opened Manage Wells window press Import. Select the input LAS log file; specify (change) the Depth interval to load and if they are "TVDSS" or "MD" values. Finally, select the logs that you want to import.

7. Similarly, for importing the well markers, in the Manage Wells window press Edit markers. In the newly opened Well Markers pop-up window press Read new and select the input ASCII file, define the file header and format (column order in data file) and specify if you want to keep any existing markers or replace them completely.

Page 10: Work Flows OD

8. If well data is available in VSP (SEG-Y) format, first select and Examine the input SEG-Y file in the SEG-Y tool window. Then, specify if the data is in time or depth. In the second section, select to which well the VSP data should be added and specify a name for the new log. The output interval can be optionally limited.

9. The Simple Multi-Well import option allows creation of several horizontal wells at once either manually or by reading data from a file (click Read file.. in the bottom left corner of the Simple Multi-Well Creation window, specify the file and format in the popup dialog). 1.3.4. Import Faults

Purpose:Importing faults to a new or existing OpendTect project. Theory:Importing faults into OpendTect from general ASCII files. Software:OpendTect Workflow:Fault import function is accessed by going to (Survey - Import - Faults - Ascii3D).

1. Select the input ASCII file containing fault data and specify the Stick selection (Auto, Inl/Crl separation and Slope threshold). If "Slope threshold" is selected, provide a threshold value. Most of the time using the default option "Auto" will be sufficient.

2. Further, specify the Stick sorting (Geometric, Indexed or File order), file header and define the format (column order in data file).

3. Finally, specify an output name and press Go.

1.3.5. Import FaultStickSets

Purpose:Importing FaultStickSets to a new or existing OpendTect project. Theory:Importing FaultStickSets into OpendTect from general ASCII files. Software:OpendTect Workflow:FaultStickSets can be imported by going to (Survey - Import - FaultStickSets - Ascii3D/Ascii2D). Ascii3D

1. Select the input ASCII file containing FaultStickSet data, define the File header and the Format (i.e. column order in the data file). Optionally, you could press Examine to check and analyze the input file.

2. Finally, specify an output name and press Go.Ascii2D

1. Select the input ASCII file containing FaultStickSet data and also select the required 2D Line Sets. Optionally, you could press Examine to check and analyze the input file.

2. Afterwards, define the File header and the Format (i.e. column order in the data file).

3. Finally, specify an output FaultStickSet name and press Go.

Page 11: Work Flows OD

Prev Home Next

Start a New Project Attributes

Page 12: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

Chapter 2. Attributes

Table of Contents 2.1. Evaluate attributes 2.2. Dip-steering - Background vs Detailed 2.3. Spectral Decomposition

Attribute analysis is one of the key functionalities in OpendTect. Attributes are used for two purposes:

1. Visualization: you create a new view of the data that may lead to new geologic insight.2. Integration of information: combine attributes, e.g. using neural networks (neural networks plugin needed).

To use attributes, you must define an attribute set (Processing menu - Attributes, or press the Attributes icon). To start: either create your own attribute set, or select one of the default sets. It is possible to calculate attributes from attributes and to use mathematics and logics (if .. then .. else ..) to create your own attributes. The sky is the limit! The attributes defined in this window are the recipees for making the calculations. You can either use these recipees to create new attribute volumes (Processing menu - Create seismic output), or for quick on-the-fly calculations (right mouse click on the element in the tree). The system only calculates what is needed for creating a display. For an optimal use of the system you are advised to limit on-the-fly calculation of attributes and evaluation of parameter settings to small areas (part of an) inline, crossline, time-slice, 3D sub-volume, random line, or 2D line). Inline calculations are in general the fastest. Processing of large volumes needs time and is best done in batch mode so you can retrieve from stored data afterward. Attributes are an integral part of OpendTect. If you also have Dip-Steering plugin you can improve multi-trace attributes by extracting the information along a virtual horizon. This is called dip-steering and is supported for 2D and 3D seismic. In addition you will have a number of extra attributes: dip, azimuth, curvatures.

2.1. Evaluate attributes

Purpose: Find the best parameter setting for a particular attribute.

Page 13: Work Flows OD

Theory: Use visual inspection, common sense, experience, and seismic knowledge to evaluate attributes that are appropriate for what you are studying. Software: OpendTect Workflow:

1. Add an element to the tree, resize and position it to where you want to do the evaluation.

2. Open the Attribute Set window.

3. Add an Attribute to the list (e.g. Energy).

4. Press the Evaluate Attribute icon.

5. Specify the parameter to evaluate (Energy only has one: Timegate).

6. Specify initial value, increment and number of slices (e.g. [0,0],[-4,4] and 10, results in 10 energy calculations with time-gates: [0,0], [-4,4], [-8,8] ...[-36,36]).

7. Press Calculate. Note that the calculation will be done on the active element in the tree!

8. Use the slider to movie-style inspect the results and select the one you like best.

Tips:

1. To reduce calculation times reduce the size of the element. Inlines are generally faster than cross-lines, which are faster than time-slices.

2. When you add the element in the tree Add a second attribute to the element. Put the seismic in the first container and use the second container for the attribute evaluation. You can now switch element 2 (the top one) on/off to compare the attribute response with the seismic display.

For more info, see this Tutorial video:

Evaluate attributes (flash video)

Prev Home Next

Import data Dip-steering - Background vs Detailed

Page 14: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

2.2. Dip-steering - Background vs Detailed

Purpose: Calculate multi-trace attributes on 2D or 3D seismic by extracting data along seismic events, filter along seismic events, auto-track chrono-stratigraphic horizons (SSIS plugin), and to compute attributes based on local dip-azimuth information only. Theory: If we know, at each position in a 3D volume, the direction (dip-azimuth) of the seismic events we can use this information to create a local horizon at each position in the cube simply by following the dip and azimuth information from the center outwards. We call this process dip-steering. It requires a pre-calculated steering cube and is used to compute better (dip-steered) attributes and to filter the data. In addition, the steering cube can be used to create various attributes that are purely based on dip-azimuth information. For example curvature attributes but also the dip in the inline and crossline direction. Lastly the dip-steering process is used to auto-track chrono-stratigrpahic horizons in the SSIS plugin. We often use two versions of the steering cube: detailed and background. Detailed steering cube is the cube that is computed with the default settings. It is primarily used for attribute calculations. The background steering cube is a heavily smoothed (e.g. median filtered 4x4x4) version of the detailed steering cube. Its main use is in dip-steered filtering of seismic data. Software: OpendTect + Dip-Steering (+ SSIS) Workflow:

1. Create a Detailed Steering Cube (Processing - Steering - Create, e.g. FFT 3x3x3).

2. Create a Background Steering Cube (Processing - Steering - Filter, e.g. 4x4x4).

3. Open the Attribute Set Window and add new attributes from the Steering Cube (Dip, Curvature), or that use Steering - Full option where applicable. Examples: Similarity, Volume Statistics (median filter).

4. Apply the attributes to the seismic data in batch: Processing - Create seismic output, or on-the-fly: right-click on the element in the tree (e.g. part of a time-slice).

Tips:

1. To QC the Steering Cubes: create a Crossline dip attribute (Dip attribute) from the Detailed Steering Cube and from the Background Steering Cube and display both on an inline.

See also chapter 3.2 - dip-steered median filter

Prev Home Next

Attributes Spectral Decomposition

Page 15: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

2.3. Spectral Decomposition

Purpose: Analyze seismic data at different vertical scales. Typically applied to horizons to visualize thickness variations at sub-seismic resolution. Theory: When the thickness of an acoustic layer falls in the tuning range the seismic amplitude increases due to interference of top and bottom reflections. If you decompose the seismic response into its frequency components (or into wavelet scales) the largest amplitude corresponds to the layer's tuning range, hence, to its thickness. In this way, it is possible to interpret thickness variations at sub-seismic resolution by decomposing an interval around a mapped horizon. Software: OpendTect Workflow:

1. Add an element (horizon) to the tree.

2. Open the Attribute Set window and add Spectral Decomposition to the list.

3. Press the Evaluate Attribute icon to calculate the individual components (frequencies or wavelet scales).

4. Use the slider to movie-style inspect the components. Pressing Accept updates the Attribute definition to calculate this component only.

5. To use individual components for further work (e.g. to create a cube for frequency 10Hz) you must define each component as a separate attribute in the list.

Tips:

1. Spectral Decomposition applied to a horizon takes time. Use the "Store slices on Accept" option to save all components as Surface data belonging to that Horizon for later use. To scroll through Surface data use the Pg Up and Pg down keys.

2. Color blended display: RGBA blending attribute display is used to create a normalized color-blended display that often show features with greater clarity and enhances a detail map view. The user can blend the iso-frequency responses (Spectral Decomposition). For more information go to RGB display

Prev Home Next

Dip-steering - Background vs Detailed

Filters

Page 16: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

Chapter 3. Filters

Table of Contents 3.1. Dip-steered diffusion 3.2. Dip-steered median filter 3.3. Fault enhancement filter 3.4. Ridge enhancement filter 3.5. Spectral Blueing

Filters in OpendTect are implemented as Attributes that need to be defined in the attribute set window. Filters with a user interface are grouped under Filters. This group includes convolution, frequency filters, gap decon, and velocity fan filters. Filters without a user interface are filters that are constructed from (chains of) attributes. For example, using Reference Shift, Volume Statistics, and Mathematics, you can extract and manipulate data to construct your own filters. This group of filters contains, among others: dip-steered median filter, dip-steered diffusion filter, fault enhancement filter, and ridge-enhancement filter. A number of these filters are delivered with the system as default attribute sets.

3.1. Dip-steered diffusion

Purpose: Sharpen edges (faults) by filtering along the structural dip. Theory: In diffusion filtering, you evaluate the quality of the seismic data in a dip-steered circle. The center amplitude is replaced by the amplitude where the quality is deemed best. In the vicinity of a fault, the effect is that good quality seismic data is moved from either side of the fault in the direction of the fault plane, hence the fault becomes sharper. Software: OpendTect + Dip-Steering Workflow:

1. Open the attribute set window and open the default set called: Dip-steered diffusion filter. Select seismic and steering cube.

Page 17: Work Flows OD

2. Use Evaluate attribute and evaluate the "Stepout" (radius) of the Position attribute on a (small) test element, e.g. part of an inline.

3. Apply the Diffusion filter to the seismic data in batch: Processing - Create seismic output, or on-the-fly: right-click on the element in the tree.

Tips:

1. Dip-steered filtering works best when you use a heavily smoothed steering cube (background steering). Smoothing of the steering cube is done in Processing - Steering - Filter. Use e.g. a median filter 4x4x4 to create the background steering cube.

2. Calculate Similarity (or Coherency) on dip-steered diffusion filtered seismic data if you need to see sharp faults.

3. Dip-steered diffusion filtering produces unwanted circular patterns in amplitude horizon slices. To reduce this effect, combine dip-steered diffusion filtering with dip-steered median filtering. This is explained in the Fault enhancement filtering work flow.

Prev Home Next

Spectral Decomposition Dip-steered median filter

Page 18: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

3.2. Dip-steered median filter

Purpose: Remove random noise and enhance laterally continuous seismic events by filtering along the structural dip. Theory: In median filtering, the center amplitude in a dip-steered circle is replaced by the median amplitude within the extraction circle. The effect is an edge-preserving smoothing of the seismic data. Software: OpendTect + Dip-Steering Workflow:

1. Open the attribute set window and open the default set called: Dip-steered median filter. Select seismic and steering cube.

2. Use Evaluate attribute and evaluate the "Stepout" (radius) of the Volume Statistics attribute on a (small) test element, e.g. part of an inline.

3. Apply the Median filter to the seismic data in batch: Processing - Create seismic output, or on-the-fly: right-click on the element in the tree.

Tips:

1. Dip-steered filtering works best when you use a heavily smoothed steering cube (background steering). Smoothing of the steering cube is done in Processing - Steering - Filter. Use e.g. a median filter 4x4x4 to create the background steering cube.

2. Dip-steered median filtering does not create faults as sharp as a diffusion filter but the smoothing in unfaulted areas is better. To get the best of both worlds you can combine dip-steered median filtering with dip-steered diffusion filtering. This is explained in the Fault enhancement filtering work flow.

For more info, see this Tutorial video:

Dip steered median filter (flash video)

Prev Home Next

Filters Fault enhancement filter

Page 19: Work Flows OD
Page 20: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

3.3. Fault enhancement filter

Purpose: Sharpen edges (faults) by median or diffusion filtering along the structural dip. Theory: In Fault enhancement filtering, you evaluate the quality of the seismic data in a dip-steered circle. If the quality is good (Similarity is high), you apply a dip-steered median filter. If the quality is low (near faults,) you apply a dip-steered diffusion filter. The effect is smoothed seismic with sharp fault breaks. Software: OpendTect + Dip-Steering Workflow:

1. Open the attribute set window and open the default set called: Fault enhancement filter. Select seismic and steering cube.

2. Use Evaluate attribute and evaluate the "Stepout" (radius) of the median filter defined in the Volume statistics attribute on a (small) test element, e.g. part of an inline.

3. Use Evaluate attribute and evaluate the "Stepout" (radius) of the diffusion filter defined in the Position attribute on the (small) test element.

4. Test different cut-off values c0 (range 0 - 1) in the Mathematics attribute by applying the Mathematics attribute on the (small) test element.

5. Apply the Fault enhancement filter to the seismic data in batch: Processing - Create seismic output, or on-the-fly: right-click on the element in the tree.

Tips:

1. Dip-steered filtering works best when you use a heavily smoothed steering cube (background steering). Smoothing of the steering cube is done in Processing - Steering - Filter. Use e.g. a median filter 4x4x4 to create the background steering cube.

2. Calculate Similarity (or Coherency) on dip-steered Fault enhancement filtered seismic data if you need to see sharp faults.

Prev Home Next

Dip-steered median filter Ridge enhancement filter

Page 21: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

3.4. Ridge enhancement filter

Purpose: Sharpen ridges in a Similarity Cube. Theory: The filter compares in the time-slice domain three neighboring similarity values in six different directions and outputs the largest ridge value. The ridge in each direction is the: sum (values on either side) / 2 - center value. In most evaluation points there are no ridges and the values thus tend to be small but when you cross a fault there will be a large ridge perpendicular to the fault direction. The filter outputs the largest value i.e. the ridge corresponding to the perpendicular direction. Software: OpendTect + Dip-Steering Workflow:

1. Open the attribute set window and open the default set called: Ridge enhancement filter. Select seismic and steering cube.

2. Apply the Ridge enhancement filter to the seismic data in batch: Processing - Create seismic output, or on-the-fly: right-click on the element in the tree (e.g. part of a time-slice).

Tips:

1. In the default attribute set you calculate 9 similarity values at each output point. The process can be speeded up (almost 9 times) by calculating a Similarity Cube first and to extract the similarity values from this pre-calculated cube. You must change the attribute set. Use the "Reference Shift" Attribute instead of Similarity.

2. Ridge-enhancement can be used to enhance any type of ridge cube. Note that ridges are either positive or negative and that you need to modify the ridge enhancement attribute (Mathematics max or min) accordingly.

3. In the default attribute set Dip-steering is used to create better (dip-steered) similarity. This is not a pre-requisite for ridge-enhancement filtering.

Prev Home Next

Fault enhancement filter Spectral Blueing

Page 22: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

3.5. Spectral Blueing

Purpose: Increase the vertical resolution of the seismic data. Theory: A matching filter is designed that shapes the seismic amplitude spectrum to resemble the amplitude spectrum of measured logs. In the general case, amplitudes at low frequencies are reduced while amplitudes at higher frequencies are enhanced. Detailed information that is inherent in the seismic signal becomes visible, while noise is not increased to unacceptable levels. Software: OpendTect + Seismic Spectral Bluing Workflow:

1. Open the attribute set window, select attribute: Spectral Blueing, select input Data .

2. Press Analyze and Create ... to open the Spectral Blueing application.

3. Select Input Seismic and Well data, select well logs (right-click on the well) and time windows for seismic and wells. To load, press Load seismic and Reload wells, respectively.

4. Select Design controls and play with the parameters (increase the smoothing operator, toggle range and try reducing the max. frequency, toggle Auto Calc. and change low-cut and high-cut). Notice how the curves change interactively. Chose parameters that yield a smooth spectrum over the seismic frequency band.

5. Apply the Spectral Blueing attribute to the seismic data in batch: Processing - Create seismic output, or on-the-fly: right-click on the element in the tree (e.g. part of an inline).

Tips:

1. Sometimes low and high mono-frequency noise trails are observed in the spectral blued data. The frequencies of such noise trails correspond to the cut-off frequencies specified in the design window. To reduce these effects try changing the cut-offs. Look at the spectrum of the operator (the blue curve) that should be straight over the seismic bandwidth without lobes on the side.

2. Use the Chart Controller and Zoom options (View menu) to see all graphs simultaneously.

For more info, see this Tutorial video:

Page 24: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

Chapter 4. Inversion and Rock Property Prediction

Table of Contents 4.1. Colored Inversion 4.2. MPSI Stochastic Inversion 4.3. Neural Network Rock Property Prediction

OpendTect offers various plugins for quantitative seismic interpretation. Seismic Colored Inversion (by Ark cls) is a fast way to convert seismic data to band-limited acoustic impedance. Full-bandwidth Deterministic and Stochastic inversion is offered in plugins with the same names by Earthworks and Ark cls. Using Neural Networks, it is possible to convert seismic information (e.g. acoustic and/or elastic impedance) to rock properties (e.g. Porosity, Vshale etc). The supervised network is trained along well tracks to find the (non-linear) relationship between seismic and well logs.

4.1. Colored Inversion

Purpose: A fast-track approach to invert seismic data to band-limited (relative) acoustic impedance. Theory: A single convolution inversion operator is derived that optimally inverts the data and honours available well data in a global sense. In this way, the process is intrinsically stable and broadly consistent with known AI behaviour in the area. Construction of the operator is a simple process and implementation can be readily performed within the processing module included in SCI. As an explicit wavelet is not required, other than testing for a residual constant phase rotation as the last step, this removes an inherently weak link that more sophisticated processes rely on. Software: OpendTect + Seismic Colored Inversion Workflow:

1. Open the attribute set window, add a new Attribute: Colored Inversion, select input seismic and specify a wavelet name.

2. Press Analyze and Create ... to open the Colored Inversion application.

Page 25: Work Flows OD

3. Select Input Seismic and Well data, select well logs (right-click on the well) and time windows for seismic and

wells. To load, press Load seismic and Reload wells, respectively.

4. Select Design controls and play with the parameters (increase the smoothing operator, toggle range and try reducing the max. frequency, toggle Auto Calc. and change low-cut and high-cut). Notice how the curves change interactively. Choose parameters that yield a smooth spectrum over the seismic frequency band.

5. Apply the Colored Inversion attribute to the seismic data in batch: Processing - Create seismic output, or on-the-fly: right-click on the element in the tree (e.g. part of an inline).

Tips:

1. Use the Chart Controller and Zoom options (View menu) to see all graphs simultaneously.

Prev Home Next

Spectral Blueing MPSI Stochastic Inversion

Page 26: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

4.2. MPSI Stochastic Inversion

Purpose: Invert seismic data to full-bandwidth acoustic impedance either in a deterministic mode (the optimal AI volume is produced) or in a stochastic mode (N realizations are computed), so that uncertainties in the inversion process can be analyzed and probability output volumes can be generated. Theory: Stochastic inversion is complementary to deterministic inversion. Stochastic inversion is used to understand the uncertainty in seismic inversion, via the stochastic inversion utilities, and allows the user to explore the impact of this geophysical uncertainty on the lithology, porosity or reservoir volumes over the 3D seismic volume inverted. For thin intervals, a stochastic inversion is particularly appropriate for understanding uncertainty and reservoir volume and connectivity. It also allows the user to improve the estimation obtained from a deterministic inversion as the mean of many realizations (calculated using the utilities) will behave like a global convergence solution from computationally expensive algorithms. Software: OpendTect + MPSI (MPSI consists of five modules that are implemented as attributes in OpendTect's attribute engine: Model building, Error grid. and Deterministic Inversion are released as a bundle to perform deterministic inversion. Stochastic inversion and Utilities are add-ons needed for the stochastic inversion). Workflow:

1. Open the attribute set window and select "EW3DModelBuilder" attribute. Define the model by selecting the wells, zones (horizons) and the smoothing parameter.

2. Select "EW2DErrorGrid" attribute and specify a variogram that captures the uncertainty in the model as a function of distance from the wells.

3. Select "EWDeterministicInversion" and specify the 3D model, the error grid, the seismic data, the wavelet and some parameters to perform the deterministic inversion. Press "Perform Preprocessing" to compute the required pre-processing step. Test in the usual way (see tips below) and if satisfied invert the entire volume by applying the attribute in batch mode under "Processing - Create Seismic output".

4. Select "EWStochasticInversion" and specify the 3D model, the error grid, the deterministic inversion result, the seismic data and some parameters to compute N stochastic realizations. Typically the number N is 100. Press "Perform pre-processing" or create a parameter file that can be executed in batch mode to perform the required pre-processing. Stochastic realizations can be inspected on the current element (say an inline) with the "Evaluate attributes" option. Press this icon and select the realizations that you wish to inspect in a movie-style manner using the slider in the Evaluate attributes window.

5. Select "EWUtilities" to perform post-processing on the N stochastic realizations. You can choose to create a Mean volume (which will be more or less similar to the volume created in the deterministic mode), a standard deviation volume (which gives you an estimate of the uncertainties in the inversion process), probability cubes and trends and joined probability cubes and trends. Probability cubes return the probability of finding a certain range of acoustic impedance values, e.g. corresponding to the sand distribution.

Tips:

Page 27: Work Flows OD

1. The implementation as attributes allows on-the-fly calculation and inspection of each step in the workflow. For example use the "Redisplay" icon to inspect the 3D model, the error grid, the deterministic inversion result, or any of the post-processing results from the stochastic realizations on the current element (say an inline) in the scene.

2. Only the stochastic realizations cannot be inspected with the "Redisplay" icon. As explained above use the "Evaluate attributes" icon to movie-style inspect different realizations.

3. The version of MPSI in OpendTect v3.2 does not support wavelet estimation functionality. Use "Manage wavelets" to either import a wavelet, or to create a synthetic (Ricker, Sinc) one.

For more info, see this Tutorial video:

MPSI stochastic inversion (flash video)

4.2.1. Variogram Analysis

Purpose: Analyse the data prior to the MPSI deterministic and stochastic inversion. Provides the necessary "variogram ranges" and "variogram model" parameters. Theory: Variogram models are used to model the spatial correlation between rock properties when performing a stochastic impedance inversion, or gridding rock properties to a 2D/3D space. The model is defined by a measure of the cross-correlation of a given rock property (impedance) as a function of the distance, vertical or lateral, between the data points. The variogram analysis tool computes a semi-variogram graph from the actual data, and allows to fit a model through it. The model parameters can be input into the MPSI attributes EW3DModelBuidler, EW2DErrorGrid. Software: OpendTect. Workflow - Vertical variogram analysis:

1. Open the "Well logs Cross-plot.2. Select one or more wells, one or more logs.3. Adjust the extraction interval using "Extract between" and "Distance "above/below".4. Make sure the "Radius around wells" parameter is set to 0.5. Make sure the "Log resampling method" parameter is set to "Nearest sample".6. Launch the data extraction.7. Fit the variogram model (green) to your data (blue) by moving the still and range sliders or typing values.8. Find one single optimal range for all wells, even if their sill (total variance) varies a lot.

Workflow - Horizontal variogram analysis:

Page 28: Work Flows OD

1. Load saved horizon data on an horizon. Ideally you would like to use relative impedance data, such as an average map from a coloured inverted volume (can be computed from Processing --> Create Horizon output --> Stratal amplitude). Alternatively you may use energy or spectral decomposition attributes.

2. Right-click on the horizon data --> Tools --> Variogram.3. You may want to increase the maximum number of pairs for the final run a larger number (by powers of 10) to

increase the output quality. Press OK.4. Fit the variogram model (green) to your data (blue) by moving the still and range sliders or typing values.5. Do not try to do a global fit, but instead concentrate on the fit for the small lags. Also do not try to force the sill to

be at 1 (total variance of the attribute), the horizontal variogram on measured data always keeps on increasing.6. Check if the variogram model you deducted fits equally well in the three directions defined by the grid. You may

want to keep the same sill and model, but change the range.

Prev Home Next

Inversion and Rock Property Prediction Neural Network Rock Property Prediction

Page 29: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

4.3. Neural Network Rock Property Prediction

Purpose: Predict rock properties from seismic information and well data. Theory: A supervised neural network is trained on examples (seismic + logs) extracted along a well track. Typically, input is (absolute) acoustic impedance and/or elastic impedance. Typical outputs are porosity logs, lithology logs (Vshale, gamma ray, lithology codes), and saturation logs. Software: OpendTect + Neural Networks Workflow:

1. Open the attribute set window and specify the seismic attributes that you wish to extract along the well track (see Tips below).

2. Open the Neural Networks window and Select Property Prediction. In the new window specify the input attributes, the target well log, the wells, the well interval, the Location selection (see Tips below), the log type (values or binaries) and the percentage to set aside for testing the network during training.

3. Crossplot the target values against each of the input attributes. If need be remove / edit points.

4. Balance the data. This step ensures that each bin in the training set has the same number of examples, which improves training.

5. Train the neural network. Stop where the test set has reached minimum error (beyond that point overfitting occurs: the network learns to recognize individual examples from the training set but looses general prediction capabillities). Store the trained neural network.

6. Apply the trained neural network to the seismic data in batch: Processing - Create seismic output, or on-the-fly: right-click on the element in the tree (e.g. part of an inline).

Tips:

1. To compensate for mis-alignment problems ,you can extract a small time-window from the input (e.g. acoustic impedance volume). To do this use the Reference Shift Attribute and construct an attribute set that extracts at each point e.g. the AI values at -8,-4, 0, 4, 8ms.

2. Use the "All corners" option for the Location selection parameter to compensate for uncertainties in the well track and to increase the statistics. Each example is then extracted along 4 well tracks that run along the corner grid points of the actual well track.

Prev Home Next

Page 30: Work Flows OD

MPSI Stochastic Inversion Object detection

Page 31: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

Chapter 5. Object detection

Table of Contents 5.1. Common Contour Binning 5.2. Chimney Cube 5.3. Fault Cube 5.4. Fingerprint 5.5. UVQ waveform segmentation

Seismic attributes are used to visualize data such that relevant information becomes easier to interpret. However, calculating many attributes leads to a data explosion and confusion: which view is best and how can I combine interesting attributes into one output representing the optimal view? We have introduced the term meta-attrbute for attributes that are combined in an intelligent way. In OpendTect you can create meta-attributes using math and logic (Mathematics attribute in OpendTect), neural networks (commercial plugin), and using the fingerprint attribute (OpendTect).

5.1. Common Contour Binning

This plugin will work only on a 3D survey Purpose: To detect subtle hydrocarbon-related seismic anomalies and to pin-point Gas-Water, Gas-Oil and Oil-Water contacts. Theory: In a structure filled with hydrocarbons, seismic traces that lie on the same (depth) contour line will have similar hydrocarbon effects because these positions sample the same column lengths. Stacking of traces along contour lines will therefore stack up hydrocarbon effects while stratigraphic effects and noise are canceled. CCB (Common Contour Binning) produces two outputs: a CCB volume that consists of traces stacked along contour lines that are re-distributed along the same contour lines and a CCB stack. This is a 2D section with stacked traces flattened along the mapped reference horizon. The ideas behind CCB originate with Jan Gabe van der Weide and Andries Wever of Wintershall, who are the IP owners. Software: OpendTect + Common Contour Binning (CCB) plugin

Page 32: Work Flows OD

Workflow:

1. Create a new polygon over the prospect by right clicking on Pickset. Close the Polygon and Save it (right-click). In general: restrict the polygon to one fault block per analysis.

2. Launch the CCB plugin by clicking on the CCB icon or selecting it from the Processing menu.

3. Specify the horizon, the seismic and the volume sub-selection (the polygon). The horizon inside the polygon area determines the Contour Z division (you probably don't want to change this). The step determines the contour bin-size (all traces within this contour interval are stacked). Optionally change the Z range (this is the vertical slice around the horizon that will be stacked. Press Go.

4. The traces inside the polygon are retrieved and sorted. A histogram is shown. For QC purposes you can display traces that will be stacked at each contour bin (single Z option). To produce the CCB stack (Normal or RMS) press GO. To produce the CCB volume toggle write output to on before pressing Go.

5. The CCB volume can now be used for further analysis. For example: display the amplitude at the horizon (add attribute and select CCB volume from stored data), or create new attributes from the CCB volume (energy, max, min amplitude) and display these on the horizon.

Tips:

1. To determine the spill point you can add a Timeslice element (preferably depth) and move this down over the displayed horizon map (with CCB amplitude display) until you see the contour line that determines the spill point. A spill-point coinciding with a step-change in amplitudes can be explained by a contact and supports the hydrocarbon fill hypothesis.

2. To avoid stacking in traces of bad quality and to ensure that you are not stacking over multiple fault blocks (which may have different fluid-fills and/or contacts) display the similarity attribute on the horizon. Use this display to guide the polygon picking.

For more info, see this Tutorial video:

Common Contour Binning (flash video)

Prev Home Next

Neural Network Rock Property Prediction

Chimney Cube

Page 33: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

5.2. Chimney Cube

Purpose: Create a Chimney "probability" Cube for fluid migration path interpretation. Theory: When fluids (oil, gas, brine) move up through strata, rocks are cracked, chemically altered, and connate gas stays behind causing changes in the acoustic properties of the rocks. The actual path often remains visible in post-stack seismic data as subtle vertical noise trails. A Chimney Cube is a new seismic volume that highlights such vertical disturbances so that these can be interpreted as fluid migration paths. This requires studying the spatial relationships between the paths (chimneys) and other elements of the petroleum system: faults, traps, HC anomalies, (paleo) mud-volcanos, pockmarks, etc. The Chimney Cube is created by training a neural network on two sets of attributes extracted at example locations picked by a human interpreter: one set representing the chimney class (vertically disturbed noise trails) and the other representing the non-chimneys class (i.e. normal seismic response). Software: OpendTect + Dip-Steering + Neural Networks Workflow:

1. Scan your data set for obvious chimneys. For example calculate Similarity attributes at different time-slices, look for circular features in these slices and position seismic lines through these circles. Chimneys are often associated with faults, high-amplitude anomalies, and seepage-related features such as pockmarks and mud-volcanos (look at the seabed reflection for mounds).

2. Create two New Picksets: one for Chimneys and one for Non-chimneys (right-mouse click in the tree).

3. Pick examples for chimneys and non-chimneys (Select the Pickset in the tree and then left-click in the scene on the element at the position you want to add; Control-click to remove a pick). Try to pick a representative set for both chimneys and non-chimneys. This means: pick different chimneys, pick as many points as possible (several hundred picks for each is typical); for non-chimneys pick both low- and high energy zones, also pick (non-leaking) faults and other noisy zones that are not vertically disturbed.

4. Open the attribute set window and open the default set called: NN Chimney Cube. Select seismic and steering cube and Save the attribute set.

5. Open the Neural Networks window. Select Pattern recognition (Picksets). Select Supervised, the input attributes, the Picksets (Chimneys and Non-chimneys) and the Percentage to set aside for testing (e.g. 30%).

6. Train the neural network. Stop where the test set has reached minimum error (beyond that point overfitting occurs: the network learns to recognize individual examples from the training set but looses general prediction capabillities). Store the trained neural network.

7. Apply the trained neural network to the seismic data in batch: Processing - Create seismic output, or on-the-fly: right-click on the element in the tree (e.g. part of an inline). You can chose between 4 outputs: Choose Chimneys to create the Chimney "probability" Cube.

Tips:

Page 34: Work Flows OD

1. The default attribute set can be tuned to your data set by changing parameters, and adding or removing attributes.

2. The colors in the neural network indicate the relative weight attached to each attribute (ranging from white to red). White nodes indicate low weights meaning the attributes are not contributing much and can be removed to speed up processing time.

3. Display the Mis-classified points (Pickset tree) to evaluate why these are mis-classified. If you agree with the network you may want to remove some of these points from the input sets and retrain the network. This will improve the classification results but the process is dangerous as you are working towards a solution.

For more info, see this Tutorial video:

Chimney Cube (flash video)

Prev Home Next

Object detection Fault Cube

Page 35: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

5.3. Fault Cube

Purpose: Create a Fault "probability" Cube for fault interpretation. A Fault cube is typically used to visualize the larger scale faults. Detailed faults are better visualized by Similarity on Fault enhancement filtered seismic data. Theory: Several attributes in OpendTect can be used as fault indicators e.g. Similarity, Curvature, and Energy. A Fault Cube is a new seismic volume that highlights faults by combining the information from several fault indicators into a fault "probability". This is done by training a neural network on two sets of attributes extracted at example locations picked by the human interpreter: one set representing the fault class and the other representing the non-fault class (i.e. normal seismic response). Software: OpendTect + Dip-Steering + Neural Networks Workflow:

1. Scan your data set for obvious faults.

2. Create two New Picksets: one for Faults and one for Non-faults (right-mouse click in the tree).

3. Pick examples for faults and non-faults (Select the Pickset in the tree and then left-click in the scene on the element at the position you want to add; Control-click to remove a pick). Try to pick a representative set for both faults and non-faults. This means: pick different faults, pick as many points as possible (several hundred picks for each is typical); for non-faults pick both low- and high energy zones, also pick noisy zones that are not faulted.

4. Open the attribute set window and open the default set called: NN Fault Cube. Select seismic and steering cube and Save the attribute set.

5. Open the Neural Networks window. Select Pattern recognition (Picksets). Select Supervised, the input attributes, the Picksets (Faults and Non-faults) and the Percentage to set aside for testing (e.g. 30%).

6. Train the neural network. Stop where the test set has reached minimum error (beyond that point overfitting occurs: the network learns to recognize individual examples from the training set but looses general prediction capabilities). Store the trained neural network.

7. Apply the trained neural network to the seismic data in batch: Processing - Create seismic output, or on-the-fly: right-click on the element in the tree (e.g. part of an inline). You can choose between 4 outputs (Faults_yes, Faults_no, Classification, Confidence): Choose Faults_Yes to create the Fault "probability" Cube.

Tips:

1. The default attribute set can be tuned to your data set by changing parameters, and adding or removing attributes.

2. The colors in the neural network indicate the relative weight attached to each attribute (ranging from white via yellow to red). White nodes indicate low weights meaning the attributes are not contributing much and can be removed to speed up processing time.

Page 36: Work Flows OD

3. Display the Mis-classified points (Pickset tree) to evaluate why these are mis-classified. If you agree with the

network you may want to remove some of these points from the input sets and retrain the network. This will improve the classification results but the process is dangerous as you are working towards a solution.

Prev Home Next

Chimney Cube Fingerprint

Page 37: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

5.4. Fingerprint

Purpose: Create a Fingerprint "probability" Cube, i.e. a cube that shows how similar each position is to the position(s) where you created the fingerprint. Theory: The fingerprint attribute has the same objective as the neural network object detection method (e.g. Chimney Cube, Fault Cube): To detect similar seismic responses as the target response (e.g. HC bearing). The advantage of the fingerprint is that you only need to give examples of the object class itself (one point is sufficient). You don't have to pick counter examples (non-objects) as is the case in the neural network workflow. A fingerprint is created from selected attributes at the given input location(s). The ouput is a "probability" cube with values ranging between 0 (completely dissimilar) to 1 (identical to the fingerprint response). Software: OpendTect Workflow:

1. Create a New Picksets, e.g. to capture the response at a hydrocarbon anomaly.

2. Pick one or more examples of the object under study.

3. Open the attribute set window and create a new attribute set with attributes on which your fingerprint should be based. Use Evaluate attributes to select attributes that show the object most clearly. To create a fingerprint for hydrocarbons investigate: energy, frequencies, AVO attributes etc.

4. Add the Fingerprint attribute, select the Pickset file, add the attributes that were defined above and Calculate the parameters (this means the attributes at the picked locations are extracted (and averaged) to calculate the fingerprint.

5. Apply the Fingerprint attribute to the seismic data in batch: Processing - Create seismic output, or on-the-fly: right-click on the element in the tree (e.g. part of an inline).

Tips:

1. The Fingerprint attribute assigns equal weights to each input attribute (this is where the fingerprint loses from the (non-linear) neural network seismic object detection technique). Therefore, try to limit the number of input attributes and do not add attributes that have virtually similar information.

Prev Home Next

Fault Cube UVQ waveform segmentation

Page 38: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

5.5. UVQ waveform segmentation

Purpose: Visualize patterns pertaining to a seismic window that is cut out along a horizon (horizon slice). Theory: In UVQ waveform segmentation, an unsupervised vector quantizer type of neural network clusters seismic trace segments (= waveforms) into a user-specified number of clusters (typically between 4 and 10 clusters). Two output grids are generated: the segment grid reveals patterns pertaining to the studied interval and the match grid shows the confidence in the clustering result ranging from 0 (no confidence) to 1 (waveform is identical to the winning cluster center). A third useful output is a display of the actual cluster centers. As of v3.2 UVQ waveform segmentation can be done in two ways: a fast track approach called "Quick UVQ" and in the Conventional way. The conventional way is a two-step approach: Step 1--the user-defined number of cluster centers are found by training the network on a representative sub-set of the data (typically 1000 trace segments extracted at random locations along the horizon). Step 2--the trained network is applied to the horizon. Each trace segment is compared to the cluster centers to generate two outputs: segment (index of the winning cluster) and match (how close is the waveform to the winning cluster center). Software: OpendTect + Neural Networks Workflow - Quick UVQ:

1. Load a horizon and select "Quick UVQ" from the horizon menu (right-click).

2. Specify the seismic data set, the number of classes and the time-gate and press OK.

3. The software selects waveforms at random positions and starts training the UVQ network. The Average match (%) should reach approx. 90. If it does not reach 90 reduce the number of clusters and/or the time-window. Press OK to continue.

4. The trained UVQ network is automatically applied to the horizon. Class (=segment) and match grids are computed and added as "attributes" to the horizon.

5. A new window with neural network info pops up. Press Display to display the class centers. Press Dismiss to close the window.

6. Class and Match are not automatically saved! To store these grids as "Surface data" with the horizon use the Save option in the horizon menu (right click).

Workflow - Conventional:

1. Create a New Picksets and fill this with 1000 random locations along the mapped horizon.

2. Open the default attribute set called "Unsupervised Segmentation 2D".

3. Open the Neural Networks window and Select Pattern Recognition. In the new window specify Unsupervised, the input attributes, the Pick set generated in above and the Number of classes. Note that the window (the input attributes) should be chosen such that it captures the seismic response of the geologic interval to study. To

Page 39: Work Flows OD

visualize patterns pertaining to a reservoir interval of approx. 30ms 2WT thickness on a mapped top reservoir horizon select a window length of approx. -12 to 42ms. This captures the interval plus the bulk of convolutional side effects on zero-phase data.

4. Train the UVQ network. The Average match (%) should reach approx. 90. If it does not reach 90 reduce the number of clusters and/or the time-window.

5. Store the Neural Network and Display the cluster centers by pressing Info ... followed by Display ...6. Apply the Neural Network Segment (or Match) to the horizon in batch: Processing - Create output using Horizon,

or on-the-fly: right-click on the horizon in the tree.

Tips:

1. If you apply the network on-the-fly you probably want to save the result as Surface data with the horizon for later retrieval.

Prev Home Next

Fingerprint Sequence Stratigraphy

Page 40: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

Chapter 6. Sequence Stratigraphy

Table of Contents 6.1. Chrono-stratigraphy 6.2. Wheeler Transformation 6.3. Stratal Slicing 6.4. Systems Tracts Interpretation

Seismic sequence stratigraphic interpretation contains two primary goals: (1) unraveling the depositional environment and (2) predicting potential stratigraphic traps. The OpendTect SSIS plugin offers unique interpretation capabilities in this domain. In SSIS all possible horizons are tracked (data-driven mode) or modeled (interpolated between mapped horizons, or shifted parallel to upper or lower bounding surface). Each horizon is a chrono-stratigraphic event that can be used to reconstruct the depositional history (chrono-strat slider), to flatten seismic data and attributes (Wheeler transform), and to interpret system tracts (relating units to the relative sea level curve).

6.1. Chrono-stratigraphy

Purpose: Track or model all possible horizons within a given interval. Theory: Map the major bounding surfaces (horizons) in a conventional way (minimum is two horizons: top and bottom). Specify per interval how SSIS should create chrono-stratigaphic horizons (data-driven or model-driven). In data-driven mode the seismic events are followed. This mode requires a Steering cube (dip- azimuth information at every sample position computed with the Dip-Steering plugin). In model-driven mode, you can choose to interpolate between horizons (a.k.a. stratal slicing, or proportional slicing), or shift parallel to the upper horizon (emulating onlap situations), or shifting parallel to the lower horizon (emulating unconformable settings). All modes work for 2D and 3D seismic data. In practice, the data-driven mode is used for 2D (or on 2D sections from a 3D cube) while the model-driven mode is used for 3D seismic data. Software: OpendTect + Dip-Steering + SSIS Workflow:

Page 41: Work Flows OD

1. Use Data Preparation to prepare the horizons: horizons cannot cross and they should be continuous.

2. For the data-driven mode ensure that you have a Steering Cube (Processing - Steering). Filter the steering cube if you observe that the tracked chrono-stratigraphic horizons do not follow the seismic events correctly (Data Preparation - Filter Steering Cube).

3. Create a New Chrono-stratigraphy. Read the horizons and specify per interval and per horizon whether the horizon is isochronous (parallel layering) or diachronous (on- / off-lapping settings). For data-driven mode (both horizons are diachronous) select the steering cube and the minimum spacing (when to stop) and maximum spacing (when to insert a new chrono-stratigrahic horizon).

4. Proceed to calculate the chrono-stratigraphy. When the batch processing is finished: Select the chrono-stratigraphy.

5. Display the chrono-stratigraphy (right-click on the element in the tree).

6. To study the depositional history use the chrono-strat slider (right-click on chrono-stratigraphy) and add / remove geologic time from the sequence.

7. Add a Wheeler scene and use the chrono-stratigraphy to flatten the seismic (or attribute).

Tips:

1. To highlight unconformities with 2D (data-driven) chrono-stratigraphy use the Fill between lines option (Options menu) and play with colors.

2. To improve results add more horizons (map unconformities)

For more info, see this Tutorial video:

Chrono-stratigraphy (flash video)

Prev Home Next

UVQ waveform segmentation Wheeler Transformation

Page 42: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

6.2. Wheeler Transformation

Purpose: To flatten the seismic data while honoring hiatuses caused by non-deposition and erosion. Theory: The Wheeler transform is the seismic equivalent of the geologic Wheeler diagram (= chrono-stratigraphic chart). In a Wheeler diagram, rock units are plotted in a 2D chart of geologic time (y-axis) versus space (x-axis). The diagram shows the temporal-spatial relationship between rock units. Gaps in the chart represent non-deposition or erosion. In a Wheeler transform, we flatten the seismic data (or derived attributes) along flattened chrono-stratigraphic horizons. The differences with the Wheeler diagram are: The vertical axis in the Wheeler transformed domain is relative geologic time (as opposed to absolute geologic time) and a Wheeler transform can be performed in 2D and 3D, whereas the Wheeler diagram is always 2D. Software: OpendTect + Dip-Steering + SSIS Work flow:

1. Select a pre-calculated chrono-stratigraphy.

2. Add a Wheeler scene and use the selected chrono-stratigraphy to flatten the seismic data (or attribute).

Tips:

1. Instead of displaying the flattened seismic, display the chrono-stratigraphy itself in the Wheeler scene. This display is useful for studying the depositional history, especially if you also display the same line (+ chrono-stratigraphy) in the structural domain.

Prev Home Next

Sequence Stratigraphy Stratal Slicing

Page 43: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

6.3. Stratal Slicing

Purpose: To flatten 3D seismic data (or attribute volumes) so that we can slice through the cubes to pick up stratigraphic details. Theory: Stratal slicing (or proportional slicing) is the interpolation option of the model-driven mode for creating chrono-stratigraphic horizons. It is the mode in which the thickness interval between mapped top and bottom horizon is distributed equally over the number of chrono-stratigraphic horizons. Software: OpendTect + Dip-Steering + SSIS Workflow:

1. Select a pre-calculated chrono-stratigraphy.

2. Create a Wheeler Cube. (This step is required for displaying data in a Volume viewer in the Wheeler domain.)

3. Add a Wheeler scene and load the Wheeler cube in the Volume viewer. Use the time slicer to movie-style, inspect the data. Remember that a time-slice in the Wheeler domain corresponds to a horizon slice in the Structural domain.

Tips:

1. To improve time-slicing in the Wheeler domain you need to improve the underlying chrono-stratigraphy, which can be done by adding more mapped horizons.

Prev Home Next

Wheeler Transformation Systems Tracts Interpretation

Page 44: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

6.4. Systems Tracts Interpretation

Purpose: To interpret genetically associated stratigraphic units that were deposited during specific phases of relative sea-level cycles. Theory: In systems tract interpretation, you look at stacking patterns and lapout patterns (onlap, toplap, offlap) in both the Structural domain and the Wheeler transformed domain. Based on these observations you decide whether you are dealing with a transgression (Transgressive Systems Tract), a Normal Regression (either a High Stand Systems Tract or a Low Stand Systems Tract) or falling stage (Falling Stage Systems Tract). Nomenclature according to Hunt and Tucker. (In SSIS you can choose which model (nomenclature) to use.) Software: OpendTect + Dip-Steering + SSIS Workflow:

1. Select a pre-calculated chrono-stratigraphy.

2. Add a Wheeler scene and use the selected chrono-stratigraphy to flatten the seismic data (or attribute). 3. Open the Interpretation window, choose a model and use the chrono-strat sliders to set the systems tract

boundaries. Assign the systems tract by right-clicking in the interpretation column. Note the automatic reconstruction of the relative sea level curve.

Tips:

1. Use arrows to indicate lapout patterns to help you decide where to set systems tract boundaries.

For more info, see this Tutorial video:

SSIS interpretation (flash video)

Prev Home Next

Stratal Slicing Processing

Page 45: Work Flows OD

OpendTect Workflows Documentation version 4.4Prev Next

Chapter 7. Processing

Table of Contents 7.1. Time-depth conversion of 2D data

7.1. Time-depth conversion of 2D data

Although we do not officially support time-depth conversion of 2D data, this workflow that allows you to do it using a combination of OpendTect and Madagascar tools. You must specify the following information: 1- First of all you will need to have a 3D velocity model, in time or depth, in the same domain as your 2D seismic data. Madagascar will need interval velocities so if your velocity type is different the model will need to be converted first (from the processing menu - velocity conversion). 2- Then you will need to project the 3D velocity model along the 2D lines. This can be done in the 2D seismic line data management. The third icon in the middle launches the 'Extract from 3D cube" module that should be used to convert the 3D velocity model to a set of 2D lines. For each 2D line: 3a- The project velocity model must be converted to Madagascar format. This can be done in the Madagascar link window, with an empty whose input is the OpendTect 2D-line containing the velocity model, and with an output set to a Madagascar file. 3b- Secondly the Madagascar link is used for the time-depth conversion itself: The input is the 2D seismic line from OpendTect, the output will be the 2D seismic line in the output domain, placed in the same 2D grid. The program to run is either sftime2dpeth or sfdepth2time depending on the direction of the conversion. You must specify a few information: The output sampling rate dz= The output start time/depth to= or z0= The domain of the input seismic and velocity data: intime=y or intime=n The velocity model contains interval velocities or slownesses: slow=n or slow=y Always in OpendTect: twoway=y

Page 46: Work Flows OD

Thus an example for converting 2D seismic from time to depth would be: sftime2depth velocity=velocity_dip3.rsf dz=5 z0=0 intime=y slow=n twoway=y eps=0.01 This will generate a seismic section in depth starting at z0=0m with a constant sampling rate of 5m. 4- Export the depth converted seismic to SEG-Y. 5- Setup the depth survey by copying the survey definition from the time survey and adjusting the survey domain and range. 6- Import the depth converted seismic from the SEG-Y files. Since the lines were created in the time survey the stored sampling rate will be wrong and will need to be overruled.

Prev Home Next

Systems Tracts Interpretation