Hybrid target-oriented salt interpretation in the Gulf of Mexico

2014 ◽  
Vol 2 (4) ◽  
pp. SL21-SL28 ◽  
Author(s):  
Ravi Maranganti ◽  
Yogesh Agnihotri

The top-down method for salt interpretation starts with the search for salt-sediment interfaces, which begins at the shallowest depths and progressively moves deeper. This search is typically conducted on intermediate seismic products such as sediment flood, salt flood volumes, overhang sediment flood, and overhang salt flood depth migrations. However, with this approach, poorly imaged subsalt areas become known only after spending considerable time interpreting intermediate salt features. We evaluated a new method for streamlining this traditional approach to salt-model building wherein a reference salt geometry was obtained earlier in the salt interpretation process. Having a reference seismic volume helped to identify poorly imaged subsalt targets much sooner. A hybrid of model-based and data-based interpretations was then performed for the poorly imaged subsalt areas, whereby interpretation was completed by proposing geologically viable models that were continuously verified by their impact on geophysical (seismic) data. In poorly imaged areas, we looked bottom-up, rather than top-down, to establish a salt geometry that best fit the geologic model as well as the geophysical data. Our hybrid target-oriented approach is useful for not only reducing the interpretation cycle time but also for improving images beneath the salt.

Geophysics ◽  
2016 ◽  
Vol 81 (5) ◽  
pp. C177-C191 ◽  
Author(s):  
Yunyue Li ◽  
Biondo Biondi ◽  
Robert Clapp ◽  
Dave Nichols

Seismic anisotropy plays an important role in structural imaging and lithologic interpretation. However, anisotropic model building is a challenging underdetermined inverse problem. It is well-understood that single component pressure wave seismic data recorded on the upper surface are insufficient to resolve a unique solution for velocity and anisotropy parameters. To overcome the limitations of seismic data, we have developed an integrated model building scheme based on Bayesian inference to consider seismic data, geologic information, and rock-physics knowledge simultaneously. We have performed the prestack seismic inversion using wave-equation migration velocity analysis (WEMVA) for vertical transverse isotropic (VTI) models. This image-space method enabled automatic geologic interpretation. We have integrated the geologic information as spatial model correlations, applied on each parameter individually. We integrate the rock-physics information as lithologic model correlations, bringing additional information, so that the parameters weakly constrained by seismic are updated as well as the strongly constrained parameters. The constraints provided by the additional information help the inversion converge faster, mitigate the ambiguities among the parameters, and yield VTI models that were consistent with the underlying geologic and lithologic assumptions. We have developed the theoretical framework for the proposed integrated WEMVA for VTI models and determined the added information contained in the regularization terms, especially the rock-physics constraints.


2021 ◽  
Author(s):  
Farah Syazana Dzulkefli ◽  
Kefeng Xin ◽  
Ahmad Riza Ghazali ◽  
Guo Qiang ◽  
Tariq Alkhalifah

Abstract Salt is known for having a generally low density and higher velocity compared with the surrounding rock layers which causes the energy to scatter once the seismic wavefield hits the salt body and relatively less energy is transmitted through the salt to the deeper subsurface. As a result, most of imaging approaches are unable to image the base of the salt and the reservoir below the salt. Even the velocity model building such as FWI often fails to illuminate the deeper parts of salt area. In this paper, we show that Full Wavefield Redatuming (FWR) is used to retrieved and enhance the seismic data below the salt area, leading to a better seismic image quality and allowing us to focus on updating the velocity in target area below the salt. However, this redatuming approach requires a good overburden velocity model to retrieved good redatumed data. Thus, by using synthetic SEAM model, our objective is to study on the accuracy of the overburden velocity model required for imaging beneath complex overburden. The results show that the kinematic components of wave propagation are preserved through redatuming even with heavily smoothed overburden velocity model.


1995 ◽  
Vol 23 (1) ◽  
pp. 3-14 ◽  
Author(s):  
John A. Ingram

Model building in Christian psychology has gradually become increasingly outdated and unsophisticated over the past decade, particularly in light of postmodern challenges to the limitations of received modern scientific perspectives and social practices. The present article draws from Rychlak's (1993) “complementarity” model, Sperry's (1993) “bidirectional determinism” concept, and Engel's (1977) biopsychosocial formulation to develop a multiperspectival, holistic framework drawing on the strengths of both modern and postmodern approaches. The proposed model includes inferences from both top down and bottom up formulations, as well as potential for interactions between or among any of the various “groundings” for psychological theories. Such a model seems more faithful to both biblical and scientific perspectives, and thus may provide a more accurate and comprehensive view of persons to facilitate more effective research and treatment. A clinical example is provided with DSM-IV descriptive and criterion referents.


2019 ◽  
Vol 38 (11) ◽  
pp. 872a1-872a9 ◽  
Author(s):  
Mauricio Araya-Polo ◽  
Stuart Farris ◽  
Manuel Florez

Exploration seismic data are heavily manipulated before human interpreters are able to extract meaningful information regarding subsurface structures. This manipulation adds modeling and human biases and is limited by methodological shortcomings. Alternatively, using seismic data directly is becoming possible thanks to deep learning (DL) techniques. A DL-based workflow is introduced that uses analog velocity models and realistic raw seismic waveforms as input and produces subsurface velocity models as output. When insufficient data are used for training, DL algorithms tend to overfit or fail. Gathering large amounts of labeled and standardized seismic data sets is not straightforward. This shortage of quality data is addressed by building a generative adversarial network (GAN) to augment the original training data set, which is then used by DL-driven seismic tomography as input. The DL tomographic operator predicts velocity models with high statistical and structural accuracy after being trained with GAN-generated velocity models. Beyond the field of exploration geophysics, the use of machine learning in earth science is challenged by the lack of labeled data or properly interpreted ground truth, since we seldom know what truly exists beneath the earth's surface. The unsupervised approach (using GANs to generate labeled data)illustrates a way to mitigate this problem and opens geology, geophysics, and planetary sciences to more DL applications.


2000 ◽  
Vol 19 (5) ◽  
pp. 466-472 ◽  
Author(s):  
Louis Liro ◽  
Mark Lahr ◽  
Kim Cline ◽  
Jerry Young ◽  
Mary Kadri ◽  
...  

Geophysics ◽  
1989 ◽  
Vol 54 (2) ◽  
pp. 181-190 ◽  
Author(s):  
Jakob B. U. Haldorsen ◽  
Paul A. Farmer

Occasionally, seismic data contain transient noise that can range from being a nuisance to becoming intolerable when several seismic vessels try simultaneously to collect data in an area. The traditional approach to solving this problem has been to allocate time slots to the different acquisition crews; the procedure, although effective, is very expensive. In this paper a statistical method called “trimmed mean stack” is evaluated as a tool for reducing the detrimental effects of noise from interfering seismic crews. Synthetic data, as well as field data, are used to illustrate the efficacy of the technique. Although a conventional stack gives a marginally better signal‐to‐noise ratio (S/N) for data without interference noise, typical usage of the trimmed mean stack gives a reduced S/N equivalent to a fold reduction of about 1 or 2 percent. On the other hand, for a data set containing high‐energy transient noise, trimming produces stacked sections without visible high‐amplitude contaminating energy. Equivalent sections produced with conventional processing techniques would be totally unacceptable. The application of a trimming procedure could mean a significant reduction in the costs of data acquisition by allowing several seismic crews to work simultaneously.


Sign in / Sign up

Export Citation Format

Share Document