Latest trends in seismic acquisition technology

1st February 2013

Page 1 of 3

Global spending on seismic has reached 7 BUSD in 2011, and is expected to increase by ~30 per cent. This investment is a result of the need for more, better and cost-effective data, and the response of the seismic industry is strong, with innovative solutions being developed to meet this challenge. Eric de Graaff explains.

Advances in seismic technology have driven much of the improvement in our ability to explore for new oil and gas, to develop fields more effectively and safely, and to monitor production. Some of these advances have followed an evolutionary path, like the ever increasing numbers of sensors and streamers, improved spatial sampling, and imaging and migration algorithms. Other approaches, like the switch from 2D to 3D, the introduction of Wide Azimuth surveys, and seabed technologies have more suddenly changed the landscape.

Passive seismic has emerged as an effective tool to monitor fraccing and injection, and may evolve into a technique which is capable of monitoring more subtle changes in the reservoir reliably.

Recently, a lot of focus has been put on ways to increase the bandwidth of the data, and some interesting ideas and approaches have been proposed. In addition, several technologies have been introduced that increase the cost-effectiveness of data acquisition.

The quest for more densely sampled data is most evident in the onshore efforts, where the vision of a million channel system is rapidly becoming reality, through the introduction of low weight sensors and wireless communication.

In offshore seismic acquisition, the towed streamers can easily decrease the sensor spacing in inline direction, for instance in the Q implementation, but improved cross-line sampling is more difficult to achieve, leaving an asymmetrically sampled wavefield.

The emergence of wide azimuth acquisition is an effective solution for this asymmetry, but the significant costs drive this acquisition to areas with very significant illumination challenges, for instance in pre-salt targets. The use of ocean bottom nodes is opening up the way to achieve good sampling by dense shooting, but the cost effectiveness of this method would need an improvement by a factor 3-5 to become a mainstream alternative to towed seismic.

The search to increase the bandwidth of the data has seen the introduction of dual sensors in the seismic streamers, as well as slanted cable configurations. In both cases, towing the cable at greater depth would allow for operation in rougher sea state conditions, although the corresponding increased variability of the sea-air reflection coefficient limits the effectiveness of the deghosting process. In addition, data dependent matched filtering is necessary to match the noisy geophone data to the hydrophone in the dual sensor case, while the slanted streamer approach requires some intricate data dependent processing to handle the variable ghost. This can only be done effectively poststack, which severely limits the use of prestack data for reservoir characterisation.

Source deghosting through time and space staggering does show promising results, and can be used to good effect. New streamer deghosting techniques are therefore required to mitigate the afore mentioned drawback, and are subject of active research in the industry. Unfortunately, the increased patenting efforts by major players are limiting the ability to build on the current state of knowledge, and would seem to go far beyond the legitimate desire to justify R&D expenditure by protecting new inventions.

An interesting view of the use of wide bandwidth data was presented by Philip Fontana et al (SEG, 2011, San Antonio), in which he argued that deeper understanding of the target reservoirs is required. In many deeper targets, the earth spectrum has a limited bandwidth, and wide bandwidth acquisition will actually decrease the signal to noise ratio through the dispersion, absorption, and scattering of the higher frequencies in the seismic signal. An underestimated recent development is the emergence of new noise eliminating processing technologies (Fig. 1.), which do not have the disadvantages of conventional filtering approaches, which tended to smear the data and limit spatial resolution.

As often, the need to drive down acquisition cost has also resulted in elimination of unnecessary cost. A good example is the need for seismic infill, which traditionally can be as high as 20-30 per cent of the total acquisition. An understanding of the need for dense sampling at longer shooting offsets is emerging, initially through office based analysis of streamer spacing and possibilities to use fanned streamers. A more effective solution is offered by the recent introduction of Fresnel Zone Binning (Fig. 2.), which allows for a full analysis of target depth, seismic frequency dependency, and offsets, resulting in an optimised acquisition with real time analysis driving the streamer positioning.

Further to cost reduction, a significant effort is spent on simultaneous seismic acquisition, also called blended shooting. In particular the wide azimuth acquisition, both towed and ocean bottom, can benefit tremendously from simultaneous shooting. This has been demonstrated convincingly for exploration and development surveys, and the industry is now attempting to solve this for production monitoring surveys, which require a lower level of remnant deblending noise to achieve reliable analysis of the time-lapse signal. However, no matter how seismic acquisition technology improves, limitations remain in the acquisition of low frequencies. In onshore acquisition these are being solved through the use of sensors that record down to 0Hz, yielding startling improvements in the ability to create a correct earth model through the use of full waveform inversion rather than travel time inversion methods.

The inherent limitations in the acquisition of low frequencies offshore however, require different geophysical methods. A first step towards acquiring these is through the addition of a limited number of ocean bottom seismic sensors, yielding a significantly improved frequency spectrum, as shown by recent experiments in the Barents Sea. As these ocean bottom sensors can be deployed at relatively large spacing, this approach promises to be a very cost effective way to acquire better data. The use of gravity/gradiometry, magnetic, magneto telluric, and controlled source electromagnetic data offers great opportunities for improved definition of the earth model, and for derisking prospects by techniques complementary to seismic analysis. The introduction of towed CSEM acquisition would be a step change, lowering the threshold for the acquisition of richer datasets.

In conclusion, the drive to get more, better, and cost-effective data is stronger than ever, and it is expected that, in addition to currently used technologies, some more radical changes will be needed.

Enter √ at

Eric de Graaff is Chief Scientist, Fugro, Leidschendam, The Netherlands.

Pages 1 2 3

Your Career

Your Career


twitter facebook linkedin © Setform Limited