عنوان مقاله [English]
In the current seismic data acquisition techniques, sources are fired with large time intervals in order to avoid interferences between the responses of successively firing sources, measured by the receivers. This leads to a time-consuming and expensive survey. Theoretically the waiting time between two successively firing sources has to be infinite, since the wavefield never vanishes completely. However, in practice this waiting time varies from a few seconds (s) up to 30 s. This means that the source responses are negligible after the waiting time. As an example, within the time interval of 200 s, 40 source locations can be fired with 5 s waiting time, or 20 source locations can be fired with 10 s waiting time. Since decision making at the business level are usually based on minimizing the acquisition costs, the source domain is usually poorly sampled to limit the survey duration, causing spatial aliasing (Mahdad, 2011). On the other hand, modifying the waiting times brings flexibility in the source sampling and the survey time. The concept of simultaneous or blended acquisition is to address the aforementioned issues by either reducing the waiting time between firing sources, leading to reduced acquisition costs, or by increasing the number of sources within the same survey time, leading to a higher data quality. Note that a combination of the two approaches combines these benefits. The price paid for achieving higher data quality at lower acquisition cost is dealing with the interfered data, called blended data, which are acquired in the blended acquisition. But in order to precede further processing and imaging algorithms, one needs to first breakdown the blended data into its original components (single source responses) by a processing step called deblending. It is a try to retrieve the data as if they were acquired in a conventional, unblended way. In this paper, we introduce the concept of simultaneous acquisition and examine three methods of deblending:
1) The least-squares method (Pseudo-deblending) which perfectly predicts the blended data but its solution suffers from the interference noises related to the interfering sources in the observations, the so called blending noises (crosstalk noises). These noises have different characteristics in different domains of the data. For example, in the common-mid-point (CMP) domain they are incoherent and spike-like and thus can be tackled by a denoising algorithm.
2) Noise attenuation by multidirectional vector-median filter (MD-VMF). It is a generalization of the well-known conventional median filter from a scalar implementation to a vector form. More specifically, a vector median filter is applied in many trial directions and then the median vector is selected.
3) Regularization of deblending operator matrix. Deblending is by itself an underdetermined and thus ill-posed problem; meaning that, there are infinitely many solutions for the deblending problem. Therefore, constraints are necessary to solve it. A possible way is spatially band-limiting constraints which are useful when the sources are densely sampled. It has been shown that under such constraints, the deblending operator matrix can be regularized to form a well behaved direct deblending operator.
Finally, by observing the wavefield from deblended synthetic and field data we conclude that, regularization of the belending operator matrix is reliable because of its accuracy in noise attenuation and keep the signal and speed of the algorithm.