基于压缩感知的L1范数谱投影梯度算法地震数据重建

2019年 58卷 第No. 2期
阅读:93
查看详情
Seismic data reconstruction based on spectral projection gradient L1 algorithm via compressive sensing
(吉林大学地球探测科学与技术学院,吉林长春130026)
(College of Geo-exploratiom Science and Technology,Jilin University,Changchun 130026,China)

 


摘要:随着油气勘探的发展,采集的数据规模与复杂度越来越大,对这些数据进行重建的精度与效率影响到后续地震资料的处理效果。常用于地震数据重建的压缩感知理论与重建算法各有精度与效率的优势,因此对于大规模、复杂地震数据,综合考虑重建精度与计算时间,提出了一种基于压缩感知理论和L1范数谱投影梯度算法(SPGL1)的地震数据重建方法。首先根据地震数据的缺失情况选择采样矩阵,然后在contourlet域中采用L1范数谱投影梯度算法重建缺失的稀疏系数,最后进行contourlet反变换实现地震数据的重建。合成地震数据实验结果表明,基于压缩感知和L1范数谱投影梯度算法重建的地震数据精度较好,计算效率高。通过实际地震资料处理,对比了相同稀疏变换基情况下常用的贪婪算法中的正交匹配追踪(OMP)、梯度投影稀疏重建算法(GPSR)及L1范数谱投影梯度算法(SPGL1)的应用效果,发现基于压缩感知的L1范数谱投影梯度算法鲁棒性较好,受噪声影响小,重建精度高,并且兼顾了计算效率的需求。
 

With the development of oil and gas exploration,the scale and complexity of collected data are increasing.The reconstruction of seismic missing data is essential for subsequent data processing.Reconstruction algorithms based on compressive sensing are accurate and efficient.Here,we proposed a seismic data reconstruction method,based on a spectral projection gradient L1 algorithm (SPGL1) and on compressive sensing,which can be applied to large-scale and complex seismic data.First,a sampling matrix was selected according to the missing data.Then,the missing sparse coefficients were reconstructed using the SPGL1 in the contourlet domain.Finally,the contourlet inverse transform was used to reconstruct the seismic data.Tests on synthetic and field data demonstrated the superiority of the proposed method over traditional methods:it provided higher accuracy and efficiency.Based on the contourlet transform,we could conclude that the SPGL1 is more robust than OMP and the gradient projection algorithm GPSR in the processing of noisy data.

压缩感知; 测量矩阵; contourlet变换; 地震数据重建; 贪婪算法; 绕射波; SPGL1;
compressive sensing,measurement matrix,; contourlet transform,; reconstruction of seismic data,; greedy algorithm,; diffraction waves,; spectral projection gradient L1 algorithm (SPGL1);

国家重点研发计划课题“天然气水合物高精度三维地震数据处理和成像技术研究”(2017YFC0307405)资助。

10.3969/j.issn.1000-1441.2019.02.007