ABSTRACT

The deployment of oceanic seismic arrays facilitated unique data sets for the science community in imaging the seismic structures and understanding the lithosphere and mantle dynamics at subduction zone systems and other tectonic settings. The data quality is fundamental to ensure reliable seismic results using records from ocean‐bottom seismometers. In this study, we conduct a comprehensive analysis of factors that may affect the signal‐to‐noise ratio (SNR) of the fundamental‐mode Rayleigh waves, as a proxy for the waveform quality, within the Cascadia subduction zone. We use stations from Cascadia Initiative, Gorda deformation zone experiment, Blanco transform fault experiment, and Neptune Canada array. The empirical Green’s functions (EGFs) of Rayleigh waves are extracted from ambient‐noise seismic waveforms and filtered at 10‐ to 35‐s periods. In general, the SNR of the EGFs decreases with increasing interstation distance and increasing sediment thickness. A portion of stations, mainly located within the Gorda plate and along the trench, demonstrates temporal variations of the data quality, with the highest SNR observed during the fall and winter seasons. The SNR demonstrates a complicated pattern in terms of the length of the time series used to extract EGFs. Most stations within the Juan de Fuca (JDF) plate show improvement of data quality with increasing length. However, for many stations located within the accretionary wedge and the Gorda plate, the ratio does not increase much by stacking more data. The distinctly different patterns of the SNR between the Gorda and JDF plates indicate possible impacts of lithosphere properties on data quality.

You do not currently have access to this article.