Abstract

Isolating seismic instruments from temperature fluctuations is routine practice within the seismological community. However, the necessary degree of thermal stability required in broadband installations to avoid generating noise or compromising the fidelity in the seismic records is largely unknown and likely application dependent. To quantify the temperature sensitivity of seismometers over a broad range of frequencies, we artificially induced local temperature changes on three different models of seismometers to measure the effect of thermal variations on seismometer output. We found that diurnal temperature changes above 0.002°C root mean square (rms) showed significant changes in velocity and acceleration output in comparison to thermally stable reference measurements. We also found that sensor incoherent self‐noise increased with temperature variation; these increases in noise can be modeled as 1/f noise (pink noise), and are unlikely to be easily corrected for. These experimental results are compared with the data from Incorporated Research Institutions for Seismology (IRIS) U.S. Geological Survey (USGS) Global Seismographic Network (GSN) station TUC (Tucson, Arizona). This station is well instrumented with temperature sensors and has three different broadband seismometers, each of which uses a different method of thermal isolation. We show that the water bricks and borehole installations give ample temperature attenuation to thermally isolate seismometers from diurnal thermal variability that would compromise seismic data. We find that seismometer installations that provide thermal stability below 0.002°C rms could help to improve long‐period vertical seismic data across the GSN by decreasing temperature‐driven 1/f noise.

You do not currently have access to this article.