Abstract

Correlation detectors are of considerable interest to seismic monitoring communities because they offer reduced detection thresholds and combine detection, location, and identification functions into a single operation. They appear to be ideal for applications requiring screening of frequent repeating events. But questions remain about how broadly empirical correlation methods are applicable. We describe the effectiveness of banks of correlation detectors in a system that combines traditional power detectors with correlation detectors in terms of efficiency, which we define to be the fraction of events detected by the correlators. This article elaborates and extends the concept of a dynamic correlation detection framework—a system that autonomously creates correlation detectors from event waveforms detected by power detectors and reports observed performance on a network of arrays in terms of efficiency. We performed a large‐scale test of dynamic correlation processors on an 11 TB global dataset using 25 arrays in the 1–3 Hz frequency band. The system found over 3.2 million unique signals and produced 459,747 screened detections. A very satisfying result is that, on average, efficiency grows with time and, after nearly 16 years of operation, exceeds 47% for events observed over all distance ranges and approaches 70% for near‐regional and 90% for local events. This suggests that future pipeline architectures should make extensive use of correlation detectors, principally for decluttering observations of local and near‐regional events. Our results also suggest that future operations based on correlation detection will require commodity large‐scale computing infrastructure, because the numbers of correlators in an autonomous system can grow into the hundreds of thousands.

You do not currently have access to this article.