Abstract

After 120 years of unsuccessful endeavor, a paradigm shift is required before earthquakes can be predicted. The most sensitive diagnostic of low-level changes of stress in in situ rock, variations in microcrack geometry, can be monitored by analyzing shear-wave splitting. The suggested paradigm shift is that, instead of investigating the source zone, we monitor stress accumulation before earthquakes at, possibly, substantial distances from the source. Characteristic temporal variations of shear-wave time delays have been observed in retrospect before 14 earthquakes worldwide. On one occasion, when changes were recognized early enough, the time, magnitude, and fault break of an M = 5 earthquake in southwest Iceland were successfully stress-forecast in a narrow time-magnitude window. Such stress accumulation can be theoretically modeled and is believed to be at least partially understood. When sufficient shear-wave source earthquakes are available, increasing time delays also show an abrupt decrease shortly before the impending earthquake occurs. This is not fully understood but is thought to be caused by stress relaxation as microcracks coalesce onto the eventual fault break. The new result confirming these ideas, and justifying the paradigm shift, is that logarithms of the durations of both increases and decreases in time delays are found to be proportional (self-similar) to the magnitudes of impending earthquakes.

You do not currently have access to this article.