Stress drops determined from small earthquakes (m <4.5) might, in principle, provide a means of monitoring stress variations in tectonic regions, but to date most such stress drop determinations have shown too large a scatter, both in space and in time, for any trends to be detectable. Results from some recent studies suggest that inferring stress drops from determinations of the apparent stress might result in sufficient reduction in the scatter to allow the detection of tectonic variations in stress.
The determination of the apparent stress from data is largely model independent, but this is not the case for the usual corner-frequency method for calculating the static stress drop. The inference of a stress drop from the apparent stress requires assumptions about the dynamics of the rupture process, whereas the static stress drop (by definition) has no inherent dependence on the dynamics. However, for far-field data, the determination of the static stress drop has about as much model dependence as does the interpretation of the apparent stress. We have compared the two approaches for estimating the stress drop in the context of some currently used seismic source models, and we conclude that if the quality of the available data is sufficient to allow a good estimate of the radiated energy, determining the apparent stress should lead to more robust estimates.