The magnitudes of any collection of earthquakes nucleating in a region are generally observed to follow the Gutenberg–Richter (GR) distribution. On some major faults, however, paleoseismic rates are higher than a GR extrapolation from the modern rate of small earthquakes would predict. This, along with other observations, led to the formulation of the characteristic earthquake hypothesis, which holds that the rate of small‐to‐moderate earthquakes is permanently low on large faults relative to the large‐earthquake rate (Wesnousky et al., 1983; Schwartz and Coppersmith, 1984).
We examine the rate difference between recent small‐to‐moderate earthquakes on the southern San Andreas fault (SSAF) and the paleoseismic record, hypothesizing that the discrepancy can be explained as a rate change in time rather than a deviation from GR statistics. We find that with reasonable assumptions, the rate changes necessary to bring the small and large earthquake rates into alignment agree with the size of rate changes seen in epidemic‐type aftershock sequence modeling, where aftershock triggering of large earthquakes drives strong fluctuations in the seismicity rates for earthquakes of all magnitudes. The necessary rate changes are also comparable to rate changes observed for other faults worldwide. These results are consistent with paleoseismic observations of temporally clustered bursts of large earthquakes on the SSAF and the absence of M≥7 earthquakes on the SSAF since 1857.