We propose a probabilistic framework in which different types of information pertaining to the recurrence of large earthquakes on a fault can be combined in order to constrain the parameter space of candidate recurrence models and provide the best combination of models knowing the chosen data set and priors.
We use Bayesian inference for parameter and error estimation, graphical models (Bayesian networks) for modeling, and stochastic modeling to link cumulative offsets (CO) to coseismic slip. The cumulative offset‐based Bayesian approach (COBBRA) method (Fitzenz et al., 2010) was initially developed to use CO data to further constrain and discriminate between recurrence models built from historical and archaeological catalogs of large earthquakes (CLE). We discuss this method and present an extension of it that incorporates trench data (TD). For our case study, the Jordan Valley fault (JVF), the relative evidence of each model slightly favors the Brownian passage time (BPT) and lognormal models.
We emphasize that (1) the time variability of fault slip rate is critical to constrain recurrence models; (2) the shape of the probability density functions (PDF) of paleoseismic events is very important, in most cases not Gaussian, and should be reported in its complexity; (3) renewal models are in terms of intervals between consecutive earthquakes, not dates, and the algorithms should account for that fact; and (4) maximum‐likelihood methods are inadequate for parameter uncertainty evaluation and model combination or ranking. Finally, more work is needed to define proper priors and to model the relationship between cumulative slip and coseismic slip, in particular, when the fault behavior is more complex.