The common observation that smaller particle-size fractions of sedimentary rocks yield younger K-Ar apparent ages than the larger particle-size fractions of the same stratigraphic age was analyzed with the aid of the 40Ar/40K ratio from 14 stratigraphically and regionally different sections. Estimation of the loss of radiogenic 40Ar from varied clay-rich size fractions was based on two models: a relationship between particle size and the 40Ar/40K ratio, and a theoretical diffusional loss from spherical particles. The differences between the two models and reconciliation of their results are discussed. For the smallest fractions (up to <0.5 μm), percent-wise losses of 40Ar from the spherical particles model increase from Upper Carboniferous and Permian (38±10%), to Late Triassic (47±10%), and to Miocene and Late Neogene (65±8%). This trend suggests that escape of 40Ar from the smaller particles in older sediments decreased or even stopped after deposition of the sedimentary sections.
The large 40Ar losses derived from small 40Ar/40K ratios in the younger Tertiary sediments, indicate that addition of K to the small fractions is, at least in part, responsible for the young K-Ar apparent ages in geologically different settings. In several 102–103 m thick sections, authigenic illite in the <0.1 to <2 μm fractions yields young K-Ar apparent ages resulting from simultaneous 40Ar production and release during clay authigenesis. In a production and loss model, a first-order escape-rate parameter (ε) was estimated at 0.2×10−8 to 4×10−8 y−1, depending on the K-Ar apparent age of the size fractions and the stratigraphic age of the section. The limitations and uncertainties of the methods of evaluating diagenetic 40Ar losses from fine clay particles are discussed.