Although topographic steady state is often used as a simplifying assumption in sediment yield studies and landscape evolution models, the temporal and spatial scales over which this assumption applies in natural landscapes are poorly defined. We used visible–near-infrared (visNIR) spectroscopy to measure the weathering of hilltop soils and quantify local erosional variability in two watersheds in the Oregon Coast Range (United States). One watershed appears adjusted to base-level lowering driven by rock uplift in the Cascadia forearc, while the other is pinned by a gabbroic dike that locally slows river incision and hillslope erosion. Models for uniformly eroding hillslopes imply uniform soil residence times; instead, we observe significant variability around the mean value of 18.8 k.y. (+31.2/–11.8 k.y.) for our adjusted watershed. The magnitude of erosional variability likely reflects the time scales associated with stochastic processes that drive bedrock weathering, soil production, and soil transport (e.g., tree turnover). The residence time distribution for our pinned watershed has a mean value of 72.9 k.y. (+165.6/–50.6 k.y.) and is highly skewed with a substantial fraction of long residence time soils. We speculate that this pattern results from the lithologic control of base level and lateral divide migration driven by erosional contrasts with neighboring catchments. Our novel and inexpensive methodology enables us to quantify for the first time the magnitude of erosional variability in a natural landscape, and thus provides important geomorphic context for studies characterizing regolith development. More generally, we demonstrate that soils can record catchment-scale landscape dynamics that may arise from lithologic controls or forcing due to climate or tectonics.