Hydrograph separation methods for estimating recharge are not useful in all watersheds, but in this article, the hypothesis is tested that they should be reasonably accurate in montane humid locations where dissipation times for runoff are short. The study area is a montane carbonate/clastic Paleozoic aquifer in the Appalachian Mountains, West Virginia and Virginia. For 1 yr, groundwater discharge was continuously measured at three different stream reaches; each has grossly similar climate, soils, and topography, but differing geology underlying the main stream channels and different stream orientations with respect to structure. One station measured base flow in a strike-parallel stream of moderate length (13.4 km [8.3 mi]); the other two in strike-normal streams of shorter length (2.7 km [1.7 mi]). Pronounced differences were observed in apparent recharge rate between the strike-normal and strike-parallel streams, with the strike-parallel stream similar in apparent recharge to 21 other regional streams, with both large and small catchments, of similar climatic and geologic conditions. Similarly, the strike-normal streams in the study area were significantly lower in recharge compared with these. The precise location of stream losses, if any, could not be pinpointed in detailed streamflow measurements. Although the low apparent recharge rates are primarily ascribed to stream losses, underflow, and/or hyporheic interactions, the evidence for this was nonconclusive. This case underscores that even in humid montane settings of moderate relief, complications in the groundwater budget may arise that make recharge estimates based on surface hydrograph analysis inaccurate.