Abstract

The recycling of agricultural drainage water for irrigation is increasingly viewed as a desirable management practice in areas with limited options for drainage disposal. Modeling is potentially a cost-effective approach to examining design and management options for drainage reuse systems, but questions exist about the accuracy of simulated root water uptake in dynamic, highly saline conditions such as those encountered in reuse operations. This study compares HYDRUS-1D simulations of root water uptake and drainage with lysimeter data collected during an experiment in which forage crops (alfalfa [Medicago sativa L.] and tall wheatgrass [Agropyron elongatum (Host) P. Beauv.]) were irrigated with synthetic drainage waters. A trial-and-error fitting procedure was used to determine uptake reduction parameters for each crop. Good agreement between the model simulations and data was achieved, a noteworthy result given the broad range of experimental conditions considered: irrigation waters with salinities ranging from 2.5 to 28 dS m−1 and irrigation rates ranging from deficit to luxurious. The approximations required to derive uptake reduction parameters from published salt tolerance data are examined. Simulations are presented that attempt to account for the uncertainty in derived uptake reduction functions. Overall, we concluded that the general modeling approach captures many essential features of root water uptake under stressed conditions and it may be useful in designing and analyzing reuse operations.

You do not currently have access to this article.