Abstract

A model for simulating the decrease in albedo of melting prairie snow covers is presented. Its application for calculating net radiation and establishing the time of melt is demonstrated. It is expected the routine will find use in operational systems for synthesizing and forecasting streamflow runoff from snowmelt.The model is based on point and aerial observations of incoming and reflected global radiation taken from February 1 to the end of ablation of the seasonal snow cover, over a 14 year period, in the open grassland area of western Canada. For complete snow covers not subject to frequent melt events, the albedo-depletion curve is approximated by three line segments of constant slope describing the periods of (1) premelt—the months preceding the occurrence of "active" melt; (2) melt—the period of rapid ablation that leads to the disappearance of the seasonal snow cover, and (3) postmelt—the days following melt.An algorithm of the model is developed and procedures for defining the start of melt and albedo depletion from daily inputs of net radiation, maximum air temperature, and snow-cover and snowfall depths are described. Data are presented that demon strate close agreement between simulated and measured albedo-depletion curves for "deep" (depth > 25 cm) and "shallow" (depth ≤25 cm) snow covers. The mean difference between simulated and measured albedo on 74 days of melt over 7 years of record was calculated to be −0.0007 with a standard deviation of 0.17.The model is applied for calculating daily net radiation during the melt period. This analysis makes use of an empirical relationship to estimate net radiation from the clear-sky insolation, sunshine hours, and simulated albedo. Comparison of the differences between simulated and measured values for 62 days of melt gave a mean and standard deviation of 0.49 and 2.05 MJ/(m2∙d), respectively.

You do not currently have access to this article.