For ambient noise, a long time series is typically used for measuring surface-wave dispersion in seismic interferometry. It is preferable to measure dispersions with a broad period range. The reliability of such measurements is often studied using the signal-to-noise ratio (S/N) of the crosscorrelation function (CCF). While many studies have revealed that the S/N evolves as the length of a time series increases, the required conditions for such measurements remain unclear. We maximized the period range suitable for dispersion measurements by examining variations in the amplitudes of the signals and noise of CCFs. For these purposes, and to preserve the broadband amplitude information, we do not apply filtering in the frequency domain or signal normalization in the time domain. The preserved signals and the trailing noise levels of the CCFs exhibit different time-varying features that agree with the predictions of theoretical work on amplitudes. Specifically, as the duration of the crosscorrelated time series increases, the amplitude of the signal remains constant while the trailing noise decreases. Moreover, the trailing noise exhibits a power-law dependence on the period. The period range in which the maximum CCF amplitude exceeds the level expected for this power law corresponds to the period range in which dispersion measurements can be made appropriately with frequency-time analysis (FTAN). This approach can be used to quantitatively determine the optimal period range for dispersion measurements. Results obtained with this method indicate that long-duration records used for crosscorrelation provide not only high S/Ns but also broaden the period range in which dispersion measurements can be made.