The direction of theoretical seismology today is toward the realistic description of ground motion in the time domain. The adequacy of theoretical models is tested by comparing recorded ground motions to theoretically predicted instrumental records. A requirement for making such a comparison is that the response of the seismograph system be known for an impulse in ground displacement. This in turn requires that the amplitude and phase response of the seismograph system be known. This paper reviews calibration techniques currently used and indicates the deficiencies in such techniques. A multistep procedure is developed in which a Hilbert transform relating the phase response to the amplitude response of a minimum-phase causal system is used together with experimental band-limited data for the phase and amplitude response of a seismograph system. Because assumptions must be made about the nature of the amplitude response outside the range for which data are available, an additional step of using the computed frequency response to generate the theoretical system response to a step in acceleration which can be compared to actual calibration pulses is proposed. Using this approach, the impulse response of the SRO long-period system is estimated.