The effect of noise on response spectra is quantitatively estimated via simulating separately earthquake signals and ambient noise. The distortion is measured for three types of response spectra: absolute acceleration response spectra (AAs), relative velocity response spectra (RVs), and relative displacement response spectra (RDs). For all types of response spectra, it is observed that the distortion fairly well correlates with the signal-to-noise (S/N) ratio of the root mean square (rms). The distortion of AAs is almost independent of the earthquake magnitude, while that of RDs increases with decreasing magnitude. The distortion of RVs is in between that of AAs and RDs. Based on these observations, it is concluded that, when calculating response spectra from earthquake time histories, the distortion of AAs can be effectively predicted by simply measuring the rms S/N ratio. For RDs, combination of the rms S/N ratio and the earthquake magnitude is needed. For RVs, the combination is not strictly required.