We present a methodology for the detection of small, impulsive signal transients using time-frequency spectrograms closely related to the emerging field of scan statistics. In local monitoring situations, single-channel detection of small explosions can be difficult due to the complicated nature of the local noise field. Small, impulsive signals are manifest as vertical stripes on spectrograms and are enhanced on grayscale representations using vertical detection masks. Bitmap images are formed where pixels above a defined threshold are set to one. A short-duration large bandwidth signal will have a large number of illuminated bits in the column corresponding to its arrival time. We form the marginal distribution of bit counts as a function of time, ni, by summing columnwise over frequency. For each time window we perform a hypothesis test, H0: signal+noise, by defining a probability model expected when a signal is present. This model is Bernoulli for signal versus no signal with probability of signal =ρ1. We assume that ni follows the binomial distribution and compute a probability of detection (represented as a p value) for a given ρ1. We apply the spectrogram detector to 1 hr of single-channel acoustic data containing a signal from a 1 lb chemical surface explosion recorded at 3.1 km distance and compare performance with a short-term average to long-term average (STA/LTA) detector. Both detectors are optimized through grid search and successfully detect the acoustic arrival from the 1 lb explosion. However, 70% more false detections are observed for STA/LTA than for the spectrogram detector. At great range, attenuation properties of the earth reduce the effectiveness of the spectrogram detector relative to STA/LTA. Data fusion techniques using multiple channels from a network are shown to reduce the number of false detections.