Coherent interferometry is an array-imaging method in which we back-propagate, or migrate, crosscorrelations of the traces over appropriately chosen space-time windows rather than the traces themselves. The size of the space-time windows is critical and depends on two parameters. One is the decoherence frequency, which is proportional to the reciprocal of the delay spread in the traces produced by the clutter. The other is the decoherence length, which also depends on the clutter. As is usual, the clutter is modeled by random fluctuations in the medium properties. In isotropic clutter, the decoherence length is typically much shorter than the array aperture. In layered random media, the decoherence length along the layers can be quite long. We show that when the crosscorrelations of the traces are calculated adaptively, coherent interferometry can provide images that are statistically stable relative to small-scale clutter in the environment. This means that the images we obtain are not sensitive to the detailed form of the clutter. They only depend on its overall statistical properties. However, clutter does reduce the resolution of the images by blurring. We show how the amount of blurring can be minimized by using adaptive interferometric imaging algorithms and discuss the relation between the coherence properties of the array data and the loss in resolution caused by the blurring.