The partitioning interwell tracer test (PITT) was introduced into the practice of contaminant hydrogeology in 1994, since which time about 50 PITTs have been conducted in the field. Confirmation of results of vadose-zone and ground-water zone PITTs by subsequent field testing of aquifers in Utah and New Mexico have demonstrated the reliability of the technique for operational purposes, i.e., the measurement of average inter-well NAPL (non-aqueous phase liquid) saturations that provide meaningful values upon which to base remedial decisions. However, the use of the PITT has become increasingly infrequent, partly because it has attracted criticism concerning its reliability to quantify DNAPL (dense non-aqueous-phase liquid) either in low-permeability units surrounded by high-permeability materials or present in depressions at the base of aquifers. In the vadose zone, where gaseous partitioning tracers are used, the criticism maintains that the partitioning tracers will necessarily follow cleaned pathways created during soil vapor extraction and will thus bypass DNAPL zones. In the ground-water zone, the criticism is that it fails to detect DNAPL in low-permeability lenses either in the middle or at the base of aquifers, again because of bypassing. This paper presents numerical simulations of two laboratory experiments that demonstrate that the criticisms are incorrect and PITT will accurately detect residual NAPL volume when proper design and field implementation are undertaken.