Control of analytical data quality is usually referred to in the mining industry as Quality Assurance and Quality Control (QAQC), and involves the monitoring of sample quality and quantification of analytical accuracy and precision. QAQC procedures normally involve using sample duplicates and specially prepared standards whose grade is known. Numerous case studies indicate that reliable control of sample precision is achieved by using approximately 5% to 10% of field duplicates and 3% to 5% of pulp duplicates. These duplicate samples should be prepared and analyzed in the primary laboratory.
Bias in the analytical results can be identified by inclusion of 3% to 5% of the standard in each sample batch. Several different standards are used, with values spanning the practical range of grades in the actual samples. A blank (a sample in which the concentration of metal of interest is below detection limit) should also be included. Standard samples alone cannot identify biases introduced during sample preparation, and therefore approximately 5% of the duplicate samples (coarse rejects and pulp) should be processed and assayed at another, external, reputable laboratory.
This paper discusses techniques used for estimation of errors in precision and accuracy, and overviews diagnostic tools. It is shown that one of the most commonly used methods, the Thompson-Howarth technique, produces consistently lower results than other methods. These results reflect the nature of this method, which relies on the assumption of a normally distributed error, and thus produces biased results when errors have a skewed distribution. This study concurs with the suggestion of Stanley and Lawie (2007: Exploration and Mining Geology, v. 16, p. 265–274) to use the average coefficient of variation (CVAVR(%)) as the universal measure of relative precision error in mine geology applications:
Based on case studies, an acceptable level of sample precision is proposed for several different deposit types.