Suppose that we have measurements of the vertical intensity of gravity or the magnetic field at n points; we wish to estimate a second derivative or the like at a specified point. If there were no experimental error, we could interpolate and differentiate well enough with a polynomial of degree k-1 < n-1. What shall we do when experimental error is present? The minimum-variance principle says: adjust the n-k arbitrary constants of the polynomial formula so as to minimize the mean square error (variance) of the estimated quantity. By applying this principle directly, we can get "best" estimation formulas of specified degree k. We can also get "best" formulas by a least-squares method; it gives the same results as the minimum-variance method, by a theorem of mathematical statistics. The theorem furthermore enables us to estimate the variance and to choose a satisfactory value of k. For shortening the calculations, orthogonal polynomials are useful; tables of them are available in several places, and tables of their derivatives are given here. The methods are illustrated by applying them to several made-up examples of data with known random errors.