Skip to Main Content


If an inverse problem can be represented with the explicit linear equation d=Gm, it is said to be LINEAR. If a perfect (or exact) relationship exists between the observations d and the model parameters m, then we can use very simple procedures to invert our measurements for m. In many practical situations, the observed data may not all lie on a straight line (see Fig. 4.1.1). If we decide to fit a line to these data, the fitted Ivn* may be some appreciable distance away from some data values. For a collection of n data pairs {(x1, y1), (x2, y2), … (xn, yn)}, the fitted line (known as the REGRESSION line) is described by the equation

where ei is the vertical distance between the ith data point and the regression line (Fig. 4.1.1). The quantity ei is called the RESIDUAL, MISFIT or prediction ERROR. The solution to the straight line inverse problem in this case is not an exact solution since the relation yi = a + bxi cannot be satisfied for every i and the problem is also overdetermined. This type of problem is generally solved using the LEAST SQUARES method.

In the least squares method we try to MINIMIZE the error e by determining those parameters a,b such that the sum of squares of the error (S) is minimal, i.e.,

Minimization is accomplished by differentiating S with respect to the model parameters and setting the derivatives equal to zero.

Recall that y=(a+bx)+e. Assuming that

You do not currently have access to this chapter.

Figures & Tables





Citing Books via

Close Modal
This Feature Is Available To Subscribers Only

Sign In or Create an Account

Close Modal
Close Modal