The dipmeter is a borehole tool that can be used to estimate the dip direction and magnitude of subsurface bedding. Typical computer dipmeter analysis utilizes crosscorrelation techniques which can be subject to cycle skipping. A Monte Carlo approach, based upon a method that was successfully applied to determining seismic residual statics, has been developed as an alternative dipmeter analysis method.The technique, which can process either raw or derivative dipmeter data, searches for the best-fit plane through a depth interval of the data. This plane would approximate a bedding interface intersecting the borehole. Operating on a parameterized forward model, this statistical, iterative algorithm randomly perturbs a local depth shift applied to a dipmeter trace. The algorithm then checks how well that perturbation fits the data by locally stacking the data interval. Random perturbations that improve the solution are always accepted as updates of the parameters. However, random perturbations that degrade the solution are not necessarily rejected. Occasionally, these bad guesses are retained, which gives the algorithm the ability to avoid converging to local minima.Setting the proper initial acceptance-to-rejection ratio for bad guesses is important for successful convergence. This ratio is initially large but, after many iterations, is decreased substantially. If this ratio is initially too high, convergence may not occur within reasonable CPU time. If the ratio is initially too low, convergence to a local minimum may result. Testing showed that only a narrow range of initial bad-guess acceptance-to-rejection ratios led to efficient global optimization.For synthetic and real dipmeter data, better results were achieved when processing differentiated dipmeter data than when processing raw dipmeter data.