An algorithm for the estimation of time delays and weights in an arbitrary-single or three-component seismic array is developed by the use of a linearized waveform inversion technique. This algorithm differs from conventional crosscorrelation methods in its ability to simultaneously obtain time delays and weights by minimizing residuals of all possible waveform fittings, and by its robustness in the presence of high random noise levels and local geological scattering. There are N stations in an array, and for each station, a beam is formed by a weighted linear combination of the remaining (N - 1) seismic traces. The time delays and weights are model parameters to be found by minimizing the sum of N objective functions. Two optimization algorithms for solving the least-squares problem, singular-value decomposition and conjugate gradient, are compared, and the conjugate gradient method is found to be satisfactory and faster for large arrays. The algorithm was tested using synthetic array data with high noise, real data from shots in a borehole to a linear array on land, and Ms 6.7 earthquake data recorded with a broadband three-component array. The success with synthetic and real data shows the algorithm to be useful for seismic data stacking, residual static corrections, and phase picking when the data quality is poor.