We present a theoretical and experimental performance study of a method for time delay estimation (TDE), based on the signal integral (TDE-SI). The TDE-SI method considers the delay between two transient signals as the difference between the center of mass of these signals. The method has three special cases: In the first, time is the mass coordinate and the signal sit) is the mass distribution (estimate D/spl circ//sub s/); in the second case, the squared signal s/sup 2/(t) is the mass distribution (estimate D/spl circ/(s/sup 2/)); and the last is a variant of the second. The bias and the standard deviation to) of the estimate have been evaluated when the signal is contaminated by Gaussian white noise. D/spl circ//sub s/ is not biased but the /spl sigma/ of the TDE is higher than that obtained when working with D/spl circ/(s/sup 2/). Moreover, the D/spl circ/(s/sup 2/) estimate is biased. The special case of a bias-corrected estimate (D/spl circ/'(s/sup 2/)) is presented; this D/spl circ/'(s/sup 2/) yields a /spl sigma/ of its TDE lower than the estimate D/spl circ//sub s/. Hence, D/spl circ/'(s/sup 2/) is the most suitable of the three TDE-SI options for TDE. Theoretical estimations are validated by simulation results with artificially generated signals and by real so-called QRS complex waves (ventricular activity) from an electrocardiographic (ECG) signal.
展开▼