This paper addresses the estimation of time delay between two spatially separated noisy signals by system identification modeling with the input and output corrupted by additive white Gaussian noise. The proposed method is based on a modified adaptive Butler-Cantoni equalizer that decouples noise variance estimation from channel estimation. The bias in time delay estimates that is induced by input noise is reduced by an IIR whitening filter whose coefficients are found by the Burg algorithm. For step time-variant delays, a dual mode operation scheme is adopted in which we define a normal operating (tracking) mode and an interrupt operating (optimization) mode. In the tracking mode, only a few coefficients of the impulse response vector are monitored through L_1-normed finite forward differences tracking, while in the optimization mode, the time delay optimized. Simulation results confirm the superiority of the proposed approach at low signal-to-noise ratios.
展开▼